CN107101584B - Object displacement measurement method, device and system based on image recognition - Google Patents

Object displacement measurement method, device and system based on image recognition Download PDF

Info

Publication number
CN107101584B
CN107101584B CN201710285370.8A CN201710285370A CN107101584B CN 107101584 B CN107101584 B CN 107101584B CN 201710285370 A CN201710285370 A CN 201710285370A CN 107101584 B CN107101584 B CN 107101584B
Authority
CN
China
Prior art keywords
image
detection image
detection
feature point
point set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710285370.8A
Other languages
Chinese (zh)
Other versions
CN107101584A (en
Inventor
陈长征
聂婷
何斌
薛金来
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN201710285370.8A priority Critical patent/CN107101584B/en
Publication of CN107101584A publication Critical patent/CN107101584A/en
Application granted granted Critical
Publication of CN107101584B publication Critical patent/CN107101584B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/022Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to an object displacement measuring method based on image recognition, which comprises the following steps: acquiring first detection image data of an object at a first position and second detection image data of the object at a second position based on a preset target image; performing edge detection processing on the first detection image data and the second detection image data to obtain first edge map data corresponding to the first detection image and second edge map data corresponding to the second detection image; positioning according to first edge map data corresponding to the first detection image, second edge map data corresponding to the second detection image and a sub-pixel centroid method to obtain a feature point set corresponding to the first detection image and a feature point set corresponding to the second detection image; obtaining an optimal projection relation matrix of the first detection image and the second detection image according to the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image; and obtaining the relative distance between the object at the first position and the object at the second position according to the preset focal distance, the preset object distance and the optimal projection relation matrix of the first detection image and the second detection image.

Description

Object displacement measurement method, device and system based on image recognition
Technical Field
The invention relates to the technical field of measurement, in particular to an object displacement measurement method, a measurement device and a measurement system based on image recognition.
Background
With the development of the technology, when an object generates a small displacement, the relative position of the object needs to be measured, but the existing relative position measuring device has a large measuring error in the measuring process and cannot meet the precision requirement; the other conventional relative position measuring device can measure the micro displacement of the object only when contacting the object in the measuring process, but the contact measurement easily causes scratches or scratches on the surface of the object and cannot meet the measuring requirement.
Disclosure of Invention
The invention aims to solve the technical problem that the relative position of an object is difficult to measure in the prior art, and provides an object displacement measuring method, a measuring device and a measuring system based on image recognition, which meet the requirement of measuring accuracy and do not damage the surface of the object.
The invention provides an object displacement measuring method based on image recognition, which comprises the following steps:
acquiring first detection image data of an object at a first position and second detection image data of the object at a second position based on a preset target image;
performing edge detection processing on the first detection image data and the second detection image data to obtain first edge map data corresponding to the first detection image and second edge map data corresponding to the second detection image;
positioning according to first edge map data corresponding to the first detection image, second edge map data corresponding to the second detection image and a sub-pixel centroid method to obtain a feature point set corresponding to the first detection image and a feature point set corresponding to the second detection image;
obtaining an optimal projection relation matrix of the first detection image and the second detection image according to the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image;
and obtaining the relative distance between the object at the first position and the object at the second position according to the preset focal distance, the preset object distance and the optimal projection relation matrix of the first detection image and the second detection image.
The invention provides an object displacement measuring device based on image recognition, which comprises:
the device comprises an acquisition module, a detection module and a processing module, wherein the acquisition module is used for acquiring first detection image data of an object at a first position and second detection image data of the object at a second position based on a preset target pattern;
the edge detection processing module is used for carrying out edge detection processing on the first detection image data and the second detection image data to obtain first edge map data corresponding to the first detection image and second edge map data corresponding to the second detection image;
the positioning module is used for positioning according to first edge map data corresponding to the first detection image, second edge map data corresponding to the second detection image and a sub-pixel centroid method to obtain a feature point set corresponding to the first detection image and a feature point set corresponding to the second detection image;
the first processing module is used for obtaining an optimal projection relation matrix of the first detection image and the second detection image according to the characteristic point set corresponding to the first detection image and the characteristic point set corresponding to the second detection image;
and the second processing module is used for obtaining the relative distance between the object at the first position and the object at the second position according to the preset camera focal length, the camera object distance and the optimal projection relation matrix of the first detection image and the second detection image.
The invention provides an object displacement measuring system based on image recognition, which comprises a photographing device, a light source, a beam splitter and the object displacement measuring device based on image recognition, wherein the beam splitter is arranged along the optical axis of the light source, a preset target image is positioned between the light source and the beam splitter so as to project the target image onto a measured object, the photographing device is connected with the object displacement measuring device, and the optical axis of the photographing device is perpendicular to the object surface of the measured object.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that: the method comprises the steps of projecting a preset target image on a measured object through a camera to take a picture twice, converting the relative position relation of the measured object into the displacement relation between images by utilizing the imaging relation between the object images, accurately calculating a spatial relation matrix between the two images by carrying out edge detection, centroid point extraction and sub-pixel registration on the two images, and finally obtaining the displacement relation of a measured object meter. The image recognition method does not need contact measurement, has accurate measurement precision and simple, convenient and efficient measurement mode, and can quickly obtain displacement information.
Drawings
FIG. 1 is a flow chart of an embodiment of an object displacement measurement method based on image recognition according to the present invention;
FIG. 2 is a schematic diagram of an embodiment of a predetermined target image according to the present invention;
FIG. 3 is a schematic diagram of an embodiment of capturing an image according to the present invention;
FIG. 4 is a schematic diagram of the present invention capturing an image for edge detection and centroid localization;
FIG. 5 is a schematic structural diagram of an embodiment of an object displacement measuring device based on image recognition according to the present invention;
fig. 6 is a schematic structural diagram of an embodiment of the object displacement measurement system based on image recognition according to the present invention.
In the figure, 1, a photographing device, 2, a light source, 3, a beam splitter, 4, a preset target image, 50, an obtaining module, 51, an edge detection processing module, 52, a positioning module, 53, a first processing module, 54, and a second processing module.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings.
The invention provides an embodiment of an object displacement measurement method based on image recognition, as shown in fig. 1, the measurement method includes:
step S11, acquiring first detection image data of an object at a first position and second detection image data of the object at a second position based on a preset target image;
step S12, performing edge detection processing on the first detection image data and the second detection image data to obtain first edge map data corresponding to the first detection image and second edge map data corresponding to the second detection image;
step S13, positioning according to a first edge map data corresponding to a first detection image, a second edge map data corresponding to a second detection image and a sub-pixel centroid method to obtain a feature point set corresponding to the first detection image and a feature point set corresponding to the second detection image;
step S14, obtaining an optimal projection relation matrix of the first detection image and the second detection image according to the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image;
and step S15, obtaining the relative distance between the object at the first position and the object at the second position according to the preset focal distance and the object distance and the optimal projection relation matrix of the first detection image and the second detection image.
In a specific implementation, as shown in fig. 2, the preset target image 4 is projected onto the Object to be measured, and the optical axis of the photographing device 1 is perpendicular to the Object plane of the Object to be measured to photograph, so as to obtain the first detected image Img1 data, and after the Object to be measured has slightly moved, the preset target image 4 is still projected onto the Object to be measured, and the optical axis of the photographing device 1 is perpendicular to the Object plane of the Object to be measured to photograph, so as to obtain the second detected image Img2 data. In which, as shown in fig. 3, a schematic image is captured.
Specifically, the edge detection processing specifically includes: canny edge detection, that is, in step S12, Canny edge detection is performed on the data of the first detected image Img1 and the data of the second detected image Img2 to obtain first edge map data Img1_ b corresponding to the first detected image and second edge map data Img2_ b corresponding to the second detected image, wherein the Canny edge detection algorithm is specifically implemented as follows: converting the color image into a gray image; carrying out Gaussian blur on the gray level image; calculating image gradient, and calculating the edge amplitude and angle of the image according to the gradient; non-maximum signal suppression processing (edge thinning); performing double-threshold edge connection processing; and outputting a result of the binarization image. In specific implementation, as shown in fig. 4, the centroid of the target image can be obtained by pixel centroid method positioning, since each centroid point is a feature point of a corresponding image, n centroids of the first detected image are obtained by pixel centroid method positioning on the first edge map data Img1_ b, that is, n centroids of the second detected image are obtained by pixel centroid method positioning on the second edge map data Img2_ b, that is, a feature point set-Pt 1{ Pt11, Pt12, …, Pt1n } corresponding to the first detected image, and that is, a feature point set-Pt 2{ Pt21, Pt22, …, Pt2n } corresponding to the second detected image are obtained.
In a specific implementation, the line coordinates of each feature point in the feature point set Pt1 corresponding to the first detection image and the feature point set Pt2 corresponding to the second detection image, that is, the line coordinates of each centroid, are expressed by the following formula:
Figure BDA0001280552180000031
the column coordinates of each feature point in the feature point set Pt1 corresponding to the first detection image and the feature point set Pt2 corresponding to the second detection image, that is, the column coordinates of each centroid, are expressed by the following formula:
Figure BDA0001280552180000041
where T is the threshold value set, g (u)i,vj) Is the u-th image of the image giLine, v thjThe pixel gray value of the column, i represents the row of the image, the range of i1 to i2, j represents the column of the image, the range of j1 to j2, and the image g is the image to be detected.
Because the shooting visual angles and the lighting conditions of the feature points to be registered are different, translation, rotation and scaling may exist in the first detection image and the second detection image, in order to improve the measurement accuracy, a registration algorithm related to sub-pixel expansion is adopted, and the spatial relationship matrix of the feature point set Pt1 corresponding to the first detection image and the spatial relationship matrix of the feature point set Pt2 corresponding to the second detection image are solved.
In a specific implementation, step S14 specifically includes:
the row coordinates and the column coordinates of each feature point in the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image are obtained by using a phase correlation method to obtain an initialized projection relation matrix;
and solving the initialized projection relation matrix by adopting a registration algorithm related to sub-pixel expansion according to the initialized projection relation matrix to obtain an optimal projection relation matrix of the first detection image and the second detection image.
Specifically, when the Object to be measured is slightly displaced during two photographing, the feature points corresponding to the feature point set Pt1 corresponding to the first detection image and the feature points corresponding to the feature point set Pt2 corresponding to the second detection image satisfy the following correlation formula by using a phase correlation method:
Figure BDA0001280552180000042
wherein, (x1, y1) and (x2, y2) are corresponding homonyms on the two images; p is the initial projection relationship matrix,
Figure BDA0001280552180000043
dx is the horizontal distance between the first detection image and the second detection image, and dy is the first detection image
And the vertical distance of the second detection image, θ being the rotation angle of the first detection image and the second detection image.
In a specific implementation, the initialized projection relationship matrix is solved by using a registration algorithm related to sub-pixel expansion according to the initialized projection relationship matrix to obtain an optimal projection relationship matrix of the first detection image and the second detection image, and a specific formula is as follows:
Figure BDA0001280552180000044
wherein, | | | |, denotes the Euclidean distance, irIs (pt11, pt12, … and pt1n) and represents a vector formed by the feature point set corresponding to the first detection image,
Figure BDA0001280552180000045
is subtracted from the arithmetic mean value, i, of the feature point set corresponding to the first detected imagew(p) is (pt21, pt22, …, pt2n) and represents a vector formed by the feature point set corresponding to the second detected image,
Figure BDA0001280552180000051
is iw(p) subtracting the arithmetic mean value of the feature point set corresponding to the second detection image, wherein p is an initialized projection relation matrix; that is, ir=(pt11,pt12,...,pt1n),iw(p)=(pt21,pt22,...,pt2n)。
The arithmetic mean of the specific vectors is: the average of the terms is collected. For example, (a1+ a2+ … … an)/n is the arithmetic mean of a1, a2, … …, an. The process of determining the optimal projection relation matrix is an iterationThe process of solving, utilize phase correlation method to solve and initialize projection relation matrix p at first, then bring the characteristic point set that the first detects the picture corresponds to, characteristic point set that the second detects the picture corresponds to and initialize projection relation matrix p into the formula (4), solve a value of two pictures similarity, judge whether this value T is smaller than given threshold value T, if smaller than, projection relation matrix p is the best projection matrix at this moment, iteration stops; if not, updating the projection relation matrix pk+1=pkAnd (5) enabling the iteration times to be represented by + delta p and k, enabling p to be a relation matrix to be solved, enabling delta p to be an iteration step length, and carrying out the next iteration until the similarity of the two images meets a given threshold value T.
In a specific implementation, under the condition that the focal length f and the object distance H are known, the relative distance between the object at the first position and the object at the second position is obtained according to the optimal projection relation matrix of the first detection image and the second detection image: the horizontal distance Dx is Dx/f H, the vertical distance Dy is Dy/f H, and the angle θ.
The object displacement measuring method based on image recognition comprises the steps of projecting a preset target image on a measured object through a camera to take a picture twice, converting the relative position relation of the measured object into the displacement relation between images by utilizing the imaging relation between the object images, accurately calculating the spatial relation matrix between the images by carrying out edge detection, centroid point extraction and sub-pixel registration on the two images, and finally obtaining the displacement relation of the measured object. The image recognition method does not need contact measurement, has accurate measurement precision and simple, convenient and efficient measurement mode, and can quickly obtain displacement information.
In a specific implementation, the present invention further provides an object displacement measurement device based on image recognition, as shown in fig. 5, where the object displacement measurement device includes:
an obtaining module 50, configured to obtain first detection image data of an object at a first position and second detection image data of the object at a second position based on a preset target pattern;
an edge detection processing module 51, configured to perform edge detection processing on the first detected image data and the second detected image data to obtain first edge map data corresponding to the first detected image and second edge map data corresponding to the second detected image;
the positioning module 52 is configured to perform positioning according to first edge map data corresponding to the first detected image, second edge map data corresponding to the second detected image, and a sub-pixel centroid method, so as to obtain a feature point set corresponding to the first detected image and a feature point set corresponding to the second detected image;
the first processing module 53 is configured to obtain an optimal projection relationship matrix of the first detection image and the second detection image according to the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image;
and the second processing module 54 is configured to obtain a relative distance between the object at the first position and the object at the second position according to a preset camera focal length, a preset camera object distance, and an optimal projection relationship matrix of the first detection image and the second detection image.
In a specific implementation, as shown in fig. 2, the preset target image 4 is projected onto the Object to be measured, and the optical axis of the photographing device 1 is perpendicular to the Object plane of the Object to be measured to photograph, so as to obtain the first detected image Img1 data, and after the Object to be measured has slightly moved, the preset target image 4 is still projected onto the Object to be measured, and the optical axis of the photographing device 1 is perpendicular to the Object plane of the Object to be measured to photograph, so as to obtain the second detected image Img2 data. In which, as shown in fig. 3, a schematic image is captured.
Specifically, the edge detection processing specifically includes: canny edge detection, that is, the edge detection processing module 51 is further configured to: canny edge detection is carried out on the data of the first detection image Img1 and the data of the second detection image Img2, and first edge map data Img1_ b corresponding to the first detection image and second edge map data Img2_ b corresponding to the second detection image are obtained. The Canny edge detection algorithm is specifically realized by the following steps: converting the color image into a gray image; carrying out Gaussian blur on the gray level image; calculating image gradient, and calculating the edge amplitude and angle of the image according to the gradient; non-maximum signal suppression processing (edge thinning); performing double-threshold edge connection processing; and outputting a result of the binarization image.
In specific implementation, as shown in fig. 4, the centroid of the target image can be obtained by pixel centroid method positioning, since each centroid point is a feature point of a corresponding image, n centroids of the first detected image are obtained by pixel centroid method positioning on the first edge map data Img1_ b, that is, n centroids of the second detected image are obtained by pixel centroid method positioning on the second edge map data Img2_ b, that is, a feature point set-Pt 1{ Pt11, Pt12, …, Pt1n } corresponding to the first detected image, that is, a feature point set-Pt 2{ Pt21, Pt22, …, p2tn } corresponding to the second detected image.
In a specific implementation, the line coordinates of each feature point in the feature point set Pt1 corresponding to the first detection image and the feature point set Pt2 corresponding to the second detection image, that is, the line coordinates of each centroid, are expressed by the following formula:
Figure BDA0001280552180000061
the column coordinates of each feature point in the feature point set Pt1 corresponding to the first detection image and the feature point set Pt2 corresponding to the second detection image, that is, the column coordinates of each centroid, are expressed by the following formula:
Figure BDA0001280552180000062
where T is the threshold value set, g (u)i,vj) Is the u-th image of the image giLine, v thjThe pixel gray value of the column, i represents the row of the image, the range of i1 to i2, j represents the column of the image, the range of j1 to j2, and the image g is the image to be detected.
Because the shooting visual angles and the lighting conditions of the feature points to be registered are different, translation, rotation and scaling may exist in the first detection image and the second detection image, in order to improve the measurement accuracy, a registration algorithm related to sub-pixel expansion is adopted, and the spatial relationship matrix of the feature point set Pt1 corresponding to the first detection image and the spatial relationship matrix of the feature point set Pt2 corresponding to the second detection image are solved.
In a specific implementation, the first processing module 53 is further configured to:
the row coordinates and the column coordinates of each feature point in the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image are obtained by using a phase correlation method to obtain an initialized projection relation matrix;
and solving the initialized projection relation matrix by adopting a registration algorithm related to sub-pixel expansion according to the initialized projection relation matrix to obtain an optimal projection relation matrix of the first detection image and the second detection image.
Specifically, when the Object to be measured is slightly displaced during two photographing, the feature points corresponding to the feature point set Pt1 corresponding to the first detection image and the feature point set Pt2 corresponding to the second detection image satisfy the following relational formula:
Figure BDA0001280552180000071
wherein, (x1, y1) and (x2, y2) are corresponding homonyms on the two images; p is the initial projection relationship matrix,
Figure BDA0001280552180000072
dx is the horizontal distance between the first detection image and the second detection image, and dy is the first detection
The vertical distance between the image and the second detection image, and θ is the rotation angle of the first detection image and the second detection image.
In a specific implementation, the initialized projection relationship matrix is solved according to a phase correlation method, and the registration algorithm of sub-pixel extended correlation is adopted to solve the initialized projection relationship matrix to obtain an optimal projection relationship matrix of the first detection image and the second detection image, where a specific formula is as follows:
Figure BDA0001280552180000073
wherein | l | · | | represents the euclidean distance,iris (pt11, pt12, … and pt1n) to represent the vector formed by the feature point set corresponding to the first detection image,
Figure BDA0001280552180000074
is subtracted from the arithmetic mean value, i, of the feature point set corresponding to the first detected imagew(p) is (pt21, pt22, …, pt2n) and represents a vector formed by the feature point set corresponding to the second detected image,
Figure BDA0001280552180000075
is iw(p) subtracting the arithmetic mean of the feature point sets corresponding to the second detected image, p being the initialized projection relation matrix, i.e., ir=(pt11,pt12,...,pt1n),iw(p)=(pt21,pt22,...,pt2n)。
The arithmetic mean of the specific vectors is: the average of the terms is collected. For example, (a1+ a2+ … … an)/n is the arithmetic mean of a1, a2, … …, an. The process of determining the optimal projection relation matrix is an iterative solution process, firstly, an initialized projection relation matrix p is solved by using a phase correlation method, then a characteristic point set corresponding to a first detection image, a characteristic point set pair corresponding to a second detection image and the initialized projection relation matrix p are brought into a formula (4), a value of similarity of two images is solved, whether the value T is smaller than a given threshold value T or not is judged, if so, the projection relation matrix p is the optimal projection matrix, and iteration is stopped; if not, updating the projection relation matrix pk+1=pkAnd (5) enabling the iteration times to be represented by + delta p and k, enabling p to be a relation matrix to be solved, enabling delta p to be an iteration step length, and carrying out the next iteration until the similarity of the two images meets a given threshold value T.
In a specific implementation, under the condition that the focal length f and the object distance H are known, the relative distance between the object at the first position and the object at the second position is obtained according to the optimal projection relation matrix of the first detection image and the second detection image: the horizontal distance Dx is Dx/f H, the vertical distance Dy is Dy/f H, and the angle θ.
The object displacement measuring device based on image recognition projects a preset target image on a measured object through a camera to take a picture twice, then converts the relative position relation of the measured object into the displacement relation between images by utilizing the imaging relation between the object images, accurately calculates the spatial relation matrix between the images by carrying out edge detection, centroid point extraction and sub-pixel registration on the two images, and finally obtains the displacement relation of the measured object. The image recognition method does not need contact measurement, has accurate measurement precision and simple, convenient and efficient measurement mode, and can quickly obtain displacement information.
The invention also provides an embodiment of an Object displacement measurement system based on image recognition, as shown in fig. 6, the system includes a photographing device 1, a light source 2, a beam splitter 3 and the Object displacement measurement device based on image recognition, the beam splitter 3 is arranged along an optical axis of the light source 2, a preset target image 4 is positioned between the light source and the beam splitter so as to project the target image onto an Object of a measured Object, the photographing device 1 is connected with the Object displacement measurement device, and the optical axis of the photographing device 1 is perpendicular to an Object plane of the measured Object.
In the specific implementation, taking the micro displacement of the Object to be measured as an example, the self-designed pixel size is 7 μm × 7 μm, the effective pixel is 4098pixel × 4098pixel, and the photographing device 1 specifically includes: the optical focal length f is 70mm, and the small area-array camera is arranged on the translation stage, and the distance between the camera and the Object to be measured is 2 m. The light source 2 adopts a bromine tungsten lamp, namely the bromine tungsten lamp is used for illuminating the target through an integrating sphere, and the designed target image is a circular light-transmitting plate; when the Object to be detected does not move, a first detection image Img1 is obtained by shooting; when the Object to be detected slightly moves, a second detection image Img2 is obtained by shooting; obtaining the translation amount of the two images through image edge detection, feature centroid extraction and sub-pixel registration: dx is 10.25pixel, dy is 12.35pixel, and the rotation angle θ is 1.3 degrees. And finally, according to the focal length and the object distance of the camera, the movement displacement of the measured object is obtained as follows:
Dx=10.25*7*10-6*2/(70*10-3)=2.05mm,Dy=12.35*7*10-6*2/(70*10-3) 2.47mm, andthe rotation angle was 1.3 degrees.
The object displacement measurement system based on image recognition projects a preset target image on a measured object through a camera to take a picture twice, then converts the relative position relation of the measured object into the displacement relation between images by utilizing the imaging relation between the object images, accurately calculates the spatial relation matrix between the images by carrying out edge detection, centroid point extraction and sub-pixel registration on the two images, and finally obtains the displacement relation of the measured object. The image recognition method does not need contact measurement, has accurate measurement precision and simple, convenient and efficient measurement mode, and can quickly obtain displacement information.
The foregoing embodiments and description have been presented only to illustrate the principles and preferred embodiments of the invention, and various changes and modifications may be made therein without departing from the spirit and scope of the invention as hereinafter claimed.

Claims (3)

1. An object displacement measurement method based on image recognition is characterized in that: the measuring method comprises the following steps:
acquiring first detection image data of an object at a first position and second detection image data of the object at a second position based on a preset target image;
performing edge detection processing on the first detection image data and the second detection image data to obtain first edge map data corresponding to the first detection image and second edge map data corresponding to the second detection image;
positioning according to first edge map data corresponding to the first detection image, second edge map data corresponding to the second detection image and a sub-pixel centroid method to obtain a feature point set corresponding to the first detection image and a feature point set corresponding to the second detection image;
obtaining an initialized projection relation matrix by using a phase correlation method according to the row coordinate and the column coordinate of each feature point in the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image;
according to the initialized projection relation matrix, adopting a registration algorithm of sub-pixel extension correlation to solve the initialized projection relation matrix to obtain an optimal projection relation matrix E of the first detection image and the second detection imageCC(p);
The specific formula is as follows:
Figure FDA0002395030750000011
wherein, | | | represents the euclidean distance, ir(pt11, pt12, …, pt1n) represents a vector of feature points extracted from the first image,
Figure FDA0002395030750000012
is irSubtracting the arithmetic mean value of the feature point set corresponding to the first detection image; i.e. iw(pt21, pt22, …, pt2n), represents a vector formed by the feature points extracted from the second image,
Figure FDA0002395030750000013
is iwSubtracting the arithmetic mean value of the characteristic point set corresponding to the second detection image, wherein p is an initialized projection relation matrix;
the formula of the initialized projection relation matrix P is as follows:
Figure FDA0002395030750000021
where dx is a horizontal distance of the first detection image and the second detection image, dy is a vertical distance of the first detection image and the second detection image, and θ is a rotation angle of the first detection image and the second detection image;
obtaining the relative distance between the object at the first position and the object at the second position according to the preset focal length, the preset object distance and the optimal projection relation matrix of the first detection image and the second detection image;
the edge detection processing specifically comprises Canny edge detection;
the line coordinates of each feature point in the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image are as follows:
Figure FDA0002395030750000022
the column coordinates of each feature point in the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image are as follows:
Figure FDA0002395030750000023
where T is the threshold value set, g (u)i,vj) Is the u-th image of the image giLine, v thjThe gray value of the pixel of the column, i represents the row of the image, the range of i1 to i2, j represents the column of the image, the range of j1 to j2, the image g is the image to be detected, viV-th representing image giAnd (4) columns.
2. An object displacement measuring device based on image recognition is characterized in that: operating the method of claim 1, the measuring device comprising:
the device comprises an acquisition module, a detection module and a processing module, wherein the acquisition module is used for acquiring first detection image data of an object at a first position and second detection image data of the object at a second position based on a preset target pattern;
the edge detection processing module is used for carrying out edge detection processing on the first detection image data and the second detection image data to obtain first edge map data corresponding to the first detection image and second edge map data corresponding to the second detection image;
the positioning module is used for positioning according to first edge map data corresponding to the first detection image, second edge map data corresponding to the second detection image and a sub-pixel centroid method to obtain a feature point set corresponding to the first detection image and a feature point set corresponding to the second detection image;
the first processing module is used for obtaining an optimal projection relation matrix of the first detection image and the second detection image according to the characteristic point set corresponding to the first detection image and the characteristic point set corresponding to the second detection image;
the second processing module is used for obtaining the relative distance between the object at the first position and the object at the second position according to the preset camera focal length, the camera object distance and the optimal projection relation matrix of the first detection image and the second detection image;
the edge detection processing module is further used for performing edge detection processing on the first detection image data and the second detection image data by adopting Canny edge detection to obtain first edge map data corresponding to the first detection image and second edge map data corresponding to the second detection image;
the line coordinates of each feature point in the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image are as follows:
Figure FDA0002395030750000031
the column coordinates of each feature point in the feature point set corresponding to the first detection image and the feature point set corresponding to the second detection image are as follows:
Figure FDA0002395030750000032
where T is the threshold value set, g (u)i,vj) Is the u-th image of the image giLine, v thjThe gray value of the pixel of the column, i represents the row of the image, the range of i1 to i2, j represents the column of the image, the range of j1 to j2, the image g is the image to be detected, viColumn vi represents image g.
3. An object displacement measurement system based on image recognition is characterized in that: including the device of shooing, the light source, beam splitter and claim 2 object displacement measuring device based on image recognition, the beam splitter along the optical axis setting of light source, predetermined target image is located between light source and the beam splitter so that with the target image projection on the testee, the device of shooing with object displacement measuring device connects just the optical axis perpendicular to of the device of shooing the object plane of testee.
CN201710285370.8A 2017-04-27 2017-04-27 Object displacement measurement method, device and system based on image recognition Expired - Fee Related CN107101584B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710285370.8A CN107101584B (en) 2017-04-27 2017-04-27 Object displacement measurement method, device and system based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710285370.8A CN107101584B (en) 2017-04-27 2017-04-27 Object displacement measurement method, device and system based on image recognition

Publications (2)

Publication Number Publication Date
CN107101584A CN107101584A (en) 2017-08-29
CN107101584B true CN107101584B (en) 2020-06-12

Family

ID=59656813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710285370.8A Expired - Fee Related CN107101584B (en) 2017-04-27 2017-04-27 Object displacement measurement method, device and system based on image recognition

Country Status (1)

Country Link
CN (1) CN107101584B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107633507A (en) * 2017-09-02 2018-01-26 南京理工大学 LCD defect inspection methods based on contour detecting and characteristic matching
CN109959335B (en) * 2017-12-22 2021-09-21 北京金风科创风电设备有限公司 Method, device and system for measuring displacement of tower top
CN110415264B (en) * 2018-04-25 2023-10-24 奇景光电股份有限公司 Motion detection circuit and method
CN111829439B (en) * 2020-07-21 2021-06-25 中山大学 High-precision translation measuring method and device
CN114264227B (en) * 2021-11-26 2023-07-25 武汉联影生命科学仪器有限公司 Device and method for measuring focal spot size and position

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1280607C (en) * 2005-03-23 2006-10-18 西安交通大学 Laser dam safety monitoring method
CN100360898C (en) * 2006-03-02 2008-01-09 浣石 Small-displacement measuring method in long-distance plane
CN101408985B (en) * 2008-09-22 2010-09-22 北京航空航天大学 Method and apparatus for extracting circular luminous spot second-pixel center
CN104482861B (en) * 2014-12-08 2017-09-19 苏州市计量测试研究所 It is a kind of to measure object length and/or the method for displacement
CN204718553U (en) * 2015-06-12 2015-10-21 北京光电技术研究所 Buildings displacement measurement system
DE102015110289A1 (en) * 2015-06-26 2016-12-29 Werth Messtechnik Gmbh Method for determining measuring points on the surface of a tool piece with an optical sensor

Also Published As

Publication number Publication date
CN107101584A (en) 2017-08-29

Similar Documents

Publication Publication Date Title
CN107101584B (en) Object displacement measurement method, device and system based on image recognition
CN109146980B (en) Monocular vision based optimized depth extraction and passive distance measurement method
EP3028252B1 (en) Rolling sequential bundle adjustment
US20200132451A1 (en) Structural Light Parameter Calibration Device and Method Based on Front-Coating Plane Mirror
US10529076B2 (en) Image processing apparatus and image processing method
CN109035320A (en) Depth extraction method based on monocular vision
Lébraly et al. Flexible extrinsic calibration of non-overlapping cameras using a planar mirror: Application to vision-based robotics
US20160187130A1 (en) Method for determining a position and orientation offset of a geodetic surveying device and such a surveying device
US20150262346A1 (en) Image processing apparatus, image processing method, and image processing program
CN109754429A (en) A kind of deflection of bridge structure measurement method based on image
Ellmauthaler et al. A novel iterative calibration approach for thermal infrared cameras
TW201118791A (en) System and method for obtaining camera parameters from a plurality of images, and computer program products thereof
KR101759798B1 (en) Method, device and system for generating an indoor two dimensional plan view image
TW201439665A (en) Panoramic lens calibration for panoramic image and/or video capture apparatus
CN109855602A (en) Move the monocular visual positioning method under visual field
CN103544699B (en) Method for calibrating cameras on basis of single-picture three-circle template
CN105758337A (en) Method for obtaining included angel between a lens plane and an image sensor plane
CN104937608B (en) Road area detection
CN106796726A (en) Method for testing motion and system
KR101705330B1 (en) Keypoints Selection method to Find the Viewing Angle of Objects in a Stereo Camera Image
US10912444B2 (en) Image processing apparatus, image processing method, and endoscope
CN116205993A (en) Double-telecentric lens high-precision calibration method for 3D AOI
WO2019100216A1 (en) 3d modeling method, electronic device, storage medium and program product
CN115631245A (en) Correction method, terminal device and storage medium
CN112734838B (en) Space target positioning method, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200612

CF01 Termination of patent right due to non-payment of annual fee