CN109086795A - A kind of accurate elimination method of image mismatch - Google Patents

A kind of accurate elimination method of image mismatch Download PDF

Info

Publication number
CN109086795A
CN109086795A CN201810679354.1A CN201810679354A CN109086795A CN 109086795 A CN109086795 A CN 109086795A CN 201810679354 A CN201810679354 A CN 201810679354A CN 109086795 A CN109086795 A CN 109086795A
Authority
CN
China
Prior art keywords
matching
matching pair
pair
pole
matches
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810679354.1A
Other languages
Chinese (zh)
Inventor
赵攀攀
丁德锐
何壮壮
黄颖
冯汉
余玉琴
陈晗
张震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201810679354.1A priority Critical patent/CN109086795A/en
Publication of CN109086795A publication Critical patent/CN109086795A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of accurate elimination methods of image mismatch, mesh fitting statistical restraint can quickly distinguish in the net region being sized it is correct matching to and erroneous matching pair, thus reject erroneous matching to retain correctly matching to obtain a rough matching collection;Rough matching can further be rejected to pole restricted model in conjunction with the improvement from the projection error function proposed and concentrate the matching pair for not meeting and improving to pole restricted model, to obtain good set of matches.This method merged mesh fitting statistical restraint can quickly distinguish in net region it is correct matching to and erroneous matching pair, and carry out reject and it is improved pole restricted model can be rejected quickly be unsatisfactory for pole constrain matching pair advantage, so as to obtain good matching pair, used time is less, and matching accuracy is higher.This method can adapt to the scene of different size complexity, and method is applied to visual synchronization and positions and build figure, and can all there be good effect in the fields such as three-dimensionalreconstruction and vision tracking.

Description

A kind of accurate elimination method of image mismatch
Technical field
The present invention relates to a kind of image processing techniques, in particular to a kind of fusion mesh fitting statistical restraint and improvement are to pole The accurate elimination method of the image mismatch of restricted model.
Background technique
Images match is visual synchronization positioning and builds figure, and one of the fields such as three-dimensionalreconstruction and vision tracking is extremely important The step of, it needs image matching algorithm while having real-time, accuracy and robustness.Image matching algorithm is divided into base In the direct matching method of gray scale invariance and based on the matching algorithm of characteristic similarity.Feature-based matching method is generally logical It crosses the local feature mapping relations found between two images to complete, including matching, line match and a region Match.It is wherein based on points correspondence, since its extraction operation is simple, and matching way is flexible, the used time is less, therefore in image It is commonly used in matching.
It is generally opened up after the SIFT algorithm based on scale invariability that Lowe [1] et al. is proposed based on matched work is put Open, the algorithm is to dimensional variation, and rotation and illumination variation have good adaptability, but SIFT is computationally intensive, cost when Between it is long, do not have real-time.The dimension that feature describes operator is reduced to 64 dimensions by Bay [2] et al., is proposed SURF matching and is calculated Method although improving matching efficiency, but is positioned being used in mobile robot visual with navigation, and real-time is not much up to It arrives.ORB (the Oriented FAST and Rotated for mostly using Rublee [3] et al. to propose in visual odometry greatly at present BRIEF) Feature Correspondence Algorithm, ORB Feature Correspondence Algorithm are matched in feature point extraction with rapidity, are extracting sparse spy Has preliminary real-time when sign point.But ORB matching algorithm has big when extraction obtains characteristic point and establishes initial matching The Mismatching point of amount estimates inaccuracy so as to cause the pose of visual odometry, the problems such as poor robustness.
Therefore reject error hiding and retain it is good matching above is referred to field in be critically important part, exist at present Rejecting error hiding traditional method is to obtain rough match point first with ORB algorithm, is then picked using RANSAC [4] algorithm Except Mismatching point, but RANSAC algorithm has requirement to initial value, needs to be previously obtained preliminary accurate set of matches.Marius Muja [5] et al. proposes that FLANN algorithm can be used to propose error matching points, obtains rough set of matches, but combine this method Obtained match point is seldom, and quality of match is not high, is unfavorable for follow-up work.Xing Kaisheng [6] et al. is by combining ORB and RANSAC Algorithm carries out error hiding deletion to ORB match point, but since RANSAC algorithm needs accurate initial value, and ORB algorithm is rigid Matching obtained matching will lead to it and deletes to a large amount of error hiding pair, such RANSAC algorithm initial value inaccuracy is flooded with Lead to very big inaccuracy when except error hiding.Qin Xiaofei [7] et al. rejects ORB error hiding by introducing epipolar-line constraint, but The algorithm first has to reject Mismatching point using artificial given threshold, obtains thick set of matches, then epipolar-line constraint and RANSAC are calculated Method, which combines, carries out error hiding rejecting.The algorithm needs artificial given threshold, has certain artifact when obtaining thick set of matches Interference, and when training obtains threshold value, time cost is long, and algorithm is caused not have real-time.Bian Jiawang [8] et al. is mentioned GMS (matching statistics) algorithm out quickly can reject error hiding pair to initial matching collection, not need artificial given threshold, but should Algorithm can only obtain rough matching collection, can still there is error hiding, cannot obtain a large amount of good set of matches.
Summary of the invention
A large amount of error hidings can be led to the problem of the present invention be directed to current image matching algorithm and algorithm does not have real-time, A kind of accurate elimination method of image mismatch is proposed, mesh fitting statistical restraint can quickly distinguish the net being sized Correct matching in lattice region to and erroneous matching pair, so that it is thick to obtaining one to correctly matching is retained to reject erroneous matching Slightly set of matches.Rough matching collection can further be rejected to pole restricted model in conjunction with the improvement from the projection error function proposed In do not meet and improve matching pair to pole restricted model and improved to a certain extent matched to obtain good set of matches Accuracy rate.
The technical solution of the present invention is as follows: a kind of accurate elimination method of image mismatch, specifically comprises the following steps:
1) two images with 10 °~15 ° angular transformations are shot to Same Scene with camera, by two images of acquisition Input value as Feature Correspondence Algorithm;
2) to this two input pictures calculate separately ORB characteristic point and based on ORB characteristic point description son, and be based on the Chinese Prescribed distance carries out simple matching, obtains initial matching pair;
3) mesh fitting statistical restraint is calculated, in the initial matching pair of acquisition to reject the matching for not meeting the constraint It is right, obtain rough set of matches:
Assuming that thering is two frames are to be matched to obtain image Ia、Ib, X={ X1, X2..., XkIt is k initial matching pair, it first has to two Width image Ia、IbIt is split with the grid being sized, the quantity of matching pair in each grid is then counted, to distinguish The matching pair of right and wrong;
Define the matching statistical restraint Score index of every matching pair in any one gridCalculation formula are as follows:
The matching in this net region chosen is represented to quantity, subtracts 1 and is to remove itself this Pairing;Define a scoring threshold calculations formula are as follows:α obtains value through experiment;Work as scoringWhen, then illustrate that this matching is otherwise erroneous matching pair correctly to being, to obtain a rough set of matches;
4) matching concentrated to rough matching in step 3) is ranked up from small to large to according to Hamming distance, is sorted Then good set of matches M selects the first eight matching pair in M, to calculate to pole restricted model C, calculation formula is as follows:
r1Cr′1=0
r1=K-1p1
r′1=K-1p′1
K is camera internal reference, can be obtained by calibration;r1、r1' it is one group of characteristic point p matched1、p1After ' normalization Coordinate, respectively r1=[u1, v1, 1], r1'=[u1', v1', 1], u1, v1It is characterized point p1Normalized image pixel coordinates, u1', v1' it is characterized point p1' normalized image pixel coordinates;C is to have 9 known variables to pole restricted model, pass through following formula Pole restricted model C is found out,
Then a projection error function is defined:
ds(C, pi, pi')=pi′-Cpi
Projection function extension:
pi' it is image IaIn arbitrary characteristics point piIt is projected in image IbIn some point on corresponding polar curve;With to pole about Beam MODEL C combination projection error function PE obtain it is improved to pole restricted model PCM, then using improved to pole restricted model PE-PCM goes the remaining matching pair in test set of matches M, if remaining matching is unsatisfactory for being adjusted to above-mentioned to the model is unsatisfactory for Projection error function be greater than setting threshold value, then reject the matching pair;Otherwise, the matching pair for meeting the model is left, is obtained Good set of matches.
The beneficial effects of the present invention are: the accurate elimination method of image mismatch of the present invention, this method have merged grid With statistical restraint can quickly distinguish in net region it is correct matching to and erroneous matching to and rejected and improved Pole restricted model can be rejected quickly be unsatisfactory for pole constrain matching pair advantage, so as to obtain good matching Right, the used time is less, and matching accuracy is higher.And this method can adapt to the scene of different size complexity, in these scenes There is good error hiding to reject effect.Our methods be applied to visual synchronization position with build figure, three-dimensionalreconstruction and vision with Can all there be good effect in the fields such as track.
Detailed description of the invention
Fig. 1 is that the image mismatch of pole restricted model is accurately rejected in present invention fusion mesh fitting statistical restraint and improvement The general flow chart of method;
Fig. 2 is two schematic diagrames of algorithm input value;
Fig. 3 is the Prototype drawing that ORB detects characteristic point;
Fig. 4 is the initial matching figure that ORB algorithm obtains;
Fig. 5 a is the effect picture that traditional algorithm rejects error hiding;
Fig. 5 b is that context of methods rejects error hiding effect picture.
Specific embodiment
The present invention is to obtain the input of algorithm by shooting two images with certain angle transformation with camera first Value.Secondly, to this two input pictures calculate separately ORB characteristic point and based on ORB characteristic point description son, and be based on Hamming Distance carries out simple matching, obtains initial matching pair.Again, mesh fitting statistical restraint is calculated in the initial matching pair of acquisition (Grid-based Motion Statistics Constraint, GMSC) to reject some matchings pair for not meeting the constraint, Obtain rough set of matches.Finally, by selecting the smallest eight pairs of matchings of Hamming distance to being calculated in rough matching concentration To pole restricted model (Polar Constraint Model, PCM), then pole restricted model is combined from the projection error proposed Function (Projection Error function, PE) obtain it is improved to pole restricted model (PE-PCM), with improved to pole Restricted model goes to reject the matching pair for being unsatisfactory for the model, can obtain good matching pair.
The present embodiment we adopt and conform to the principle of simplicity with the aforedescribed process to several groups of Munich, Germany polytechnical university TUM data set to difficult Image carries out test experiments, our method of result verification is suitble to the erroneous matching Weeding under any complex scene, energy Good set of matches is enough obtained, more existing matching algorithm can be improved real-time.
It fusion mesh fitting statistical restraint and improves as shown in Figure 1 to the accurate side of rejecting of the image mismatch of pole restricted model The general flow chart of method, includes the following steps:
The first step shoots two images with certain angle transformation to Same Scene with camera, obtains the input of algorithm Value.Specifically: two images are shot to Same Scene at random using camera, but this two images need to have certain angle inclined Turn, size can be 10 °~15 ° or so, then use this input value of two images as Feature Correspondence Algorithm.If Fig. 2 is TUM Two pictures of middle rgbd_dataset_freiburg2_rpy data set have small angle change between them.
Second step calculates separately ORB characteristic point and description based on ORB characteristic point to this two input pictures, and Simple matching is carried out based on Hamming distance, obtains initial matching pair.
Specifically: 1) ORB feature is calculated separately according to following steps in two images of input, selected on the image first Take a candidate point p, the Prototype drawing of ORB detection characteristic point as shown in Figure 3, gray value Ip, then consider radius around point p For the gray value of 3 upper 16 points of circle, feature points receptance function:
Circle (p) is the set of round upper 16 pixels, IqIt is the gray value at any point in set, θdIt is given for one Fixed threshold value (such as desirable Ip20%).
N is finally calculated, N is that upper 16 pixels of circle and p point gray scale difference absolute value are greater than θdNumber, it is N number of continuous when having Pixel when all meeting condition, then p is referred to as ORB characteristic point.
2) it is calculated separately after obtaining ORB characteristic point in two images, then calculates separately the ORB description of these characteristic points Son, step are as follows: τ test is carried out in the neighborhood centered on first in two images characteristic point p, test function is
I (x), I (y) are respectively the gray value size of pixel x in two images in neighborhood, pixel y.
In two images by pass through near characteristic point p Gaussian Profile randomly select out m to test point carry out τ test, Gaussian Profile is a distributed model in mathematics.The matrix for defining a 2 × m, is denoted as
(the x of each columni, yi) it is a pair of of the test point chosen, xiAnd yiRespectively for this to the corresponding pixel of test point.
It tests to obtain ORB finally by fixed τ and describes sub- g (p), fixed τ test is as follows:
3) the simple matching based on Hamming distance is carried out to obtained ORB description, Hamming distance indicates two identical length The quantity of the character string corresponding position kinds of characters of degree.Son is described to ORB and carries out simple matching, when between two ORB description Hamming distance be less than some threshold value, then be set to initial matching pair.
Third step calculates mesh fitting statistical restraint (Grid-based in the initial matching pair that previous step obtains Motion Statistics Constraint, GMSC) to reject some matchings pair for not meeting the constraint, obtain rough With collection.Such as Fig. 4 as can be seen that initial matching has a large amount of erroneous matching to presence, therefore you we need to use mesh fitting Statistical restraint tentatively rejects erroneous matching pair.Mesh fitting statistical restraint concretism is: if from the difference of same 3D scene A pair of of the image shot in view, if carrying out motion smoothing simultaneously, that is matched just in a small net region True point pair is it can be seen that identical 3D scene, and matching error then sees different 3D scenes.And in the zonule, if It is that correctly matching can be more to then surrounding number of matches, conversely, erroneous matching can less, therefore pass through rationally then surrounding quantity The matching pair of right and wrong can then be distinguished to quantity by counting the matching in this net region.Specifically: assuming that having two Frame is to be matched to obtain image Ia、Ib, X={ X1, X2..., XkIt is k initial matching pair, matching statistical restraint target is to distinguish This k matching centering obtain correctly match to erroneous matching pair, and erroneous matching is rejected to as exterior point.It first has to two width figures As Ia、IbIt is split with the grid being sized, then counts the quantity of matching pair in each grid, to distinguish correct With the matching pair of mistake.Define the matching statistical restraint Score index of every matching pair in any one gridCalculation formula Are as follows:
The matching in this net region chosen is represented to quantity, subtracts 1 and is to remove itself this Pairing.Define a scoring threshold calculations formula are as follows:α can obtain value through experiment.Work as scoringWhen, then illustrate that this matching is otherwise erroneous matching pair correctly to being, to obtain a rough matching collection.
4th step selects the smallest eight pairs of matchings pair of Hamming distance by concentrating in rough matching, is calculated to pole about Beam model (Polar Constraint Model, PCM) then combines from the projection error function proposed pole restricted model (Projection Error function, PE) is obtained improved to pole restricted model (PE-PCM), is constrained with improved pole Model goes to reject the matching pair for being unsatisfactory for the model, can obtain good matching pair, a degree of to improve matched standard True rate.
Specifically: the matching concentrated to rough matching in third step is ranked up from small to large to according to Hamming distance, is obtained Then the set of matches M that must have been sorted selects the first eight matching pair in M, to calculate to pole restricted model C, calculation formula is as follows:
r1Cr′1=0
r1=K-1p1
r′1=K-1p′1
K is camera internal reference, can be obtained by calibration.r1、r1' it is one group of characteristic point p matched1、p1After ' normalization Coordinate, respectively r1=[u1, v1, 1], r1'=[u1', v1', 1], u1, v1It is characterized point p1Normalized image pixel coordinates, u1', v1' it is characterized point p1' normalized image pixel coordinates.C is to have 9 known variables, due to above formula to pole restricted model All it is of equal value under different scale, therefore lacks a scale factor, need to can be only found out with eight pairs of points in set of matches pair Pole restricted model C.
It can be found out in this way to pole restricted model C, then define a projection error function:
ds(C, pi, pi')=pi′-Cpi
Projection function extension:
pi' for the arbitrary characteristics point p in Fig. 4 left imageiIt is projected in some point in right image on corresponding polar curve.With right Pole restricted model C combination projection error function (PE) obtain it is improved to pole restricted model (PCM), then using improved to pole Restricted model (PE-PCM) goes the remaining matching pair in test set of matches M, if remaining matching is to being unsatisfactory for the model, i.e., above-mentioned Projection error function be greater than certain threshold value, then reject the matching pair.Otherwise, the matching pair for meeting the model is left, is obtained Good set of matches.
Fig. 5 a, 5b are that several groups of pictures that we pick in TUM data set are tested, and Fig. 5 a is that traditional algorithm rejects mistake Matched effect picture, Fig. 5 b are that context of methods rejects error hiding effect picture, this several groups of picture match difficulties are conformed to the principle of simplicity respectively to difficulty, can To find out that we have good error hiding to reject effect various scenes at method.From when the matching of several algorithms of different of the following table 1 Between in comparing result as can be seen that we are fewer than the matching algorithm used time of traditional matching algorithm and several classics the method used time, Real-time is higher.Traditional algorithm herein is to obtain initial matching using ORB algorithm, then using FLANN algorithm and RANSAC algorithm, which combines, carries out wrong rejecting.SURF, SIFT are the image matching algorithms of several classics, are calculated also with FLANN Method and RANSAC algorithm, which combine, carries out wrong rejecting.Then total used time used in them is calculated, is compared with our methods.
Table 1

Claims (1)

1. a kind of accurate elimination method of image mismatch, which is characterized in that specifically comprise the following steps:
1) two images with 10 °~15 ° angular transformations are shot to Same Scene with camera, using two images of acquisition as The input value of Feature Correspondence Algorithm;
2) to this two input pictures calculate separately ORB characteristic point and based on ORB characteristic point description son, and be based on Hamming distance From simple matching is carried out, initial matching pair is obtained;
3) mesh fitting statistical restraint is calculated in the initial matching pair of acquisition, to reject the matching pair for not meeting the constraint, is obtained To rough set of matches:
Assuming that thering is two frames are to be matched to obtain image Ia、Ib, X={ X1, X2..., XkIt is k initial matching pair, it first has to two width figures As Ia、IbIt is split with the grid being sized, then counts the quantity of matching pair in each grid, to distinguish correct With the matching pair of mistake;
Define the matching statistical restraint Score index of every matching pair in any one gridCalculation formula are as follows:
The matching in this net region chosen is represented to quantity, subtracts 1 and be this matching in order to remove itself It is right;Define a scoring threshold calculations formula are as follows:α obtains value through experiment;Work as scoring When, then illustrate that this matching is otherwise erroneous matching pair correctly to being, to obtain a rough set of matches;
4) matching concentrated to rough matching in step 3) is ranked up from small to large to according to Hamming distance, what acquisition had been sorted Then set of matches M selects the first eight matching pair in M, to calculate to pole restricted model C, calculation formula is as follows:
r1Cr′1=0
r1=K-1p1
r′1=K-1p′1
K is camera internal reference, can be obtained by calibration;r1、r1' it is one group of characteristic point p matched1、p1Seat after ' normalization Mark, respectively r1=[u1, v1, 1], r1'=[u1', v1', 1], u1, v1It is characterized point p1Normalized image pixel coordinates, u1', v1' it is characterized point p1' normalized image pixel coordinates;C is to have 9 known variables to pole restricted model, found out by following formula Pole restricted model C,
Then a projection error function is defined:
ds(C, pi, pi')=pi′-Cpi
Projection function extension:
pi' it is image IaIn arbitrary characteristics point piIt is projected in image IbIn some point on corresponding polar curve;Mould is constrained with to pole Type C combination projection error function PE obtain it is improved to pole restricted model PCM, then using improved to pole restricted model PE- PCM goes the remaining matching pair in test set of matches M, if remaining matching is unsatisfactory for being adjusted to above-mentioned to the model is unsatisfactory for Projection error function is greater than the threshold value of setting, then rejects the matching pair;Otherwise, the matching pair for meeting the model is left, is obtained excellent The set of matches of matter.
CN201810679354.1A 2018-06-27 2018-06-27 A kind of accurate elimination method of image mismatch Pending CN109086795A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810679354.1A CN109086795A (en) 2018-06-27 2018-06-27 A kind of accurate elimination method of image mismatch

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810679354.1A CN109086795A (en) 2018-06-27 2018-06-27 A kind of accurate elimination method of image mismatch

Publications (1)

Publication Number Publication Date
CN109086795A true CN109086795A (en) 2018-12-25

Family

ID=64839904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810679354.1A Pending CN109086795A (en) 2018-06-27 2018-06-27 A kind of accurate elimination method of image mismatch

Country Status (1)

Country Link
CN (1) CN109086795A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109784308A (en) * 2019-02-01 2019-05-21 腾讯科技(深圳)有限公司 A kind of address error correction method, device and storage medium
CN109829459A (en) * 2019-01-21 2019-05-31 重庆邮电大学 Based on the vision positioning method for improving RANSAC
CN109903343A (en) * 2019-02-28 2019-06-18 东南大学 A kind of feature matching method based on inertial attitude constraint
CN110163273A (en) * 2019-05-14 2019-08-23 西安文理学院 It is a kind of that genic image matching method is had based on RANSAC algorithm
CN110443295A (en) * 2019-07-30 2019-11-12 上海理工大学 Improved images match and error hiding reject algorithm
CN112016610A (en) * 2020-08-25 2020-12-01 济南大学 Image feature matching method and system
CN112419314A (en) * 2020-12-10 2021-02-26 易思维(杭州)科技有限公司 Characteristic point eliminating method based on correlation
CN115115861A (en) * 2022-08-31 2022-09-27 中国民航大学 Image correction method applied to rotating binocular stereoscopic vision system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101799939A (en) * 2010-04-02 2010-08-11 天津大学 Rapid and self-adaptive generation algorithm of intermediate viewpoint based on left and right viewpoint images
CN101833765A (en) * 2010-04-30 2010-09-15 天津大学 Characteristic matching method based on bilateral matching and trilateral restraining
CN108171787A (en) * 2017-12-18 2018-06-15 桂林电子科技大学 A kind of three-dimensional rebuilding method based on the detection of ORB features

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101799939A (en) * 2010-04-02 2010-08-11 天津大学 Rapid and self-adaptive generation algorithm of intermediate viewpoint based on left and right viewpoint images
CN101833765A (en) * 2010-04-30 2010-09-15 天津大学 Characteristic matching method based on bilateral matching and trilateral restraining
CN108171787A (en) * 2017-12-18 2018-06-15 桂林电子科技大学 A kind of three-dimensional rebuilding method based on the detection of ORB features

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PANPAN ZHAO ET AL.: "An improved GMS-PROSAC algorithm for image mismatch elimination", 《SYSTEMS SCIENCE & CONTROL ENGINEERING》 *
黄丽 等: "基于ORB特征的图像误匹配剔除方法研究", 《机电工程》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829459B (en) * 2019-01-21 2022-05-17 重庆邮电大学 Visual positioning method based on improved RANSAC
CN109829459A (en) * 2019-01-21 2019-05-31 重庆邮电大学 Based on the vision positioning method for improving RANSAC
CN109784308B (en) * 2019-02-01 2020-09-29 腾讯科技(深圳)有限公司 Address error correction method, device and storage medium
CN109784308A (en) * 2019-02-01 2019-05-21 腾讯科技(深圳)有限公司 A kind of address error correction method, device and storage medium
CN109903343A (en) * 2019-02-28 2019-06-18 东南大学 A kind of feature matching method based on inertial attitude constraint
CN109903343B (en) * 2019-02-28 2023-05-23 东南大学 Feature matching method based on inertial attitude constraint
CN110163273A (en) * 2019-05-14 2019-08-23 西安文理学院 It is a kind of that genic image matching method is had based on RANSAC algorithm
CN110443295A (en) * 2019-07-30 2019-11-12 上海理工大学 Improved images match and error hiding reject algorithm
CN112016610A (en) * 2020-08-25 2020-12-01 济南大学 Image feature matching method and system
CN112016610B (en) * 2020-08-25 2022-05-31 济南大学 Image feature matching method and system
CN112419314B (en) * 2020-12-10 2023-02-28 易思维(杭州)科技有限公司 Characteristic point eliminating method based on correlation
CN112419314A (en) * 2020-12-10 2021-02-26 易思维(杭州)科技有限公司 Characteristic point eliminating method based on correlation
CN115115861A (en) * 2022-08-31 2022-09-27 中国民航大学 Image correction method applied to rotating binocular stereoscopic vision system

Similar Documents

Publication Publication Date Title
CN109086795A (en) A kind of accurate elimination method of image mismatch
CN105701820B (en) A kind of point cloud registration method based on matching area
CN109102547A (en) Robot based on object identification deep learning model grabs position and orientation estimation method
CN108717531B (en) Human body posture estimation method based on Faster R-CNN
CN106709950B (en) Binocular vision-based inspection robot obstacle crossing wire positioning method
CN109190446A (en) Pedestrian's recognition methods again based on triple focused lost function
CN109211198B (en) Intelligent target detection and measurement system and method based on trinocular vision
CN106570491A (en) Robot intelligent interaction method and intelligent robot
CN102521616B (en) Pedestrian detection method on basis of sparse representation
CN109377513A (en) A kind of global credible estimation method of 3 D human body posture for two views
CN109241983A (en) A kind of cigarette image-recognizing method of image procossing in conjunction with neural network
CN107564059A (en) Object positioning method, device and NI Vision Builder for Automated Inspection based on RGB D information
CN108334881A (en) A kind of licence plate recognition method based on deep learning
CN110111316A (en) Method and system based on eyes image identification amblyopia
CN102704215A (en) Automatic cutting method of embroidery cloth based on combination of DST file parsing and machine vision
CN110490913A (en) Feature based on angle point and the marshalling of single line section describes operator and carries out image matching method
CN113393524B (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
US11361534B2 (en) Method for glass detection in real scenes
CN110490067A (en) A kind of face identification method and device based on human face posture
CN110443295A (en) Improved images match and error hiding reject algorithm
Zou et al. Microarray camera image segmentation with Faster-RCNN
CN104851095A (en) Workpiece image sparse stereo matching method based on improved-type shape context
CN111881716A (en) Pedestrian re-identification method based on multi-view-angle generation countermeasure network
CN110472662A (en) Image matching algorithm based on improved ORB algorithm
CN110110793A (en) Binocular image fast target detection method based on double-current convolutional neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181225

WD01 Invention patent application deemed withdrawn after publication