CN113689403B - Feature description system based on inter-feature azimuth distance - Google Patents

Feature description system based on inter-feature azimuth distance Download PDF

Info

Publication number
CN113689403B
CN113689403B CN202110974875.1A CN202110974875A CN113689403B CN 113689403 B CN113689403 B CN 113689403B CN 202110974875 A CN202110974875 A CN 202110974875A CN 113689403 B CN113689403 B CN 113689403B
Authority
CN
China
Prior art keywords
feature
main
points
point
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110974875.1A
Other languages
Chinese (zh)
Other versions
CN113689403A (en
Inventor
陶淑苹
冯钦评
刘春雨
曲宏松
徐伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN202110974875.1A priority Critical patent/CN113689403B/en
Publication of CN113689403A publication Critical patent/CN113689403A/en
Application granted granted Critical
Publication of CN113689403B publication Critical patent/CN113689403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

A feature description system based on inter-feature azimuth distance relates to the field of image feature extraction. All feature points of an image are detected by a feature detector, feature points are screened by a feature screening unit, then the azimuth and the distance between each main feature point and all other secondary feature points are calculated by a feature relation calculating unit, and the relation strength is further calculated. Finally, the feature description generating unit performs feature description, namely a feature description vector. The method adopts the information of the azimuth distance between the features to quantitatively describe the features, can describe the features without using scale space and gray gradient change information, and simultaneously maintains certain differentiation and scale invariance. The method has low computational complexity, is easy to realize in hardware, is beneficial to accelerating applications such as feature matching, image registration and the like, and brings potential to quick and real-time application.

Description

Feature description system based on inter-feature azimuth distance
Technical Field
The invention relates to the field of image feature extraction, in particular to a feature description system for inter-feature azimuth distance
Background
Feature extraction is to detect features such as points and angular points in an image, and quantitatively describe the features by a proper method so as to prepare for subsequent processing such as image registration, image stitching, image fusion and the like. Feature extraction is therefore a key step in many image applications. Feature extraction is divided into two steps of feature detection and feature description, and compared in terms of computational complexity, feature description has higher computational complexity because it needs to make full use of gray scale change information of pixels around detected feature points to determine information of directions, scales and the like. This has limited many feature-based image processing in real-time applications.
Disclosure of Invention
Aiming at the problems of high computational complexity, poor real-time performance and the like of feature description in the prior art, the invention provides a feature description system based on inter-feature azimuth distance.
The feature description system based on the inter-feature azimuth distance comprises a Gaussian smoothing unit, a feature detector, a feature screening unit, an inter-feature relation calculating unit and a feature descriptor generating unit;
the feature screening unit comprises a secondary feature screening unit and a primary feature screening unit;
the feature descriptor generating unit comprises a direction estimating unit and a direction intensity histogram statistic unit;
the Gaussian smoothing unit performs smoothing processing on the input image I (x, y) and then performs feature detection through a feature detector;
the feature detector detects a feature point set F of the obtained image (I) At the same time obtain feature point f i (I) Position Loc (f in image i (I) )=(x i ,y i ) The characteristic point f i (I) Response intensity Mag (f) i (I) ) The feature point set is expressed as:
F (I) ={f i (I) |i∈[1,2,...,N f ]}
wherein i represents the detected feature point f i (I) Is the ordinal number of the image, the feature point set of the image comprises N f Feature points;
the secondary feature screening unit detects the obtained feature point set F by the feature detector (I) Secondary feature point screening is carried out; the method comprises the following steps:
introducing modulated characteristic response intensity Mag m Expressed by the following formula:
wherein M, N represent the width and height of the image; all the characteristic points are ordered according to the descending order of the modulated characteristic response intensity, the characteristic points of the first half part are selected as secondary characteristic points, and then the serial numbers are reassigned, namely f i (I,SF) The method comprises the steps of carrying out a first treatment on the surface of the The secondary feature points form a secondary feature point set F (I,SF) Expressed by the following formula:
F (I,SF) ={f i (I,SF) |i∈[1,2,…,N f /2]}
the secondary feature point set F (I,SF) The main characteristic screening unit is adopted for screening, and main characteristic points are determined, wherein the specific process is as follows:
initializing a set of principal feature points F of an image (I,PF) Is empty set, i.e
For each secondary feature point f i (I,SF) Define a circle domain D (f) with radius R i (I,SF) ) Expressed by the following formula:
D(f i (I,SF) )={(x,y)|(x-x i ) 2 +(y-y i ) 2 <R 2 }
if there are no other secondary feature points f in the circle j (I,SF) Meets Mag (f) j (I,SF) )>Mag(f i (I,SF) ) Then f is as follows i (I,SF) As the main characteristic point, i.e. f i (I,SF) ∈F (I,PF)
For the main characteristic point set F (I,PF) Ordinals are reassigned to all main feature points in the table to obtain a main feature point f i (I,PF)
The main feature point set satisfies the following equation:
calculating each principal feature point f by the inter-feature relationship calculating unit i (I,PF) With other secondary characteristic points f j (I ,SF) The relative orientation and distance between them,
the direction estimation unit determines the main direction of each main characteristic point, takes the main direction as a reference direction, and maps the relative directions of all secondary characteristic points and the main characteristic point to a range of 0-2 pi clockwise;
wherein the main direction Ori (f i (I,PF) ) Determining according to the relative orientation of the secondary characteristic point with the strongest relation with the primary characteristic point; namely:
wherein the method comprises the steps ofThe method meets the following conditions:
the relative position after remapping is recorded as Azim rm (f j (I,SF) |f i (I,PF) );
The direction intensity histogram statistics unit will calculate the direction intensity histogram for the main direction Ori (f i (I,PF) ) And a remapped relative azimuth Azim rm (f j (I,SF) |f i (I,PF) ) Statistics is carried out to generate a feature description vector; the specific method comprises the following steps:
for each main feature point, taking the main feature point as a center, dividing the image into n sector areas from the main direction of the main feature point, and respectively representing n direction intervals;
then, calculating the sum of the relation intensities of all secondary characteristic points in each direction interval relative to the main characteristic point to form the intensity of the direction interval;
after the intensity of all direction intervals is obtained, the composition f i (I,PF) Is a feature description vector V (f) j (I,PF) |F (I,SF) ) Expressed by the following formula:
V(f j (I,PF) |F (I,SF) )=[O 1 (f i (I,PF) |F (I,SF) ),…,O n (f i (I,PF) |F (I,SF) )]
until the feature description vectors of all the principal feature points are obtained.
The invention has the beneficial effects that:
in the invention, a secondary feature screening unit is adopted to combine the response intensity of the detected feature points and the position coordinates of the detected feature points in the image to determine secondary features so as to improve the repeatability of the system; the main feature screening unit determines main feature points from the secondary feature points so as to improve the robustness of the system to the change of the video angle.
The system adopts the information of azimuth distance between the features to quantitatively describe the features during the feature description, the method can describe the features without using scale space and gray gradient change information, and meanwhile, certain distinguishing property and scale invariance are maintained. The method has low computational complexity, is easy to realize in hardware, is beneficial to accelerating applications such as feature matching, image registration and the like, and brings potential to quick and real-time application.
Drawings
FIG. 1 is a functional block diagram of a feature description system based on inter-feature bearing distances according to the present invention;
FIG. 2 is a schematic diagram of principal feature point determination;
FIG. 3 is a schematic diagram of the calculation of bearing and distance between features;
FIG. 4 is a bar graph of characteristic relative azimuth-relational intensities;
FIG. 5 is a bar graph of inter-feature remapping orientation versus relationship intensity;
fig. 6 is a bin histogram.
Detailed Description
The feature description system based on the inter-feature bearing distance of the present embodiment will be described with reference to fig. 1 to 6, and includes a gaussian smoothing unit, a feature detector, a feature screening unit, an inter-feature relationship calculating unit, and a feature descriptor generating unit; the feature screening unit comprises a secondary feature screening unit and a primary feature screening unit, and the feature descriptor generating unit comprises a direction estimating unit and a direction intensity histogram statistic unit;
firstly, an image I (x, y) is acquired from an external device (such as a camera, a CMOS detector and the like), the Gaussian smoothing unit smoothes the acquired image so as to reduce uncertainty of noise on subsequent feature detection, improve repeatability of subsequent feature screening and feature description, and acquire a smoothed image.
The feature detector performs feature detection on the smoothed image, and simultaneously obtains the positions of feature points in the image and response intensity thereof. The method used for feature detection may be any, for example FAST, SIFT, SURF. But to improve computational performance FAST is typically used. After all features are acquired, a feature set in the image I (x, y) is obtained, including all feature points f i (I) And their position coordinates Loc (f i (I) )=(x i ,y i ) And its response strength Mag (f i (I) ) The method comprises the steps of carrying out a first treatment on the surface of the Upon detection of a feature, a feature set is formed, i.e
F (I) ={f i (I) |i∈[1,2,...,N f ]} (1)
In the above formula, i represents the ordinal number of the detected feature point, i.e. the feature point set of the image contains N f And feature points.
In order to improve the repeatability of the system, the secondary feature point screening unit screens the detected feature points in combination with the response intensity and the image coordinates. Since the feature points near the center of the image can be described by the azimuth distance information of other feature points in each azimuth, and the feature points near the edges or corners can only be described by other feature points in half azimuth or quarter azimuth, the feature points near the center can be more accurately describedSaid. Thus, the response intensity of features near the center of the image is weighted more heavily, while the response intensity of features near the edges or corners of the image is weighted less heavily. Introducing modulated characteristic response intensity Mag m I.e.
In the above formula, M, N denote the width and height of the image. All the characteristic points are ordered according to the descending order of the modulated characteristic response intensity, the first half of characteristic points are selected as secondary characteristic points, and then the serial numbers are reassigned, namely f i (I,SF) These secondary feature points constitute a secondary feature point set F (I,SF)
F (I,SF) ={f i (I,SF) |i∈[1,2,…,N f /2]} (3)
Since the feature detector detects the feature, the returned feature response intensity may vary with the parallax of the image, which may affect the stability of the feature description. To alleviate this problem, the principal feature screening unit further determines principal feature points from the secondary feature points. The principal feature screening unit may further determine principal feature according to the steps of:
a) Initializing the main feature point set of the image as an empty set, i.e
b) For each secondary feature point f i (I,SF) Define a circle domain D (f) with radius R i (I,SF) ):
D(f i (I,SF) )={(x,y)|(x-x i ) 2 +(y-y i ) 2 <R 2 } (1)
Wherein R is 1/20.min (M, N);
c) If there are no other secondary feature points f in the circle j (I,SF) Meets Mag (f) j (I,SF) )>Mag(f i (I,SF) ) F is then i (I,SF) As the main characteristic point, i.e. f i (I,SF) ∈F (I,PF)
d) For F (I,PF) All main characteristic points in the tree are reassigned ordinal numbers to obtain f i (I,PF) (not care how ordinals are assigned).
As shown in fig. 2, assuming that 14 secondary feature points a-N are in an image, the numbers beside the reference numerals indicate the response loudness after modulation, and given that the radius of the circle domain is as shown in the figure, a-C, F, J, K, M, N can be determined as the primary feature points according to the formula (3);
it can be seen that these feature point sets satisfy
After all the principal feature points are determined, the inter-feature relation calculation unit calculates the relative orientation and distance between each principal feature point and all other secondary feature points: assume that there is a principal feature point f i (I,PF) And another secondary feature point f j (I,SF) Then f j (I,SF) Relative to f i (I,PF Azimuth Azim (f) j (I,SF) |f i (I,PF) ) Distance Dist (f) j (I,SF) |f i (I,PF) ):
Dist(f j (I,SF) |f i (I,PF) )=(x i -x j ) 2 +(y i -y j ) 2 (6)
In the above formula (x) i ,y i )=Loc(f i (I,PF) ),(x j ,y j )=Loc(f j (I,SF) ). And define f j (I,SF) And f i (I,PF) The strength of the relationship between them is that of the distanceReciprocal of separation:
taking the feature point A as an example, as shown in FIG. 3, the rest of the feature points are regarded as secondary feature points, and after one direction is selected as the absolute direction, the relative orientation theta between the rest of the feature points and the point A is obtained by taking the absolute direction as a reference i And the distance, and the relationship strength was obtained at the same time, as shown in table 1.
TABLE 1
After determining the relation between all principal feature points and all other secondary feature points, the direction estimation unit will determine the principal direction of each principal feature point (fig. 3 also shows the principal direction of one principal feature point) and take this direction as the reference direction (zero direction), mapping the relative orientations of all secondary feature points and this principal feature point clockwise to the range of 0-2 pi. As shown in fig. 4, the histogram illustrates another example, including the relationship between one principal feature point and all other secondary feature points.
In which the main direction Ori (f i (I,PF) ) Is determined according to the relative orientation of the secondary feature point most strongly related to the primary feature point. I.e.
Wherein the method comprises the steps ofThe method meets the following conditions:
opposite party after remappingBit is denoted as Azim rm (f j (I,SF) |f i (I,PF) )
If the main direction in fig. 4 is taken as the reference direction, the relative orientations of all the secondary feature points and the main feature point are mapped to the range of 0-2 pi clockwise, so that fig. 5 can be obtained.
Based on this, the direction intensity histogram statistics unit will make statistics on these information, generating a feature description vector. The specific method comprises the following steps: for each main feature point, taking this as the center, the image is divided into n sector areas from its main direction (n is generally more than 10, the larger n is, the better the distinguishing of the generated descriptors is, but the calculation amount when the features are matched is increased), and the n direction intervals are represented respectively. And then, counting the sum of the relation intensities of all secondary characteristic points in each direction interval relative to the main characteristic point, and calling the sum as the intensity of the direction interval. For example, the sum of the relation intensity of all secondary feature points in the p-th sector area relative to the main feature point is O p (f i (I,PF) |F (I,SF) ):
Wherein the method comprises the steps ofTo indicate a function:
after the intensity of all direction intervals is obtained, the composition f i (I,PF) Is a feature description vector V (f) j (I,PF) |F (I,SF) ):
V(f j (I,PF) |F (I,SF) )=[O 1 (f i (I,PF) |F (I,SF) ),…,O n (f i (I,PF) |F (I,SF) )] (12)
For example, in fig. 5, all directions are divided into n direction intervals from the main direction, where n=10, and then the intensity of the azimuth interval is calculated in each direction interval, so as to obtain an azimuth interval histogram, that is, a feature description vector of the feature, as shown in fig. 6.
Repeating the steps to obtain the feature description vectors of all the main feature points.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (3)

1. The feature description system based on the inter-feature azimuth distance is characterized in that: the system comprises a Gaussian smoothing unit, a feature detector, a feature screening unit, a feature relation calculating unit and a feature descriptor generating unit;
the feature screening unit comprises a secondary feature screening unit and a primary feature screening unit;
the feature descriptor generating unit comprises a direction estimating unit and a direction intensity histogram statistic unit;
the Gaussian smoothing unit performs smoothing processing on the input image I (x, y) and then performs feature detection through a feature detector;
the feature detector detects a feature point set F of the obtained image (I) At the same time obtain feature point f i (I) Position Loc (f in image i (I) )=(x i ,y i ) The characteristic point f i (I) Response intensity Mag (f) i (I) ) The feature point set is expressed as:
F (I) ={f i (I) |i∈[1,2,...,N f ]}
wherein i represents the detectedCharacteristic point f i (I) Is the ordinal number of the image, the feature point set of the image comprises N f Feature points;
the secondary feature screening unit detects the obtained feature point set F by the feature detector (I) Secondary feature point screening is carried out; the method comprises the following steps:
introducing modulated characteristic response intensity Mag m Expressed by the following formula:
wherein M, N represent the width and height of the image; all the characteristic points are ordered according to the descending order of the modulated characteristic response intensity, the characteristic points of the first half part are selected as secondary characteristic points, and then the serial numbers are reassigned, namely f i (I,SF) The method comprises the steps of carrying out a first treatment on the surface of the The secondary feature points form a secondary feature point set F (I,SF) Expressed by the following formula:
F (I,SF) ={f i (I,SF) |i∈[1,2,…,N f /2]}
the secondary feature point set F (I,SF) The main characteristic screening unit is adopted for screening, and main characteristic points are determined, wherein the specific process is as follows:
initializing a set of principal feature points F of an image (I,PF) Is empty set, i.e
For each secondary feature point f i (I,SF) Define a circle domain D (f) with radius R i (I,SF) ) Expressed by the following formula:
D(f i (I,SF) )={(x,y)|(x-x i ) 2 +(y-y i ) 2 <R 2 }
if there are no other secondary feature points in the circleSatisfy->Then f is set to i (I,SF) As the main characteristic point, i.e. f i (I,SF) ∈F (I,PF)
For the main characteristic point set F (I,PF) Ordinals are reassigned to all main feature points in the table to obtain a main feature point f i (I,PF)
The main feature point set satisfies the following equation:
calculating each principal feature point f by the inter-feature relationship calculating unit i (I,PF) With other secondary characteristic pointsThe relative orientation and distance between them,
the direction estimation unit determines the main direction of each main characteristic point, takes the main direction as a reference direction, and maps the relative directions of all secondary characteristic points and the main characteristic point to a range of 0-2 pi clockwise;
wherein the main direction Ori (f i (I,PF) ) Determining according to the relative orientation of the secondary characteristic point with the strongest relation with the primary characteristic point; namely:
wherein the method comprises the steps ofThe method meets the following conditions:
the relative position after remapping is recorded as
The direction intensity histogram statistics unit will calculate the direction intensity histogram for the main direction Ori (f i (I,PF) ) And the remapped relative orientationCounting to generate a feature description vector; the specific method comprises the following steps:
for each main feature point, taking the main feature point as a center, dividing the image into n sector areas from the main direction of the main feature point, and respectively representing n direction intervals;
then, calculating the sum of the relation intensities of all secondary characteristic points in each direction interval relative to the main characteristic point to form the intensity of the direction interval;
after the intensity of all direction intervals is obtained, the composition f i (I,PF) Feature description vector of (a)Expressed by the following formula:
until the feature description vectors of all the principal feature points are obtained.
2. The inter-feature bearing based characterization system of claim 1, wherein: the inter-feature relation calculation unit calculates each principal feature point f i (I,PF) With other secondary characteristic pointsThe relative orientation and distance between the two parts are as follows:
setting a main characteristic point f i (I,PF) And another secondary feature pointThen->Relative to f i (I,PF) Orientation between->Distance->Expressed by the following formula:
in (x) i ,y i )=Loc(f i (I,PF) ),And define->And f i (I,PF) The reciprocal between them is the relationship strength:
3. the inter-feature bearing based characterization system of claim 1, wherein: the sum of the relation intensity of all secondary characteristic points in the p-th fan-shaped area relative to the main characteristic point is O p (f i (I,PF) |F (I,SF) ) Expressed by the following formula:
wherein the method comprises the steps ofTo indicate a function, it is expressed by the following formula:
after the intensity of all direction intervals is obtained, the composition f i (I,PF) Feature description vector of (a)Until the feature description vectors of all the principal feature points are obtained.
CN202110974875.1A 2021-08-24 2021-08-24 Feature description system based on inter-feature azimuth distance Active CN113689403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110974875.1A CN113689403B (en) 2021-08-24 2021-08-24 Feature description system based on inter-feature azimuth distance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110974875.1A CN113689403B (en) 2021-08-24 2021-08-24 Feature description system based on inter-feature azimuth distance

Publications (2)

Publication Number Publication Date
CN113689403A CN113689403A (en) 2021-11-23
CN113689403B true CN113689403B (en) 2023-09-19

Family

ID=78581910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110974875.1A Active CN113689403B (en) 2021-08-24 2021-08-24 Feature description system based on inter-feature azimuth distance

Country Status (1)

Country Link
CN (1) CN113689403B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011149303A2 (en) * 2010-05-27 2011-12-01 Samsung Electronics Co., Ltd. Image capturing and display apparatus and method
CN104680550A (en) * 2015-03-24 2015-06-03 江南大学 Method for detecting defect on surface of bearing by image feature points
CN107945221A (en) * 2017-12-08 2018-04-20 北京信息科技大学 A kind of three-dimensional scenic feature representation based on RGB D images and high-precision matching process
CN108664983A (en) * 2018-05-21 2018-10-16 天津科技大学 A kind of scale and the adaptive SURF characteristic point matching methods of characteristic strength
CN111738278A (en) * 2020-06-22 2020-10-02 黄河勘测规划设计研究院有限公司 Underwater multi-source acoustic image feature extraction method and system
CN112085772A (en) * 2020-08-24 2020-12-15 南京邮电大学 Remote sensing image registration method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8494227B2 (en) * 2007-04-17 2013-07-23 Francine J. Prokoski System and method for using three dimensional infrared imaging to identify individuals
CN107633526B (en) * 2017-09-04 2022-10-14 腾讯科技(深圳)有限公司 Image tracking point acquisition method and device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011149303A2 (en) * 2010-05-27 2011-12-01 Samsung Electronics Co., Ltd. Image capturing and display apparatus and method
CN104680550A (en) * 2015-03-24 2015-06-03 江南大学 Method for detecting defect on surface of bearing by image feature points
CN107945221A (en) * 2017-12-08 2018-04-20 北京信息科技大学 A kind of three-dimensional scenic feature representation based on RGB D images and high-precision matching process
CN108664983A (en) * 2018-05-21 2018-10-16 天津科技大学 A kind of scale and the adaptive SURF characteristic point matching methods of characteristic strength
CN111738278A (en) * 2020-06-22 2020-10-02 黄河勘测规划设计研究院有限公司 Underwater multi-source acoustic image feature extraction method and system
CN112085772A (en) * 2020-08-24 2020-12-15 南京邮电大学 Remote sensing image registration method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Selection and Fusion of Color Models for Image Feature Detection;Harro Stokman;IEEE Transactions on Pattern Analysis and Machine intelligence;全文 *
实现遥感相机自主辨云的小波SCM算法;陶淑苹;测绘学报;全文 *
改进的基于深度卷积网的图像匹配算法;雷鸣;刘传才;;计算机***应用(第01期);168-174 *

Also Published As

Publication number Publication date
CN113689403A (en) 2021-11-23

Similar Documents

Publication Publication Date Title
CN110097093B (en) Method for accurately matching heterogeneous images
CN111080529A (en) Unmanned aerial vehicle aerial image splicing method for enhancing robustness
WO2018219054A1 (en) Method, device, and system for license plate recognition
CN106960451B (en) Method for increasing number of feature points of image weak texture area
CN110992263B (en) Image stitching method and system
AU2007285683A1 (en) Method of image processing
CN111667506B (en) Motion estimation method based on ORB feature points
CN109472770B (en) Method for quickly matching image characteristic points in printed circuit board detection
CN111680699B (en) Air-ground infrared time-sensitive weak small target detection method based on background suppression
WO2022267287A1 (en) Image registration method and related apparatus, and device and storage medium
CN113643334A (en) Different-source remote sensing image registration method based on structural similarity
TW201926244A (en) Real-time video stitching method
CN113421206B (en) Image enhancement method based on infrared polarization imaging
CN108229500A (en) A kind of SIFT Mismatching point scalping methods based on Function Fitting
CN112861870B (en) Pointer instrument image correction method, system and storage medium
CN109410235B (en) Target tracking method fusing edge features
WO2015113608A1 (en) Method for recognizing objects
CN114331879A (en) Visible light and infrared image registration method for equalized second-order gradient histogram descriptor
CN113689403B (en) Feature description system based on inter-feature azimuth distance
CN113744307A (en) Image feature point tracking method and system based on threshold dynamic adjustment
CN114358166A (en) Multi-target positioning method based on self-adaptive k-means clustering
CN114549400A (en) Image identification method and device
Cai et al. Feature detection and matching with linear adjustment and adaptive thresholding
CN111950563A (en) Image matching method and device and computer readable storage medium
US11238309B2 (en) Selecting keypoints in images using descriptor scores

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant