CN115775324A - Phase correlation image matching method under guidance of cross-scale filtering - Google Patents

Phase correlation image matching method under guidance of cross-scale filtering Download PDF

Info

Publication number
CN115775324A
CN115775324A CN202211603863.9A CN202211603863A CN115775324A CN 115775324 A CN115775324 A CN 115775324A CN 202211603863 A CN202211603863 A CN 202211603863A CN 115775324 A CN115775324 A CN 115775324A
Authority
CN
China
Prior art keywords
image
scale
images
matching
phase
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211603863.9A
Other languages
Chinese (zh)
Other versions
CN115775324B (en
Inventor
程翔
周伟
张永军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202211603863.9A priority Critical patent/CN115775324B/en
Publication of CN115775324A publication Critical patent/CN115775324A/en
Application granted granted Critical
Publication of CN115775324B publication Critical patent/CN115775324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a phase correlation image matching method under the guidance of cross scale filtering, which comprises the following steps: structural information with stability in the image is extracted by simulating visual cortex receptive field response through Log-Gabor filtering, and further, the image matching is realized by expanding the phase correlation and calculating the change relation between the phases of the structural image; the method extracts the stable structure information of the image by Log-Gabor filtering, can weaken the influence of radiation change on matching, takes the extended phase correlation as a basic way for calculating the image change to realize matching, can effectively solve the problems of radiation difference, scale difference and rotation change among the images to be matched, realizes the integral matching among the images with rotation, scale difference and relative translation, improves the image matching precision, and can be used for the traditional auxiliary navigation based on the downward-looking image or downward-looking corrected image matching.

Description

Phase correlation image matching method under guidance of cross scale filtering
Technical Field
The invention relates to the technical field of image processing, in particular to a phase correlation image matching method under the guidance of cross scale filtering.
Background
The unmanned aerial vehicle has the advantages of small volume, light weight, high flexibility, strong concealment, low cost, no potential safety hazard of the personnel of the aircraft, and the like, is widely applied in civil and military fields, such as disaster monitoring, geological exploration, mapping, military reconnaissance, target attack, battlefield situation monitoring, and the like, and the advanced navigation system has very important significance for the safety application of the unmanned aerial vehicle, especially for working environments which are difficult to achieve in manual wire control or remote control modes such as remote and long-time operation, and the perfect and high-precision autonomous navigation system is a key guarantee for the unmanned aerial vehicle to improve the viability and smoothly complete a working task.
At present, a certain result has been obtained in research and application of a scene matching assisted navigation technology inspired by visual perception, but the scene matching assisted navigation technology inspired by complex natural environment and flight state images has positioning performance far behind a human visual perception system, image matching is a core technology of scene matching assisted navigation, an image shot in real time in the flight process of an unmanned aerial vehicle is matched with a pre-prepared image with geographical reference information, and then the spatial plane position of the unmanned aerial vehicle at the imaging time is calculated from the matching positioning result of a real-time image, so that unmanned aerial vehicle positioning is realized.
Disclosure of Invention
In view of the above problems, the present invention provides a phase-related image matching method under guidance of cross-scale filtering, which solves the problem that the existing image matching method cannot effectively cope with the challenges of radiation difference, scale difference and rotation change between images to be matched, and is prone to cause wrong matching.
In order to achieve the purpose of the invention, the invention is realized by the following technical scheme: a cross-scale filtering guided phase correlation image matching method comprises the following steps:
the method comprises the following steps: geometrically correcting a real-time image into a temporary image according to attitude data in a traditional navigation environment, taking the real-time image as an image to be registered, taking an image of a waypoint as a reference image, and combining the reference image and the image to be registered as image input data;
step two: firstly, fourier transform processing is carried out on a reference image and a temporary image, a change relation between the images is calculated by Log-Gabor filtering, then phase correlation is used as a basic matching model, and the scale and rotation variation between the images are converted into translation parameters which can be solved by the phase correlation under a logarithmic coordinate system;
step three: the method comprises the steps of obtaining translation parameters, simultaneously carrying out Fourier transform on image input data, converting the image input data from a spatial domain into a frequency domain, simulating visual cortex receptive field response by Log-Gabor filtering, and extracting stable structural information in the image input data;
step four: based on the extracted structure information, accurately calculating the relative translation amount and the extended phase correlation between the reference image and the image to be registered in combination with the phase correlation, sequentially performing two-step phase correlation under a polar coordinate system and a Cartesian coordinate system, and respectively solving the rotation parameter and the scale parameter between the image to be registered and the reference image;
step five: calculating a geometric transformation relation between the image to be registered and the reference image according to the rotation parameter, the scale parameter and the translation parameter obtained by solving, constructing a multi-scale information image pyramid of the two images based on the geometric transformation relation, and finally realizing image matching by solving a global transformation parameter between the two images.
The further improvement lies in that: in the first step, the attitude data in the conventional navigation environment is provided by camera downward-looking imaging or by an inertial navigation system, and the temporary image is a downward-looking image.
The further improvement is that: in the first step, when the waypoint images are used as references, control points are extracted through feature matching between the real-time images and the waypoint images, when no effective waypoint information exists in the range of the real-time images, feature matching is carried out by using the previous frame of images as references, and the matched connection points are converted into the control points by means of the positioning parameters of the previous frame of images.
The further improvement lies in that: in the second step, the specific steps of calculating the variation relation between the images are as follows: simulating different hyper-column responses by means of Log-Gabor filtering, constructing a hyper-column vector, converting a local image into a bionic visual cell coordinate system, further taking the consistency of the hyper-column vectors of the same target on different images as a basic criterion, and calculating the change relation between the images according to an affine transformation model.
The further improvement is that: in the second step, in the fourier transform processing, the image signal S (x, y) is fourier transformed to obtain a complex function spectrum:
F(ω xy )=R(ω xy )+iI(ω xy )
in the formula (omega) xy ) In frequency domain coordinates, R is the real part, I is the imaginary part, expressed exponentially as a combination of magnitude and phase:
Figure BDA0003996558530000041
wherein, | F (ω) xy ) I denotes frequency (ω) xy ) The amplitude of the (d) signal at (a),
Figure BDA0003996558530000042
represents (omega) xy ) The phase of (d).
The further improvement lies in that: in the third step, when the structure information is extracted, the Log-Gabor filtering of the cross scale is used for respectively extracting the multi-scale structure characteristics of the two images, and the image scale overlapping rate is enlarged, so that the robustness of the extended phase correlation to the scale difference is enhanced.
The further improvement lies in that: in the fourth step, before two-step phase correlation is sequentially carried out under a logarithmic coordinate system and a Cartesian coordinate system, the relative rotation angle and the scale difference between the images are effectively calculated through coordinate system conversion, and the integral matching of the images with rotation, scale difference and relative translation is realized.
The further improvement is that: in the fifth step, when the image pyramid is constructed, the image is resampled towards two directions simultaneously, and the scale overlapping range between the two image pyramids is enlarged.
The invention has the beneficial effects that: the invention fully excavates stable structural information in an image through Log-Gabor filtering with good bionic performance, weakens the influence of radiation change on matching, combines the basic characteristics that phase correlation can accurately calculate the relative offset between translation images and expand the phase correlation to effectively calculate the relative rotation angle and the scale difference between the images through coordinate system conversion under the inspiration of the basic principle that a biological vision system realizes target matching by comparing the geometric change of the same target in different scenes, can effectively deal with the challenges of radiation difference, scale difference and rotation change between the images to be matched, realizes the integral matching of rotation, scale difference and relative translation images, improves the image matching precision, and can be used in the traditional auxiliary navigation based on the downward view image or downward view image correction.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flow chart of the image matching method of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the present embodiment provides a method for matching phase-related images under guidance of cross-scale filtering, including the following steps:
the method comprises the following steps: the method comprises the steps that a real-time image is geometrically corrected into a downward-looking image as a temporary image by aiming at attitude data provided by a camera downward-looking imaging or an inertial navigation system in a traditional navigation environment, the real-time image is taken as an image to be registered, an image of a waypoint is taken as a reference image, the reference image and the image to be registered are combined to be taken as image input data, when the image of the waypoint is taken as the reference, a control point is extracted through feature matching between the real-time image and the image of the waypoint, when no effective waypoint information exists in a real-time image range, feature matching is carried out by taking a former frame of image as the reference, and a connecting point obtained through matching is converted into the control point by means of positioning parameters of the former frame of image;
step two: firstly, fourier transform processing is carried out on a reference image and a temporary image, a change relation between the images is calculated by Log-Gabor filtering, then phase correlation is used as a basic matching model, and the scale and rotation variation between the images are converted into translation parameters which can be solved by the phase correlation under a logarithmic coordinate system;
the specific steps of calculating the change relationship between the images are as follows: simulating different hyper-column responses by means of Log-Gabor filtering, constructing a hyper-column vector, converting a local image into a bionic visual cell coordinate system, further taking the consistency of the hyper-column vectors of the same target on different images as a basic criterion, and calculating the change relation between the images according to an affine transformation model;
in the process of Fourier transform processing, the image signal S (x, y) is subjected to Fourier transform to obtain a complex function frequency spectrum:
F(ω xy )=R(ω xy )+iI(ω xy )
in the formula (omega) xy ) In frequency domain coordinates, R is the real part, I is the imaginary part, expressed exponentially as a combination of magnitude and phase:
Figure BDA0003996558530000061
wherein, | F (ω) xy ) I denotes frequency (ω) xy ) The amplitude of the (d) signal at (a),
Figure BDA0003996558530000062
represents (omega) xy ) The phase of (d);
step three: the method comprises the steps of obtaining translation parameters, simultaneously carrying out Fourier transform on image input data, converting the image input data from a spatial domain into a frequency domain, simulating visual cortex receptive field response by Log-Gabor filtering, extracting structural information with stability in the image input data, and extracting multi-scale structural features of two images by using cross-scale Log-Gabor filtering when extracting the structural information, so as to enlarge the image scale overlapping rate and strengthen the robustness of expansion phase correlation to scale difference;
step four: based on the extracted structure information, accurately calculating the relative translation amount and the extended phase correlation between the reference image and the image to be registered in combination with the phase correlation, effectively calculating the relative rotation angle and the scale difference between the images through coordinate system conversion, realizing the integral matching between the images with rotation, scale difference and relative translation, and then sequentially carrying out two-step phase correlation on a polar coordinate system and a Cartesian coordinate system to respectively solve the rotation parameter and the scale parameter between the image to be registered and the reference image;
step five: calculating a geometric transformation relation between the image to be registered and the reference image according to the rotation parameter, the scale parameter and the translation parameter obtained by solving, constructing multi-scale information image pyramids of the two images based on the geometric transformation relation, resampling the images towards two directions, enlarging the scale overlapping range between the two image pyramids, and finally realizing image matching by solving the global transformation parameter between the two images.
The cross-scale Log-Gabor filtering based phase correlation image matching (CSLGPC) algorithm processing is as follows:
inputting: reference image S 1 Real-time image S 2
And (3) outputting: real-time picture S 2 Center point geographic coordinates X, Y
1. Setting a set of center frequencies according to different scales
{...f 0 (-2),f 0 (-1),f 0 (0),f 0 (1),f 0 (2)...}
Are respectively paired with S 1 ,S 2 Log-Gabor filtering is carried out to obtain filtering total graph LG 1 ,LG 2
2. For LG 1 ,LG 2 Performing extended phase correlation calculations
Is phase correlation determined to be successful?
3. If { matching positioning fails, end }
4. Is that
{
5. Estimation of LG from correlation-calculated peak 1 ,LG 2 Angle of rotation theta and scaling factor f between
6. According to theta and f to LG 2 Generating LG with geometric correction 2 '
7. For LG 1 ,LG 2 ' calculate phase correlation, if correlation succeeds, calculate translation parameter (d) x ,d y ) (ii) a If the correlation fails, the matching positioning fails, and the process is finished
8. According to the geometric transformation parameters theta, f, (d) x ,d y ) Reckoning real-time graph S 1 The center points X and Y are successfully matched and positioned, and the process is finished
}
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. A cross-scale filtering guided phase correlation image matching method is characterized by comprising the following steps:
the method comprises the following steps: geometrically correcting a real-time image into a temporary image according to attitude data in a traditional navigation environment, taking the real-time image as an image to be registered, taking an image of a waypoint as a reference image, and combining the reference image and the image to be registered as image input data;
step two: firstly, fourier transform processing is carried out on a reference image and a temporary image, a change relation between the images is calculated by Log-Gabor filtering, then phase correlation is used as a basic matching model, and the scale and rotation variation between the images are converted into translation parameters which can be solved by the phase correlation under a logarithmic polar coordinate system;
step three: the method comprises the steps of obtaining translation parameters, simultaneously carrying out Fourier transform on image input data, converting the image input data from a spatial domain into a frequency domain, simulating visual cortex receptive field response by Log-Gabor filtering, and extracting stable structural information in the image input data;
step four: based on the extracted structure information, accurately calculating the relative translation amount and the extended phase correlation between the reference image and the image to be registered in combination with the phase correlation, sequentially performing two-step phase correlation under a polar coordinate system and a Cartesian coordinate system, and respectively solving the rotation parameter and the scale parameter between the image to be registered and the reference image;
step five: calculating a geometric transformation relation between the image to be registered and the reference image according to the rotation parameter, the scale parameter and the translation parameter obtained by solving, constructing a multi-scale information image pyramid of the two images based on the geometric transformation relation, and finally realizing image matching by solving a global transformation parameter between the two images.
2. The cross-scale filtering guided phase-correlation image matching method of claim 1, wherein: in the first step, the attitude data in the conventional navigation environment is provided by camera downward-looking imaging or by an inertial navigation system, and the temporary image is a downward-looking image.
3. The cross-scale filtering guided phase-correlation image matching method of claim 1, wherein: in the first step, when the waypoint image is used as a reference, the control point is extracted through feature matching between the real-time image and the waypoint image, when no effective waypoint information exists in the real-time image range, feature matching is carried out by taking the previous frame image as a reference, and the connection point obtained through matching is converted into the control point by means of the positioning parameter of the previous frame image.
4. The cross-scale filtering guided phase-correlation image matching method of claim 1, wherein: in the second step, the specific steps of calculating the change relationship between the images are as follows: simulating different hyper-cylindrical responses by means of Log-Gabor filtering, constructing hyper-cylindrical vectors, converting local images into a bionic visual cell coordinate system, further taking the consistency of the hyper-cylindrical vectors of the same target on different images as a basic criterion, and calculating the change relation between the images according to an affine transformation model.
5. The cross-scale filtering guided phase-correlation image matching method according to claim 1, wherein: in the second step, in the fourier transform processing, the image signal S (x, y) is fourier transformed to obtain a complex function spectrum:
F(ω xy )=R(ω xy )+iI(ω xy )
in the formula (omega) xy ) In frequency domain coordinates, R is the real part, I is the imaginary part, expressed exponentially as a combination of magnitude and phase:
Figure FDA0003996558520000021
wherein, | F (ω) xy ) I denotes frequency (ω) xy ) The amplitude of the (c) signal (c),
Figure FDA0003996558520000022
represents (omega) xy ) The phase of (d).
6. The cross-scale filtering guided phase-correlation image matching method of claim 1, wherein: in the third step, when the structure information is extracted, the Log-Gabor filtering of the cross scale is used for respectively extracting the multi-scale structure characteristics of the two images, and the image scale overlapping rate is enlarged, so that the robustness of the extended phase correlation to the scale difference is enhanced.
7. The cross-scale filtering guided phase-correlation image matching method according to claim 1, wherein: in the fourth step, before two-step phase correlation is sequentially carried out under a logarithmic coordinate system and a Cartesian coordinate system, the relative rotation angle and the scale difference between the images are effectively calculated through coordinate system conversion, and the integral matching of the images with rotation, scale difference and relative translation is realized.
8. The cross-scale filtering guided phase-correlation image matching method of claim 1, wherein: in the fifth step, when the image pyramid is constructed, the image is resampled towards two directions simultaneously, and the scale overlapping range between the two image pyramids is enlarged.
CN202211603863.9A 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering Active CN115775324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211603863.9A CN115775324B (en) 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211603863.9A CN115775324B (en) 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering

Publications (2)

Publication Number Publication Date
CN115775324A true CN115775324A (en) 2023-03-10
CN115775324B CN115775324B (en) 2024-01-02

Family

ID=85392197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211603863.9A Active CN115775324B (en) 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering

Country Status (1)

Country Link
CN (1) CN115775324B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160217577A1 (en) * 2015-01-22 2016-07-28 Bae Systems Information And Electronic Systems Integration Inc. Enhanced phase correlation for image registration
CN111462198A (en) * 2020-03-10 2020-07-28 西南交通大学 Multi-mode image registration method with scale, rotation and radiation invariance
CN112233225A (en) * 2020-10-14 2021-01-15 中国科学技术大学 Three-dimensional reconstruction method and system for translational motion object based on phase correlation matching
KR20210094517A (en) * 2018-09-18 2021-07-29 니어맵 오스트레일리아 프로프라이어테리 리미티드 System and method for selecting complementary images from multiple images for 3D geometry extraction
CN113223066A (en) * 2021-04-13 2021-08-06 浙江大学 Multi-source remote sensing image matching method and device based on characteristic point fine tuning
CN113552585A (en) * 2021-07-14 2021-10-26 浙江大学 Mobile robot positioning method based on satellite map and laser radar information
CN113763274A (en) * 2021-09-08 2021-12-07 湖北工业大学 Multi-source image matching method combining local phase sharpness orientation description

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160217577A1 (en) * 2015-01-22 2016-07-28 Bae Systems Information And Electronic Systems Integration Inc. Enhanced phase correlation for image registration
KR20210094517A (en) * 2018-09-18 2021-07-29 니어맵 오스트레일리아 프로프라이어테리 리미티드 System and method for selecting complementary images from multiple images for 3D geometry extraction
CN111462198A (en) * 2020-03-10 2020-07-28 西南交通大学 Multi-mode image registration method with scale, rotation and radiation invariance
CN112233225A (en) * 2020-10-14 2021-01-15 中国科学技术大学 Three-dimensional reconstruction method and system for translational motion object based on phase correlation matching
CN113223066A (en) * 2021-04-13 2021-08-06 浙江大学 Multi-source remote sensing image matching method and device based on characteristic point fine tuning
CN113552585A (en) * 2021-07-14 2021-10-26 浙江大学 Mobile robot positioning method based on satellite map and laser radar information
CN113763274A (en) * 2021-09-08 2021-12-07 湖北工业大学 Multi-source image matching method combining local phase sharpness orientation description

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BO LI等: "Automatic Example-Based Image Colorization Using Location-Aware Cross-Scale Matching", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》, vol. 28, no. 9, pages 4606, XP011735629, DOI: 10.1109/TIP.2019.2912291 *
XIAOMIN LIU等: "Image Matching Using Phase Congruency and Log-Gabor Filters in the SAR Images and Visible Images", 《ICGEC 2019: GENETIC AND EVOLUTIONARY COMPUTING》, pages 270 - 278 *
凌霄: "基于多重约束的多源光学卫星影像自动匹配方法研究", 《中国博士学位论文全文数据库_基础科学辑》, pages 008 - 36 *
张永军等: "城区机载LiDAR数据与航空影像的自动配准", 《遥感学报》, vol. 16, no. 03, pages 579 - 595 *
李加元: "鲁棒性遥感影像特征匹配关键问题研究", 《中国博士学位论文全文数据库_工程科技Ⅱ辑》, pages 028 - 15 *

Also Published As

Publication number Publication date
CN115775324B (en) 2024-01-02

Similar Documents

Publication Publication Date Title
CN110675450B (en) Method and system for generating orthoimage in real time based on SLAM technology
Sim et al. Integrated position estimation using aerial image sequences
CN101598556B (en) Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
CN102506868B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system
CN109087359A (en) Pose determines method, pose determining device, medium and calculates equipment
TWI820395B (en) Method for generating three-dimensional(3d) point cloud of object, system for 3d point set generation and registration, and related machine-readable medium
CN104833354A (en) Multibasic multi-module network integration indoor personnel navigation positioning system and implementation method thereof
CN105865454A (en) Unmanned aerial vehicle navigation method based on real-time online map generation
CN106767785B (en) Navigation method and device of double-loop unmanned aerial vehicle
CN108917753B (en) Aircraft position determination method based on motion recovery structure
KR20190051703A (en) Stereo drone and method and system for calculating earth volume in non-control points using the same
CN102506867B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN112184786B (en) Target positioning method based on synthetic vision
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
CN112580683A (en) Multi-sensor data time alignment system and method based on cross correlation
CN117232499A (en) Multi-sensor fusion point cloud map construction method, device, equipment and medium
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
JP2022130588A (en) Registration method and apparatus for autonomous vehicle, electronic device, and vehicle
CN109443355B (en) Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
Burkard et al. User-aided global registration method using geospatial 3D data for large-scale mobile outdoor augmented reality
CN110160503A (en) A kind of unmanned plane landscape matching locating method for taking elevation into account
CN105389819A (en) Robust semi-calibrating down-looking image epipolar rectification method and system
CN108921896A (en) A kind of lower view vision compass merging dotted line feature
CN115775324A (en) Phase correlation image matching method under guidance of cross-scale filtering
CN113538579B (en) Mobile robot positioning method based on unmanned aerial vehicle map and ground binocular information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant