CN110443776A - A kind of Registration of Measuring Data fusion method based on unmanned plane gondola - Google Patents
A kind of Registration of Measuring Data fusion method based on unmanned plane gondola Download PDFInfo
- Publication number
- CN110443776A CN110443776A CN201910724461.6A CN201910724461A CN110443776A CN 110443776 A CN110443776 A CN 110443776A CN 201910724461 A CN201910724461 A CN 201910724461A CN 110443776 A CN110443776 A CN 110443776A
- Authority
- CN
- China
- Prior art keywords
- image
- registration
- infrared
- frequency coefficient
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 13
- 230000009466 transformation Effects 0.000 claims abstract description 54
- 230000004927 fusion Effects 0.000 claims abstract description 51
- 230000006870 function Effects 0.000 claims description 25
- 230000005540 biological transmission Effects 0.000 claims description 19
- 238000005457 optimization Methods 0.000 claims description 11
- 239000002245 particle Substances 0.000 claims description 9
- 230000003287 optical effect Effects 0.000 claims description 5
- 230000004888 barrier function Effects 0.000 claims description 4
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 3
- 239000000284 extract Substances 0.000 claims description 3
- 238000002156 mixing Methods 0.000 claims description 3
- 230000010354 integration Effects 0.000 abstract description 5
- 230000003044 adaptive effect Effects 0.000 abstract description 4
- 238000005516 engineering process Methods 0.000 abstract description 3
- 230000000717 retained effect Effects 0.000 abstract description 2
- 238000007689 inspection Methods 0.000 description 9
- 238000000034 method Methods 0.000 description 4
- 230000005611 electricity Effects 0.000 description 3
- 239000012212 insulator Substances 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005672 electromagnetic field Effects 0.000 description 2
- 238000002844 melting Methods 0.000 description 2
- 230000008018 melting Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/006—Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C1/00—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
- G07C1/20—Checking timed patrols, e.g. of watchman
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20064—Wavelet transform [DWT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The present invention relates to technical field of remote sensing image processing, more particularly to a kind of Registration of Measuring Data fusion method based on unmanned plane gondola, the present invention is first had to before carrying out image co-registration to image registration using adaptive mutual information registration, the effective integration of visible images and infrared image is realized further through HIS transformation and lifting wavelet transform later, the visible images and infrared image of the Same Scene taken to unmanned plane in synchronization carry out effective registration and fusion in real time, make final result not only and can retain the color and clearly details profile and edge of visible images, the luminance information of infrared image infrared object can also be retained, keep infrared target object prominent with respect to background luminance, more easily identify target, the integration technology of infrared image and visible images, the False Rate of infrared target can not only be reduced, also more hold It easily detects infrared target and it is tracked.
Description
Technical field
The present invention relates to technical field of remote sensing image processing more particularly to a kind of Registration of Measuring Data based on unmanned plane gondola to melt
Conjunction method.
Background technique
As the requirement of power transmission is continuously improved, Transmission level is higher and higher, high pressure/super-pressure inspection upkeep operation pair
It is particularly important in the effect of the safety of network system, stability and improved efficiency.In recent years, unmanned plane makes in every field
More and more extensive with obtaining, no exception in inspection field, by loading gondola on unmanned plane, gondola loads the detection such as video camera
Element carries out inspection.But for high pressure/supertension line inspection, electromagnetic field with higher, unmanned plane near route
Near route when flight, electromagnetic field can influence the various electrical equipments on unmanned plane, influence its work and service life.
And ultra-high-tension power transmission line is increasing year by year, and inspection requires quality high.
Inspection is carried out to transmission line of electricity using single source sensor on unmanned plane gondola at present, but due to illumination, weather
The condition limitation of equal environment, the image of single source sensor shooting, which will receive certain influence, leads to the erroneous judgement of target, now
It is unable to satisfy practical application request.
What infrared sensor utilized is heat radiation principle, and the image infrared target brightness of acquisition is big, but target it is unintelligible,
Edge blurry;What visible light sensor utilized is light reflection principle, and shooting obtains colored details clearly image, but in energy
Under the conditions of degree of opinion is lower, the image effect of acquisition has limitation.
Summary of the invention
According to the above-mentioned deficiencies of the prior art, the present invention provides a kind of Registration of Measuring Data fusion sides based on unmanned plane gondola
Infrared image and visible images can be carried out effective integration by method, realize picture clearly show fitting in transmission line of electricity,
The temperature conditions of the important components such as insulator, improves routing inspection efficiency.
Present invention solves the technical problem that the technical solution used are as follows: a kind of Registration of Measuring Data fusion based on unmanned plane gondola
Method includes the following steps:
Step 1. obtains the visible images A and infrared image B of scene 1, using the visible images A of acquisition as with reference to figure
Picture seeks transformation parameter tform11 based on affine Transform Model using infrared image B as floating image;
Step 2. carries out spatial alternation to infrared image B using transformation parameter tform11, obtains infrared image B1, as
Rough registration result;
Step 3. carries out mutual information registration using Powell and Particle Swarm Optimization-based Hybrid Optimization Algorithm for the image after rough registration,
Transformation parameter tform11 is corrected, optimized transformation parameters tform12 is obtained;
Step 4. carries out space geometry change to the infrared image B1 that rough registration obtains using optimized transformation parameters tform12
It changes, obtains accuracy registration as a result, i.e. infrared image B2;
Step 5. for the visible images A and infrared view B of each scene n, join by the transformation being all made of in step 1
Number tform11 carries out the processing of step 2~step 4, obtains the accuracy registration of corresponding scene as a result, n is more than or equal to 2, and n is positive
Integer;
Step 6. is for each scene, the image obtained after accuracy registration, is based on HIS transformation and Lifting Wavelet becomes
Swap-in row image co-registration, specific steps are as follows:
6a. carries out the colour space transformation of HIS to visible images A, extracts H component, I component, S component;
6b. carries out greyscale transformation to infrared image B2, the infrared image B3 after obtaining gray processing;
The Lifting Wavelet that 6c. carries out two layers to the I component and infrared image B3 extracted in step 6a is decomposed, and high frequency is obtained
Coefficient and low frequency coefficient are respectively adopted high frequency coefficient and low frequency coefficient different fusion rules, are merged;
6d. carries out Lifting Wavelet inverse transformation to fused new low frequency coefficient and new high frequency coefficient;
The image that 6e. obtains step 6d as H component obtained in the I component of new images and step 6a, S component again
The inverse transformation for carrying out HIS, obtains fused image.
Further, the specific steps that step 3 is realized are as follows:
3a. calculates the combination entropy of the mutual information of infrared image B1 obtained by visible images A and rough registration, when the connection of mutual information
It is exactly maximum mutual information, formula when closing entropy value minimum are as follows:
Wherein,
I (A, B1) is the combination entropy of the mutual information of visible images A and infrared image B1;
PA(a) be visible images A probability distribution;
PB1(a) be infrared image B1 probability distribution;
PAB1(a, b) is the joint probability distribution of visible images A and infrared image B1;
A is the gray value of visible images A;
B is the gray value of infrared image B1;
A is visible images A;
B1 is infrared image B1;
3b. utilizes Powell and grain under conditions of given objective function using the combination entropy of mutual information as objective function
Subgroup hybrid optimization algorithm find so that objective function obtain minimum value when best extreme point, seek using under best extreme point
In the optimized transformation parameters tform12 of registration;
Specific steps are as follows:
1. the number of iterations m is arranged, 100≤m≤500, m are positive integer, select search range, choose appointing in search space
Anticipate a point PkAs initial point, wherein k >=0, k are positive integer, and with point PkCentered on space in initialize a grain
Subgroup initializes the random site and speed of particle;
2. updating using the iteration that PSO algorithm carries out particle in above-mentioned population, current global optimum's solution point P is soughtk+1;
3. will point PkWith point Pk+1It substitutes into given objective function and calculates point PkTarget function value TIkWith point Pk+1Target
Function TIk+1, and compare the size of two objective functions, if TIk< TIk+1It then enters step 4., otherwise enters step 6.;
4. the optimal solution point P that will be soughtk+1As the initial point of Powell algorithm, n dimension search is carried out, it is excellent to obtain Powell
The optimal solution point P changedk+2, it is positive integer that wherein n, which is more than or equal to 1, n,;
5. will point Pk+2As the initial point of a new round, population is reinitialized, enables k=k+2, and goes to step 2.;
6. exporting optimal solution point Pk+1And target function value TIk, the optimal solution point Pk+1It is exactly optimized transformation parameters
tform12。
Further, in step 6a, I component is the luminance component of visible images A, and H component is the color of visible images A
Component is spent, S component is the saturation degree component of visible images A.
Further, the greyscale transformation in step 6b refers to the pretreatment that gray processing is carried out to image.
Further, in step 6c, using different fusion rules, low frequency coefficient and high frequency coefficient are merged,
Specifically:
1. for the fusion rule of low frequency coefficient
The approximate part of low frequency coefficient representative image after decomposing to Lifting Wavelet is advised using the fusion based on region energy
It is then merged, retains the general picture of fused image, the region energy of certain pixel p (i, j) to greatest extent is defined as:
E (i, j)=∑ f2(m,n)
Wherein E (i, j) be image pixel p (i, j) region energy, f (m, n) be image luminance information,
In, m, n respectively refer to the width and height of image, and i, j are the coordinate of pixel p, and (i, j) is region, and f (m, n) is the bright of image
Spend information.
The fusion rule that using area energy carries out low frequency coefficient is: to two images to be fused, if the picture of image M
The region energy of vegetarian refreshments (i, j) is greater than the energy of the pixel (i, j) of image n-quadrant, then using the region energy of image M as melting
The value of collaboration number, otherwise using the value of n-quadrant energy as the value of fusion coefficients, it may be assumed that
In formula, Cp(i, j) indicates low frequency coefficient of the pixel p at (i, j), EM(i, j) indicate image M pixel p (i,
J) region energy, EN(i, j) indicates image N in the region energy of pixel p (i, j), and algorithm uses 3 × 3 window areas;
2. for the fusion rule of high frequency coefficient
The fusion rule of high frequency coefficient is the high frequency coefficient of this two images of movement images M and image N, maximum absolute value
That high frequency coefficient as blending image, high frequency fusion formula is as follows:
HLp(i, j)=max (| HLM(i,j)|,|HLN(i,j)|)
LHp(i, j)=max (| LHM(i,j)|,|LHN(i,j)|)
HHp(i, j)=max (| HHM(i,j)|,|HHN(i, j) |),
In formula, HL, LH, HH respectively represent three horizontal, vertical and diagonal high frequency coefficients;
HLp (i, j) is the horizontal high-frequent coefficient of pixel p (i, j);
HLM(i, j) is the horizontal high-frequent coefficient for scheming pixel (i, j) in M;
HLN(i, j) is the horizontal high-frequent coefficient for scheming pixel (i, j) in M;
LHp (i, j) is the vertical high frequency coefficient of pixel p (i, j);
LHM(i, j) is the vertical high frequency coefficient for scheming pixel (i, j) in M;
LHN(i, j) is the vertical high frequency coefficient for scheming pixel (i, j) in M;
HHp (i, j) is the diagonal high frequency coefficient of pixel p (i, j);
HHM(i, j) is the diagonal high frequency coefficient for scheming pixel (i, j) in M;
HHN(i, j) is the diagonal high frequency coefficient for scheming pixel (i, j) in M.
Further, the acquisition of the visible images A of any scene m passes through visible image capturing head, any described
The acquisition of the infrared image B of scene m passes through infrared camera, the physical location of the visible image capturing head and infrared camera
It is relatively fixed, and the optical axis of two cameras is parallel to each other, and the distance between two optical axises are 2 centimetres~5 centimetres, m is greater than
Equal to 1, and m is positive integer.
Further, the phase and resolution ratio of the visible image capturing head and infrared camera immobilize, it is described can
Light-exposed camera and infrared camera carry out the shooting of Same Scene simultaneously.
Further, the visible image capturing head and infrared camera are each attached in gondola, the gondola fixed setting
In on unmanned plane, being provided with power module, digital image information transceiver module A, information analysis module and data in the gondola
It is registrated Fusion Module, the power module is visible image capturing head, infrared camera, digital image information transceiver module A, information
Analysis module and the power supply of Registration of Measuring Data Fusion Module, it is the visible image capturing head, infrared camera, visible image capturing head, infrared
Pass through data line electrical property between camera, digital image information transceiver module A, information analysis module and Registration of Measuring Data Fusion Module
Connection, the digital image information transceiver module A and ground control centre pass through wireless connection.
Further, the ground control centre includes digital image information transceiver module B, message processing module, hovering
Instruction module, display module and power module, the power module be digital image information transceiver module B, message processing module,
Hover instruction module, display module offer power supply;
The digital image information transceiver module B and digital image information transceiver module A realizes data by being wirelessly connected
Transmission;
The message processing module connect the transmission for realizing data with digital image information transceiver module B by data line;
The hovering instruction module passes through wireless connection with digital image information transceiver module B, message processing module respectively
Realize the transmission of data;
The display module and hovering instruction module realize the transmission of data by being wirelessly connected;
The hovering instruction module is stablized for controlling the skyborne position of unmanned plane;
Display module is used for the display to registration fused image;
The message processing module is used to handle unmanned plane and the positional relationship of barrier adjusts.
The invention has the following advantages: the present invention is become using adaptive mutual information registration, HIS transformation and Lifting Wavelet
It changes, to the visible images and the real-time effective registration of infrared image progress of the Same Scene that unmanned plane takes in synchronization
And fusion, make final result not only and can retain the color and clearly details profile and edge of visible images, it can be with
The luminance information for retaining infrared image infrared object keeps infrared target object prominent with respect to background luminance, it is easier to identify target,
The integration technology of infrared image and visible images, can not only reduce the False Rate of infrared target, also be more readily detected infrared
Target simultaneously tracks it.
Detailed description of the invention
Fig. 1 is the adaptive mutual information registration algorithm flow chart of embodiment provided by the present invention;
Fig. 2 is the image co-registration flow chart of embodiment provided by the present invention;
Fig. 3 is Powell the and PSO hybrid optimization algorithm flow chart of embodiment provided by the present invention;
Specific embodiment
The present invention is described further with reference to the accompanying drawing.
Embodiment one:
Unmanned plane during lifting and flight, while carry infrared camera and visible image capturing head carry out it is infrared
The acquisition of image and visible images.Within sweep of the eye, in fact it could happen that doubtful high temp objects (infrared target), such as fire,
Vehicle etc., it is therefore desirable to which doubtful high temp objects are differentiated or tracked.Infrared target differentiation one-side cannot rely on
Infrared image or visible images need to merge infrared image and visible images, could carry out to infrared target
Effectively identification.Due to Airborne IR camera and visible image capturing head shooting angle, phase and in terms of bring
Difference, will lead to shooting two images occur rotation, translation, scaling etc. transformation, therefore before carrying out image co-registration head
First the accuracy of fusion could be improved to image registration.
Gondola is carried on unmanned plane and carries out infrared and visible images two-way acquisitions, and one kind of the present invention is based on
The Registration of Measuring Data fusion method of unmanned plane gondola includes the following steps: as shown in FIG. 1 to 3
Step 1. obtains the visible images A and infrared image B of scene 1, using the visible images A of acquisition as with reference to figure
Picture manually selects 3 pairs of control points using infrared image B as floating image in two images, is based on affine Transform Model
Seek transformation parameter tform11;
Step 2. carries out spatial alternation to infrared image B using transformation parameter tform11, obtains infrared image B1, as
Rough registration result;
Step 3. carries out mutual information registration using Powell and Particle Swarm Optimization-based Hybrid Optimization Algorithm for the image after rough registration,
Transformation parameter tform11 is corrected, optimized transformation parameters tform12 is obtained;
Specific steps are as follows:
3a. calculates the combination entropy of the mutual information of infrared image B1 obtained by visible images A and rough registration, when the connection of mutual information
It is exactly maximum mutual information, formula when closing entropy value minimum are as follows:
Wherein,
I (A, B1) is the combination entropy of the mutual information of visible images A and infrared image B1;
PA(a) be visible images A probability distribution;
PB1(a) be infrared image B1 probability distribution;
PAB1(a, b) is the joint probability distribution of visible images A and infrared image B1;
A is the gray value of visible images A;
B is the gray value of infrared image B1;
A is visible images A;
B1 is infrared image B1;
Corresponding transformation parameter is optimized transformation parameters tform12 when maximum mutual information.
3b. utilizes Powell and grain under conditions of given objective function using the combination entropy of mutual information as objective function
Subgroup hybrid optimization algorithm find so that objective function obtain minimum value when best extreme point, seek using under best extreme point
In the optimized transformation parameters tform12 of registration;
Specific steps are as follows:
1. the number of iterations m is arranged, 100≤m≤500, m are positive integer, select search range, choose appointing in search space
Anticipate a point PkAs initial point, wherein k >=0, k are positive integer, and with point PkCentered on space in initialize a grain
Subgroup initializes the random site and speed of particle;
2. updating using the iteration that PSO algorithm carries out particle in above-mentioned population, current global optimum's solution point P is soughtk+1;
3. will point PkWith point Pk+1It substitutes into given objective function and calculates point PkTarget function value TIkWith point Pk+1Target
Function TIk+1, and compare the size of two objective functions, if TIk< TIk+1It then enters step 4., otherwise enters step 6.;
4. the optimal solution point P that will be soughtk+1As the initial point of Powell algorithm, n dimension search is carried out, it is excellent to obtain Powell
The optimal solution point P changedk+2, it is positive integer that wherein n, which is more than or equal to 1, n,;
5. will point Pk+2As the initial point of a new round, population is reinitialized, enables k=k+2, and goes to step 2.;
6. exporting optimal solution point Pk+1And target function value TIk, the optimal solution point Pk+1It is exactly optimized transformation parameters
tform12。
Step 4. carries out space geometry change to the infrared image B1 that rough registration obtains using optimized transformation parameters tform12
It changes, obtains accuracy registration as a result, i.e. infrared image B2;
Step 5. for the visible images A and infrared view B of each scene n, join by the transformation being all made of in step 1
Number tform11 carries out the processing of step 2~step 4, obtains the accuracy registration of corresponding scene as a result, n is more than or equal to 2, and n is positive
Integer;
Following steps are the fusion process of image, and the purpose of image co-registration is the useful information for increasing image, pass multi-source
The image information that sensor obtains obtains complementation, and improving image definition makes image object be easier to identify, improves image resolution ratio,
Making image includes richer information.
Step 6. is for each scene, the image obtained after accuracy registration, is based on HIS transformation and Lifting Wavelet becomes
Swap-in row image co-registration, specific steps are as follows:
6a. carries out the colour space transformation of HIS to visible images A, extracts H component, I component, S component;I component is can
The luminance component of light-exposed image A, H component are the chromatic component of visible images A, the saturation degree point that S component is visible images A
Amount.Image HIS transformation can be realized the conversion of rgb color space to HIS color space, take out main luminance components I component,
The purpose of the fusions of infrared and visible images is exactly the brightness in order to protrude infrared target object in the background, therefore image
HIS transformation can help to improve the brightness of infrared object in the picture.
6b. carries out greyscale transformation to infrared image B2, the infrared image B3 after obtaining gray processing;Greyscale transformation refers to figure
As the pretreatment of progress gray processing, the color value of each pixel is also known as gray scale on gray level image, refers to the face at black white image midpoint
Color depth, range is generally from 0 to 255, and white is 255, black 0.So-called gray value refers to that shade degree, gray scale are straight
Square figure refers in a width digital picture that each corresponding gray value counts the number of picture elements with the gray value.
The Lifting Wavelet that 6c. carries out two layers to the I component and infrared image B3 extracted in step 6a is decomposed, and high frequency is obtained
Coefficient and low frequency coefficient are respectively adopted high frequency coefficient and low frequency coefficient different fusion rules, are merged;
Wherein, low frequency coefficient and high frequency coefficient are merged using different fusion rules, specifically:
1. for the fusion rule of low frequency coefficient
The approximate part of low frequency coefficient representative image after decomposing to Lifting Wavelet is advised using the fusion based on region energy
Then merged, (retaining the general picture of fused image to greatest extent) region energy of certain pixel p (i, j) is defined as:
E (i, j)=∑ f2(m,n)
Wherein E (i, j) be image pixel p (i, j) region energy, f (m, n) be image luminance information,
In, m, n respectively refer to the width and height of image, and i, j are the coordinate of pixel p, and (i, j) is region, and f (m, n) is the bright of image
Spend information.
The fusion rule that using area energy carries out low frequency coefficient is: to two images to be fused, if the picture of image M
The region energy of vegetarian refreshments (i, j) is greater than the energy of the pixel (i, j) of image n-quadrant, then using the region energy of image M as melting
The value of collaboration number, otherwise using the value of n-quadrant energy as the value of fusion coefficients, it may be assumed that
In formula, Cp(i, j) indicates low frequency coefficient of the pixel p at (i, j), EM(i, j) indicate image M pixel p (i,
J) region energy, EN(i, j) indicates image N in the region energy of pixel p (i, j), and algorithm uses 3 × 3 window areas;
2. for the fusion rule of high frequency coefficient
High frequency coefficient includes the other details and texture information other than the approximate information of image, fused in order to make
Image retains edge, details, texture information as far as possible, therefore also weighs very much to the selection for the fusion rule that high frequency coefficient uses
It wants.
The fusion rule of high frequency coefficient is the high frequency coefficient of this two images of movement images M and image N, maximum absolute value
That high frequency coefficient as blending image, high frequency fusion formula is as follows:
HLp(i, j)=max (| HLM(i,j)|,|HLN(i,j)|)
LHp(i, j)=max (| LHM(i,j)|,|LHN(i,j)|)
HHp(i, j)=max (| HHM(i,j)|,|HHN(i, j) |),
In formula, HL, LH, HH respectively represent three horizontal, vertical and diagonal high frequency coefficients;
HLp (i, j) is the horizontal high-frequent coefficient of pixel p (i, j);
HLM(i, j) is the horizontal high-frequent coefficient for scheming pixel (i, j) in M;
HLN(i, j) is the horizontal high-frequent coefficient for scheming pixel (i, j) in M;
LHp (i, j) is the vertical high frequency coefficient of pixel p (i, j);
LHM(i, j) is the vertical high frequency coefficient for scheming pixel (i, j) in M;
LHN(i, j) is the vertical high frequency coefficient for scheming pixel (i, j) in M;
HHp (i, j) is the diagonal high frequency coefficient of pixel p (i, j);
HHM(i, j) is the diagonal high frequency coefficient for scheming pixel (i, j) in M;
HHN(i, j) is the diagonal high frequency coefficient for scheming pixel (i, j) in M.
Lifting wavelet transform carries out convolution algorithm independent of Fourier transformation, and the pixel obtained after transformation is whole
Number, may be implemented integer and reconstructs exact image, and can also improve arithmetic speed, save memory headroom.
6d. carries out Lifting Wavelet inverse transformation to fused new low frequency coefficient and new high frequency coefficient;
The image that 6e. obtains step 6d as H component obtained in the I component of new images and step 6a, S component again
The inverse transformation for carrying out HIS, obtains fused image.
The Image Fusion that HIS transformation and lifting wavelet transform combine, which, which is not only replaced by brightness, retains
The luminance information of infrared object, and completely retain the color detail information of entire scene, it is most important that promotion is utilized
Wavelet Fusion advantage more fireballing than Traditional Wavelet fusion treatment, the algorithm be used in unmanned aerial vehicle platform realize real time fusion have compared with
Strong practicability.
Exploitation of the invention makes unmanned plane can be with primarily to improve the intelligence of multi-rotor unmanned aerial vehicle power transmission line inspection
In the polling transmission line of super-pressure, melted by the way that the visible data of double light gondolas and infrared measurement of temperature data are carried out data
It closes, to judge whether the main portions such as the fitting on extra-high voltage alternating current-direct current route, insulator have situations such as damage is lost.Red
The fusion of outer image and visible images is used in the systems such as the detection of unmanned plane Embedded Infrared Target or tracking, so that nobody
Machine being capable of easier determining infrared target.Using the heat radiation principle of infrared image and the light reflection principle of visible images,
Effective registration and fusion in real time are carried out to two images are collected, the face of visible images can be retained by making final result not only
Color and clearly details profile and edge, can also retain the luminance information of infrared image infrared object, make infrared target object
Body is prominent with respect to background luminance, it is easier to identify target.The integration technology of infrared image and visible images can not only reduce
The False Rate of infrared target is also more readily detected infrared target and tracks to it.
Further, the acquisition of the visible images A of any scene m passes through visible image capturing head, any described
The acquisition of the infrared image B of scene m passes through infrared camera, the physical location of the visible image capturing head and infrared camera
Relatively fixed, and the optical axis of two cameras is parallel to each other, and the distance between two optical axises are 2 centimetres, m is more than or equal to 1, and
M is positive integer.
Further, the phase and resolution ratio of the visible image capturing head and infrared camera immobilize, it is described can
Light-exposed camera and infrared camera carry out the shooting of Same Scene simultaneously.
Further, the visible image capturing head and infrared camera are each attached in gondola, the gondola fixed setting
In on unmanned plane, being provided with power module, digital image information transceiver module A, information analysis module and data in the gondola
It is registrated Fusion Module, the power module is visible image capturing head, infrared camera, digital image information transceiver module A, information
Analysis module and the power supply of Registration of Measuring Data Fusion Module, it is the visible image capturing head, infrared camera, visible image capturing head, infrared
Pass through data line electrical property between camera, digital image information transceiver module A, information analysis module and Registration of Measuring Data Fusion Module
Connection, the digital image information transceiver module A and ground control centre pass through wireless connection.
Further, the ground control centre includes digital image information transceiver module B, message processing module, hovering
Instruction module, display module and power module, the power module be digital image information transceiver module B, message processing module,
Hover instruction module, display module offer power supply;
The digital image information transceiver module B and digital image information transceiver module A realizes data by being wirelessly connected
Transmission;
The message processing module connect the transmission for realizing data with digital image information transceiver module B by data line;
The hovering instruction module passes through wireless connection with digital image information transceiver module B, message processing module respectively
Realize the transmission of data;
The display module and hovering instruction module realize the transmission of data by being wirelessly connected;
The hovering instruction module is stablized for controlling the skyborne position of unmanned plane;
Display module is used for the display to registration fused image;
The positional relationship that the message processing module is used to handle unmanned plane and barrier adjusts, by on unmanned plane
Sensor cooperation, handles the real time information of sensor, avoids the collision of unmanned plane Yu short distance barrier.Each module of unmanned plane
Function situation be the prior art, details are not described herein again.
The present invention is by carrying out adaptive mutual information registration, HIS transformation to visible images and infrared image and being promoted small
The fusion that wave conversion combines realizes the purpose of infrared object tracking, fitting in last clear display Le transmission line of electricity, absolutely
The temperature conditions of the important components such as edge, and then judgement fitting, insulator are carried out according to the exception combination visible image of temperature
With the presence or absence of abnormal conditions, routing inspection efficiency is greatly improved.
The above is the embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize the present invention
Equivalent structure or equivalent flow shift made by specification and accompanying drawing content is applied directly or indirectly in other relevant skills
Art field, is included within the scope of the present invention.
Claims (9)
1. a kind of Registration of Measuring Data fusion method based on unmanned plane gondola, which comprises the steps of:
Step 1. obtains the visible images A and infrared image B of scene 1, using the visible images A of acquisition as reference picture,
Using infrared image B as floating image, transformation parameter tform11 is sought based on affine Transform Model;
Step 2. carries out spatial alternation to infrared image B using transformation parameter tform11, obtains infrared image B1, as slightly matches
Quasi- result;
Step 3. utilizes Powell and Particle Swarm Optimization-based Hybrid Optimization Algorithm to carry out mutual information registration, amendment for the image after rough registration
Transformation parameter tform11 obtains optimized transformation parameters tform12;
Step 4. carries out space geometry transformation to the infrared image B1 that rough registration obtains using optimized transformation parameters tform12, obtains
To accuracy registration as a result, i.e. infrared image B2;
Visible images A and infrared view B of the step 5. for each scene n, the transformation parameter being all made of in step 1
Tform11 carries out the processing of step 2~step 4, obtains the accuracy registration of corresponding scene as a result, n is more than or equal to 2, and n is whole
Number;
Step 6. is for each scene, the image obtained after accuracy registration, be based on HIS transformation and lifting wavelet transform into
Row image co-registration, specific steps are as follows:
6a. carries out the colour space transformation of HIS to visible images A, extracts H component, I component, S component;
6b. carries out greyscale transformation to infrared image B2, the infrared image B3 after obtaining gray processing;
The Lifting Wavelet that 6c. carries out two layers to the I component and infrared image B3 extracted in step 6a is decomposed, and high frequency coefficient is obtained
And low frequency coefficient, different fusion rules is respectively adopted for high frequency coefficient and low frequency coefficient, is merged;
6d. carries out Lifting Wavelet inverse transformation to fused new low frequency coefficient and new high frequency coefficient;
The image that 6e. obtains step 6d carries out again as H component obtained in the I component of new images and step 6a, S component
The inverse transformation of HIS, obtains fused image.
2. the Registration of Measuring Data fusion method according to claim 1 based on unmanned plane gondola, which is characterized in that step 3 is real
Existing specific steps are as follows:
3a. calculates the combination entropy of the mutual information of infrared image B1 obtained by visible images A and rough registration, when the combination entropy of mutual information
It is exactly maximum mutual information, formula when value minimum are as follows:
Wherein,
I (A, B1) is the combination entropy of the mutual information of visible images A and infrared image B1;
PA(a) be visible images A probability distribution;
PB1(a) be infrared image B1 probability distribution;
PAB1(a, b) is the joint probability distribution of visible images A and infrared image B1;
A is the gray value of visible images A;
B is the gray value of infrared image B1;
A is visible images A;
B1 is infrared image B1;
3b. utilizes Powell and population under conditions of given objective function using the combination entropy of mutual information as objective function
Hybrid optimization algorithm find so that objective function obtain minimum value when best extreme point, seek under best extreme point for matching
Quasi- optimized transformation parameters tform12;
Specific steps are as follows:
1. the number of iterations m is arranged, 100≤m≤500, m are integer, select search range, choose any one in search space
Point PkAs initial point, wherein k >=0, k are integer, and with point PkCentered on space in initialize a population, just
The random site and speed of beginningization particle;
2. updating using the iteration that PSO algorithm carries out particle in above-mentioned population, current global optimum's solution point P is soughtk+1;
3. will point PkWith point Pk+1It substitutes into given objective function and calculates point PkTarget function value TIkWith point Pk+1Objective function
TIk+1, and compare the size of two objective functions, if TIk< TIk+1It then enters step 4., otherwise enters step 6.;
4.) the optimal solution point P that will seekk+1As the initial point of Powell algorithm, n dimension search is carried out, Powell optimization is obtained
Optimal solution point Pk+2, it is integer that wherein n, which is more than or equal to 1, n,;
5. will point Pk+2As the initial point of a new round, population is reinitialized, enables k=k+2, and goes to step 2.;
6. exporting optimal solution point Pk+1And target function value TIk, the optimal solution point Pk+1It is exactly optimized transformation parameters tform12.
3. the Registration of Measuring Data fusion method according to claim 1 based on unmanned plane gondola, it is characterised in that: step 6a
In, I component is the luminance component of visible images A, and H component is the chromatic component of visible images A, and S component is visible light figure
As the saturation degree component of A.
4. the Registration of Measuring Data fusion method according to claim 1 based on unmanned plane gondola, it is characterised in that: in step 6b
Greyscale transformation refer to image carry out gray processing pretreatment.
5. the Registration of Measuring Data fusion method according to claim 1 based on unmanned plane gondola, it is characterised in that: in step 6c
In, using different fusion rules, low frequency coefficient and high frequency coefficient are merged, specifically:
1) for the fusion rule of low frequency coefficient
To Lifting Wavelet decompose after low frequency coefficient representative image approximate part, using the fusion rule based on region energy into
Row fusion, the region energy of (retaining the general picture of fused image to greatest extent) certain pixel p (i, j) is defined as:
E (i, j)=∑ f2(m,n)
Wherein E (i, j) be image pixel p (i, j) region energy, f (m, n) be image luminance information, wherein m,
N respectively refers to the width and height of image, and i, j are the coordinate of pixel p, and (i, j) is region, and f (m, n) is that the brightness of image is believed
Breath.
The fusion rule that using area energy carries out low frequency coefficient is: to two images to be fused, if the pixel of image M
The region energy of (i, j) is greater than the energy of the pixel (i, j) of image n-quadrant, then is using the region energy of image M as fusion
Several values, otherwise using the value of n-quadrant energy as the value of fusion coefficients, it may be assumed that
In formula, Cp(i, j) indicates low frequency coefficient of the pixel p at (i, j), EM(i, j) indicates image M pixel p's (i, j)
Region energy, EN(i, j) indicates image N in the region energy of pixel p (i, j), and algorithm uses 3 × 3 window areas;
2) for the fusion rule of high frequency coefficient
The fusion rule of high frequency coefficient is the high frequency coefficient of this two images of movement images M and image N, that of maximum absolute value
A high frequency coefficient as blending image, high frequency fusion formula are as follows:
HLp(i, j)=max (| HLM(i,j)|,|HLN(i,j)|)
LHp(i, j)=max (| LHM(i,j)|,|LHN(i,j)|)
HHp(i, j)=max (| HHM(i,j)|,|HHN(i, j) |),
In formula, HL, LH, HH respectively represent three horizontal, vertical and diagonal high frequency coefficients;
HLp (i, j) is the horizontal high-frequent coefficient of pixel p (i, j);
HLM(i, j) is the horizontal high-frequent coefficient for scheming pixel (i, j) in M;
HLN(i, j) is the horizontal high-frequent coefficient for scheming pixel (i, j) in M;
LHp (i, j) is the vertical high frequency coefficient of pixel p (i, j);
LHM(i, j) is the vertical high frequency coefficient for scheming pixel (i, j) in M;
LHN(i, j) is the vertical high frequency coefficient for scheming pixel (i, j) in M;
HHp (i, j) is the diagonal high frequency coefficient of pixel p (i, j);
HHM(i, j) is the diagonal high frequency coefficient for scheming pixel (i, j) in M;
HHN(i, j) is the diagonal high frequency coefficient for scheming pixel (i, j) in M.
6. the Registration of Measuring Data fusion method according to claim 1 based on unmanned plane gondola, it is characterised in that: any described
The acquisition of the visible images A of scene m passes through visible image capturing head, and the acquisition of the infrared image B of any scene m is logical
Cross infrared camera, the physical location of the visible image capturing head and infrared camera is relatively fixed, and the light of two cameras
Axis is parallel to each other, and the distance between two optical axises are 2 centimetres~5 centimetres, and m is more than or equal to 1, and m is integer.
7. the Registration of Measuring Data fusion method according to claim 6 based on unmanned plane gondola, it is characterised in that: described visible
The phase and resolution ratio of light video camera head and infrared camera immobilize, and the visible image capturing head and infrared camera are to same
The shooting of one scene carries out simultaneously.
8. the Registration of Measuring Data fusion method according to claim 7 based on unmanned plane gondola, it is characterised in that: described visible
Light video camera head and infrared camera are each attached in gondola, and the gondola is fixedly installed on unmanned plane, setting in the gondola
There are power module, digital image information transceiver module A, information analysis module and Registration of Measuring Data Fusion Module, the power module
Mould is merged for visible image capturing head, infrared camera, digital image information transceiver module A, information analysis module and Registration of Measuring Data
Block power supply, the visible image capturing head, infrared camera, visible image capturing head, infrared camera, digital image information receive and dispatch mould
Pass through data line electrical connection, the digital image information transmitting-receiving between block A, information analysis module and Registration of Measuring Data Fusion Module
Modules A and ground control centre pass through wireless connection.
9. the Registration of Measuring Data fusion method according to claim 8 based on unmanned plane gondola, it is characterised in that: the ground
Control centre includes digital image information transceiver module B, message processing module, hovering instruction module, display module and power supply mould
Block, the power module is digital image information transceiver module B, message processing module, hovering instruction module, display module provide
Power supply;
The digital image information transceiver module B and digital image information transceiver module A realizes the biography of data by being wirelessly connected
It is defeated;
The message processing module connect the transmission for realizing data with digital image information transceiver module B by data line;
The hovering instruction module is realized with digital image information transceiver module B, message processing module by being wirelessly connected respectively
The transmission of data;
The display module and hovering instruction module realize the transmission of data by being wirelessly connected;
The hovering instruction module is stablized for controlling the skyborne position of unmanned plane;
Display module is used for the display to registration fused image;
The message processing module is used to handle unmanned plane and the positional relationship of barrier adjusts.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910724461.6A CN110443776A (en) | 2019-08-07 | 2019-08-07 | A kind of Registration of Measuring Data fusion method based on unmanned plane gondola |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910724461.6A CN110443776A (en) | 2019-08-07 | 2019-08-07 | A kind of Registration of Measuring Data fusion method based on unmanned plane gondola |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110443776A true CN110443776A (en) | 2019-11-12 |
Family
ID=68433658
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910724461.6A Pending CN110443776A (en) | 2019-08-07 | 2019-08-07 | A kind of Registration of Measuring Data fusion method based on unmanned plane gondola |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110443776A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110850109A (en) * | 2019-11-21 | 2020-02-28 | 中科智云科技有限公司 | Method for measuring vehicle speed based on fuzzy image |
CN111192229A (en) * | 2020-01-02 | 2020-05-22 | 中国航空工业集团公司西安航空计算技术研究所 | Airborne multi-mode video image enhancement display method and system |
CN111289860A (en) * | 2020-03-23 | 2020-06-16 | 云南电网有限责任公司电力科学研究院 | Method for detecting partial discharge position of electrical equipment |
CN111582296A (en) * | 2019-12-20 | 2020-08-25 | 珠海大横琴科技发展有限公司 | Remote sensing image comprehensive matching method and device, electronic equipment and storage medium |
CN111797903A (en) * | 2020-06-12 | 2020-10-20 | 武汉大学 | Multi-mode remote sensing image registration method based on data-driven particle swarm optimization |
CN112102217A (en) * | 2020-09-21 | 2020-12-18 | 四川轻化工大学 | Method and system for quickly fusing visible light image and infrared image |
CN113436362A (en) * | 2021-06-16 | 2021-09-24 | 国网河北省电力有限公司邯郸供电分公司 | Communication cable inspection method |
CN114360291A (en) * | 2021-12-23 | 2022-04-15 | 东风柳州汽车有限公司 | Driver danger early warning method, device, equipment and storage medium |
WO2022142570A1 (en) * | 2020-12-30 | 2022-07-07 | 杭州海康微影传感科技有限公司 | Image fusion method and apparatus, image processing device, and binocular system |
CN116109540A (en) * | 2023-03-22 | 2023-05-12 | 智洋创新科技股份有限公司 | Image registration fusion method and system based on particle swarm optimization gray curve matching |
CN116403057A (en) * | 2023-06-09 | 2023-07-07 | 山东瑞盈智能科技有限公司 | Power transmission line inspection method and system based on multi-source image fusion |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1545064A (en) * | 2003-11-27 | 2004-11-10 | 上海交通大学 | Infrared and visible light image merging method |
CN108364003A (en) * | 2018-04-28 | 2018-08-03 | 国网河南省电力公司郑州供电公司 | The electric inspection process method and device merged based on unmanned plane visible light and infrared image |
-
2019
- 2019-08-07 CN CN201910724461.6A patent/CN110443776A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1545064A (en) * | 2003-11-27 | 2004-11-10 | 上海交通大学 | Infrared and visible light image merging method |
CN108364003A (en) * | 2018-04-28 | 2018-08-03 | 国网河南省电力公司郑州供电公司 | The electric inspection process method and device merged based on unmanned plane visible light and infrared image |
Non-Patent Citations (1)
Title |
---|
王秋: "基于无人机的红外图像和可见光图像配准及融合算法研究", 《中国优秀硕士学位论文全文数据库(信息科技辑)》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110850109B (en) * | 2019-11-21 | 2022-04-22 | 中科智云科技有限公司 | Method for measuring vehicle speed based on fuzzy image |
CN110850109A (en) * | 2019-11-21 | 2020-02-28 | 中科智云科技有限公司 | Method for measuring vehicle speed based on fuzzy image |
CN111582296A (en) * | 2019-12-20 | 2020-08-25 | 珠海大横琴科技发展有限公司 | Remote sensing image comprehensive matching method and device, electronic equipment and storage medium |
CN111192229A (en) * | 2020-01-02 | 2020-05-22 | 中国航空工业集团公司西安航空计算技术研究所 | Airborne multi-mode video image enhancement display method and system |
CN111192229B (en) * | 2020-01-02 | 2023-10-13 | 中国航空工业集团公司西安航空计算技术研究所 | Airborne multi-mode video picture enhancement display method and system |
CN111289860A (en) * | 2020-03-23 | 2020-06-16 | 云南电网有限责任公司电力科学研究院 | Method for detecting partial discharge position of electrical equipment |
CN111797903A (en) * | 2020-06-12 | 2020-10-20 | 武汉大学 | Multi-mode remote sensing image registration method based on data-driven particle swarm optimization |
CN112102217B (en) * | 2020-09-21 | 2023-05-02 | 四川轻化工大学 | Method and system for quickly fusing visible light image and infrared image |
CN112102217A (en) * | 2020-09-21 | 2020-12-18 | 四川轻化工大学 | Method and system for quickly fusing visible light image and infrared image |
WO2022142570A1 (en) * | 2020-12-30 | 2022-07-07 | 杭州海康微影传感科技有限公司 | Image fusion method and apparatus, image processing device, and binocular system |
CN113436362A (en) * | 2021-06-16 | 2021-09-24 | 国网河北省电力有限公司邯郸供电分公司 | Communication cable inspection method |
CN114360291A (en) * | 2021-12-23 | 2022-04-15 | 东风柳州汽车有限公司 | Driver danger early warning method, device, equipment and storage medium |
CN116109540A (en) * | 2023-03-22 | 2023-05-12 | 智洋创新科技股份有限公司 | Image registration fusion method and system based on particle swarm optimization gray curve matching |
CN116109540B (en) * | 2023-03-22 | 2023-07-18 | 智洋创新科技股份有限公司 | Image registration fusion method and system based on particle swarm optimization gray curve matching |
CN116403057A (en) * | 2023-06-09 | 2023-07-07 | 山东瑞盈智能科技有限公司 | Power transmission line inspection method and system based on multi-source image fusion |
CN116403057B (en) * | 2023-06-09 | 2023-08-18 | 山东瑞盈智能科技有限公司 | Power transmission line inspection method and system based on multi-source image fusion |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110443776A (en) | A kind of Registration of Measuring Data fusion method based on unmanned plane gondola | |
US20210217212A1 (en) | Method and system for automatically colorizing night-vision images | |
US10395113B2 (en) | Polarization-based detection and mapping method and system | |
CN104504748B (en) | A kind of infrared 3-D imaging system of unmanned plane oblique photograph and modeling method | |
CN106127683B (en) | A kind of real-time joining method of unmanned aerial vehicle SAR image | |
US20130034266A1 (en) | Method and system for detection and tracking employing multi-view multi-spectral imaging | |
CN105139350A (en) | Ground real-time reconstruction processing system for unmanned aerial vehicle reconnaissance images | |
CN106096604A (en) | Multi-spectrum fusion detection method based on unmanned platform | |
CN104601953A (en) | Video image fusion-processing system | |
CN107453811B (en) | A method of the unmanned plane based on photopic vision communication cooperates with SLAM | |
CN106815826A (en) | Night vision image Color Fusion based on scene Recognition | |
CN106504363A (en) | A kind of airborne pair of light cruising inspection system stabilized platform automatic tracking method of intelligence | |
CN113643345A (en) | Multi-view road intelligent identification method based on double-light fusion | |
CN105551178A (en) | Power grid intelligent monitoring alarm method and device | |
US20220301303A1 (en) | Multispectral imaging for navigation systems and methods | |
CN114339185A (en) | Image colorization for vehicle camera images | |
CN105243653A (en) | Fast mosaic technology of remote sensing image of unmanned aerial vehicle on the basis of dynamic matching | |
WO2018165027A1 (en) | Polarization-based detection and mapping method and system | |
Martínez-de Dios et al. | Fire detection using autonomous aerial vehicles with infrared and visual cameras | |
CN113762161B (en) | Intelligent obstacle monitoring method and system | |
CN107147877A (en) | FX night fog day condition all-weather colorful video imaging system and its construction method | |
CN115127544A (en) | Thermal imaging system and method for navigation | |
CN113743286A (en) | Target monitoring system and method for multi-source signal fusion | |
CN208314563U (en) | A kind of visual identifying system for robotic tracking | |
Nguyen et al. | Characteristics of optical flow from aerial thermal imaging,“thermal flow” |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191112 |
|
RJ01 | Rejection of invention patent application after publication |