CN106683064A - Multi-focusing image fusion method based on two-dimensional coupling convolution - Google Patents

Multi-focusing image fusion method based on two-dimensional coupling convolution Download PDF

Info

Publication number
CN106683064A
CN106683064A CN201611146184.8A CN201611146184A CN106683064A CN 106683064 A CN106683064 A CN 106683064A CN 201611146184 A CN201611146184 A CN 201611146184A CN 106683064 A CN106683064 A CN 106683064A
Authority
CN
China
Prior art keywords
image
fusion
focus
matrix
fused
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611146184.8A
Other languages
Chinese (zh)
Other versions
CN106683064B (en
Inventor
梁军利
范文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201611146184.8A priority Critical patent/CN106683064B/en
Publication of CN106683064A publication Critical patent/CN106683064A/en
Application granted granted Critical
Publication of CN106683064B publication Critical patent/CN106683064B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a multi-focusing image fusion method based on two-dimensional coupling convolution. The method includes the following steps: inputting an image to be fused, and dividing the image into small blocks with the size of M*N through a sliding window technique; and for a single-input and multi-output system model, solving an imaging system F<1><j> and an imaging system F<2><j> in the model through characteristic constant decomposition, and thus selecting clear image blocks. The method of the invention can accurately determine clear image blocks, so that a fused image exhibits great robustness. Moreover, significant clear details of the image can be well fused without artificial traces.

Description

A kind of multi-focus image fusing method based on two dimension coupling convolution
Technical field
The invention belongs to digital image processing techniques field, and in particular to a kind of multi-focus figure based on two dimension coupling convolution As fusion method.
Background technology
With the fast development of information technology, the bloom for obtaining from initial visible light sensor till now of image The different sensors such as spectrum, radar, multispectral, the corresponding image data amount for obtaining also is greatly increased.Due to imaging technique condition limit The difference of system and image-forming principle, for the one side feature that any one single image can only all reflect destination object so that Its range of application is extremely limited.And image fusion technology is exactly effective spy of two or more images to Same Scene Levy the process for blending into new images so that the fusion image of acquisition can more comprehensively reflect clarification of objective, greatly increase data The precision and reliability of analysis.For example for visible light sensor imaging system, in the different image of a width scene depth In, the target focused in image can present clearly image, and with the target before and after other targets of certain distance all will There is different degrees of obscuring.
Used as the Main Branches of Data fusion technique, image co-registration can reduce single image and goal description not known Property, obtain an information more horn of plenty, obscure less, reliability better image in order to people's observation or computer disposal, Its effect includes:Image enhaucament, feature extraction, identification, tracking, classification etc..Image fusion technology is in computer vision, doctor Learn the fields such as image, remote sensing images, military affairs to be widely applied.
Existing image fusion technology will be in order that all targets can present clearly image in scene, first As system is focused in a part of target, a partial target clearly image is obtained, in refocusing another part target, obtained To another partial target clearly image, finally this two width image is merged using blending algorithm, obtained all targets All clearly image (Wang Yajie, Wang Xiaoyan, Liu Xueping. the Overview of multi-focus image fusion [J] based on wavelet transformation. Shenyang navigate Empty engineering institute's journal, 2005,04:65-67).
Existing fusion method has wavelet transform fusion and discrete cosine transform method etc., and both approaches all have fusion Afterwards image border is unintelligible, it may appear that the problem of false contouring.
The content of the invention
It is an object of the invention to provide a kind of based on the two-dimentional multi-focus image fusing method for coupling convolution, the method can be obtained The fusion image of better quality is obtained, the unsharp problem of conventional images integration technology edge details is solved.
The technical solution adopted in the present invention is that a kind of multi-focus image fusing method based on two dimension coupling convolution is wrapped Include following steps:
Step 1, the different image of multiple focus objects is gathered to Same Scene, then carries out image registration;
Step 2, is input into fusion image, I1,I2∈RI×J, and it is divided into the image that L size is M × N with sliding window technology Block,J-th image block of i-th image to be fused is represented, an empty matrix O ∈ R is initializedI×JAfter storage fusion Image;
Step 3, takes successively j=1,2 ..., L, respectively execution step 3-1 to 3-4;
Step 3-1, by the data modeling of step 2 into single-input multiple output, IjRepresent input scene,Represent focus function,Represent through different system Output;Build two dimension coupling convolution model:
According to Structural matrix X according to the following formulaj,
Wherein:
Structural matrix Y in the same mannerj
Step 3-2, calculates according to the following formula R,
R=[Xj-Yj]T[Xj-Yj]
Step 3-3, with the imaging system in Eigenvalues Decomposition computation modelWith Optimal solution it is special for the minimum of R The corresponding characteristic vector of value indicative;Then compareWithVariance, select clearly image block
Step 3-4, pressesOriginally the position in image to be fused, incites somebody to actionIn being stacked to matrix O;
Step 4, for the number of times that each pixel in matrix O is added divided by it, then obtains final output fusion figure Picture.
Of the invention the characteristics of, also resides in:
In step 2P represents sliding window step-length, p<M and p<N.
The invention has the beneficial effects as follows, the method for the present invention compared with the conventional method, can accurately be judged clearly to scheme As block so that fusion image has stronger robustness, can fusion image clearly material particular well, also will not draw Enter artificial vestige.
Description of the drawings
Fig. 1 is imaging system schematic diagram of the present invention;
Fig. 2 is the imaging system schematic diagram of present invention extension;
Fig. 3-1 represents a focusedimage of the first embodiment, and Fig. 3-2 represents another focusedimage, and Fig. 3-f are represented Fusion image;
Fig. 4-1 represents a focusedimage of second embodiment, and Fig. 4-2 represents another focusedimage, and Fig. 4-f are represented Fusion image;
Fig. 5-1 represents a focusedimage of the third embodiment, and Fig. 5-2 represents another focusedimage, and Fig. 5-f are represented Fusion image;
Fig. 6-1 represents a focusedimage of the 4th kind of embodiment, and Fig. 6-2 represents another focusedimage, and Fig. 6-f are represented Fusion image;
Fig. 7 is the inventive method and small wave converting method, discrete cosine transform method fusion Detail contrast figure.
Specific embodiment
With reference to the accompanying drawings and detailed description the present invention is described in further detail, but the present invention is not limited to The embodiment.
The multi-focus image fusing method of the present invention, gathers multiple to Same Scene and focuses on mesh initially with CCD camera The different image of mark, then carries out image registration, and image registration can be carried out using existing method for registering.
Image sliding window technology of the size for I × J is divided into into some image blocks that size is M × N, is usedRepresent J-th image block of i-th image to be fused, sets sliding window step-length p (p<M and p<N).
Although image block is from different images to be fused, they are from the target imaging in same field (only It is that focus level is different), therefore by above-mentioned analysis modeling into single-input multiple output.By taking two output as an example, such as Fig. 1 institutes Show:IjRepresent input scene,Represent focus function, Represent the output through different system.
The imaging system in model is solved using Eigenvalues Decomposition
According to Fig. 1, we have following expression:
Wherein * represents convolution algorithm, by the commutative law and associative law of two-dimensional convolution, obtains as shown in Fig. 2, formula (3) The imaging system of extension.
Build following two dimension coupling convolution model:
Then, it is based onConstruct following matrix and the convolution form of above-mentioned model is written as into mathematics multiplication form.
Wherein:
In the same manner, willAlso it is written as matrix Yj, the form that can be multiplied:
Wherein
Abbreviation above formula, and in order to avoidSituation all for 0, increases bound term:
Then have:
Wherein:It is obvious thatOptimal solution be the corresponding spy of minimal eigenvalue of R Levy vector.
Without loss of generality, it is assumed thatIn target be focus on shoot (clear), andIt is to defocus shooting (fuzzy).Cause Solve in this above formulaIt is fuzzy filter to be equivalent to, and is comparedIts variance is smaller.So eventually through calculating Variance can judge Readability.
Based on above-mentioned analysis, the method is performed as steps described below.
The first step:Determine size M of image block, N.Then sliding window step-length p (p is set<M, p<N).
Bis- Walk:Input fusion image, I1,I2∈RI×J, and the image block that L size is M × N is divided into, whereinOne empty matrix O ∈ R of initializationI×JFor depositing fused data.
3rd step:
Circulation j=1,2 ..., L
①:Pass through Structure is according to (5) (6) formula structural matrix Xj,Yj
②:R is calculated (by Xj,Yj)。
③:Calculated with Eigenvalues Decomposition Then compare Variance, select clearly image block(variance is little Corresponding image block is clearly image block .)
④:WillIt is stacked in O (by position of its original in image to be fused).Loop ends.
4th step:For the number of times that each pixel in O is added divided by it, final output fusion image is then obtained.
Using the method for the present invention multiple focussing image that six width have been aligned is merged respectively.The table in Fig. 3-1 left sides It is to focus on to shoot (imaging clearly), the table on Fig. 3-2 right sides is to focus on to shoot (imaging clearly), and Fig. 3-f are the inventive method Fusion results, it can be seen that the method for the present invention has merged well the focusing photographing section of Fig. 3-1, Fig. 3-2, after fusion The left side of image and all will be apparent that for right side.In the same manner, the table in Fig. 4-1 left sides is to focus on to shoot (imaging clearly), and Fig. 4-2 is right The people of side is to focus on to shoot (imaging clearly), and Fig. 4-f are using the clearly fusion results of the inventive method, the figure after fusion The details such as the table in the left side of picture and the people on right side all will be apparent that.The potted plant of Fig. 5-1 left sides is to focus on to shoot (imaging clearly), figure Clock and watch on the right side of 5-2 are to focus on to shoot (imaging clearly), and Fig. 5-f are, using the clearly fusion results of the inventive method, to melt The details such as potted plant and right side the clock and watch in the left side of the image after conjunction all will be apparent that.Fig. 6-1 flower leopard head be focus on shoot (into As clearly), the foot of Fig. 6-2 flower leopards is to focus on to shoot (imaging clearly), and Fig. 6-f are clearly melting using the inventive method Result is closed, the speckle of the colored leopard body of fused image is integrally all apparent from.The method of the present invention can fusion image well Clearly material particular, will not also introduce artificial vestige.
Fusion method of the present invention is contrasted with traditional Wavelet Transform Fusion method and discrete cosine transform method, such as Fig. 7 Shown, left-side images represent the fusion results of Wavelet Transform Fusion method, and intermediate image represents the fusion knot of discrete cosine transform method Really, the detailed information at the hair edge of both fusions legal person is very unintelligible, and the fusion results of right side the inventive method are carried Gem-pure edge details, effect have been supplied to be substantially better than traditional method.

Claims (2)

1. it is a kind of based on the two-dimentional multi-focus image fusing method for coupling convolution, it is characterised in that to comprise the following steps:
Step 1, the different image of multiple focus objects is gathered to Same Scene, then carries out image registration;
Step 2, is input into fusion image, I1,I2∈RI×J, and the image block that L size is M × N is divided into sliding window technology, J-th image block of i-th image to be fused is represented, an empty matrix O ∈ R is initializedI×JFor the image after storage fusion;
Step 3, takes successively j=1,2 ..., L, respectively execution step 3-1 to 3-4;
Step 3-1, by the data modeling of step 2 into single-input multiple output, IjRepresent input scene,Represent focus function,Represent through the defeated of different system Go out;Build two dimension coupling convolution model:
min F 1 j , F 2 j | | S 1 j * F 1 j - S 2 j * F 1 j | | F 2 - - - ( 4 )
According toStructural matrix X according to the following formulaj,
Wherein:
Structural matrix Y in the same mannerj
Step 3-2, calculates according to the following formula R,
R=[Xj-Yj]T[Xj-Yj]
Step 3-3, with the imaging system in Eigenvalues Decomposition computation modelWithWithOptimal solution it is special for the minimum of R The corresponding characteristic vector of value indicative;Then compareWithVariance, select clearly image block
Step 3-4, pressesOriginally the position in image to be fused, incites somebody to actionIn being stacked to matrix O;
Step 4, for the number of times that each pixel in matrix O is added divided by it, then obtains final output fusion image.
2. it is according to claim 1 based on the two-dimentional multi-focus image fusing method for coupling convolution, it is characterised in that step In 2P represents sliding window step-length, p<M and p<N.
CN201611146184.8A 2016-12-13 2016-12-13 A kind of multi-focus image fusing method based on two dimension coupling convolution Expired - Fee Related CN106683064B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611146184.8A CN106683064B (en) 2016-12-13 2016-12-13 A kind of multi-focus image fusing method based on two dimension coupling convolution

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611146184.8A CN106683064B (en) 2016-12-13 2016-12-13 A kind of multi-focus image fusing method based on two dimension coupling convolution

Publications (2)

Publication Number Publication Date
CN106683064A true CN106683064A (en) 2017-05-17
CN106683064B CN106683064B (en) 2019-07-30

Family

ID=58869463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611146184.8A Expired - Fee Related CN106683064B (en) 2016-12-13 2016-12-13 A kind of multi-focus image fusing method based on two dimension coupling convolution

Country Status (1)

Country Link
CN (1) CN106683064B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109934831A (en) * 2019-03-18 2019-06-25 安徽紫薇帝星数字科技有限公司 A kind of surgical tumor operation real-time navigation method based on indocyanine green fluorescent imaging
CN110099207A (en) * 2018-01-31 2019-08-06 成都极米科技股份有限公司 A kind of effective image calculation method for overcoming camera unstable

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1402191A (en) * 2002-09-19 2003-03-12 上海交通大学 Multiple focussing image fusion method based on block dividing
CN103985104A (en) * 2014-02-20 2014-08-13 江南大学 Multi-focusing image fusion method based on higher-order singular value decomposition and fuzzy inference

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1402191A (en) * 2002-09-19 2003-03-12 上海交通大学 Multiple focussing image fusion method based on block dividing
CN103985104A (en) * 2014-02-20 2014-08-13 江南大学 Multi-focusing image fusion method based on higher-order singular value decomposition and fuzzy inference

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110099207A (en) * 2018-01-31 2019-08-06 成都极米科技股份有限公司 A kind of effective image calculation method for overcoming camera unstable
CN110099207B (en) * 2018-01-31 2020-12-01 成都极米科技股份有限公司 Effective image calculation method for overcoming camera instability
CN109934831A (en) * 2019-03-18 2019-06-25 安徽紫薇帝星数字科技有限公司 A kind of surgical tumor operation real-time navigation method based on indocyanine green fluorescent imaging

Also Published As

Publication number Publication date
CN106683064B (en) 2019-07-30

Similar Documents

Publication Publication Date Title
Fan et al. Dual refinement underwater object detection network
Kim et al. Multispectral transfer network: Unsupervised depth estimation for all-day vision
Zhu et al. Single view metrology in the wild
CN106952341A (en) The underwater scene three-dimensional point cloud method for reconstructing and its system of a kind of view-based access control model
CN103700099A (en) Rotation and dimension unchanged wide baseline stereo matching method
Xie et al. Semantics lead all: Towards unified image registration and fusion from a semantic perspective
CN107845145B (en) Three-dimensional reconstruction system and method under electron microscopic scene
CN112488970A (en) Infrared and visible light image fusion method based on coupling generation countermeasure network
Xing et al. Multi-level adaptive perception guidance based infrared and visible image fusion
Li et al. Self-supervised coarse-to-fine monocular depth estimation using a lightweight attention module
CN106683064A (en) Multi-focusing image fusion method based on two-dimensional coupling convolution
CN112070181B (en) Image stream-based cooperative detection method and device and storage medium
Li et al. Learning scribbles for dense depth: Weakly-supervised single underwater image depth estimation boosted by multi-task learning
Zhou et al. A matching algorithm for underwater acoustic and optical images based on image attribute transfer and local features
Liu et al. Siamese network with bidirectional feature pyramid for small target tracking
Zhang et al. Robust registration for ultra-field infrared and visible binocular images
CN109993782A (en) A kind of annular generates the heterologous remote sensing image registration method and device of confrontation network
CN112419387B (en) Unsupervised depth estimation method for solar greenhouse tomato plant image
Xu et al. Local feature matching using deep learning: A survey
Cai et al. Hyperspectral image classification using multi-branch-multi-scale residual fusion network
Yao et al. Matching wide-baseline stereo images with weak texture using the perspective invariant local feature transformer
Bodensteiner et al. Multispectral matching using conditional generative appearance modeling
Wei et al. Multiscale feature U-Net for remote sensing image segmentation
Huang et al. Low illumination soybean plant reconstruction and trait perception
Zhao et al. Small object detection of imbalanced traffic sign samples based on hierarchical feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190730

Termination date: 20191213

CF01 Termination of patent right due to non-payment of annual fee