CN106651749B - Graph fusion method and system based on linear equation - Google Patents

Graph fusion method and system based on linear equation Download PDF

Info

Publication number
CN106651749B
CN106651749B CN201510731351.4A CN201510731351A CN106651749B CN 106651749 B CN106651749 B CN 106651749B CN 201510731351 A CN201510731351 A CN 201510731351A CN 106651749 B CN106651749 B CN 106651749B
Authority
CN
China
Prior art keywords
row
image
col
gradient
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510731351.4A
Other languages
Chinese (zh)
Other versions
CN106651749A (en
Inventor
刘德建
陈宏展
陈建宽
胡铭
王兆安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian TQ Digital Co Ltd
Original Assignee
Fujian TQ Digital Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian TQ Digital Co Ltd filed Critical Fujian TQ Digital Co Ltd
Priority to CN201510731351.4A priority Critical patent/CN106651749B/en
Publication of CN106651749A publication Critical patent/CN106651749A/en
Application granted granted Critical
Publication of CN106651749B publication Critical patent/CN106651749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a graph fusion method and a system based on a linear equation, which comprises the following steps: s1, determining a fusion area of the source image and a fusion area of the target image, and respectively calculating a source image gradient bs and a target gradient bt; s2, calculating a mixed gradient b according to the source map gradient bs and the target gradient bt; and S3, determining a fusion area image o of the output image according to a linear equation o-b/A, wherein A is a sparse coefficient matrix. The invention further optimizes the fusion effect by mixing the gradient maps and adds a high-quality and high-feasibility graph fusion method for the graph processing technology.

Description

Graph fusion method and system based on linear equation
Technical Field
the invention relates to the field of image processing, in particular to a method and a system for fusing images based on a linear equation.
Background
The 21 st century is an information-filled era, images serve as visual bases for human perception of the world, are important means for human to acquire information, express information and transmit information, and image processing also becomes an popular research field. Image processing, i.e., techniques in which an image is analyzed by a computer to achieve a desired result. Also known as image processing. Image processing generally refers to digital image processing. Digital images are large two-dimensional arrays of elements called pixels and values called gray-scale values, which are captured by industrial cameras, video cameras, scanners, etc.
Image synthesis is a fundamental problem in image processing, which creates a new image by embedding an object or a region in a source image into a target image. In order to make the synthesized image more natural, the synthesized boundary should be seamless, but if the source image and the target image have significantly different texture features, the directly synthesized image will have a significant boundary.
disclosure of Invention
The technical problem to be solved by the invention is as follows: a method and a system for fusing images based on linear equation are provided to fuse seamless boundaries of images.
In order to solve the technical problems, the invention adopts the technical scheme that: a graph fusion method based on linear equation includes the following steps:
S1, determining a fusion area of the source image and a fusion area of the target image, and respectively calculating a source image gradient bs and a target gradient bt;
the source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), where x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
The target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); where y is the target image, row is the row of the fusion zone of the target image, and col is the column of the fusion zone of the target image.
s2, calculating a mixed gradient b according to the source map gradient bs and the target gradient bt;
and S3, determining a fusion area image o of the output image according to a linear equation o-b/A, wherein A is a sparse coefficient matrix.
the invention also relates to a graph fusion system based on linear equation, comprising:
the fusion region gradient calculation module is used for determining a fusion region of a source image and a fusion region of a target image and respectively calculating a source image gradient bs and a target gradient bt;
the source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), where x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
The target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); where y is the target image, row is the row of the fusion zone of the target image, and col is the column of the fusion zone of the target image.
the mixed gradient calculation module is used for calculating a mixed gradient b according to the source map gradient bs and the target gradient bt;
and the image output module is used for determining a fusion area image o of the output image according to a linear equation of o-b/A, wherein A is a sparse coefficient matrix.
the invention has the beneficial effects that: by mixing the gradient maps, the fusion effect is further optimized, and a high-quality and high-feasibility graph fusion method is added for the graph processing technology.
Drawings
FIG. 1 is a flowchart of a method according to a first embodiment of the present invention;
Fig. 2 is a schematic system structure diagram according to a first embodiment of the present invention.
Detailed Description
In order to explain technical contents, objects and effects of the present invention in detail, the following detailed description is given with reference to the accompanying drawings in conjunction with the embodiments.
the most key concept of the invention is as follows: and for the output fusion area image, mixing the source image gradient of the source image and the target gradient of the target image according to the weight, and for the output non-fusion area image, the output non-fusion area image is consistent with the non-fusion area of the target image.
Referring to fig. 1, a graph fusion method based on linear equation includes the following steps:
S1, determining a fusion area of the source image and a fusion area of the target image, and respectively calculating a source image gradient bs and a target gradient bt;
the source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), where x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
The target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); where y is the target image, row is the row of the fusion zone of the target image, and col is the column of the fusion zone of the target image.
s2, calculating a mixed gradient b1 according to the source map gradient bs and the target gradient bt;
and S3, determining a fusion area image o of the output image according to a linear equation o-b/A, wherein A is a sparse coefficient matrix.
from the above description, the beneficial effects of the present invention are: the fusion effect is further optimized, and the feasibility is high.
Further, in step S2, the blending gradient b is calculated according to the formula b ═ alpha × bs + (1-alpha) × bt; where alpha is the weight of the mixture of the source and target gradients.
As can be seen from the above description, images with different fusion effects can be output according to different weight values.
further, in step S3, the non-fusion area image o' of the output image coincides with the non-fusion area of the target image.
the invention also relates to a graph fusion system based on linear equation, comprising:
the fusion region gradient calculation module is used for determining a fusion region of a source image and a fusion region of a target image and respectively calculating a source image gradient bs and a target gradient bt;
the source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), where x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
The target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); where y is the target image, row is the row of the fusion zone of the target image, and col is the column of the fusion zone of the target image.
the mixed gradient calculation module is used for calculating a mixed gradient b according to the source map gradient bs and the target gradient bt;
and the image output module is used for determining a fusion area image o of the output image according to a linear equation of o-b/A, wherein A is a sparse coefficient matrix.
Further, the mixed gradient b is calculated according to the formula b ═ (alpha) × bs + (1-alpha) × bt; where alpha is the weight of the mixture of the source and target gradients.
Further, the image output module is further configured to determine a non-fusion region image o ' of the output image, and for the non-fusion region image o ' of the output image, the non-fusion region image o ' is consistent with the non-fusion region of the target image.
Example one
Referring to fig. 1, a first embodiment of the present invention is: a graph fusion method based on linear equation comprises the following steps:
s1: respectively dividing a source image and a target image into a fusion area and a non-fusion area as required, and respectively calculating a source image gradient bs of the source image fusion area and a target gradient bt of the target image fusion area;
The source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), wherein 4 is the inherent coefficient of the equation, x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
the target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); where 4 is the intrinsic coefficient of the equation, y is the target image, row is the row of the fusion zone of the target image, and col is the column of the fusion zone of the target image.
S2: and calculating a mixed gradient b according to the source map gradient bs and the target gradient bt, wherein the mixed gradient b is the gradient of a fusion region of the final output image, and the mixed gradient b is calculated according to a formula b ═ alpha ═ bs + (1-alpha) × bt, wherein alpha takes a value of 0-1 and is the mixed weight of the source map gradient and the target gradient, the default value of alpha is 0.5, and the specific weight value can be determined according to the required fusion effect.
S3: and determining a non-fusion area image o of the output image according to a linear equation o, b and A, wherein b is the mixed gradient and A is a sparse coefficient matrix.
As for the non-fusion region image o' of the output image, it coincides with the non-fusion region of the target image, that is, the non-fusion region of the target image can be directly output as the non-fusion region image of the output image.
Alternatively, the fusion process may perform fusion on each channel of the image, i.e. performing the above steps S1-S3 on each channel, and using the same sparse coefficient matrix a for different channels, which may further improve the fusion effect, wherein the gradient map b needs to be calculated according to each color channel.
as shown in FIG. 2, a linear equation-based image fusion system corresponding to the above method comprises a fusion region gradient calculation module, a mixed gradient calculation module and an image output module.
The fusion region gradient calculation module is used for determining a fusion region of a source image and a fusion region of a target image, respectively calculating a source image gradient bs and a target gradient bt, and sending the calculated source image gradient bs and target gradient bt to the mixed gradient calculation module;
The source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), wherein 4 is the inherent coefficient of the equation, x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
The target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); where 4 is the intrinsic coefficient of the equation, y is the target image, row is the row of the fusion zone of the target image, and col is the column of the fusion zone of the target image.
The mixed gradient calculation module is used for calculating a mixed gradient b according to the source image gradient bs and the target gradient bt and sending the mixed gradient b to the image output module; the mixed gradient b is calculated according to a formula b ═ alpha ═ bs + (1-alpha) × bt, wherein alpha takes a value of 0-1, which is the mixed weight of the source map gradient and the target gradient, the default value of alpha is 0.5, and the specific weight value can be determined according to the required fusion effect.
The image output module is used for determining a fusion area image o of an output image according to a linear equation o b/A, wherein A is the sparse coefficient matrix.
The image output module is further used for determining a non-fusion area image o ' of the output image, and for the non-fusion area image o ' of the output image, the non-fusion area image o ' is consistent with the non-fusion area of the target image.
In summary, the graph fusion method and system based on the linear equation provided by the invention further optimize the fusion effect through the mixing of the gradient graphs, and add a high-quality and high-feasibility graph fusion method for the graph processing technology.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all equivalent changes made by using the contents of the present specification and the drawings, or applied directly or indirectly to the related technical fields, are included in the scope of the present invention.

Claims (4)

1. A graph fusion method based on linear equation is characterized in that: the method comprises the following steps:
s1, determining a fusion area of the source image and a fusion area of the target image, and respectively calculating a source image gradient bs and a target gradient bt;
The source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), where x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
the target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); wherein y is the target image, row is the row of the fusion zone of the target image, col is the column of the fusion zone of the target image;
S2, calculating a mixed gradient b according to the source map gradient bs and the target gradient bt;
S3, determining a fusion area image o of the output image according to a linear equation o-b/A, wherein A is a sparse coefficient matrix;
in step S2, the mixture gradient b is calculated according to the formula b ═ (alpha) × bs + (1-alpha) × bt; where alpha is the weight of the mixture of the source and target gradients.
2. the linear equation based graphics fusion method of claim 1, wherein: in step S3, the non-fusion area image o' of the output image matches the non-fusion area of the target image.
3. A graphical fusion system based on linear equations, characterized by: the method comprises the following steps:
the fusion region gradient calculation module is used for determining a fusion region of a source image and a fusion region of a target image and respectively calculating a source image gradient bs and a target gradient bt;
the source map gradient bs is calculated according to the formula bs ═ 4 × x (row, col) -x (row +1, col) -x (row-1, col) -x (row, col +1) -x (row, col-1), where x is the source image, row is the row of the fusion region of the source image, and col is the column of the fusion region of the source image;
The target gradient bt is calculated according to the formula bt 4 × y (row, col) -y (row +1, col) -y (row-1, col) -y (row, col +1) -y (row, col-1); wherein y is the target image, row is the row of the fusion zone of the target image, col is the column of the fusion zone of the target image;
the mixed gradient calculation module is used for calculating a mixed gradient b according to the source map gradient bs and the target gradient bt;
The image output module is used for determining a fusion area image o of an output image according to a linear equation o b/A, wherein A is a sparse coefficient matrix;
the mixed gradient b is calculated according to the formula b ═ (alpha) × bs + (1-alpha) × bt; where alpha is the weight of the mixture of the source and target gradients.
4. The linear equation based graphical fusion system of claim 3, wherein: the image output module is further used for determining a non-fusion area image o ' of the output image, and for the non-fusion area image o ' of the output image, the non-fusion area image o ' is consistent with the non-fusion area of the target image.
CN201510731351.4A 2015-11-02 2015-11-02 Graph fusion method and system based on linear equation Active CN106651749B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510731351.4A CN106651749B (en) 2015-11-02 2015-11-02 Graph fusion method and system based on linear equation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510731351.4A CN106651749B (en) 2015-11-02 2015-11-02 Graph fusion method and system based on linear equation

Publications (2)

Publication Number Publication Date
CN106651749A CN106651749A (en) 2017-05-10
CN106651749B true CN106651749B (en) 2019-12-13

Family

ID=58809763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510731351.4A Active CN106651749B (en) 2015-11-02 2015-11-02 Graph fusion method and system based on linear equation

Country Status (1)

Country Link
CN (1) CN106651749B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551904A (en) * 2009-05-19 2009-10-07 清华大学 Image synthesis method and apparatus based on mixed gradient field and mixed boundary condition
CN102393958A (en) * 2011-07-16 2012-03-28 西安电子科技大学 Multi-focus image fusion method based on compressive sensing
CN103593833A (en) * 2013-10-25 2014-02-19 西安电子科技大学 Multi-focus image fusion method based on compressed sensing and energy rule
CN104504670A (en) * 2014-12-11 2015-04-08 上海理工大学 Multi-scale gradient domain image fusion algorithm

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1748389A1 (en) * 2005-07-28 2007-01-31 Microsoft Corporation Image blending

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551904A (en) * 2009-05-19 2009-10-07 清华大学 Image synthesis method and apparatus based on mixed gradient field and mixed boundary condition
CN102393958A (en) * 2011-07-16 2012-03-28 西安电子科技大学 Multi-focus image fusion method based on compressive sensing
CN103593833A (en) * 2013-10-25 2014-02-19 西安电子科技大学 Multi-focus image fusion method based on compressed sensing and energy rule
CN104504670A (en) * 2014-12-11 2015-04-08 上海理工大学 Multi-scale gradient domain image fusion algorithm

Also Published As

Publication number Publication date
CN106651749A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN101593349B (en) Method for converting two-dimensional image into three-dimensional image
CN110288614B (en) Image processing method, device, equipment and storage medium
US9898837B2 (en) Image processing system
RU2017101484A (en) METHOD AND DEVICE FOR FORMING A THREE-DIMENSIONAL IMAGE
KR102351725B1 (en) Color gamut mapping method and color gamut mapping device
CN109643462B (en) Real-time image processing method based on rendering engine and display device
CN103688287A (en) Method of adapting a source image content to a target display
CN103646378A (en) High reduction degree spatial domain image zooming method based on FPGA platform
CN109934793A (en) A kind of Real-time image fusion method based on Integer DCT Transform
CN103198486A (en) Depth image enhancement method based on anisotropic diffusion
CN111787240B (en) Video generation method, apparatus and computer readable storage medium
CN104639834A (en) Method and system for transmitting camera image data
CN106651749B (en) Graph fusion method and system based on linear equation
US11494934B2 (en) Image processing device, image processing method, and monitoring system
KR101451236B1 (en) Method for converting three dimensional image and apparatus thereof
US8750648B2 (en) Image processing apparatus and method of processing image
Li Image super-resolution algorithm based on RRDB model
JP6866181B2 (en) An image processing device, a control method thereof, a display device including the image processing device, a program, and a storage medium.
CN103888752A (en) Image conversion method and image conversion device from two-dimensional image to three-dimensional image
KR101382227B1 (en) Method for classifying input image into window image and method and electronic device for converting window image into 3d image
KR101803065B1 (en) Method and apparatus for processing image
JP6283297B2 (en) Method, apparatus and system for resizing and restoring original depth frame
US20130038630A1 (en) Image drawing device and image drawing method
US8624909B2 (en) Image processing system and method thereof
KR102524223B1 (en) Data Processing Apparatus and Method for Infrared Thermography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant