CN104933671B - Color of image fusion method - Google Patents

Color of image fusion method Download PDF

Info

Publication number
CN104933671B
CN104933671B CN201510271603.XA CN201510271603A CN104933671B CN 104933671 B CN104933671 B CN 104933671B CN 201510271603 A CN201510271603 A CN 201510271603A CN 104933671 B CN104933671 B CN 104933671B
Authority
CN
China
Prior art keywords
overlap
pixel
image
overlapping region
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510271603.XA
Other languages
Chinese (zh)
Other versions
CN104933671A (en
Inventor
李永
吴岳辛
余杭
乔伟
荆晶
金宏斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN201510271603.XA priority Critical patent/CN104933671B/en
Publication of CN104933671A publication Critical patent/CN104933671A/en
Application granted granted Critical
Publication of CN104933671B publication Critical patent/CN104933671B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present invention provides a kind of color of image fusion method, including:Obtain two images A and B, obtain the overlapping region overlap_a and overlap_b of image A and B, the overlapping region overlap_a and overlap_b of image A and B are respectively divided at least two equal-sized regions, calculate the pixel average of each area pixel in overlapping region overlap_a and overlap_b, the pixel average of each area pixel in overlapping region overlap_a and overlap_b is corresponded, establish Map Searching table, by Map Searching table in image A and B all pixels point in piece image being appointed to convert, image A' or B' after being restained.Color of image fusion method provided by the invention can carry out color blend on entire color space, so as to effectively reduce the difference of color between image.

Description

Color of image fusion method
Technical field
The present invention relates to image processing techniques more particularly to a kind of color of image fusion methods.
Background technology
Image mosaic refers to the seamless high-resolution panoramic image that the image that several have lap is combined into width large size Technology, by image mosaic technology, can redundancy be compressed with the resolution ratio of expanded images.When obtaining image, difference figure As may be that different time, different visual angles or different sensors obtain.However, different time, different visual angles or difference The parameters such as the image of sensor shooting often exposes, contrast are different, and the color for causing different images is different so as to image When carrying out splicing generation panoramic picture, have apparent splicing seams in lap and generate.It is then desired to the color of image into Row adjustment, reduces the aberration between image.Wherein, color blend is on certain color space, and image is restained.Face The purposes of the colour space is that colour is illustrated with generally acceptable mode under some standards, such as RGB color, YUV color spaces.RGB color is color space most basic, the most frequently used, towards hardware in image procossing, mainly including R Three kinds of (red), G (green), B (blueness) ingredients, YUV color spaces are to describe the color of color sky by luma-chroma Between, Y represents brightness, and U and V represent colourity.
At present, to color of image fusion mainly using the method (Gain Compensation) of exposure compensating, exposure is mended The method repaid is that two images first are converted into YUV color spaces by rgb space, then defines an error function and is used for balancing Brightness between image carries out color blend by error function to two images, and wherein error function is to institute in overlapping region Normalization pixel value a little ask error and.
However, the method for exposure compensating only considered the brightness in color of image space, entire color space is not accounted for, is led The image syncretizing effect for causing color distortion slightly larger is bad.
The content of the invention
The present invention provides a kind of color of image fusion method, and color blend can be carried out on entire color space, so as to The difference of color effectively between reduction image.
Color of image fusion method provided by the invention, including:
Obtain two images A and B;
Obtain the overlapping region overlap_a and overlap_b of image A and B;
It is equal-sized that the overlapping region overlap_a and overlap_b of image A and B are respectively divided at least two Region;
Calculate the pixel average of each area pixel in overlapping region overlap_a and overlap_b;
The pixel average of each area pixel in overlapping region overlap_a and overlap_b is corresponded, is built Vertical Map Searching table;
By Map Searching table to all pixels point in piece image being appointed to convert in image A and B, obtain again Image A' or B' after color.
In an embodiment of the present invention, it is foregoing to draw the overlapping region overlap_a and overlap_b of image A and B respectively It is divided at least two equal-sized regions, including:
Each equal-sized region is a pixel;
The pixel average of each area pixel in overlapping region overlap_a and overlap_b is calculated, including:
Calculate the pixel value of each pixel in overlapping region overlap_a and overlap_b;
The pixel average of each area pixel in overlapping region overlap_a and overlap_b is corresponded, is built Vertical Map Searching table, including:
The pixel value of each pixel in overlapping region overlap_a and overlap_b is corresponded, establishes mapping Look-up table.
In an embodiment of the present invention, each pixel in foregoing overlap_a and overlap_b by overlapping region Pixel value corresponds, and establishes Map Searching table, including:
If the pixel value of the pixel value of pixel corresponding pixel in overlap_b in the overlap_a of overlapping region There are two at least, then at least two pixel values of corresponding pixel in overlap_b are sorted by size, intermediate value is taken to make It is for the pixel value of corresponding pixel points in overlap_a, the pixel value of pixel in overlap_a is corresponding with overlap_b Pixel takes the intermediate pixel value being worth to correspond, and establishes Map Searching table.
In an embodiment of the present invention, each pixel in foregoing overlap_a and overlap_b by overlapping region Pixel value corresponds, and establishes Map Searching table, further includes:
If the pixel value of the pixel value of pixel corresponding pixel in overlap_b in the overlap_a of overlapping region Number is less than or equal to 500 or cannot be divided exactly by 15, then the pixel value of corresponding pixel in overlap_b is made linear interpolation, Using the value that linear interpolation obtains as the pixel value of corresponding pixel points in overlap_a, by the pixel of pixel in overlap_a The pixel value that value pixel linear interpolation method corresponding with overlap_b obtains corresponds, and establishes Map Searching table.
In an embodiment of the present invention, the foregoing overlapping region overlap_a and overlap_b for obtaining image A and B, bag It includes:
The overlapping region of image A and B is found out in image registration;
Image A is consistent with the overlapping region size adjustment of B;
Obtain overlapping region overlap_a and overlap_b after image A and B are sized.
In an embodiment of the present invention, it is foregoing after two images A and B is obtained, obtain the overlapping region of image A and B Before overlap_a and overlap_b, further include:
Image A and B are pre-processed, the interference signal in removal image A and B.
Color of image fusion method provided by the invention is more by the way that the overlapping region of two images to be fused is divided into A equal-sized zonule calculates the pixel average of each zonule pixel, by the pixel of each zonule pixel Average value corresponds, and establishes Map Searching table, can be by appointing in two images to be fused by searching for Map Searching table Piece image carries out colour change, so as to obtain the image after colour switching so that two images to be fused can be entire Color blend is carried out on color space, effectively reduces the difference of color between image, the two images larger to color distortion also can Preferable syncretizing effect is played, and then improves the quality of image mosaic.
Description of the drawings
It in order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is this hair Some bright embodiments, for those of ordinary skill in the art, without having to pay creative labor, can be with Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the color of image fusion method flow chart that the embodiment of the present invention one provides;
Fig. 2 is color of image fusion method flow chart provided by Embodiment 2 of the present invention;
Fig. 3 is the color of image fusion method flow chart that the embodiment of the present invention three provides;
Fig. 4 is the color of image fusion method flow chart that the embodiment of the present invention four provides.
Specific embodiment
To make the purpose, technical scheme and advantage of the embodiment of the present invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, the technical solution in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is Part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art All other embodiments obtained without creative efforts belong to the scope of protection of the invention.
Fig. 1 is the color of image fusion method flow chart that the embodiment of the present invention one provides.As shown in Figure 1, the present embodiment carries The color of image fusion method of confession, including:
S101:Obtain two images A and B.
Specifically, have from what different time, different visual angles or different sensors obtained in the different images of lap Obtain two images A and B.
S102:Obtain the overlapping region overlap_a and overlap_b of image A and B.
Specifically, the overlapping region overlap_a and overlap_b of image A and B are obtained by image registration.Image is matched somebody with somebody Standard referred to using certain matching strategy, finds out template in image to be spliced or characteristic point corresponding position in a reference image It puts.
S103:The overlapping region overlap_a and overlap_b of image A and B are respectively divided at least two size phases Deng region.
Specifically, the overlapping region overlap_a of image A and B are divided into multiple equal-sized zonules, by image The overlapping region overlap_b of A and B is also divided into multiple equal-sized zonules corresponding with overlap_a divisions region. For example, each region of overlapping region overlap_a divisions includes 4 pixels, is respectively labeled as Aa、Ab、Ac、Ad, then Each region of overlapping region overlap_b divisions includes 4 pixels corresponding with overlapping region overlap_a, marks respectively It is denoted as Ba、Bb、Bc、Bd
It should be noted that the overlapping region overlap_a and overlap_b of image A and B are respectively divided into multiple big During small equal region, it may differ and surely just be divided into so multiple equal-sized regions, at this point, then by image A and B Subregion is divided into multiple equal-sized areas from top to bottom from left to right respectively by overlapping region overlap_a and overlap_b Domain, the region not enough divided is just using remaining part as a division region.
S104:Calculate the pixel average of each area pixel in overlapping region overlap_a and overlap_b.
Wherein, the pixel average of the multiple pixels in each region is calculated according to formula of averaging.Specifically, weight The pixel average of each area pixel in folded region overlap_a can be obtained by the following formula:
In formula (1),Represent the average value of each area pixel point in the overlap_a of overlapping region, a1, a2, an The pixel value of each pixel in each region is represented respectively, and n represents the number of pixel in each region.For example, weight Each region of folded region overlap_a divisions includes 4 pixel Aa、Ab、Ac、Ad, it is assumed that Aa、Ab、Ac、AdPixel value point Not Wei 100,110,108,118, then the corresponding pixel value of area pixel point is 109 in the overlap_a of overlapping region, specific to count It is:(100+110+108+118)/4=109.
It should be noted that the value of each pixel between 0 to 255 (including 0 and 255).If it is calculated The average value of pixel is not integer, then using the value round being calculated as the average value of pixel.Citing comes Say, if the average value for the pixel being calculated be 109.3, using 109 as pixel average value.If it is calculated Pixel average value for 109.7, then using 110 as pixel average value.
Similarly, the calculation formula and meter that the pixel average of each area pixel uses in the overlap_b of overlapping region Calculation method is identical with the pixel average of each area pixel in the overlap_a of overlapping region, and the present embodiment is herein no longer It repeats.For example, each region of overlapping region overlap_b divisions includes 4 pixel Ba、Bb、Bc、Bd, it is assumed that Ba、 Bb、Bc、BdIt is and pixel Aa、Ab、Ac、AdCorresponding pixel, pixel Ba、Bb、Bc、BdPixel value be respectively 90, 118th, 120,116, then the corresponding pixel value of area pixel point is 111 in the overlap_b of overlapping region, is specifically calculated as:(90 + 118+120+116)/4=111.
S105:The pixel average one of each area pixel in overlapping region overlap_a and overlap_b is a pair of Should, establish Map Searching table.
Specifically, by the pixel average of pixel in each area in the overlapping region overlap_a being calculated with Pixel in the overlapping region overlap_b being calculated in each area with corresponding pixel in overlap_a is averaged Value corresponds, and will be opposite in the pixel average of pixel in each area in obtained overlap_a and overlap_b The one-to-one relationship of the pixel average for the pixel answered is established as the pixel value Map Searching table of an all pixels point. That is, Map Searching table includes the pixel value of all pixels point in the overlapping region overlap_a and overlap_b of image A and B Correspondence.For example, 4, a region pixel A in the overlapping region overlap_a being calculateda、Ab、Ac、Ad's Pixel average is 109,4, a region pixel B in the overlapping region overlap_b being calculateda、Bb、Bc、BdPicture Plain average value is 111, and assumes Ba、Bb、Bc、BdIt is and pixel Aa、Ab、Ac、AdCorresponding pixel, then by overlapping region The pixel average 109 of pixel and the pixel average 111 of pixel in the overlap_b of overlapping region are right in overlap_a Should, labeled as 109:111, by correspondence 109:111 establish in Map Searching table.
It should be noted that above-mentioned S101 to S104 is carried out in RGB color, and S105 is in YUV colors What space carried out.The component of RGB color and brightness are closely related, as long as brightness changes, tri- components of R, G, B all can be therewith Correspondingly change.So RGB color is suitable for display system, image procossing is but not suitable for.YUV color spaces are The color space of color is described by luma-chroma, luminance signal Y and carrier chrominance signal U, V are separated, thus in color YUV color spaces are better than other color spaces in fusion process.RGB color is converted into YUV colors in the present embodiment Space is with identical in the prior art, and details are not described herein for the present embodiment.
S106:By Map Searching table in image A and B all pixels point in piece image being appointed to convert, obtain Image A' or B' after restaining.
Specifically, by the Map Searching table of foundation, it can be to appointing all pixels point in piece image in image A and B It is converted.That is, can splice image A on image B, all pixels point in image A is converted, is obtained again Image A' after coloring splices the image A' after coloring on image B, and image A and image B is merged.It can also be by image B Splicing converts all pixels point in image B, the image B' after being restained, after coloring on image A Image B' splices on image A, and image B and image A is merged.As an example it is assumed that there are four pairs of mappings to close in Map Searching table System is respectively 90:110、100:106、110:112、112:118, if by image A splicings on image B, it can will be in image A The pixel value 90,100,110,112 of each pixel is respectively converted into 110,106,112,118, the image A' after being converted, The pixel value of each pixel is respectively 110,106,112,118 in image A' at this time, and the image A' after coloring is spliced to figure As on B.By image B be spliced on image A with by image A splicing on image B it is similar, details are not described herein for the present embodiment.
Color of image fusion method provided in this embodiment, by the way that the overlapping region of two images to be fused is divided into Multiple equal-sized zonules calculate the pixel average of each zonule pixel, by the picture of each zonule pixel Plain average value corresponds, and establishes Map Searching table, can will be in two images to be fused by searching for Map Searching table Piece image is appointed to carry out colour change, so as to obtain the image after colour switching so that two images to be fused can be whole Color blend is carried out on a color space, the difference of color effectively between reduction image, the two images larger to color distortion, Preferable syncretizing effect can be played, and then improves the quality of image mosaic.
Fig. 2 is color of image fusion method flow chart provided by Embodiment 2 of the present invention.When by the overlay region of image A and B When domain overlap_a and overlap_b are respectively divided at least two equal-sized regions, it is to draw to also have a kind of special circumstances Each the equal-sized region divided is a pixel, can use the color of image fusion method of the present embodiment at this time. As shown in Fig. 2, color of image fusion method provided in this embodiment, including:
S201:Obtain two images A and B.
S202:Obtain the overlapping region overlap_a and overlap_b of image A and B.
S203:Calculate the pixel value of each pixel in overlapping region overlap_a and overlap_b.
Specifically, the overlapping region overlap_a and overlap_b of image A and B are respectively divided at least two sizes Equal region, including:Each equal-sized region is a pixel.
S204:The pixel value of each pixel in overlapping region overlap_a and overlap_b is corresponded, is built Vertical Map Searching table.
Specifically, obtaining each pixel in the overlap_a of overlapping region, the pixel of each pixel is calculated Value.Each pixel in the overlap_b of overlapping region is obtained, calculates the pixel value of each pixel.Obtain overlapping region Each pixel corresponding pixel in the overlap_b of overlapping region in overlap_a, and by overlapping region overlap_ The pixel value of pixel and the pixel value of corresponding pixel points in the overlap_b of overlapping region correspond in a, establish Map Searching Table.For example, a pixel a in the overlap_a of overlapping region is acquired, pixel value 100 acquires overlapping Pixel b corresponding with pixel a in the overlap_b of region, pixel value 110, by the pixel value 100 and picture of pixel a The pixel value 110 of vegetarian refreshments b corresponds to, labeled as 100:110, by correspondence 100:110 establish in Map Searching table.
S205:By Map Searching table in image A and B all pixels point in piece image being appointed to convert, obtain Image A' or B' after restaining.
Color of image fusion method provided in this embodiment, by by each picture in the overlapping region of two images to be fused The pixel value of vegetarian refreshments corresponds, and establishes Map Searching table, can be by two images to be fused by searching for Map Searching table In appoint a piece image carry out colour change, so as to obtain the image after colour switching so that two images to be fused can Color blend is carried out on entire color space, effectively reduces the difference of color between image, the two width figures larger to color distortion Picture can also play preferable syncretizing effect, and then improve the quality of image mosaic.
Fig. 3 is the color of image fusion method flow chart that the embodiment of the present invention three provides.When by overlapping region overlap_a When being corresponded with the pixel value of each pixel in overlap_b, it is overlapping region to also have a kind of special circumstances In overlap_a the pixel value of the pixel value of pixel corresponding pixel in overlap_b at least there are two, at this time Using color of image fusion method flow chart provided in this embodiment.As shown in figure 3, Fig. 3 is in Fig. 1 and embodiment illustrated in fig. 2 On the basis of, further include:
S301:If the picture of the pixel value of pixel corresponding pixel in overlap_b in the overlap_a of overlapping region Element value at least there are two.
Specifically, in the overlap_a of overlapping region the pixel value of pixel corresponding pixel in overlap_b picture Plain value may have multiple.For example, the pixel a that pixel value is 100 in the overlap_a of overlapping region corresponds to overlay region respectively Pixel value is 90,110,120 pixel b in the overlap_b of domain.
S302:Then at least two pixel values of corresponding pixel in overlap_b are sorted by size, take intermediate value Pixel value as corresponding pixel points in overlap_a.
S303:Corresponding pixel points in the pixel value and overlap_b of pixel in overlap_a are taken into intermediate be worth to Pixel value correspond, establish Map Searching table.
Specifically, if the number of the pixel value of corresponding pixel is odd number in overlap_b, by pixel value by size Arrangement directly takes pixel value of the intermediate value as corresponding pixel points in overlap_a.If corresponding pixel in overlap_b The number of the pixel value of point is even number, then pixel value is sized, and takes the average value conduct of two intermediate values The pixel value of corresponding pixel points in overlap_a.For example, if pixel value is 100 pixel in the overlap_a of overlapping region Point a corresponds to the pixel b that pixel value in the overlap_b of overlapping region is 90,110,120 respectively, then takes 110 as overlap_ The pixel value of corresponding pixel points in a, i.e., by respective pixel in the pixel value 100 of pixel a in overlap_a and overlap_b The pixel value 110 of point b corresponds to, and establishes Map Searching table.If a points of the pixel that pixel value is 100 in the overlap_a of overlapping region The pixel b that pixel value in the overlap_b of overlapping region is 90,100,110,120 is not corresponded to, then takes (100+110)/2=105 As the pixel value of corresponding pixel points in overlap_a, i.e., by the pixel value 100 and overlap_ of pixel a in overlap_a The pixel value 105 of corresponding pixel points b corresponds in b, establishes Map Searching table.
It can be with it should be noted that at least two pixel values of corresponding pixel in overlap_b are sorted by size It arranges, can also arrange from small to large from big to small, the present embodiment is herein without limiting.
Color of image fusion method provided by the embodiment, on the basis of above-described embodiment, as overlapping region overlap_a The pixel value of the pixel value of middle pixel corresponding pixel in overlap_b at least there are two when, will be right in overlap_b At least two pixel values of the pixel answered sort by size, and take picture of the intermediate value as corresponding pixel points in overlap_a Corresponding pixel points in the pixel value and overlap_b of pixel in overlap_a, are taken the intermediate pixel value being worth to by element value It corresponds, establishes Map Searching table.The foundation of Map Searching table is optimized, reduces error, larger to color distortion two Width image plays preferable syncretizing effect so that color of image fusion method adaptability is wider.
Fig. 4 is the color of image fusion method flow chart that the embodiment of the present invention four provides.When by overlapping region overlap_a When being corresponded with the pixel value of each pixel in overlap_b, it is overlapping region to also have a kind of special circumstances The pixel value number of the pixel value of pixel corresponding pixel in overlap_b is less than or equal to 500 or not in overlap_a It can be divided exactly by 15, color of image fusion method flow chart provided in this embodiment can be used at this time.As shown in figure 4, Fig. 4 be Fig. 1 is further further included on the basis of embodiment illustrated in fig. 3:
S401:If the picture of the pixel value of pixel corresponding pixel in overlap_b in the overlap_a of overlapping region Plain value number is less than or equal to 500 or cannot be divided exactly by 15.
Specifically, some pixel value in the overlap_a of overlapping region is unsatisfactory for the corresponding pixel points in overlap_b Number more than 500 condition or overlapping region overlap_a in some pixel value be unsatisfactory for it is right in overlap_b The condition that the pixel value of pixel is answered to be divided exactly by 15.For example, the pixel that pixel value is 100 in the overlap_a of overlapping region Point a corresponding pixel numbers in overlap_b only have 20.Or pixel value is 100 in the overlap_a of overlapping region The pixel value of pixel a corresponding pixels in overlap_b is 91.
It should be noted that 500 and 15 be one of threshold value and reference value respectively, the present embodiment is not limited merely to 500 and 15.
S402:The pixel value of corresponding pixel in overlap_b is then made into linear interpolation, the value that linear interpolation is obtained Pixel value as corresponding pixel points in overlap_a.
S403:By the pixel value of pixel in overlap_a pixel linear interpolation method corresponding with overlap_b Obtained pixel value corresponds, and establishes Map Searching table.
Specifically, if to be unsatisfactory for the corresponding pixel points number in overlap_b big for some pixel value in overlap_a In the condition that 500 or pixel value can be divided exactly by 15, then the pixel value of the pixel in the overlap_a of overlapping region is utilized into line The method of property interpolation obtains.For example, pixel of the pixel value between 15 to 30 is unsatisfactory for item in the overlap_a of overlapping region Part (not including pixel value 15 and 30), if the respective pixel in overlap_b of pixel value 15 and 30 in the overlap_a of overlapping region The pixel value of point is respectively 30 and 60, then corresponds to overlap_b by the pixel value between 15 to 30 in overlap_a is proportional In pixel value between 30 to 60.That is, pixel value 15 corresponds to pixel value 30, overlap_ in overlap_b in overlap_a Pixel value 16 corresponds to pixel value 32 in overlap_b in a, and pixel value 17 corresponds to pixel value in overlap_b in overlap_a Pixel value corresponds to pixel value 36 in overlap_b in 34, overlap_a, and pixel value is corresponded in overlap_b in overlap_a Pixel value 38, and so on.
It should be noted that if the pixel value obtained using the method for linear interpolation is not integer, will be calculated The pixel value that is obtained as the method for linear interpolation of value round.For example, if the picture that linear interpolation obtains Element value is 32.3, then using 32 as pixel value.If the pixel value that linear interpolation obtains is 32.7, using 33 as pixel value.
Color of image fusion method provided by the embodiment, on the basis of above-described embodiment, as overlapping region overlap_a The pixel value number of the pixel value of middle pixel corresponding pixel in overlap_b is less than or equal to 500 or cannot be whole by 15 Except when, the pixel value of corresponding pixel in overlap_b is made into linear interpolation, using the value that linear interpolation obtains as The pixel value of corresponding pixel points in overlap_a, the pixel value of pixel in overlap_a is corresponding with overlap_b The pixel value that pixel linear interpolation method obtains corresponds, and establishes Map Searching table, further optimizes Map Searching table Foundation, reduce error, the two images larger to color distortion play preferable syncretizing effect so that color of image merge Method applicability is wider.
In an embodiment of the present invention, the overlapping region overlap_a and overlap_b of image A and B are obtained, including:
The overlapping region of image A and B is found out in image registration.
Image A is consistent with the overlapping region size adjustment of B.
Obtain overlapping region overlap_a and overlap_b after image A and B are sized.
In an embodiment of the present invention, after two images A and B is obtained, the overlapping region of image A and B is obtained Before overlap_a and overlap_b, further include:
Image A and B are pre-processed, the interference signal in removal image A and B.
Specifically, image A and B are pre-processed in the present embodiment, the principle of the interference signal in removal image A and B Same as the prior art with method, the present embodiment is herein without repeating.
Finally it should be noted that:The above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;To the greatest extent Pipe is described in detail the present invention with reference to foregoing embodiments, it will be understood by those of ordinary skill in the art that:Its according to Can so modify to the technical solution recorded in foregoing embodiments either to which part or all technical characteristic into Row equivalent substitution;And these modifications or replacement, the essence of corresponding technical solution is not made to depart from various embodiments of the present invention technology The scope of scheme.

Claims (3)

1. a kind of color of image fusion method, which is characterized in that including:
Obtain two images A and B;
Obtain the overlapping region overlap_a and overlap_b of image A and B;
The overlapping region overlap_a and overlap_b of image A and B are respectively divided at least two equal-sized regions;
Calculate the pixel average of each area pixel in overlapping region overlap_a and overlap_b;
The pixel average of each area pixel in overlapping region overlap_a and overlap_b is corresponded, foundation is reflected Penetrate look-up table;
By the Map Searching table to all pixels point in piece image being appointed to convert in image A and B, obtain again Image A' or B' after color;
It is described that the overlapping region overlap_a and overlap_b of image A and B are respectively divided at least two is equal-sized Region, including:
Each described equal-sized region is a pixel;
The pixel average for calculating each area pixel in overlapping region overlap_a and overlap_b, including:
Calculate the pixel value of each pixel in overlapping region overlap_a and overlap_b;
The pixel average of each area pixel corresponds in the overlap_a and overlap_b by overlapping region, builds Vertical Map Searching table, including:
The pixel value of each pixel in overlapping region overlap_a and overlap_b is corresponded, establishes Map Searching Table;
The pixel value of each pixel corresponds in the overlap_a and overlap_b by overlapping region, establishes mapping Look-up table, including:
If the pixel value of pixel value corresponding pixel in overlap_b of pixel in the overlapping region overlap_a Number is less than or equal to 500 or cannot be divided exactly by 15, then the pixel value of corresponding pixel in the overlap_b is made linear insert Value, will be in the overlap_a using the value that linear interpolation obtains as the pixel value of corresponding pixel points in the overlap_a The pixel value that the pixel value of pixel pixel linear interpolation method corresponding with the overlap_b obtains corresponds, Establish Map Searching table.
2. color of image fusion method according to claim 1, which is characterized in that described to obtain the overlay region of image A and B Domain overlap_a and overlap_b, including:
The overlapping region of image A and B is found out in image registration;
Image A is consistent with the overlapping region size adjustment of B;
Obtain overlapping region overlap_a and overlap_b after image A and B are sized.
3. color of image fusion method according to claim 1 or 2, which is characterized in that it is described acquisition two images A and After B, before obtaining the overlapping region overlap_a and overlap_b of image A and B, further include:
Described image A and B are pre-processed, the interference signal in removal described image A and B.
CN201510271603.XA 2015-05-25 2015-05-25 Color of image fusion method Active CN104933671B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510271603.XA CN104933671B (en) 2015-05-25 2015-05-25 Color of image fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510271603.XA CN104933671B (en) 2015-05-25 2015-05-25 Color of image fusion method

Publications (2)

Publication Number Publication Date
CN104933671A CN104933671A (en) 2015-09-23
CN104933671B true CN104933671B (en) 2018-05-25

Family

ID=54120826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510271603.XA Active CN104933671B (en) 2015-05-25 2015-05-25 Color of image fusion method

Country Status (1)

Country Link
CN (1) CN104933671B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107203986A (en) * 2017-05-26 2017-09-26 努比亚技术有限公司 A kind of image interfusion method, device and computer-readable recording medium
CN113096043B (en) * 2021-04-09 2023-02-17 杭州睿胜软件有限公司 Image processing method and device, electronic device and storage medium
CN113327193A (en) * 2021-05-27 2021-08-31 北京百度网讯科技有限公司 Image processing method, image processing apparatus, electronic device, and medium
CN113570537B (en) * 2021-09-26 2022-02-08 熵基科技股份有限公司 Security check image fusion method and device, storage medium and computer equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020938A (en) * 2012-12-14 2013-04-03 北京经纬恒润科技有限公司 Method and system for stitching spatial domain images based on weighted average method
CN103279939A (en) * 2013-04-27 2013-09-04 北京工业大学 Image stitching processing system
CN103729834A (en) * 2013-12-23 2014-04-16 西安华海盈泰医疗信息技术有限公司 Self-adaptation splicing method and system of X-ray images
CN103778599A (en) * 2012-10-23 2014-05-07 浙江大华技术股份有限公司 Image processing method and system thereof
CN103945087A (en) * 2013-01-22 2014-07-23 深圳市腾讯计算机***有限公司 Image tonal adjusting method and adjusting apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7260258B2 (en) * 2003-06-12 2007-08-21 Fuji Xerox Co., Ltd. Methods for multisource color normalization

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103778599A (en) * 2012-10-23 2014-05-07 浙江大华技术股份有限公司 Image processing method and system thereof
CN103020938A (en) * 2012-12-14 2013-04-03 北京经纬恒润科技有限公司 Method and system for stitching spatial domain images based on weighted average method
CN103945087A (en) * 2013-01-22 2014-07-23 深圳市腾讯计算机***有限公司 Image tonal adjusting method and adjusting apparatus
CN103279939A (en) * 2013-04-27 2013-09-04 北京工业大学 Image stitching processing system
CN103729834A (en) * 2013-12-23 2014-04-16 西安华海盈泰医疗信息技术有限公司 Self-adaptation splicing method and system of X-ray images

Also Published As

Publication number Publication date
CN104933671A (en) 2015-09-23

Similar Documents

Publication Publication Date Title
CN104933671B (en) Color of image fusion method
US7474343B2 (en) Image processing apparatus having a digital image processing section including enhancement of edges in an image
CN104599636B (en) LED display bright chroma bearing calibration and bright chroma correction coefficient generating means
US8526719B2 (en) Method of converting color image into grayscale image and recording medium storing program for performing the same
US20060103615A1 (en) Color display
EP1801752A3 (en) Image processing circuit and image processing method for correcting color pixel defects
US20100166305A1 (en) Method for detecting and correcting chromatic aberration, and apparatus and method for processing image using the same
EP2493204B1 (en) Stereoscopic image registration and color balance evaluation display
CN106506900B (en) For black and white conversion image processing apparatus and have its image forming apparatus
CN109300085B (en) Image stitching method and device, electronic equipment and storage medium
CN110784623A (en) Gamut mapping using luminance mapping also based on the luminance of cusp colors
US20090256928A1 (en) Method and device for detecting color temperature
TWI544785B (en) Image downsampling apparatus and method
JP4023518B2 (en) Color correction method and color correction apparatus
CN102946501B (en) Color distortion correction method and device in imaging system or image output system
KR20090097796A (en) Method for correcting chromatic aberration
TWI320912B (en)
CN103313066B (en) Interpolation method and device
JP4164944B2 (en) Color gamut compression apparatus and color gamut compression method
US20080316343A1 (en) Method and Apparatus For Allowing Access to Individual Memory
CN106791736B (en) Trapezoidal correction method and projector
JP2007324665A (en) Image correction apparatus and video display apparatus
CN103379253A (en) Color processing apparatus and method
KR100408508B1 (en) Method and apparatus for processing color, signal using color difference plane separation
CN105359517A (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant