CN103974043B - Image processor and image treatment method - Google Patents

Image processor and image treatment method Download PDF

Info

Publication number
CN103974043B
CN103974043B CN201310027214.3A CN201310027214A CN103974043B CN 103974043 B CN103974043 B CN 103974043B CN 201310027214 A CN201310027214 A CN 201310027214A CN 103974043 B CN103974043 B CN 103974043B
Authority
CN
China
Prior art keywords
object pixel
heterochromia
edge
colouring information
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310027214.3A
Other languages
Chinese (zh)
Other versions
CN103974043A (en
Inventor
陈世泽
黄文聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realtek Semiconductor Corp
Original Assignee
Realtek Semiconductor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realtek Semiconductor Corp filed Critical Realtek Semiconductor Corp
Priority to CN201310027214.3A priority Critical patent/CN103974043B/en
Publication of CN103974043A publication Critical patent/CN103974043A/en
Application granted granted Critical
Publication of CN103974043B publication Critical patent/CN103974043B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a kind of image processor and image treatment method, this device includes: capture an image data from one frame buffer; For the object pixel in this image data, according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, four the second colouring informations on left and right; According to four the second colouring informations of this object pixel calculate this object pixel up, below, four heterochromia Grad on left and right; Four heterochromia Grad according to this object pixel decide the Edge texture feature of this object pixel; And decide the edge indicator of this object pixel according to the Edge texture feature of this object pixel, and table look-up according to this and determine the follow-up weight of carrying out interpolating operations and using.

Description

Image processor and image treatment method
Technical field
The present invention has about a kind of image treatment method, and is particularly to a kind of image data image sensor captured and carries out the method for color interpolation and relevant image processor.
Background technology
At digital camera, video cameras, multimedia handset, supervisory control system, picture telephone ... carry out in the consumption electronic products of image capture etc. the single image sensor of every use (Singlesensor), its sensor can by covering bayer color filter array (Bayercolorfilterarray, BayerCFA) in single image sensor, redness, green and blue color information is recorded, wherein, this single sensor only can record a kind of colouring intensity value to reduce manufacturing cost at each location of pixels.In addition, in order to by single image sensor the raw bayer mosaic image (RawBayerCFAImage) that captures be reduced into full-color image, need to carry out color interpolation process to obtain lost color data.Because color interpolation algorithm contains the analysis for presentation content structure and color, for the last image quality exported, there is quite critical impact.
On IC design ap-plication; the operation of color interpolation can take more memorizer buffer space (linebuffer) usually; and computation complexity is also higher; therefore, the color interpolation method how effectively avoiding again boundary of object and texture region in image to produce the distortion situations such as slide fastener shape (zippereffects), color overlap (coloraliasing), moir patterns (Moirepattern) and flaw look (falsecolor) under these manufacturing costs of minimizing is the Research Emphasis between same domain always.In addition, in No. I274908th, TaiWan, China patent announcement, disclose utilization precalculates least squares error (MinimumSquareError, MSE) to estimate contiguous for adopting the weighted value of color pixel value; And another technology is as TaiWan, China patent announcement No. I296482 image edge detected on object pixel on multiple interior direction interpolation, and produce multiple image edge Grad, the hue component of object pixel loss is then estimated with a random Dynamic gene normalization summation.But above-mentioned technology needs to use larger memorizer buffer space and considerable division arithmetic, can manufacturing cost be improved like this and relative be not easy at hardware implementing, therefore reducing technological applicability degree.In addition, above-mentioned color interpolation method, because consider too much directional information, therefore, easily produces blurring effect at the imagery zone of sharpened edge and fine structure and rebuilds the situation of distortion, thus reduce image quality.
Summary of the invention
Therefore, an object of the present invention is to provide a kind of image treatment method that can produce high-quality image, and does not also need to use expensive one-tenth division arithmetic and extra buffer memory size, to solve the problems of the prior art in the implementation of hardware.
According to one embodiment of the invention, a kind of image processor includes an initial interpolation unit, one heterochromia Gradient estimates unit, one edge textural characteristics determining means and an edge indicator record cell, wherein this initial interpolation unit is in order to capture an image data from one frame buffer, each pixel wherein in this image data only has a kind of colouring information, and for the object pixel in this image data, one first colouring information of this initial interpolation unit according to this object pixel itself and the colouring information of neighborhood pixels, to estimate this object pixel up, below, four the second colouring informations on left and right, wherein this color corresponding to the first colouring information is different from the color corresponding to four the second colouring informations, this heterochromia Gradient estimates unit is coupled to this initial interpolation unit, and in order to calculate according to four the second colouring informations of this object pixel this object pixel up, below, four heterochromia Grad on left and right, this Edge texture characteristics determined unit is coupled to this heterochromia Gradient estimates unit, and in order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel, and this edge indicator record cell is coupled to this Edge texture characteristics determined unit, and in order to the bit value of this first colouring information of determining whether revising this object pixel being stored in this frame buffer according to the Edge texture feature of this object pixel.
According to another embodiment of the present invention, a kind of image treatment method includes: capture an image data from one frame buffer, and each pixel wherein in this image data only has a kind of colouring information; For the object pixel in this image data, according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, four the second colouring informations on left and right, wherein this color corresponding to the first colouring information is different from the color corresponding to four the second colouring informations; According to four the second colouring informations of this object pixel calculate this object pixel up, below, four heterochromia Grad on left and right; In order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel; And determine whether revising the bit value of this first colouring information of this object pixel being stored in this frame buffer according to the Edge texture feature of this object pixel.
According to another embodiment of the present invention, a kind of image processor, include an initial interpolation unit, one heterochromia Gradient estimates unit, one edge textural characteristics determining means, one Dynamic Weights quantizing distribution unit and a Weighted Interpolation unit, wherein this initial interpolation unit is in order to capture an image data from one frame buffer, each pixel wherein in this image data only has a kind of colouring information, and for the object pixel in this image data, one first colouring information of this initial interpolation unit according to this object pixel itself and the colouring information of neighborhood pixels, to estimate this object pixel up, below, four the second colouring informations on left and right, wherein this color corresponding to the first colouring information is different from the color corresponding to four the second colouring informations, this heterochromia Gradient estimates unit is coupled to this initial interpolation unit, and in order to calculate according to four the second colouring informations of this object pixel this object pixel up, below, four heterochromia Grad on left and right, this Edge texture characteristics determined unit, is coupled to this heterochromia Gradient estimates unit, and in order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel, this Dynamic Weights quantizing distribution unit is coupled to this heterochromia Gradient estimates unit and this Edge texture characteristics determined unit, and be used for according to four heterochromia Grad of this object pixel and the Edge texture feature of this object pixel, and comparison list is used to decide multiple weight, and this Weighted Interpolation unit is coupled to this initial interpolation unit and this Dynamic Weights quantizing distribution unit, wherein this Weighted Interpolation unit uses the plurality of weight at least two the second colouring informations in four of this object pixel the second colouring informations to be weighted addition, to obtain target second colouring information of this object pixel.
According to another embodiment of the present invention, a kind of image treatment method, includes: capture an image data from one frame buffer, and each pixel wherein in this image data only has a kind of colouring information; For the object pixel in this image data, one first colouring information of this initial interpolation unit according to this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, four the second colouring informations on left and right, wherein this color corresponding to the first colouring information is different from the color corresponding to four the second colouring informations; According to four the second colouring informations of this object pixel calculate this object pixel up, below, four heterochromia Grad on left and right; Four heterochromia Grad according to this object pixel decide the Edge texture feature of this object pixel; According to four heterochromia Grad of this object pixel and the Edge texture feature of this object pixel, and comparison list is used to decide multiple weight; And use the plurality of weight at least two the second colouring informations in four of this object pixel the second colouring informations to be weighted addition, to obtain target second colouring information of this object pixel.
Accompanying drawing explanation
Fig. 1 is the schematic diagram of the image processor according to one embodiment of the invention.
Fig. 2 is the flow chart of the image treatment method according to one embodiment of the invention.
Fig. 3 is the schematic diagram of Bel's mosaic image.
Fig. 4 A is the schematic diagram using top shade to calculate top heterochromia Grad.
Fig. 4 B is the schematic diagram using below shade to calculate below heterochromia Grad.
Fig. 4 C uses left shade to calculate the schematic diagram of left heterochromia Grad.
Fig. 4 D uses right shade to calculate the schematic diagram of right heterochromia Grad.
Fig. 5 is the schematic diagram obtaining edge indicator corresponding in neighborhood pixels.
Fig. 6 is the schematic diagram of the computer readable media according to one embodiment of the invention.
Wherein, description of reference numerals is as follows:
100: image processor
102: image sensor
104: initial interpolation unit
106: heterochromia Gradient estimates unit
108: Edge texture characteristics determined unit
110: edge indicator record cell
112: Dynamic Weights quantizing distribution unit
114: Weighted Interpolation unit
120: frame buffer
200 ~ 216: step
402: top shade
404: below shade
406: left shade
408: right shade
600: host computer
610: processor
620: computer readable media
622: computer program
Embodiment
Some vocabulary is employed to censure specific element in the middle of specification and follow-up claim.Person with usual knowledge in their respective areas should understand, and hardware manufacturer may call same element with different nouns.This specification and follow-up claim are not used as the mode of distinguish one element from another with the difference of title, but are used as the criterion of differentiation with element difference functionally." comprising " mentioned in the middle of specification and follow-up claims is in the whole text an open term, therefore should be construed to " comprise but be not limited to ".In addition, " couple " word comprise directly any at this and be indirectly electrically connected means, therefore, if describe a first device in literary composition to be coupled to one second device, then represent this first device and directly can be electrically connected in this second device, or be indirectly electrically connected to this second device through other devices or connection means.
Please refer to Fig. 1, Fig. 1 is the schematic diagram of the image processor 100 according to one embodiment of the invention.As shown in Figure 1, image processor 100 is coupled to an image sensor 102 and one frame buffer (framebuffer) 120, and include an initial interpolation unit 104, one heterochromia Gradient estimates unit 106, one edge textural characteristics determining means 108, one edge indicator record cell 110, one Dynamic Weights quantizing distribution unit 112 and a Weighted Interpolation unit 114, wherein, image sensor 102 is a single image sensor, and it is covered with bayer color filter array, also it is (red that each pixel that namely image sensor 102 captures only has a kind of colouring information, blue or green).In addition, image processor 100 can be applicable to the electronic installation of the single image sensor of any use, such as digital camera, video cameras, multimedia handset, supervisory control system, picture telephone etc.
Please also refer to Fig. 1 and Fig. 2, wherein Fig. 2 is the flow chart of the image treatment method according to one embodiment of the invention.With reference to figure 2, flow process is described below.
In step 200, flow process starts, and a produced image data is temporarily stored in frame buffer 120 by image sensor 102, wherein this image data is Bel's mosaic image, a part of data in this image data can be as shown in Figure 3, each pixel only has a color data, is also that the pixel indicating " R " in Fig. 3 represents the colouring information that this pixel only has redness, and has lost green and blue colouring information; The pixel indicating " G " represents this pixel only viridescent colouring information, and has lost red and blue colouring information; And the pixel indicating " B " represents the colouring information that this pixel only has blueness, and lose green and red colouring information.
In step 202., initial interpolation unit 104 carries out initial interpolation to this image data received, also on red pixel and blue pixel, namely estimate the green color information of four direction, and on green pixel, estimate redness and the Blue information of four direction.Specifically, with reference to the partial image data shown in figure 3, suppose that i and j represents the position of current column and row respectively, current object pixel to be processed is (i, j), and conveniently represents, following content all represents original pixels color with c (i, j), with represent the pixel color estimated, wherein c value can be R, G or B, then on red pixel, estimate green information can obtain via following formula (1.1) ~ (1.4):
Wherein subscript L, R, T, B represents upper and lower, left and right four direction respectively.Then, to after red pixel estimates green information, the estimated green information on red pixel of above-mentioned formula can be utilized further to help the red information estimated on green pixel, and the red information estimated on green pixel can obtain via following formula (2.1) ~ (2.4) (following formula hypothetical target pixel is green pixel G (i, j)):
The green color value of four direction is gone out as initial estimation in blue pixel, and initial estimation goes out the Blue value of four direction on green pixel, above-mentioned two groups of formula can be utilized to obtain in a similar manner (such as the R in above-mentioned formula (2.1) ~ (2.4) being replaced to B).
Should be noted, two groups of above-mentioned formula (1.1) ~ (1.4) and (2.1) ~ (2.4) are only used to the colouring information that the four direction estimating object pixel is described, its detailed computational methods are only an example explanation, and not as restriction of the present invention, as long as the green color information of four direction also namely can be estimated on red pixel and blue pixel, and on green pixel, estimating redness and the Blue information of four direction, above-mentioned formula can be replaced by any other computing formula be applicable to.
Then, in step 204, heterochromia Gradient estimates unit 106 according to the color estimated value calculated in original pixels color and step 202, to calculate the heterochromia Grad of object pixel on four direction.In natural image, pixel color aberration (such as green and red value differences in an object, or green and blue value differences) there is level and smooth characteristic, in other words, heterochromia Grad along edge direction can be less than the direction of bounding edge, and therefore this characteristic can be used to the edge and the textural characteristics that judge object pixel.Specifically, please refer to Fig. 4 A ~ Fig. 4 D, Fig. 4 A ~ Fig. 4 D is respectively and uses top shade, below shade, left shade and right shade to calculate the schematic diagram of heterochromia Grad.For the upper direction shade shown in Fig. 4 A, with represent top heterochromia Grad, can calculate by with under type:
Wherein (p, q) ∈ (i+m, j+n) | m=0 ,-1; N=0, ± 1}, and represent the green of correspondence position and red heterochromia value, represent the green of correspondence position and blue heterochromia value, and can be defined as follows:
If (p, q) is positioned at red pixel location, then if (p, q) is positioned at green pixel position, then if (p, q) is positioned at blue pixel location, then if (p, q) is positioned at green pixel position, then
About the heterochromia Grad in other three directions, it is also below heterochromia Grad left heterochromia Grad and right heterochromia Grad similar above-mentioned formula all can be utilized to try to achieve, usually know that the knowledgeable should be able to read heterochromia Grad above above-mentioned calculating owing to having in this area formula after understand the heterochromia Grad how calculating other three directions, therefore details does not repeat them here.
Above-mentioned have about calculating top heterochromia Grad formula be only an example and illustrate, and not as restriction of the present invention.For example, if need further to strengthen accuracy, simple 1 × 3 low pass filter can be recycled filtering process is carried out to reduce the impact being subject to noise jamming to above-mentioned heterochromia value, or addition is weighted to each the heterochromia value in formula (3) or subtracts each other to obtain top heterochromia Grad
In step 206, Edge texture characteristics determined unit 108 first obtains edge indicator corresponding in neighborhood pixels in frame buffer 120, and wherein indicator system in edge is used to refer to those neighborhood pixels is be positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region (not obvious especially edge feature) respectively.For example, please refer to Fig. 5, Fig. 5 is the schematic diagram obtaining edge indicator corresponding in neighborhood pixels, as shown in Figure 5, suppose that current object pixel to be processed is R (i, j), then the neighborhood pixels obtaining edge indicator required for can comprise Fig. 5 be shown with the pixel of mark triangular form and rhombus, wherein indicating leg-of-mutton is important representative pixels, and indicates the general pixel common for importance of rhombus.In addition, in the present embodiment, the edge indicator system of each pixel stores last bit value of each pixel value in the frame buffer, carries out describing in subsequent step as the account form of edge indicator and recording method again.
Then, in step 208, Edge texture characteristics determined unit 108 is according to the top heterochromia Grad determined in step 204 below heterochromia Grad left heterochromia Grad and right heterochromia Grad and edge indicator corresponding in acquired neighborhood pixels in step 206, judge the Edge texture feature of object pixel, also namely judge that object pixel is positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region (also namely not having especially significantly edge feature).For example, the Edge texture feature of object pixel can be judged by following judgment formula:
If meet following formula (4.1), then judge that object pixel is positioned at sharp keen vertical edge:
α ( Δ CD L ( i , j ) + ΔCD R ( i , j ) + ΔCD CH ( i , j ) × ( 16 - ρ ) / 16 ) >
( ΔCD T ( i , j ) + ΔCD B ( i , j ) + ΔCD CV ( i , j ) × ρ / 16 ) . . . . . . ( 4.1 )
If meet following formula (4.2), then judge that object pixel is positioned at sharp keen vertical edge:
( &Delta;CD L ( i , j ) + &Delta;CD R ( i , j ) + &Delta;CD CH ( i , j ) &times; ( 16 - &rho; ) / 16 ) <
&alpha; ( &Delta;CD T ( i , j ) + &Delta;CD B ( i , j ) + &Delta;CD CV ( i , j ) &times; &rho; / 16 ) . . . . . . ( 4.2 )
And if formula (4.1) and (4.2) all do not meet, then judge that object pixel is positioned at texture region, wherein in above-mentioned formula (4.1) with (4.2), wherein α is a zoom factor, be used for controlling vertical and horizontal limit in image with and the set sizes of textural characteristics &Delta;CD CV ( i , j ) = | G &OverBar; T ( i , j ) - R ( i , j ) - G &OverBar; B ( i , j ) - R ( i , j ) | , &Delta;CD CH ( i , j ) = | G &OverBar; L ( i , j ) - R ( i , j ) - G &OverBar; R ( i , j ) - R ( i , j ) | , Above-mentioned formula can simplify and is revised as &Delta;CD CV ( i , j ) = | G &OverBar; T ( i , j ) - G &OverBar; B ( i , j ) | , &Delta;CD CH ( i , j ) = | G &OverBar; L ( i , j ) - G &OverBar; R ( i , j ) | , And ρ calculates according to edge indicator corresponding in neighborhood pixels acquired in step 206.
In the calculating of ρ value, edge indicator due to each pixel is last bit value storing each pixel value in the frame buffer, therefore, if suppose that edge indicator " 0 " represents this pixel and is positioned at sharp keen vertical edge, edge indicator " 1 " represents this pixel and is positioned at sharp keen horizontal edge, then with reference to shown in figure 5, if indicate the edge indicator of leg-of-mutton representative neighborhood pixels for " 1 ", then ρ value being added 3(is also ρ=ρ+3), if and the edge indicator indicating the general neighborhood pixels of rhombus is " 1 ", then ρ value being added 1(is also ρ=ρ+1), if and the edge indicator of neighborhood pixels is " 0 ", then ρ is constant.Also namely, ρ value can by cumulative after the edge indicator of the neighborhood pixels shown in Fig. 5 being scaled corresponding numerical value and obtain.
Judge object pixel be positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region after, flow process enters step 210 to determine the edge indicator of object pixel, and the edge indicator of object pixel is also embedded in the pixel value of this object pixel stored by frame buffer by edge indicator record cell 110.Specifically, suppose when object pixel is judged as vertical edges feature, remove to check last bit (LeastSignificantBit, LSB) of the binary values of the object pixel original color value being stored in frame buffer simultaneously, if be not 0, then carry out adding 1 or subtract 1 action; Otherwise, if be just 0, be then left intact.Similarly, when object pixel is judged as horizontal sides feature, the LSB carrying out object pixel original color value checks, if be not 1, then carry out adding 1 or subtract 1 action; Otherwise, if be just 1, be then left intact.With the raw video color pixel values of 10 bit sizes, the change of LSB is also not easy to be discovered by human eye, utilize this way just when not increasing extra buffer storage, to note down when these helpful information provide image feature to judge with reference to (such as can be employed in step 206).
Then, determine object pixel be positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region after, the interpolated reconstruction mode of one of following formula (5.1) ~ (5.3) just can be utilized to reconstruct lost colouring information (shown below on red pixel interpolation go out green color information):
If object pixel is positioned at sharp keen vertical edge, then following formula (5.1) is utilized to reconstruct lost colouring information (target color information):
If object pixel is positioned at horizontal vertical edge, then following formula (5.2) is utilized to reconstruct lost colouring information:
If object pixel is positioned at texture region, then following formula (5.3) is utilized to reconstruct lost colouring information:
In order to determine required each weighted value used in above-mentioned formula (5.1) ~ (5.3), in step 212, Dynamic Weights quantizing distribution unit 112 is according to the top heterochromia Grad calculated in step 204 below heterochromia Grad left heterochromia Grad and right heterochromia Grad the mode quantized to table look-up is to distribute the weights in two or four direction.For example, if object pixel is positioned at sharp keen vertical edge, then table one shown below can be utilized to decide weights W t(i, j) and W b(i, j):
And if object pixel is positioned at sharp keen horizontal edge, then table two shown below can be utilized to decide weights W l(i, j) and W r(i, j):
And if object pixel is positioned at texture region, then table three shown below can be utilized to decide weights W t(i, j), W b(i, j), W l(i, j) and W r(i, j):
Wherein in table three be respectively top heterochromia Grad below heterochromia Grad left heterochromia Grad and right heterochromia Grad in maximum heterochromia Grad, and for secondary high heterochromia Grad.Also namely the magnitude relationship of the heterochromia Grad of the four direction of hypothetical target pixel is respectively &Delta;CD T ( i , j ) > &Delta;CD L ( i , j ) > &Delta;CD B ( i , j ) > &Delta;CD R ( i , j ) , And &Delta;CD L ( i , j ) < = 0.25 * &Delta;CD T ( i , j ) , Then W t(i, j) can be set as (21/32), W l(i, j) can be set as (3/32), and two other weights W b(i, j) and W r(i, j) just distributes remaining (8/32) together.
Should be noted, table one ~ table three is only an example so that the mode determining above-mentioned weight to be described, and not as restriction of the present invention, as long as the weight in each direction can be judged according to heterochromia Grad, the comparison rule in table one ~ table three and the weighted value of quantizing distribution can according to designer need adjust to some extent.
After determining required weight, in step 214, Weighted Interpolation unit 114 one of to utilize in above-mentioned formula (5.1) ~ (5.3) and is weighted interpolation, to reconstruct lost colouring information, after above-mentioned all step process are complete, reconstruct green color information and not only can retain more image detail, and do not need to use any division arithmetic and extra buffer memory size.
In addition, in step 214, after reconstructing the green color information on red pixel (or blue pixel), Weighted Interpolation unit 114 recycle the green color information of rebuilding out reconstruct redness on other neighborhood pixels and blue information, also namely can utilize the green color information of rebuilding out to reconstruct redness on other green pixels or blue information, utilize the green color information of rebuilding out reconstruct blue information on other red pixels, or utilize the green color information of rebuilding out reconstruct red information in other blue pixel.Specifically, if to utilize the green color information of rebuilding out to reconstruct the red color information on other green pixels, following formula (6.1) or (6.2) can be utilized:
If (i, j) is positioned on green red row:
If (i, j) is positioned on bluish-green row:
In addition, if to utilize the green color information of rebuilding out to reconstruct the red information in other blue pixel, following formula (6.3) can be utilized:
In addition, the above is the formula reconstructing red color information in green with blue pixel, because the formula and above-mentioned formula that reconstruct Blue information in green with red pixel are similar, have in this area and usually know that the knowledgeable should be able to understand and how above-mentioned formula is done to reconstruct Blue information after amendment slightly, thus details it will not go into details.
In step 214, reconstruct red color information and Blue information can synchronously process to reduce the time of implementation, and due to step 214 be that the green color information utilizing accuracy higher helps rebuild red color information and Blue information, therefore image quality can be higher, and overall efficiency also can promote.
Finally, in step 216, flow process terminates, and reconstructed colouring information is outputted to the processing unit of rear end by Weighted Interpolation unit 114, after treatment and be shown on a display screen.
In addition, although the image processor 100 shown in Fig. 1 uses hardware circuit to carry out implementation, but the flow process of the image treatment method shown in Fig. 2 also can use software to carry out implementation, and is not limited to use hardware circuit.Specifically, please refer to Fig. 6, one host computer 600 at least includes processor 610 and a computer readable media 620, and wherein computer readable media 620 can be a hard disk or other storage device, and computer readable media 620 stores a computer program 622.When processor 610 performs computer program 622, host computer 600 can perform the step shown in Fig. 2.
In image processor of the present invention and image treatment method, because prior art is usually by the operation using the weights of each edge direction to carry out pixel color interpolation, but these technology not only need to use expensive division arithmetic usually, and also quite high to mode of the trying to achieve complexity of weights, cause the raising that hardware implementation cost is also relative.Therefore, image processor of the present invention and image treatment method under not needing to use division arithmetic and utilize action of quite simply tabling look-up, can reach the advantage using peripheral adjustment method of weighting.In addition, in order to strengthen being easier to human eye the trickle marginal texture treatment efficiency that is concerned about, one embodiment of the invention proposes the method that just can record edge indicator not needing extra buffer storage, by the minimum effective bit (LeastSignificantBit of the original color value of object pixel, LSB) position is changed to the numerical value of edge indicator, so just can in the treatment step determining edge feature direction before reference the edge judged result of neighborhood pixels to strengthen the accuracy judging the rim condition of object pixel.

Claims (24)

1. an image processor, includes:
One initial interpolation unit, in order to capture an image data from one frame buffer, each pixel wherein in this image data only has a kind of colouring information, and for the object pixel in this image data, one first colouring information of this initial interpolation unit according to this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, four the second colouring informations on left and right, wherein this color corresponding to the first colouring information is different from the color corresponding to four the second colouring informations;
One heterochromia Gradient estimates unit, is coupled to this initial interpolation unit, in order to calculate according to four the second colouring informations of this object pixel this object pixel up, below, four heterochromia Grad on left and right;
One edge textural characteristics determining means, is coupled to this heterochromia Gradient estimates unit, in order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel; And
One edge indicator record cell, is coupled to this Edge texture characteristics determined unit, in order to determine whether revising the bit value of this first colouring information of this object pixel being stored in this frame buffer according to the Edge texture feature of this object pixel.
2. image processor as claimed in claim 1, wherein this image data is Bel's mosaic image.
3. image processor as claimed in claim 1, wherein this heterochromia Gradient estimates unit calculates multiple heterochromia value according to this second colouring information correspondence, and calculates heterochromia Grad corresponding to this second colouring information according to described multiple heterochromia value.
4. image processor as claimed in claim 3, wherein this heterochromia Gradient estimates unit also carries out filtering process to described multiple heterochromia value.
5. image processor as claimed in claim 3, is wherein weighted each in described multiple heterochromia value and is added or subtracts each other to obtain heterochromia Grad corresponding to this second colouring information.
6. image processor as claimed in claim 1, wherein this Edge texture characteristics determined unit decides the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel and the edge indicator of at least one neighborhood pixels.
7. image processor as claimed in claim 6, wherein the edge indicator of this at least one neighborhood pixels is last bit value of the colouring information being stored in this at least one neighborhood pixels in this frame buffer.
8. image processor as claimed in claim 1, wherein this edge indicator record cell determines whether revising last bit value of this first colouring information of this object pixel being stored in this frame buffer according to the Edge texture feature of this object pixel.
9. image processor as claimed in claim 1, wherein this Edge texture characteristics determined unit decides this object pixel according to four heterochromia Grad of this object pixel and is positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region.
10. image processor as claimed in claim 9, wherein when this object pixel is positioned at sharp keen vertical edge, this edge indicator record cell determines that last bit value of this first colouring information of this object pixel of this frame buffer is one in " 1 " or " 0 "; When this object pixel is positioned at sharp keen horizontal edge, this edge indicator record cell determines that last bit value of this first colouring information of this object pixel of this frame buffer is for another in " 1 " or " 0 "; And when this object pixel is positioned at texture region, this edge indicator record cell does not revise last bit value of this first colouring information of this object pixel of this frame buffer.
11. image processors as claimed in claim 1, also include:
One Dynamic Weights quantizing distribution unit, be coupled to this heterochromia Gradient estimates unit and this Edge texture characteristics determined unit, be used for according to four heterochromia Grad of this object pixel and the Edge texture feature of this object pixel, and use comparison list to decide multiple weight; And
One Weighted Interpolation unit, be coupled to this initial interpolation unit and this Dynamic Weights quantizing distribution unit, wherein this Weighted Interpolation unit uses the plurality of weight at least two the second colouring informations in four of this object pixel the second colouring informations to be weighted addition, to obtain target second colouring information of this object pixel.
12. image processors as claimed in claim 11, wherein this image data is Bel's mosaic image.
13. 1 kinds of image treatment methods, include:
Capture an image data from one frame buffer, each pixel wherein in this image data only has a kind of colouring information;
For the object pixel in this image data, according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, four the second colouring informations on left and right, wherein this color corresponding to the first colouring information is different from the color corresponding to four the second colouring informations;
According to four the second colouring informations of this object pixel calculate this object pixel up, below, four heterochromia Grad on left and right;
In order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel; And
Edge texture feature according to this object pixel determines whether revising the bit value of this first colouring information of this object pixel being stored in this frame buffer.
14. image treatment methods as claimed in claim 13, wherein this image data is Bel's mosaic image.
15. image treatment methods as claimed in claim 13, wherein calculate multiple heterochromia value according to this second colouring information correspondence, and calculate heterochromia Grad corresponding to this second colouring information according to described multiple heterochromia value.
16. image treatment methods as claimed in claim 15, wherein also carry out filtering process to described multiple heterochromia value.
17. image treatment methods as claimed in claim 15, are wherein weighted each in described multiple heterochromia value and are added or subtract each other to obtain heterochromia Grad corresponding to this second colouring information.
18. image treatment methods as claimed in claim 13, the step wherein determining the Edge texture feature of this object pixel includes:
The Edge texture feature of this object pixel is decided according to four heterochromia Grad of this object pixel and the edge indicator of at least one neighborhood pixels.
19. image treatment methods as claimed in claim 18, wherein the edge indicator of this at least one neighborhood pixels is last bit value of the colouring information being stored in this at least one neighborhood pixels in this frame buffer.
20. image treatment methods as claimed in claim 13, wherein include according to the step that the Edge texture feature of this object pixel determines whether revising the bit value of this first colouring information of this object pixel being stored in this frame buffer:
Edge texture feature according to this object pixel determines whether revising last bit value of this first colouring information of this object pixel being stored in this frame buffer.
21. image treatment methods as claimed in claim 13, the step wherein determining the Edge texture feature of this object pixel includes:
Four heterochromia Grad according to this object pixel decide this object pixel and are positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region.
22. image treatment methods as claimed in claim 21, wherein include according to the step that the Edge texture feature of this object pixel determines whether revising the bit value of this first colouring information of this object pixel being stored in this frame buffer:
When this object pixel is positioned at sharp keen vertical edge, determine that last bit value of this first colouring information of this object pixel of this frame buffer is one in " 1 " or " 0 ";
When this object pixel is positioned at sharp keen horizontal edge, determine that last bit value of this first colouring information of this object pixel of this frame buffer is for another in " 1 " or " 0 "; And
When this object pixel is positioned at texture region, do not revise last bit value of this first colouring information of this object pixel of this frame buffer.
23. image treatment methods as claimed in claim 13, also include:
According to four heterochromia Grad of this object pixel and the Edge texture feature of this object pixel, and comparison list is used to decide multiple weight; And
Use the plurality of weight at least two the second colouring informations in four of this object pixel the second colouring informations to be weighted addition, to obtain target second colouring information of this object pixel.
24. image treatment methods as claimed in claim 23, wherein this image data is Bel's mosaic image.
CN201310027214.3A 2013-01-24 2013-01-24 Image processor and image treatment method Active CN103974043B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310027214.3A CN103974043B (en) 2013-01-24 2013-01-24 Image processor and image treatment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310027214.3A CN103974043B (en) 2013-01-24 2013-01-24 Image processor and image treatment method

Publications (2)

Publication Number Publication Date
CN103974043A CN103974043A (en) 2014-08-06
CN103974043B true CN103974043B (en) 2016-02-10

Family

ID=51243022

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310027214.3A Active CN103974043B (en) 2013-01-24 2013-01-24 Image processor and image treatment method

Country Status (1)

Country Link
CN (1) CN103974043B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110490029B (en) * 2018-05-15 2022-04-15 瑞昱半导体股份有限公司 Image processing method capable of performing differentiation processing on face data
CN110858894B (en) * 2018-08-23 2021-11-26 瑞昱半导体股份有限公司 Color reconstruction device and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200620149A (en) * 2004-12-03 2006-06-16 Altek Corp System and method applied to adaptive image transformation
CN1870048A (en) * 2005-05-25 2006-11-29 凌阳科技股份有限公司 Edge strengthening method and device of Bel image and color image pick-up system
TW200643820A (en) * 2005-06-03 2006-12-16 Ultramedia Inc Color interpolation method with directed weights
CN101815220A (en) * 2009-02-20 2010-08-25 华晶科技股份有限公司 Method for correcting image color distortion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7486844B2 (en) * 2005-11-17 2009-02-03 Avisonic Technology Corporation Color interpolation apparatus and color interpolation method utilizing edge indicators adjusted by stochastic adjustment factors to reconstruct missing colors for image pixels

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200620149A (en) * 2004-12-03 2006-06-16 Altek Corp System and method applied to adaptive image transformation
CN1870048A (en) * 2005-05-25 2006-11-29 凌阳科技股份有限公司 Edge strengthening method and device of Bel image and color image pick-up system
TW200643820A (en) * 2005-06-03 2006-12-16 Ultramedia Inc Color interpolation method with directed weights
CN101815220A (en) * 2009-02-20 2010-08-25 华晶科技股份有限公司 Method for correcting image color distortion

Also Published As

Publication number Publication date
CN103974043A (en) 2014-08-06

Similar Documents

Publication Publication Date Title
Park et al. Single image dehazing with image entropy and information fidelity
CN110574025A (en) Convolution engine for merging interleaved channel data
US11645734B2 (en) Circuitry for image demosaicing and contrast enhancement and image-processing method
US20170053379A1 (en) Demosaicing methods and apparatuses using the same
US8587705B2 (en) Hardware and software partitioned image processing pipeline
US11941785B2 (en) Directional scaling systems and methods
US11551336B2 (en) Chrominance and luminance enhancing systems and methods
CN106651783A (en) Image filtering method and device
US9008421B2 (en) Image processing apparatus for performing color interpolation upon captured images and related method thereof
US20110211770A1 (en) Compressibility-aware media retargeting with structure preserving
CN103685858A (en) Real-time video processing method and equipment
CN109829890B (en) Safety evaluation method for JPEG image carrier
CN103974043B (en) Image processor and image treatment method
US9779486B2 (en) Image processing apparatus and image processing method
US20130300890A1 (en) Image processing apparatus, image processing method, and program
US10719916B2 (en) Statistical noise estimation systems and methods
US20200175647A1 (en) Methods and apparatus for enhanced downscaling
US10530996B2 (en) Electronic device
US11321813B2 (en) Angular detection using sum of absolute difference statistics systems and methods
US9508020B2 (en) Image processing system with artifact suppression mechanism and method of operation thereof
JP3955034B2 (en) Color temperature conversion apparatus and method
TWI410896B (en) Compressibility-aware media retargeting with structure preserving
CN107578020A (en) Mobile object detection method, device, medium and computing device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant