CN110060619A - Sub-pixel rendering method and device - Google Patents

Sub-pixel rendering method and device Download PDF

Info

Publication number
CN110060619A
CN110060619A CN201910332550.6A CN201910332550A CN110060619A CN 110060619 A CN110060619 A CN 110060619A CN 201910332550 A CN201910332550 A CN 201910332550A CN 110060619 A CN110060619 A CN 110060619A
Authority
CN
China
Prior art keywords
pixel
sub
rendered
source
factor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910332550.6A
Other languages
Chinese (zh)
Other versions
CN110060619B (en
Inventor
李响
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Granfei Intelligent Technology Co ltd
Original Assignee
Shanghai Zhaoxin Integrated Circuit Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhaoxin Integrated Circuit Co Ltd filed Critical Shanghai Zhaoxin Integrated Circuit Co Ltd
Priority to CN201910332550.6A priority Critical patent/CN110060619B/en
Publication of CN110060619A publication Critical patent/CN110060619A/en
Priority to US16/592,061 priority patent/US11030937B2/en
Priority to US17/318,126 priority patent/US11158236B2/en
Application granted granted Critical
Publication of CN110060619B publication Critical patent/CN110060619B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2003Display of colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2007Display of intermediate tones
    • G09G3/2074Display of intermediate tones using sub-pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/04Structural and physical details of display devices
    • G09G2300/0439Pixel structures
    • G09G2300/0452Details of colour pixel setup, e.g. pixel composed of a red, a blue and two green components
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0457Improvement of perceived resolution by subpixel rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

One sub-pixel rendering method generates target image according to source images, this method comprises: obtaining the source images;Determine the object pixel to be rendered in the target image;Calculate the edge code of source pixel corresponding with the sub-pixel of the object pixel to be rendered in the source images;The texture information around the sub-pixel of the object pixel to be rendered is determined according to the edge code;And when the edge code is not special style, the pixel value of the sub-pixel of the object pixel to be rendered is calculated according to the texture information and based on distance.

Description

Sub-pixel rendering method and device
Technical field
The present invention relates to the rendering method of sub-pixel and devices, in particular to according to texture information and based on distance and/or The method and device that area renders sub-pixel.
Background technique
In the prior art, when display shows image with traditional sub-pixel driving method, one in display One color component of corresponding one source pixel into source images of sub-pixel.However, the threshold with manufacturing technology rises, display The quantity of sub-pixel on device is also restricted.In other words, the resolution of display will be difficult to continue to lift up.Therefore, when be intended in When compared with showing the image of high-res on the display of low-res, how to retain more source images details is required at present solve Certainly the problem of.
Summary of the invention
Following disclosure is exemplary only, and is not intended to limit in any way.In terms of except described illustrate, Except embodiment and feature, by referring to accompanying drawing with following specific embodiments, other aspect, embodiment and features also will Obviously.That is, following disclosure be provided with introduce concept, emphasis, benefit and it is described herein novel and it is non-aobvious and The technical advantage being clear to.Selected, not all, embodiment will be further described as follows.Therefore, following disclosure It is not intended to the essential feature in claimed subject, is also not intended to use in the range for determining claimed subject.
In the preferred embodiment, the present invention provides a sub-pixel rendering method, generates target image according to source images, This method comprises: obtaining the source images;Determine the object pixel to be rendered in the target image;It calculates in the source images The edge code of source pixel corresponding with the sub-pixel of the object pixel to be rendered;According to the edge code determine it is described to Texture information around the sub-pixel of post-processing object pixel;And when the edge code is not special style, according to The pixel value of the texture information and the sub-pixel based on the distance calculating object pixel to be rendered.
In the preferred embodiment, the present invention provides a sub-pixel rendering device, comprising: reservoir is used for storage source figure Picture and target image;And processor, for generating the target image according to the source images;Wherein, the processor from The reservoir obtains the source images, determines the object pixel to be rendered in the target image, calculates in the source images The edge code of source pixel corresponding with the sub-pixel of the object pixel to be rendered, according to the edge code determine it is described to Texture information around the sub-pixel of post-processing object pixel, and when the edge code is not special style, according to The pixel value of the texture information and the sub-pixel based on the distance calculating object pixel to be rendered.
Detailed description of the invention
Attached drawing is included to that the disclosure is made to be further appreciated and be merged and forms a part of this disclosure.Attached drawing is used for Illustrate embodiment of the disclosure and together with the principle described to explain the disclosure.Itself it is understood that attached drawing not necessarily press than Example is described, some elements can by be more than in actual implementation in a manner of size show, to clearly demonstrate the concept of the disclosure.
Fig. 1 is to show an electronic device as described in the examples to execute sub-pixel rendering method according to the present invention Block diagram.
Fig. 2 is the flow chart for showing the sub-pixel rendering method described in an embodiment according to the present invention based on distance.
Fig. 3 is the schematic diagram for showing recess described in an embodiment according to the present invention and fillet.
Fig. 4 is to show to carry out drawing processing to multiple segments of a source images described in an embodiment according to the present invention Schematic diagram.
Fig. 5 is to show the schematic diagram for how carrying out mirror image processing described in an embodiment according to the present invention to edge pixel.
Fig. 6 is level (h) direction for showing form described in an embodiment according to the present invention, upper left bottom right (l) direction, hangs down The directly schematic diagram of the direction (v) and upper right lower-left (r) direction four direction.
Fig. 7 is 9 kinds of situations for showing the edge code for corresponding to horizontal direction (h) described in an embodiment according to the present invention Schematic diagram.
Fig. 8 A is the signal of corresponding texture information when edge code described in an embodiment is 0x3030 according to the present invention Figure.
Fig. 8 B is the signal of corresponding texture information when edge code described in an embodiment is 0xC0C0 according to the present invention Figure.
Fig. 9 is the signal of the size for showing source pixel described in an embodiment according to the present invention and the size of sub-pixel Figure.
Figure 10 is the schematic diagram for showing source pixel arrangement in source images described in an embodiment according to the present invention.
Figure 11 is the schematic diagram for showing arrangement of subpixels described in an embodiment according to the present invention.
Figure 12 is shown the arrangement of the source pixel of source images and target image institute described in an embodiment according to the present invention The schematic diagram that the arrangement of corresponding sub-pixel overlaps.
Figure 13 is to show according to texture information described in an embodiment according to the present invention and carried out based on area to sub-pixel The flow chart of the method for rendering.
Figure 14 is shown the arrangement of the source pixel of source images and target image described in another embodiment according to the present invention The schematic diagram that the arrangement of corresponding sub-pixel overlaps.
Figure 15 A~Figure 15 D is shown described in an embodiment according to the present invention according to texture information and based on areal calculation The schematic diagram of the pixel value of the sub-pixel of the channel B of object pixel.
Figure 16 is to show that 12 kinds of edge codes for needing to be sharpened processing described in an embodiment according to the present invention are corresponding Texture information schematic diagram.
Symbol description
100 electronic devices
110 processors
120 reservoirs
S201, S205, S210, S215, S216, S218, S220, S225 step
310 recesses
320 fillets
420 sawtooth
421,423 be located at fillet source pixel
421a, 423a are located at the subregion in circular arc tangential line
421b is located at the subregion outside circular arc tangential line
450 circular arc tangential lines
H, l, v, r pixel orientation
701~709 coding situations
S1301, S1305, S1310, S1315, S1316, S1318, S1320, S1325 step
1501,1521 target sub-pixel
1502~1505,1522~1525 constitute the sub-pixel of diamond-shaped area
1511~1516,1531~1535 source pixels
1550,1560 diamond-shaped area
The subregion of 1550a~1550f, 1560a~1560e composition diamond-shaped area
Corresponding pixel value in 3 × 3 form of V0~V8
Specific embodiment
Explanation is the relatively good implementation to complete invention below, its object is to describe essence spirit of the invention, but Not to limit the present invention.Actual summary of the invention must refer to after scope of the claims.
It will be appreciated that using in this manual " include ", " include " and etc. words, be specific to indicate to exist Technical characteristic, numerical value, method and step, operation processing, element and/or component, but being not precluded can be special plus more technologies Sign, numerical value, method and step, operation processing, element, component or above any combination.
Fig. 1 is to show an electronic device as described in the examples to execute sub-pixel rendering method according to the present invention Block diagram.Electronic device 100 includes an at least processor 110 and a reservoir 120.Processor 110 can be in several ways Implement, for example, with special hardware circuit or common hardware (such as single-processor, have parallel processing ability multiprocessor, Graphics processor or other processors with operational capability), to be converted to source images it is suitable for that there is specific sub- picture The target image of the display of element arrangement.Wherein, in an embodiment of the present invention, the height of the sub-pixel of display (not shown) It is the 2/3 of the height of the source pixel of source images, and its width is the 3/4 of the width of the source pixel of source images.The source image of source images Plain line number is identical as the object pixel line number of target image, and in every row every 3 adjacent source images source pixel by processor 110 are rendered into the object pixel of 2 target images.Each object pixel of target image includes 3 and corresponds respectively to R, G and B The sub-pixel in channel, each source pixel of source images also include 3 pixel values for corresponding respectively to R, G and channel B, object pixel The pixel value of the pixel value corresponding channel that is based respectively on source pixel of sub-pixel in each channel calculated.Reservoir 120 can For nonvolatile memory (such as ROM, flash memory etc.), to store an at least source images and to turn source images Information needed for being changed to the target image for being suitable for the display with particular sub-pixel arrangement.For example, source images are turned Information needed for being changed to the target image for being suitable for the display with particular sub-pixel arrangement includes that source pixel is converted to son The related algorithm of pixel, for based on distance and the relevant parameter of sub-pixel rendering method based on area etc..
In one embodiment, electronic device 100 is a display pannel controller, is coupled to graphics processing unit (Graphics Processing Unit, GPU) is between (not shown) and display (not shown).Electronic device 100 connects from GPU Source images are received, and the source images received are converted to and sends display to after target image and shows.
The sub-pixel rendering method (will be described hereinafter) based on distance and/or the son based on area can be used in processor 110 Pixel rendering method (will be described) hereinafter, and source images are converted to the mesh for being suitable for the display with particular sub-pixel arrangement Logo image.Sub-pixel rendering method based on distance is first described below.
Fig. 2 is the flow chart for showing the sub-pixel rendering method described in an embodiment according to the present invention based on distance, under Face will be described in detail the sub-pixel rendering method shown in Fig. 2 based on distance in conjunction with Fig. 3~Figure 12.
Firstly, processor 110 obtains source images from reservoir 120 in step S201.In one embodiment, processor 110 First from GPU receive source images, and by the source images received storage into reservoir 120 after, just enter step S201.Then locate It manages device 110 and executes step S205.
In step S205, when the display of displaying target image has recess and/or fillet, processor 110 is to source The source pixel for being located at recess or fillet in image carries out anti-aliasing processing.Specifically, whether processor 110 first judges display With recess or fillet.As shown in figure 3,310 be positioned at the recess at display edge and 320 for positioned at display edge Fillet.In one embodiment, if display has recess and/or fillet, all and recess can be stored in reservoir 120 And/or the coordinate information of the source pixel in source images corresponding to fillet.If processor 110 can be obtained from reservoir 120 Get the coordinate information with the source pixel in source images corresponding to recess and/or fillet, then it represents that display has recess And/or fillet.When display has recess and/or fillet, processor 110 will be located at recess and/or fillet in source images The pixel value of source pixel is multiplied by attenuation coefficient, to carry out anti-aliasing processing.In one example, processor 110 will be located at edge and For recess and/or the source images of fillet source pixel pixel value multiplied by attenuation coefficient, showed with softening edge pixel Sawtooth.Subsequent step can be according to each son of the object pixel of the calculated for pixel values target image of the source pixel of the source images through softening The pixel value of pixel.Wherein, attenuation coefficient is related to the area that edge pixel is cut by circular arc, and can pass through formula below It obtains:
Areaarch=(2*offset-1)/(2*step)
Wherein, AreaarchFor attenuation coefficient, offset is location index of the source pixel in sawtooth, and step is sawtooth Width.
For example, as shown in figure 4, region 410 is the region of not photophore, region 420 (being described with dotted line) is circle One of sawtooth in multiple sawtooth of angle or recess, and solid line 450 is then in region 420 close to ideal circular arc tangential line. For source pixel 421, region 421a is the region in circular arc tangential line, and region 421b is then outside circular arc tangential line Region.According to the content of Fig. 4, it can be seen that, the width in region 420 is 5 source pixels, and source pixel 421 is then region 420 1st source pixel (i.e. offset is 1).Therefore, attenuation coefficient corresponding to source pixel 421 can be calculated to obtain according to above-mentioned formula For Areaarch=(2*1-1)/(2*5)=1/10.In other words, area corresponding to the 421a of region is the entire surface of source pixel 421 Long-pending 1/10, and the pixel value of the pixel 421 after softening is original 1/10.In another example, source pixel 423 pair The attenuation coefficient answered is Areaarch=(2*3-1)/(2*5)=5/10, i.e., the pixel value of the source pixel 423 after softening are original First 5/10, in other words, area corresponding to the 423a of region are the 5/10 of the entire area of source pixel 423, and after softening The pixel value of source pixel 423 is 5/10 originally.And so on.
In one embodiment, processor 110 by source images in the target image without the source pixel of corresponding sub-pixel Pixel value be set as 0, imply that source images corresponding to the region (region 410 in Fig. 4) that will not have photophore in display In the pixel value of source pixel be set as 0.
In addition, corresponding to the correlation of the source pixel of saw tooth region in the storage of reservoir 120 in one embodiment of the invention When information, the coordinate that can only store the starting point of saw tooth region, offset direction and source image corresponding to the direction x or the direction y The offset of element.For example, region 420 as shown in Figure 4 can only store pair when storage corresponds to the sawtooth in region 420 Should in the coordinate of source pixel 421, corresponding to the offset direction in the direction x and the information of the offset of 5 source pixels.
After processor 110 carries out anti-aliasing processing to the source pixel for being located at recess and/or fillet, enter step S210.In step S210, processor 110 determines the coordinate (x of an object pixel to be rendered in target imagespr, y), so After enter step S215.
In step S215, processor 110 calculate separately in source images with object pixel (xspr, y) each sub-pixel it is corresponding Source pixel edge code (edge code) object pixel (x to be rendered with determinationspr, y) each sub-pixel around line Manage information.Specifically, in order to object pixel (xspr, y) each sub-pixel correctly rendered, processor 110 is to source In image with the object pixel (x to be rendered of target imagespr, y) the corresponding source pixel of each sub-pixel centered on a form hold Row edge detection is to obtain and object pixel (x to be renderedspr, y) the corresponding source pixel of each sub-pixel edge code, and Object pixel (x is determined according to obtained edge codespr, y) each sub-pixel texture information, and to different texture Object pixel (xspr, y) each sub-pixel rendered using different rendering methods respectively to obtain object pixel (xspr, Y) pixel value of each sub-pixel.It calculates and object pixel (xspr, y) the corresponding source images of each sub-pixel in source pixel x When coordinate, according to object pixel (x to be renderedspr, y) and it is located at the odd-numbered line of target image or the difference of even number line, calculation method meeting It is different.
For description object pixel (x better further belowspr, y) each sub-pixel pixel value calculating process, here first Explain the definition about even number line, odd-numbered line, even column and odd column.With object pixel (xspr, y) for, work as xspr%2 Object pixel (x is indicated when=0spr, y) and it is located at the even column of target image, work as xsprObject pixel (x is indicated when %2=1spr, y) Positioned at the odd column of target image, wherein %2 expression rems to 2, and works as object pixel (xspr, y) be located at the 1st column when xspr= 0, as object pixel (xspr, y) be located at the 2nd column when xspr=1, and so on;Object pixel (x is indicated as y%2=0spr, y) Positioned at the even number line of target image, object pixel (x is indicated as y%2=1spr, y) and it is located at the odd-numbered line of target image, Middle %2 expression rems to 2, and works as object pixel (xspr, y) be located at 1 row when y=0, as object pixel (xspr, y) and it is located at the Y=1 when 2 row, and so on.Above-mentioned determining even number line/even column and odd-numbered line/odd column method are equally applicable to source images In source pixel, description is not repeated herein.
As object pixel (x to be renderedspr, y) be located at target image even number line when, corresponding to source images in source The x coordinate of pixel obtains (source pixel corresponding to the channel R and G of object pixel and the sub-pixel of channel B by formula below It is different):
And as object pixel (x to be renderedspr, y) when being located at the odd-numbered line of target image, corresponding to source pixel X coordinate obtains (channel R and G of object pixel is different from source pixel corresponding to the sub-pixel of channel B) by formula below:
Wherein, floor () indicates downward round numbers.With object pixel (xspr, y) the corresponding source images of each sub-pixel in Source pixel be (x, y).With source pixel (x, y) for center pixel, institute in form can be found out according to the coordinate of source pixel (x, y) The coordinate of active pixel.By taking 3 × 3 form as an example, the coordinate of the source pixel above source pixel (x, y) is (x, y-1), source pixel The coordinate of the source pixel of the left (x, y) is (x-1, y), and so on all 8 source images around available source pixel (x, y) The coordinate of element.The pixel value of corresponding source pixel can be obtained from source images according to the coordinate of source pixel.When processor 110 is counted When calculating the pixel value for being located at the sub-pixel at edge, processor 110 first will carry out mirror image processing to the source pixel for being located at edge, to take Obtain the pixel value for the virtual source pixel being located at outside edge.For example, as shown in figure 5, the source pixel 0 of the shadow region in the lower right corner ~15 be the source pixel in source images.Due to that can use and be located at when processor 110 calculates the object pixel for being located at edge Virtual source pixel outside source images can be used that is, when calculating the pixel value for being located at sub-pixel corresponding with the source pixel 4 of source images To the pixel value for the virtual source pixel for being located at 4 left side of source pixel, thus the pixel of the virtual source pixel in 4 left side of source pixel will be located at Value is mapped as being located at the pixel value of the source pixel 5 on 4 right side of source pixel, and so on, so that processor 110 corresponds to side in calculating When the pixel value of the sub-pixel of edge, related operation can be carried out according to the pixel value of virtual source pixel.
It is got and object pixel (x from source imagesspr, y) the corresponding form of some sub-pixel after, processor 110 start Carry out the calculating of edge code.With object pixel (xspr, y) the pixel value of the corresponding source pixel of the sub-pixel be individually subtracted Multiple pixel values for closing on source pixel in one of multiple directions of form obtain multiple first differences;Source pixel is closed on multiple Pixel value the pixel value of the source pixel be individually subtracted obtain multiple second differences;According to the first difference compared with first threshold As a result the first coding is obtained;The second coding is obtained according to the comparison result of the second difference and second threshold;By the first coding and the Two coded combinations obtain the coding of one of multiple directions;The coding for finally combining the multiple direction obtains edge code.
Below with object pixel (xspr, y) the corresponding sub-pixel in the channel R corresponding 3 × 3 form for illustrate. Edge code can be made up of the coding of four hex bits, and each position for forming the coding of edge code is divided from left to right Not Dui Yingyu 3 × 3 level (h) direction of form, upper left bottom right (1) direction, vertical direction (v) and upper right lower-left (r) side To coding, wherein each characterization one direction texture information.It is worth noting that, although being with four in above-described embodiment As an example, but the present invention is not limited thereto for position, this needs the direction number characterized to determine by edge code.Citing comes It says, as shown in fig. 6, the coding in horizontal direction (h) is (i.e. shown in figure by the 3rd, 4 and the 5th source pixel in nine grids V3~V5) it is calculated, the coding in upper left bottom right (1) direction (is schemed by the 0th, 4 and the 8th source pixel in nine grids Shown in V0, V4, V8) be calculated, the coding in vertical direction (v) is by the 1st, 4 and the 7th source pixel in nine grids (V1, V4, V7 i.e. shown in figure) be calculated and the coding in upper right lower-left (r) direction be by the 2nd, 4 in nine grids with And the 6th source pixel (V2, V4, V6 i.e. shown in figure) is calculated.Wherein, the first two position of the coding of all directions is logical It crosses produced by subtracting the pixel value of center pixel from the pixel value of surrounding pixel, and latter two position encoded then passes through therefrom imago The pixel value of element subtracts produced by the pixel value of surrounding pixel.For example, by taking horizontal direction (h) as an example, horizontal direction (h) It is encoded to H (f (V3-V4), f (V5-V4), f (V4-V3), f (V4-V5)).Wherein, f () indicates a function, when in bracket Value exports 1 when being greater than a given threshold, exports 0 when the value in bracket is less than a given threshold;H () indicates a function, will The number of four binary digits in parantheses is converted to the number of a hex bit.For example, it is assumed that threshold value be 10, when V3 be 151, When V4 is 148, V5 is 150, V3-V4 is equal to 3, thus V3-V4 is less than 10, so f (V3-V4) exports 0, V5-V4 and is equal to 2, because And V5-V4 is less than 10, so f (V5-V4) exports 0, V4-V3 and is equal to -3, thus V4-V3 is less than 10, so 0, V4-V5 of output etc. In -2, thus V4-V5 is less than 10, so output 0;So horizontal direction (h) is encoded to 0x0 (i.e. Binary Zero 000).Now It is to show that 9 kinds of situations of the coding that corresponds to horizontal direction (h) described in an embodiment according to the present invention are shown with reference to Fig. 7, Fig. 7 It is intended to.As shown in 701 in Fig. 7,0x0 indicates that the brightness value (i.e. pixel value, similarly hereinafter) of V3, V4 and V5 are not much different;V3, V4 and V5 All with white filling signal.V3, V4 and V5 can also illustrate (not shown), the filling of V3, V4 and V5 square with filled black Color is identical to indicate that the brightness value of V3, V4 and V5 are not much different.When V3 is 151, V4 120, V5 are 150, V3-V4 is equal to 31, thus V3-V4 is greater than 10, so f (V3-V4) exports 1, V5-V4 and is equal to 30, thus V5-V4 is greater than 10, so f (V5-V4) It exports 1, V4-V3 and is equal to -31, thus V4-V3 is less than 10, so 0, V4-V5 of output is equal to -30, thus V4-V5 is less than 10, institute To export 0, so horizontal direction (h) is encoded to 0xC (i.e. binary one 100), as shown in 703 in Fig. 7, coding 0xC indicates V3 The brightness value of V4 is both greater than with the brightness value of V5.Similarly, as shown in 702 in Fig. 7, coding 0x3 indicates the brightness value of V3 and V5 The both less than brightness value of V4;As indicated by 704, coding 0x1 indicates that the brightness value of V3 and V4 is both greater than the brightness value of V5;Such as 705 institutes Show, coding 0x4 indicates that the brightness value of V3 and V4 is both less than the brightness value of V5;As shown at 706, coding 0x6 indicates the brightness value of V3 The brightness value of brightness value, V4 less than V4 is less than the brightness value of V5;As shown in 707, coding 0x2 indicates the brightness value of V4 and V5 Brightness value greater than V3;As shown in 708, coding 0x8 indicates that the brightness value of V4 and V5 is both less than the brightness value of V3;As shown in 709, Encoding 0x9 indicates brightness value of the brightness value greater than V5 of brightness value of the brightness value greater than V4 of V3, V4.
The coding in upper left bottom right (1) direction, vertical direction (v) and upper right lower-left (r) direction can similarly be acquired.It will be horizontal (h) coding in direction, upper left bottom right (1) direction, vertical direction (v) and upper right lower-left (r) direction is by sequence from left to right Arrangement can be obtained one and include the edge code of four hex bits, and pass through the edge code finally exported Learn object pixel (xspr, y) the channel R sub-pixel around texture information.In one embodiment, when horizontal direction (h) When edge code is 0x4 or 0x8, object pixel (x is indicatedspr, y) sub-pixel around texture it is weaker.When edge, code is 0x0111,0x0222,0x0333,0x0444,0x0CCC, 0xCC0C, 0x1102,0x2201,0x3303,0x4408 or 0x8804 When (as shown in figure 16), indicate object pixel (xspr, y) sub-pixel around texture it is stronger.And when edge code is 0x3030 When (as shown in Figure 8 A) or 0xC0C0 (as shown in Figure 8 B), object pixel (x is indicatedspr, y) sub-pixel around texture information For special style.
Object pixel (x has been calculated in processor 110spr, y) each sub-pixel edge code and determine object pixel (xspr, Y) after the texture information around each sub-pixel, S216 is entered step.In step S216, processor 110 judges target respectively Pixel (xspr, y) each sub-pixel around texture information whether be special style.In one embodiment, that is, judge object pixel (xspr, y) the edge code of sub-pixel whether be 0x3030 or 0xC0C0;Entering step S220 if it is "No" (below can be detailed State), S218 is then entered step if it is "Yes".
For description object pixel (x better further belowspr, y) each sub-pixel pixel value calculating process, here first Explain the positional relationship between each sub-pixel of the object pixel in the source pixel and target image in source images.Such as Figure 10 Shown, the "○" in Figure 10 represents the source pixel in source images.As shown in figure 11, " △ " in Figure 11 indicates mesh in target image The sub-pixel in the channel R of pixel is marked, " ◇ " indicates the sub-pixel in the channel G of object pixel in target image, and " " then table Show the sub-pixel of the channel B of object pixel in target image.And according to the content of Fig. 9 it is found that in an embodiment of the present invention, mesh The height of the sub-pixel of object pixel is 2/3 (object pixel line number and source image of the height of the source pixel of source images in logo image Plain line number is identical), and in target image each channel of the sub-pixel of object pixel width be source images source pixel channel 3/4 (number of object pixel is the 2/3 of source pixel number in every row) of width.In other words, as shown in figure 12, when source images are aobvious When being shown in target image, the position of the source pixel of source images can't be with the position of the sub-pixel of the object pixel of target image It overlaps, therefore when calculating the pixel value of sub-pixel in the channel R of object pixel of target image, the channel G and channel B, place It manages device 110 and the pixel value of left and right two source pixel most adjacent to the sub-pixel with object pixel is subjected to interpolation calculation to obtain The pixel value of the sub-pixel of object pixel.Step S218 is described below.
In step S218, the direct interpolation calculation object pixel (x of processor 110spr, y) sub-pixel pixel value.In detail For thin, processor 110 is to calculate object pixel (x by formula belowspr, y) sub-pixel pixel value.When target figure Object pixel (the x of picturespr, y) be located at target image even number line when, the object pixel (x of target imagespr, y) the channel R and G It is calculated with the pixel value of the sub-pixel of channel B by following formula:
When
*factorave, when
*factorkep, when
*factorave, when
And object pixel (the x when target imagespr, y) be located at target image odd-numbered line when, the target of target image Pixel (xspr, y) the channel R and G and the pixel value of sub-pixel of channel B calculated by following formula:
Work as edgecode=0x3030
Work as edgecode=0xC0C0
Wherein,Refer to that coordinate is (x in target imagespr, y) object pixel the channel R or the channel G Sub-pixel pixel value,Refer to that coordinate is (x in target imagespr, y) object pixel channel B sub-pixel Pixel value, R ' (G ')X, yRefer to that coordinate is the channel R of the source pixel of (x, y) or the pixel value in the channel G, B ' in source imagesX, yIt is Refer to that coordinate in source images is the pixel value of the channel B of the source pixel of (x, y), it is each in formulaAll contain downward rounding Number operation, factorkepFor a preset value, factoraveRefer to edge code for a preset value and edgecode.And in In the embodiment of the present invention, factorkepValue be 1.0 and factoraveValue be 0.5.It will be appreciated, however, that factorkepAnd factoraveValue be that can be adjusted according to the demand of user, be not limited with the present invention.Wherein, in In one example, as the x of the sub-pixel in the channel R of object pixelsprCoordinate be 5 when, processor 110 be take x coordinate be 7 source image The pixel value of element is multiplied by factorkepValue, with obtain correspond to object pixel sub-pixel pixel value.
In step S220, processor 110 can calculate object pixel (x to be rendered according to texture information and based on distancespr, Y) pixel value of sub-pixel.Specifically, corresponding to the object pixel (x to be rendered of even number linespr, y) the channel R, the channel G And the pixel value of the sub-pixel of channel B can be acquired by formula below:
And object pixel (the x to be rendered corresponding to odd-numbered linespr, y) the channel R, the channel G and channel B sub- picture The pixel value of element can be obtained by formula below:
Wherein,Refer to object pixel (xspr, y) the channel R or the channel G sub-pixel pixel value,Refer to object pixel (xspr, y) channel B sub-pixel pixel value, R ' (G ')(x, y)Refer to that coordinate is (x, y) The channel R of source pixel or the pixel value in the channel G, B '(x, y)Refer to the channel B corresponded in source pixel that coordinate is (x, y) Pixel value, it is each in formulaDownward round numbers operation (for example, 3 × 3/2 are equal to 4) is all contained, wherein %2 is indicated to 2 It rems, therefore xspr%2=0 indicates even column and xspr%2=1 indicates odd column.In one embodiment, work as target picture Element (xspr, y) the corresponding texture information of sub-pixel it is weaker when, i.e., when level (h) direction of edge code be encoded to 0x8 or When 0x4, object pixel (x is being calculatedspr, y) sub-pixel pixel value when need be smoothed.Specifically, work as level (h) direction is encoded to 0x8 and xspr0x4 and x are encoded to when %2=0 or when horizontal direction (h)sprWhen %2=1 Use factorsmoothReplace factorrg(b)**, wherein factorsmoothFor a preset value, factorrg(b)**It indicates factorrg00、factorrg01、factorrg10、factorrg11、factorb00、factorb01、factorb10Or factorb11, And factorsmooth、factorrg00、factorrg01、factorrg10、factorrg11、factorb00、factorb01、 factorb10And factorb11It is preset value.
For example, when the channel the R sub-pixel for the object pixel for being located at (3,1) in the calculating target image of processor 110 When pixel value, it can be acquired according to the pixel value in the channel R for the source pixel for being located at (3,1) and (4,1) in source images, with such It pushes away.
In one embodiment of the invention, factorrg00、factorrg01、factorrg10、factorrg11、factorb00、 factorb01、factorb10、factorb11Value be all 0.7.Alternatively, another embodiment according to the present invention, factorrg00、 factorrg10、factorrg11、factorb00、factorb01、factorb10Value be 1.0, and factorrg01、factorb11's Value is then 0.7.In other words, the sub-pixel applied to the channel R of the object pixel of different row/columns, the channel G and channel B The value of factor is that the user of demand can be shown according to to(for) color is different and change.In addition, in response to the son of object pixel When texture around pixel is more gentle or does not have texture, the value of factor also can directly apply 0.5.
Object pixel (x has been calculated in processor 110spr, y) all sub-pixels pixel value after, enter step S225.? In step S225, processor 110 checks whether there are also the object pixels not rendered in target image.If it is "No", then it represents that The sub-pixel of all object pixels of target image is all rendered complete, and processing terminate.Then processor 110 can will render Target image be sent to display and shown.Otherwise, return step S210 continues to next object pixel not rendered It is rendered.
The sub-pixel rendering method based on area is described below.Figure 13 is to show root described in an embodiment according to the present invention According to texture information and the flow chart of method that is rendered based on area to sub-pixel.Below in conjunction with Figure 14~Figure 16 to Figure 13 It is illustrated.
Figure 14 is shown the arrangement of the source pixel of source images and target image described in another embodiment according to the present invention The schematic diagram that the arrangement of corresponding sub-pixel overlaps.Figure 15 A~Figure 15 D is to show root described in an embodiment according to the present invention According to texture information and the schematic diagram of the pixel value of the sub-pixel of the channel B based on areal calculation object pixel.Figure 16 is display root Need to be sharpened the schematic diagram of the corresponding texture information of edge code of processing according to 12 kinds described in one embodiment of the invention.
As shown in figure 13, step S1301, S1305, S1310, S1316, S1318 and S1325 respectively with the step in Fig. 2 The operation of S201, S205, S210, S216, S218 and S225 are identical, will no longer carry out repeated description herein.It is right separately below S1315 and step S1320 are described.Figure 13 is using the object pixel to be rendered based on areal calculation in step S1320 (xspr, y) sub-pixel pixel value, and Fig. 2 is to calculate object pixel to be rendered using based on distance in step S220 (xspr, y) sub-pixel pixel value, therefore the step S215 in step S1315 and Fig. 2 in Figure 13 is not also identical, i.e. Figure 13 with The embodiment of Fig. 2 is calculating and object pixel (xspr, y) the corresponding source images of sub-pixel in source pixel coordinate when use Formula it is also different, it is specific as follows shown in:
As object pixel (x to be renderedspr, y) be in target image even number line when, corresponding to source images in The x coordinate of source pixel obtains the (channel R and G of object pixel source pixel corresponding with the sub-pixel of channel B by formula below May be different):
And as object pixel (x to be renderedspr, y) be in target image odd-numbered line when, corresponding to source images In the x coordinate of source pixel (the channel object pixel R and G source image corresponding with the sub-pixel of channel B is obtained by formula below Element may be different):
Wherein, floor () indicates downward round numbers.With object pixel (x to be renderedspr, y) the corresponding source figure of sub-pixel Source pixel as in is (x, y), and wherein %2 expression rems to 2, therefore xspr%2=0 indicates even column and xspr%2= 1 indicates odd column.
Step S1320 is described again below.In step S1320, processor 110 is according to texture information and is based on areal calculation Object pixel (x to be renderedspr, y) sub-pixel pixel value.Specifically, as shown in figure 14, " △ " indicates mesh to be rendered The channel R of pixel or the sub-pixel in the channel G are marked, and " " then indicates the sub-pixel of the channel B of object pixel to be rendered, and The center of the small square of each dotted line frame is the position where a source pixel of source images.Processor 110 is corresponded in calculating When the pixel value of the sub-pixel of the object pixel to be rendered of a target image, the target picture to be rendered with target image is first had to Its corresponding form in source images is obtained centered on the sub-pixel of element, it is notable that form here and aforementioned It is different that the form that edge code is taken is calculated in step S1315.Below by taking 3 × 3 form as an example, it is described in detail below.
When the sub-pixel in the channel R or the channel G that the sub-pixel of object pixel to be rendered is object pixel, it is located at mesh The sub-pixel of the object pixel to be rendered of the even number line even column of logo image, even number line odd column and odd-numbered line even column is in source The source pixel for including in corresponding form in image are as follows:
And the sub-pixel to be rendered of the odd-numbered line odd column positioned at target image wraps in corresponding form in source images The source pixel contained are as follows:
Wherein, R ' (G ')(x, y)Refer to coordinate for the pixel value in the source pixel of (x, y) corresponding to the channel R or the channel G.
In addition, being located at target image when the sub-pixel for the channel B that the sub-pixel of object pixel to be rendered is object pixel Even number line even column, odd-numbered line even column and odd-numbered line odd column object pixel to be rendered sub-pixel in source images The source pixel for including in corresponding form are as follows:
And the sub-pixel of the object pixel to be rendered positioned at the even number line odd column of target image is corresponding in source images Form in include source pixel are as follows:
Wherein, B '(x, y)Refer to that coordinate is the pixel value of the channel B of the source pixel of (x, y) in source images.
(such as Figure 14 after the sub-pixel source pixel that corresponding form includes in source images for obtaining object pixel to be rendered In 3 × 3 dotted line frames small square), top, lower section, the left side of sub-pixel of the processor 110 based on object pixel to be rendered The sub-pixel of the object pixel to be rendered of side and right is to obtain a diamond-shaped area.As shown in Figure 15 A~Figure 15 D, " △ " generation The channel R/channel G sub-pixel of table object pixel to be rendered, " " represents the sub-pixel of the channel B of object pixel to be rendered, And 9 small squares represent the source pixel that the sub-pixel of object pixel to be rendered includes in corresponding form in source images. 4 kinds of different types of diamond-shaped areas are shared, the diamond shape 1550 in Figure 15 A is the sub-pixel when the channel B of object pixel to be rendered Acquired diamond-shaped area when position in the target image is even number line even column, the diamond shape 1560 in Figure 15 B is to be rendered The position of the sub-pixel of the channel B of object pixel in the target image diamond-shaped area acquired when being even number line odd column, figure Diamond shape in 15C is the position of the sub-pixel of the channel B of object pixel to be rendered in the target image when being odd column even number line Diamond shape in acquired diamond-shaped area and Figure 15 D be the sub-pixel of the channel B of object pixel to be rendered in the target image Position acquired diamond-shaped area when being odd-numbered line odd column.
Then, the area ratio of the surrounding source pixel according to shared by diamond-shaped area of processor 110 calculates corresponding target image Object pixel to be rendered sub-pixel pixel value.That is the face that judges surrounding source pixel shared by the diamond-shaped area of processor 110 Product ratio, finally the area ratio shared in corresponding source pixel according to each sub-regions is multiplied by corresponding source pixel Pixel value is simultaneously added up, to obtain the pixel value of the sub-pixel of object pixel to be rendered.
As shown in fig. 15, when processor 110 is intended to obtain the pixel of the sub-pixel 1501 of the channel B of object pixel to be rendered When value, processor 110 is primarily based on top, lower section, left and the right side of the sub-pixel 1501 of the channel B of object pixel to be rendered Side object pixel the channel R/channel G sub-pixel 1502~1505 obtain a diamond-shaped area 1550, with according to diamond-shaped area in Shared area acquires the pixel value of the sub-pixel of object pixel to be rendered in the source pixel of multiple source images.Diamond-shaped area 1550 It is to be made of respectively subregion 1550a~1550f, and the form that subregion 1550a~1550f respectively accounts for 3 × 3 includes In the source pixel of source images right side two column source pixels a part (source pixel 1511 of source images i.e. shown in figure~ 1516).Processor 110 then finds out subregion 1550a area ratio shared in the source pixel 1511 of source images, son respectively Area ratio region 1550b shared in the source pixel 1512 of source images, subregion 1550c are in the source pixel 1513 of source images In shared area ratio, sub-pixel 1550d area ratio shared in the source pixel 1514 of source images, sub-pixel 1550e Shared area ratio and sub-pixel 1550f are shared in the source pixel 1516 of source images in the source pixel 1515 of source images Area ratio, then area ratio corresponding to each sub-regions is multiplied by aforementioned obtained 3 × 3 source pixel respectively and is divided Not corresponding channel B pixel value simultaneously adds up the value for corresponding to each subregion, and the B that can finally obtain object pixel to be rendered is logical The pixel value of the sub-pixel 1501 in road.For example, area ratio subregion 1550d shared in the source pixel 1514 of source images Example is 54/144, if channel B pixel value corresponding to the source pixel 1514 of source images is 144, corresponding to subregion 1550d Value is then 54, and so on.
As shown in fig. 15b, when the pixel value of the sub-pixel 1521 of the channel B of the object pixel to be obtained of processor 110, place The R that reason device 110 is primarily based on the top of sub-pixel 1521 of the channel B of object pixel to be rendered, lower section, left and right is logical Road/the channel G sub-pixel 1522~1525 obtains a diamond-shaped area 1560, with according to diamond-shaped area in the source pixel of multiple source images In shared area acquire object pixel to be rendered channel B sub-pixel 1521 pixel value.Diamond-shaped area 1560 is difference It is made of subregion 1560a~1560e, and subregion 1560a~1560e is respectively in the source pixel for the source images for accounting for 3 × 3 A part (source pixels 1531~1535 of source images i.e. shown in figure) of 2nd, 4~6,8 pixel.Then, processor 110 Subregion 1560a area ratio shared in the source pixel 1531 of source images, subregion 1560b are found out respectively in source images Area ratio shared area ratio, subregion 1560c shared in the source pixel 1533 of source images, son in source pixel 1532 Area ratio and sub-pixel 1560e pixel 1560d shared in the source pixel 1534 of source images is in the source pixel of source images Shared area ratio in 1535, then area ratio corresponding to each sub-regions is multiplied by the source figure of aforementioned obtained 3*3 The source pixel of picture corresponding channel B pixel value and add up correspond to each subregion pixel value, can finally obtain The pixel value of the sub-pixel 1521 of the channel B of object pixel to be rendered.
The mode of the pixel value of the sub-pixel of calculating object pixel to be rendered and Figure 15 A, figure as shown in Figure 15 C, Figure 15 D 15B is similar, is a difference in that the area of the source pixel of source images shared by each region of diamond shape is different, is not described with essence herein Simple explanation.It is worth noting that, the sub-pixel in each channel of foregoing object pixel to be rendered and the source image of source images The area configuration information of element is stored in reservoir 120 in advance, and processor 110 can be according to corresponding to the sub-pixel of object pixel Row and column accesses corresponding area configuration information in reservoir 120, and by corresponding to each sub-regions 3 × 3 source images Source pixel pixel value substitute into, in the hope of the pixel value of the sub-pixel of corresponding object pixel to be rendered.It is worth noting that, Configuration of the diamond-shaped area of the sub-pixel in the channel the R or channel G corresponding to object pixel to be rendered in 3 × 3 source pixel It is opposite with configuration of the diamond-shaped area of the sub-pixel of the channel B of object pixel to be rendered in 3 × 3 source pixel.In other words, When " △ " and " " is exchanged, i.e., when " " is for the channel R/channel G sub-pixel, " △ " corresponding to object pixel to be rendered When the sub-pixel of the channel B corresponding to object pixel to be rendered, Figure 15 A is when the channel R or the channel G of object pixel to be rendered Sub-pixel be odd-numbered line odd column when situation, Figure 15 B be when object pixel to be rendered the channel R or the channel G sub- picture Situation when element is odd-numbered line even column, Figure 15 C are when the channel R of object pixel to be rendered or the sub-pixel in the channel G are even Situation and Figure 15 D when ordered series of numbers odd-numbered line are when the channel R of object pixel to be rendered or the sub-pixel in the channel G are even number Situation when row even column.
For example, the source image of the source images around the sub-pixel in the channel R of the object pixel to be rendered of (1,1) Element are as follows:
It and is corresponding to odd-numbered line and odd column according to content above-mentioned available (1,1), therefore processor 110 is by root It is calculated according to the pixel value of aforementioned acquired source pixel multiplied by the area configuration of the diamond-shaped area of Figure 15 A and corresponds to target to be rendered The pixel value of the sub-pixel in the channel R of pixel.
In addition, an embodiment according to the present invention, in the sub-pixel rendering method based on area, when object pixel to be rendered Sub-pixel corresponding to texture information be 12 kinds of patterns shown in Figure 16 it is any when (wherein the white in figure represents " 1 ", black represents " 0 "), i.e., when texture corresponding to the sub-pixel of object pixel to be rendered is stronger, processor 110 more treats wash with watercolours The sub-pixel of dye object pixel is sharpened processing, so that target image is more clear.Specifically, when target to be rendered When texture corresponding to the sub-pixel of pixel is stronger, processor 110 treats post-processing object pixel using a Diamond filter Sub-pixel corresponding 3 × 3 source pixel in source images carries out convolution algorithm, obtains a sharpening parameter.Then face will be based on The pixel value for the sub-pixel of object pixel to be rendered that product acquires is added with parameter is sharpened, it is obtained be worth be exactly after sharpening to The pixel value of the sub-pixel of post-processing object pixel.Here is an example of the Diamond filter in one embodiment of the invention:
Another embodiment according to the present invention, when processor 110 respectively according to based on distance sub-pixel rendering method and Sub-pixel rendering method based on area obtains the sub- picture of the channel R for corresponding to object pixel to be rendered, the channel G and channel B After the pixel value of element, processor 110 can merge two calculated results according to set weight more in the hope of last pixel value.Citing For, processor 110 can give the weight of two calculated results each 0.5 respectively, with the average sub-pixel rendering side based on distance Pixel value acquired by method and sub-pixel rendering method based on area.
In conclusion sub-pixel rendering method according to the present invention and device, in the situation for not changing image quality Under, the source pixel that interpolation is only source images in the hope of quantity is carried out by two source pixels to source images or multiple source pixels 2/3 object pixel of quantity, will can so save the photophore of 1/3 quantity.In addition, in acquiring each object pixel to be rendered Sub-pixel pixel value when, method of the present invention according to the texture information around the sub-pixel of object pixel to have spy The pixel of different texture (such as edge code described in aforementioned specification is special style) carries out specially treated, and using different When the pixel value for the sub-pixel that method (such as based on distance and/or area) calculates object pixel to be rendered, according to different texture Information (such as texture described in aforementioned specification weaker or stronger situation) is accordingly to calculating resulting target picture to be rendered The pixel value of the sub-pixel of element carries out smooth or Edge contrast, to obtain optimal image conversion effect.Furthermore in response to display Device have fillet perhaps recess when method of the present invention more can be in advance to the source pixel for corresponding to fillet or notched region Reverse sawtooth processing is carried out, so that the image quality finally exported is more preferably.
Although the present invention is illustrated using above embodiments, it should be noted that these descriptions are not to limit The present invention.It is obviously modified and similar set up on the contrary, the invention covers those skilled in the art.So applying right Sharp claimed range must be explained in a manner of most wide to include all obvious modifications and similar set up.

Claims (20)

1. a sub-pixel rendering method generates target image according to source images, this method comprises:
Obtain the source images;
Determine the object pixel to be rendered in the target image;
Calculate the edge code of source pixel corresponding with the sub-pixel of the object pixel to be rendered in the source images;
The texture information around the sub-pixel of the object pixel to be rendered is determined according to the edge code;And
When the edge code is not special style, the target to be rendered is calculated according to the texture information and based on distance The pixel value of the sub-pixel of pixel.
2. sub-pixel rendering method as described in claim 1, wherein it is described calculate in the source images with the mesh to be rendered Mark pixel the corresponding source pixel of the sub-pixel the edge code the step of include:
Obtain the source coordinate of the source pixel corresponding with the sub-pixel of the object pixel to be rendered;
Obtain multiple pixel values of the form in the source images centered on the source coordinate;And
According to edge code described in multiple calculated for pixel values of the form.
3. sub-pixel rendering method as claimed in claim 2, wherein when the object pixel to be rendered is located at the target figure When the even number line of picture, the channel R of the object pixel to be rendered and/or the corresponding source coordinate of the sub-pixel in the channel G ForThe corresponding source coordinate of the sub-pixel of the channel B of the object pixel to be rendered ForWherein xsprIndicate that the x coordinate of the object pixel to be rendered, y indicate described to wash with watercolours The y-coordinate of object pixel is contaminated, floor () indicates downward round numbers.
4. sub-pixel rendering method as claimed in claim 2, wherein when the object pixel to be rendered is located at the target figure When the odd-numbered line of picture, the channel R of the object pixel to be rendered and/or the corresponding source coordinate of the sub-pixel in the channel G ForThe corresponding source of the sub-pixel of the channel B of the object pixel to be rendered Coordinate isWherein xsprIndicate the x coordinate of the object pixel to be rendered, y indicate it is described to The y-coordinate of post-processing object pixel, floor () indicate downward round numbers.
5. sub-pixel rendering method as described in claim 1, wherein the edge code includes multiple positions, wherein each Characterize the texture information in a direction.
6. sub-pixel rendering method as described in claim 1, wherein the edge code includes four hex bits, Described in special style the edge code be 0x3030 or 0xC0C0.
7. sub-pixel rendering method as described in claim 1, wherein described to be calculated according to the texture information and based on distance The step of pixel value of the sub-pixel of the object pixel to be rendered includes:
When the texture information around the sub-pixel of the object pixel to be rendered is weaker, the mesh to be rendered is calculated It is smoothed when marking the pixel value of the sub-pixel of pixel.
8. sub-pixel rendering method as claimed in claim 7, wherein when being encoded to for the horizontal direction in the edge code When 0x8 or 0x4, determine that the texture information around the sub-pixel of the object pixel to be rendered is weaker.
9. sub-pixel rendering method as claimed in claim 7, wherein when the sub-pixel position of the object pixel to be rendered When the even number line of the target image, when pixel value of the sub-pixel for calculating the object pixel to be rendered, is carried out The formula used when smoothing processing are as follows:
Wherein, 0x8 and x are encoded to when horizontal directionsprWhen %2=0 or when horizontal direction be encoded to 0x4 and xsprFactor is used when %2=1smoothReplace factorrg(b)**, wherein factorrg(b)**Indicate factorrg00、 factorrg01、factorrg10、factorrg11、factorb00、factorb01、factorb10Or factorb11
Wherein,Refer to the pixel in the channel R of the object pixel to be rendered or the sub-pixel in the channel G Value,Refer to the pixel value of the sub-pixel of the channel B of the object pixel to be rendered, R ' (G ')(x, y)Refer to coordinate For the pixel value in the channel R or the channel G in the source pixel of (x, y), B '(x, y)Refer to that coordinate is the source pixel of (x, y) The pixel value of middle channel B, it is each in formulaAll contain downward round numbers operation, xspr%2=0 indicates described to wash with watercolours Dye object pixel is located at the even column and x of the target imagespr%2=1 indicates that the object pixel to be rendered is located at institute State the odd column of target image, and factorsmooth、factorrg00、factorrg01、factorrg10、factorrg11、 factorb00、factorb01、factorb10And factorb11It is preset value.
10. sub-pixel rendering method as claimed in claim 7, wherein when the object pixel to be rendered is located at the target When the odd-numbered line of image, when pixel value of the sub-pixel for calculating the object pixel to be rendered, is smoothed The formula used are as follows:
Wherein, 0x8 and x are encoded to when horizontal directionsprWhen %2=0 or when horizontal direction be encoded to 0x4 and xsprFactor is used when %2=1smoothReplace factorrg(b)**, wherein factorrg(b)**Indicate factorrg00、 factorrg01、factorrg10、factorrg11、factorb00、factorb01、factorb10Or factorb11
Wherein,Refer to the pixel in the channel R of the object pixel to be rendered or the sub-pixel in the channel G Value,Refer to the pixel value of the sub-pixel of the channel B of the object pixel to be rendered, R ' (G ')(x, y)Refer to coordinate For the pixel value in the channel R or the channel G in the source pixel of (x, y), B '(x, y)Refer to that coordinate is the source pixel of (x, y) The pixel value of middle channel B, it is each in formulaAll contain downward round numbers operation, xspr%2=0 indicates described to wash with watercolours Dye object pixel is located at the even column and x of the target imagespr%2=1 indicates that the object pixel to be rendered is located at institute State the odd column of target image, and factorsmooth、factorrg00、factorrg01、factorrg10、factorrq11、 factorb00、factorb01、factorb10And factorb11It is preset value.
11. sub-pixel rendering method as described in claim 1, further includes:
When the edge code is the even number line that the special style and the object pixel to be rendered are located at the target image When, the pixel value of the sub-pixel of the object pixel to be rendered is calculated by following formula:
When
When
When
When
Wherein,Refer to that coordinate is (xspr, y) the object pixel to be rendered the channel R or the channel G institute The pixel value of sub-pixel is stated,Refer to that coordinate is (xspr, y) the object pixel to be rendered channel B the son The pixel value of pixel, R ' (G ')X, yRefer to the channel R that coordinate in the source images is the source pixel of (x, y) or the channel G Pixel value, B 'X, yRefer to that coordinate in the source pixel is the pixel value of the channel B of the source pixel of (x, y), it is each in formulaAll contain downward round numbers operation, factorkepFor a preset value, factoraveFor a preset value, and Edgecode refers to the edge code.
12. sub-pixel rendering method as described in claim 1, further includes:
When the edge code is the odd-numbered line that the special style and the object pixel to be rendered are located at the target image When, the pixel value of the sub-pixel of the object pixel to be rendered is calculated by following formula:
Work as edgecode=0x3030
Work as edgecode=0xC0C0
Wherein,Refer to that coordinate is (xspr, y) the object pixel to be rendered the channel R or the channel G institute The pixel value of sub-pixel is stated,Refer to that coordinate is (xspr, y) the object pixel to be rendered channel B the son The pixel value of pixel, R ' (G ')X, yRefer to the channel R that coordinate in the source images is the source pixel of (x, y) or the channel G Pixel value, B 'X, yRefer to that coordinate in the source images is the pixel value of the channel B of the source pixel of (x, y), it is each in formulaAll contain downward round numbers operation, factorkepRefer to the edge code for a preset value and edgecode.
13. sub-pixel rendering method as described in claim 1, wherein when the target image have an at least recess and/or When one fillet, by the pixel value for corresponding to the source pixel of the recess and/or the fillet in the source images multiplied by declining Subtract coefficient.
14. sub-pixel rendering method as claimed in claim 13, wherein the face that the attenuation coefficient and circular arc tangential line are cut Product is related.
15. sub-pixel rendering method as claimed in claim 13, wherein the attenuation coefficient can be taken by formula below :
Areaarch=(2*offset-1)/(2*step)
Wherein, AreaarchFor the attenuation coefficient, offset is location index of the source pixel in sawtooth, and step is described The width of sawtooth.
16. sub-pixel rendering method as claimed in claim 13, wherein the coordinate information of the recess and/or the fillet It is stored in register, can be determined using the coordinate information and correspond to the recess and/or the circle in the source images The position of the source pixel at angle.
17. sub-pixel rendering method as claimed in claim 13, wherein if the source pixel in the source images is in institute State the sub-pixel that the corresponding object pixel to be rendered is not present in target image, then it will be described in the source images The pixel value of source pixel is set as 0.
18. a sub-pixel rendering device, comprising:
Reservoir, for storing source images and target image;And
Processor, for generating the target image according to the source images;
Wherein, the processor obtains the source images from the reservoir, determines the target to be rendered in the target image Pixel calculates the edge code of source pixel corresponding with the sub-pixel of the object pixel to be rendered in the source images, according to The edge code determines the texture information around the sub-pixel of the object pixel to be rendered, and works as the edge generation When code is not special style, the sub-pixel of the object pixel to be rendered is calculated according to the texture information and based on distance Pixel value.
19. sub-pixel rendering device as claimed in claim 18, wherein the processor is according to the texture information and is based on When distance calculates the pixel value of the sub-pixel of the object pixel to be rendered, if the object pixel to be rendered is described The texture information around sub-pixel is weaker, carries out when calculating the pixel value of the sub-pixel of the object pixel to be rendered Smoothing processing.
20. sub-pixel rendering device as claimed in claim 18, wherein when the coding of the horizontal direction in the edge code When for 0x8 or 0x4, the processor determines that the texture information around the sub-pixel of the object pixel to be rendered is It is weaker.
CN201910332550.6A 2019-04-24 2019-04-24 Sub-pixel rendering method and device Active CN110060619B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910332550.6A CN110060619B (en) 2019-04-24 2019-04-24 Sub-pixel rendering method and device
US16/592,061 US11030937B2 (en) 2019-04-24 2019-10-03 Sub-pixel rendering method and device
US17/318,126 US11158236B2 (en) 2019-04-24 2021-05-12 Sub-pixel rendering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910332550.6A CN110060619B (en) 2019-04-24 2019-04-24 Sub-pixel rendering method and device

Publications (2)

Publication Number Publication Date
CN110060619A true CN110060619A (en) 2019-07-26
CN110060619B CN110060619B (en) 2022-05-10

Family

ID=67320423

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910332550.6A Active CN110060619B (en) 2019-04-24 2019-04-24 Sub-pixel rendering method and device

Country Status (1)

Country Link
CN (1) CN110060619B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419147A (en) * 2020-04-14 2021-02-26 上海哔哩哔哩科技有限公司 Image rendering method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140750A (en) * 2006-09-05 2008-03-12 三星电子株式会社 Edge-based image enhancement
US20140232757A1 (en) * 2013-02-15 2014-08-21 Sony Corporation Display device and electronic apparatus
CN104766548A (en) * 2015-03-17 2015-07-08 京东方科技集团股份有限公司 Display device and display method thereof
US20160307487A1 (en) * 2015-04-15 2016-10-20 Japan Display Inc. Display device and electronic apparatus
US20170091903A1 (en) * 2015-09-30 2017-03-30 Lg Display Co., Ltd. Image-processing circuit and display device having the same
US20180137602A1 (en) * 2016-11-14 2018-05-17 Google Inc. Low resolution rgb rendering for efficient transmission
CN108417177A (en) * 2017-02-10 2018-08-17 深圳云英谷科技有限公司 Display pixel arrangement and its driving circuit
CN109559650A (en) * 2019-01-16 2019-04-02 京东方科技集团股份有限公司 A kind of pixel rendering method and device, image rendering method and device, display device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140750A (en) * 2006-09-05 2008-03-12 三星电子株式会社 Edge-based image enhancement
US20140232757A1 (en) * 2013-02-15 2014-08-21 Sony Corporation Display device and electronic apparatus
CN104766548A (en) * 2015-03-17 2015-07-08 京东方科技集团股份有限公司 Display device and display method thereof
US20160307487A1 (en) * 2015-04-15 2016-10-20 Japan Display Inc. Display device and electronic apparatus
US20170091903A1 (en) * 2015-09-30 2017-03-30 Lg Display Co., Ltd. Image-processing circuit and display device having the same
US20180137602A1 (en) * 2016-11-14 2018-05-17 Google Inc. Low resolution rgb rendering for efficient transmission
CN108417177A (en) * 2017-02-10 2018-08-17 深圳云英谷科技有限公司 Display pixel arrangement and its driving circuit
CN109559650A (en) * 2019-01-16 2019-04-02 京东方科技集团股份有限公司 A kind of pixel rendering method and device, image rendering method and device, display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419147A (en) * 2020-04-14 2021-02-26 上海哔哩哔哩科技有限公司 Image rendering method and device
CN112419147B (en) * 2020-04-14 2023-07-04 上海哔哩哔哩科技有限公司 Image rendering method and device

Also Published As

Publication number Publication date
CN110060619B (en) 2022-05-10

Similar Documents

Publication Publication Date Title
CN110047417A (en) Sub-pixel rendering method and device
US9589367B2 (en) Reconstruction of missing data point from sparse samples during graphics processing using cubic spline polynomials
CN102842116B (en) Illumination equalization processing method for quick-response matrix code in image
US11030937B2 (en) Sub-pixel rendering method and device
US7348996B2 (en) Method of and system for pixel sampling
JP5106735B2 (en) Shape processor
US9774761B2 (en) Image processing system and method
JP2022511319A (en) Distance field color palette
CN101452573A (en) Image edge enhancing method
US11350015B2 (en) Image processing system and method
US8582902B2 (en) Pixel block processing
CN101770759B (en) Method and device for downsampling based on sub-pixel
CN105930891A (en) Method for generating two-dimensional code
CN106233334A (en) A kind of apparatus and method that video block Fractionation regimen is associated with Video coding block
CN113347416B (en) Chroma intra prediction method and device, and computer storage medium
CN110060619A (en) Sub-pixel rendering method and device
JPH0728993A (en) Image doubling device
CN108182666B (en) Parallax correction method, device and terminal
CN104270624A (en) Region-partitioning 3D video mapping method
CN104461441A (en) Rendering method, rendering device and display device
US7050066B2 (en) Image processing apparatus and image processing program
RU2003130966A (en) METHOD FOR SEALING AND UNPACKING IMAGE DATA
JP2010541094A (en) Method, compressor, decompressor and signal representation for lossless compression of pixel block values using row tilt codewords and column tilt codewords
CN110310235A (en) Method for processing fundus images, device and equipment and storage medium
CN109218636A (en) The binaryzation data output method of imaging sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210311

Address after: 201203 3rd floor, building 2, No. 200, zhangheng Road, Pudong New Area pilot Free Trade Zone, Shanghai

Applicant after: Gryfield Intelligent Technology Co.,Ltd.

Address before: Room 301, 2537 Jinke Road, Zhangjiang High Tech Park, Pudong New Area, Shanghai 201203

Applicant before: VIA ALLIANCE SEMICONDUCTOR Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 201203, 11th Floor, Building 3, No. 889 Bibo Road, China (Shanghai) Pilot Free Trade Zone, Shanghai

Patentee after: Granfei Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 201203 3rd floor, building 2, No. 200, zhangheng Road, Pudong New Area pilot Free Trade Zone, Shanghai

Patentee before: Gryfield Intelligent Technology Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address