WO2013114643A1 - Dispositif d'affichage vidéo et récepteur de télévision - Google Patents

Dispositif d'affichage vidéo et récepteur de télévision Download PDF

Info

Publication number
WO2013114643A1
WO2013114643A1 PCT/JP2012/067815 JP2012067815W WO2013114643A1 WO 2013114643 A1 WO2013114643 A1 WO 2013114643A1 JP 2012067815 W JP2012067815 W JP 2012067815W WO 2013114643 A1 WO2013114643 A1 WO 2013114643A1
Authority
WO
WIPO (PCT)
Prior art keywords
luminance
unit
video signal
input
video
Prior art date
Application number
PCT/JP2012/067815
Other languages
English (en)
Japanese (ja)
Inventor
藤根 俊之
洋二 白谷
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2013114643A1 publication Critical patent/WO2013114643A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/66Transforming electric information into light information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • G09G3/342Control of illumination source using several illumination sources separately controlled corresponding to different display panel areas, e.g. along one dimension such as lines
    • G09G3/3426Control of illumination source using several illumination sources separately controlled corresponding to different display panel areas, e.g. along one dimension such as lines the different display panel areas being distributed in two dimensions, e.g. matrix
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
    • G09G3/3611Control of matrices with row and column drivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/57Control of contrast or brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2310/00Command of the display device
    • G09G2310/02Addressing, scanning or driving the display screen or processing steps related thereto
    • G09G2310/0237Switching ON and OFF the backlight within one frame
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0271Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0613The adjustment depending on the type of the information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • G09G2320/064Adjustment of display parameters for control of overall brightness by time modulation of the brightness of the illumination source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • G09G2320/0646Modulation of illumination source brightness and image signal correlated to each other
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/16Calculation or use of calculated indices related to luminance levels in display data
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/20Details of the management of multiple sources of image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4436Power management, e.g. shutting down unused components of the receiver

Definitions

  • the present invention relates to a video display device and a television receiver, and more particularly to a video display device and a television receiver having an enhancement function for improving the quality of a displayed video.
  • images acquired from websites or PCs are often still images that do not move, and if such still images are excessively brightened or the contrast is improved, it will be glaring, On the contrary, it may be difficult to see the video.
  • the above-described light-emitting partial enhancement processing is applied to a still image that does not move, the sense of contrast is improved and the sense of brightness is increased. It is desirable to suppress it.
  • a first technical means of the present invention includes a display unit that displays an input video signal, a light source that illuminates the display unit, a display unit, and a control unit that controls the light source.
  • a video type determination unit that determines a type of the input video signal, and the control unit executes a light emission partial enhancement process according to a determination result of the video type determination unit, or
  • the light emission partial enhancement processing generates a histogram in which the number of pixels is integrated with respect to a predetermined feature amount related to the brightness of the input video signal, and uses an upper region of the predetermined range of the histogram as a light emission portion.
  • the second technical means includes the first input unit for inputting a video signal from the network and the second input unit for inputting the video signal from the information processing device in the first technical means, and the video type determination The unit determines, as the type of the input video signal, whether the input video signal is a video signal input from the first input unit or the second input unit, and the control unit When the video type determination unit determines that the input video signal is a video signal input from the first input unit or the second input unit, the light emission partial enhancement process is stopped. Is.
  • the third technical means includes the first input unit for inputting the video signal from the network and the second input unit for inputting the video signal from the information processing device in the first technical means, and the video type determination When the input video signal is determined to be the video signal input from the first input unit or the second input unit as the type of the input video signal, the input video signal is It is determined whether the image is a still image, and the control unit stops the light emission partial enhancement process when the video type determination unit determines that the input video signal is a still image. Is.
  • the control unit divides an image based on the input video signal into a plurality of regions, and the video of the divided region which is the divided region. Based on the gradation value of the signal, the lighting rate of the light source region corresponding to the divided region is changed, and an average lighting rate is obtained by averaging the lighting rate of the light source region for all the light source regions, The luminance of the light source is stretched based on the maximum display luminance that can be taken on the screen of the display unit that is related in advance to the average lighting rate.
  • the above pixel is a light emitting portion.
  • the input video signal is a video signal separated from the broadcast signal received by the tuner 9.
  • the input video signal is reproduced by a playback device such as a recorder or a player connected to the HDMI terminal 10 via HDMI.
  • This is a video signal when an HDD or an external recording medium (BD, DVD, etc.) is reproduced.
  • the area active control / luminance stretch unit 4 divides the image based on the video signal into predetermined areas according to the input video signal, and extracts the maximum gradation value of the video signal for each divided area. Based on the extracted value, the lighting rate of the backlight unit 6 is calculated. The lighting rate is determined for each region of the backlight unit 6 corresponding to the divided region of the video, and the lighting rate referred to here is actually changed as described later, and can be said to be a temporary value. .
  • the backlight unit 6 is an example of a light source for illuminating the display unit 8, and is configured by a plurality of LEDs, and brightness can be controlled for each region.
  • the lighting rate for each area of the backlight unit 6 is determined based on a predetermined arithmetic expression, but is basically maintained without decreasing the luminance of the LED in an area having a bright maximum gradation value of high gradation. Then, an operation for reducing the luminance of the LED in a region having a dark maximum gradation value with low gradation is performed.
  • the lighting rate may be calculated from other feature quantities related to the brightness of the input video signal such as an average gradation value instead of the maximum gradation value. Instead of the region having the dark maximum gradation value, a region having a bright average gradation value and a dark region may be applied.
  • the area active control / luminance stretch unit 4 outputs Max luminance determined according to the average lighting rate to the mapping unit 3 of the signal processing unit 1 for feedback.
  • the multiplier is used to apply the tone mapping to the input video signal, for each pixel value of the frame f N + 1 of the video signal, by multiplying the gain coefficient indicated by the tone mapping for the frame f N + 1, Area Output to the active control / luminance stretch unit 4.
  • the backlight unit 6 After calculating the lighting rate for each of the six areas based on the above-described predetermined arithmetic expression, it can be obtained by changing by stretching.
  • luminance of LED of the backlight part 6 is performed by PWM (Pulse Width Modulation) control, it can also be controlled so that it may become a desired value by electric current control or these combination.
  • the area active control / luminance stretch unit 4 stretches the backlight luminance in accordance with the average lighting rate to increase the luminance of the LED of the backlight unit 6, and information on the luminance stretch (the above Max luminance). Is returned to the signal processing unit 1 to reduce the luminance corresponding to the luminance stretch of the backlight unit 6 with respect to the video signal.
  • the luminance stretch is applied to the entire backlight unit 6, and the luminance reduction due to the video signal processing is performed on a portion (non-light emitting portion) that is regarded as not emitting light except the light emitting portion.
  • the backlight luminance is stretched to increase the luminance of the LED of the backlight unit 6, and the luminance of the video signal of the non-light emitting portion of the input video signal is decreased.
  • light emitting portion enhancement processing By such video signal processing and backlight luminance control processing, it is possible to increase the screen luminance of only the light emitting part, to perform video expression with high contrast, and to improve image quality.
  • reducing the luminance corresponding to the luminance stretch of the backlight unit 6 with respect to the non-light emitting portion is the screen luminance of the non-light emitting portion.
  • the area active control / luminance stretch unit 4 uses the increase in display luminance of the display unit 8 due to the luminance stretch of the light source in the non-light-emitting portion (that is, the predetermined region having a low predetermined feature amount). It is preferable to reduce the brightness.
  • the video type determination unit 14 is a video signal input from any input source of the input video signal, that is, any input source of the tuner 9, the HDMI terminal 10, the LAN terminal 11, and the DVI terminal 12 as the type of the input video signal. Determine if there is. Since the video signal input from the tuner 9 or the HDMI terminal 10 is mainly video content, the determination result including the execution instruction of the light emission partial enhancement process is output to the signal processing unit 1 and the area active control / luminance stretch unit 4. To do. Since the video signal input from the LAN terminal 11 or the DVI terminal 12 is mainly still image content, the determination result including the stop instruction of the light emission partial enhancement processing is used as the signal processing unit 1 and the area active control / luminance stretching unit 4. Output to.
  • FIG. 3 is a diagram showing an example of still image content displayed on the video display device.
  • 3A shows still image content acquired from a website on the Internet
  • FIG. 3B shows still image content acquired from a PC.
  • the signal processing unit 1 and the area active control / luminance stretch unit 4 are configured to emit light when the video type determination unit 14 determines that the input video signal is a video signal input from the LAN terminal 11 or the DVI terminal 12. Stop enhancement processing. This is because the video signal input from the LAN terminal 11 or the DVI terminal 12 mainly corresponds to still image content.
  • the tuner 9 is selected as an input destination and the user is watching a digital broadcast program.
  • the tuner 9 since the tuner 9 is selected, the light emission partial enhancement process is executed.
  • a remote control signal for operating the remote control R to switch the input to the Internet (LAN terminal 11) is transmitted to the video display device.
  • the remote control signal processing unit 15 of the video display device includes a remote control light receiving unit (not shown), analyzes the remote control signal received from the remote control R, and instructs the selector 13 to switch the input to the LAN terminal 11.
  • the selector 13 selects the LAN terminal 11 in accordance with an instruction from the remote control signal processing unit 15 and switches the input to the selected LAN terminal 11.
  • the light emission partial enhancement process can be stopped, so that it is possible to display an image in which the feeling of brightness and contrast is suppressed.
  • the remote controller R when the user operates the remote controller R to switch the input to the PC (DVI terminal 12) in order to view the video of the PC, the light emission by the signal processing unit 1 and the area active control / luminance stretch unit 4 is also performed. Partial enhancement processing is stopped.
  • the video display device may determine whether or not the input video signal is a still image. Specifically, when the video type determination unit 14 determines that the input video signal is a video signal input from the LAN terminal 11 or the DVI terminal 12, it further determines whether or not the input video signal is a still image. Determine.
  • the signal processing unit 1 and the area active control / luminance stretch unit 4 stop the light emission partial enhancement processing when the video type determination unit 14 determines that the input video signal is a still image.
  • still image determination makes it possible to more reliably discriminate between a moving image and a still image.
  • the light emission partial enhancement process can be stopped.
  • FIG. 4C shows an example of the state when the maximum gradation value is extracted from each divided area of one frame and the lighting rate corresponding to the maximum gradation value.
  • FIG. 4D shows the lighting rate of each area and the average lighting rate of the entire screen.
  • 4 (C) and 4 (D) an example in which the screen of one frame is divided into eight areas (area Nos. 1 to 8) is given for the sake of simplicity, but as shown in FIG. 4 (B). Processing can be performed by dividing into a large number of regions, and processing can be performed by dividing the region into as many regions as the number of LEDs provided.
  • the lighting rate of the temporary LED of the backlight in the area is calculated from the maximum gradation value in the area.
  • the provisional lighting rate can be indicated by, for example, LED drive duty (hereinafter, LED duty).
  • LED duty LED drive duty
  • the maximum value of the lighting rate is 100%.
  • the brightness of the LED is controlled to be a desired value by PWM and / or current control.
  • PWM pulse width modulator
  • current control an example in which only PWM control is employed for the sake of simplification of description.
  • a predetermined luminance may be obtained by increasing the current value together with current control.
  • the luminance of the backlight is lowered by lowering the lighting rate in a dark region where the maximum gradation value is low.
  • the actual lighting rate of each area is determined so as to accurately display the gradation to be displayed and to make the LED duty as low as possible.
  • the duty temporary lighting rate
  • the gradation of the display unit 8 here, the LCD panel
  • the gradation value of the video is expressed by 8-bit data of 0 to 255
  • the gradation values of a plurality of pixels in one area in FIG. 4C are shown in FIG. ) Will be described.
  • the maximum gradation value is 128.
  • the area active control / luminance stretch unit 4 determines such a provisional lighting rate, and considers the gradation value for each pixel in the display unit 8 in consideration of the provisional lighting rate for the region including the pixel. To calculate.
  • the display control unit 7 may perform display control of the display unit 8 with the display control data of the gradation value illustrated in FIG. 5C for the pixel group illustrated in FIG.
  • This Max luminance is the maximum value of possible screen luminance, and is determined based on the relationship as shown in FIG. 2, for example.
  • the horizontal axis in the graph of FIG. 2 is the average lighting rate (window size) of the backlight, and this average lighting rate is calculated between a lighting region (window region) with a lighting rate of 100% and a non-lighting region with a lighting rate of 0%. It can be expressed as a ratio.
  • the average lighting rate is zero when there is no lighting region, the average lighting rate increases as the window of the lighting region increases, and the average lighting rate becomes 100% for all lighting.
  • the Max luminance when the backlight is fully lit is, for example, 550 (cd / m 2 ), and this is the reference luminance before stretching.
  • the Max luminance is increased as the average lighting rate decreases from 100%.
  • a pixel having a gradation value of 255 gradations has the highest screen brightness in the screen, and the maximum possible screen brightness (Max brightness). From this, it can be seen that even with the same average lighting rate, the screen luminance does not increase up to the Max luminance depending on the gradation value of the pixel.
  • the average lighting rate is P
  • the value of Max luminance is the largest
  • the maximum screen luminance at this time is 1500 (cd / m 2 ).
  • the maximum possible screen brightness is stretched to 1500 (cd / m 2 ) compared to 550 (cd / m 2 ) when all the lights are on.
  • P is set at a position where the average lighting rate is relatively low.
  • the brightness of the backlight is stretched to a maximum of 1500 (cd / m 2 ) when the screen is a dark screen as a whole with a low average lighting rate and a high gradation peak in part.
  • the reason for the lower stretch of backlight brightness is the higher the average lighting rate, the less bright the screen is because it may feel dazzling if the backlight brightness is excessively high on an originally bright screen. It is for doing so.
  • the range with a low average lighting rate corresponds to a dark screen image, and rather than increasing the screen brightness by stretching the backlight brightness, the backlight brightness is reduced to improve the contrast, It is preferable to maintain the display quality by suppressing black float. Therefore, in the example of FIG. 2, such a setting for suppressing black floating at the low average lighting rate is adopted, and the value of Max luminance is gradually decreased from the average lighting rate P to the average lighting rate 0 (all black). I am letting.
  • FIG. 6 is a diagram illustrating an example of a Y histogram generated from the luminance signal Y of the input video signal.
  • the light emission detection unit 2 adds up the number of pixels for each luminance gradation for each frame of the input video signal to generate a Y histogram.
  • the horizontal axis represents the gradation value of luminance Y
  • the vertical axis represents the number of pixels (frequency) integrated for each gradation value.
  • a light emitting portion is detected for luminance Y.
  • the luminance Y is an example of a feature amount of an image for creating a histogram for detecting a light emitting portion, and another example of the feature amount will be described later.
  • the second threshold value Th2 defines a light emission boundary, and processing is performed assuming that pixels having the threshold value Th2 or more in the Y histogram are light emitting portions.
  • the second threshold Th2 can be expressed by the following equation (1), where N is a predetermined constant and ⁇ is a standard deviation. That is, the light emission detection unit 2 detects pixels equal to or greater than Th2 in the following expression (1) as light emission portions.
  • Th2 Ave + N ⁇ Expression (1)
  • the values of the first and second threshold values Th1 and Th2 detected by the light emission detection unit 2 are output to the mapping unit 3 and used to generate tone mapping.
  • FIG. 7 is a diagram illustrating an example of tone mapping generated by the mapping unit 3.
  • the horizontal axis represents the input gradation of the luminance value of the video
  • the vertical axis represents the output gradation.
  • the pixels with the second threshold Th2 or more detected by the light emission detection unit 2 are light emitting portions in the image, and the gain is reduced by applying a compression gain except for the light emitting portions.
  • the light emission detection unit 2 sets and detects the first threshold Th1, sets the first gain G1 for a region smaller than Th1, and sets the second gain so as to linearly connect Th1 and Th2. Tone mapping is performed by setting the gain G2.
  • the mapping unit 3 receives the Max luminance value from the area active control / luminance stretch unit 4.
  • the Max luminance indicates the maximum screen luminance determined from the average lighting rate of the backlight.
  • the value of the backlight duty (LED duty) indicating the maximum light emission luminance can be input. it can.
  • the first gain G1 is applied to a region smaller than the first threshold Th1, and is set by the following expression (3).
  • G1 (Ls / Lm) 1 / ⁇ Expression (3)
  • Ls is the reference luminance (reference luminance when the backlight luminance is not stretched; for example, the luminance when the maximum screen luminance is 550 cd / m 2 )
  • Lm is the area active control / luminance stretching unit. Max luminance output from 4. Therefore, the first gain G1 applied to the region smaller than the first threshold Th1 lowers the output gradation of the video signal so as to reduce the screen luminance that increases due to the luminance stretch of the backlight.
  • the tone mapping generated by the mapping unit 3 is applied to the input video signal, and the video signal in which the output of the low gradation part is suppressed based on the luminance stretch amount of the backlight is input to the area active control / luminance stretch unit 4.
  • the area active control / luminance stretch unit 4 inputs the video signal to which the tone mapping generated by the mapping unit 3 is applied, performs area active control based on the video signal, and Max based on the average lighting rate. The brightness is also determined.
  • the frame of this time is the frame f N.
  • the value of Max luminance frame f N is output to the mapping portion 3.
  • Max luminance based on the average lighting rate of area active control is fed back and used for tone mapping of the next frame.
  • Mapping unit 3 based on the Max luminance determined by the frame f N, as described in FIG. 7, applying a gain (first gain G1) to reduce the video output for the first threshold value Th1 smaller area To do.
  • the second gain G2 that linearly connects Th1 and Th2 is applied to the region between Th1 and Th2, and the video output between Th1 and Th2 is reduced.
  • FIG. 9 is a diagram illustrating a state in which the screen luminance is enhanced by the processing of the area active control / luminance stretch unit 4.
  • the horizontal axis is the gradation value of the input video signal
  • the vertical axis is the screen luminance (cd / m 2 ) of the display unit 8
  • S 2 and S 3 are the first and This corresponds to the position of the gradation values of the second threshold values Th1 and Th2.
  • the input video signal is enhanced and displayed with a ⁇ curve according to the Max luminance determined by the area active control.
  • S4 indicates the screen luminance when the input video signal has the maximum gradation value (255). For example, when the Max luminance is 1500 (cd / m 2 ), the screen luminance at the maximum gradation is 1500 (cd). / M 2 ).
  • the first gain G1 is applied to the video signal so as to reduce the screen luminance component that increases due to the luminance stretch of the backlight. Therefore, the screen is displayed with a ⁇ curve based on the reference luminance. This is because the output value of the video signal is suppressed in a range smaller than the threshold Th1 (corresponding to S2) corresponding to the luminance stretch by the mapping unit 3 in accordance with the Max luminance determined by the area active control / luminance stretch unit 4.
  • the screen brightness changes according to the tone mapping of Th1 to Th2.
  • the curve based on the reference brightness indicates that the maximum brightness value screen brightness is the reference brightness when the backlight brightness is not stretched (for example, the maximum brightness value screen brightness is 550 cd / m 2 ).
  • the curve based on the Max luminance is a ⁇ curve in which the screen luminance of the maximum gradation value becomes the Max luminance determined by the area active control / luminance stretch unit 4.
  • the range where the input video signal is greater than or equal to S3 is a range that is considered to emit light
  • the video signal is maintained without being suppressed while the backlight is stretched by luminance stretching. Thereby, the screen brightness is enhanced, and a high-quality image display with a more lustrous feeling can be performed.
  • the ⁇ curves from S1 to S2 do not need to match the reference luminance, and can be set by appropriately adjusting the gain G1 as long as it has a level different from the enhancement region of the light emitting portion. .
  • the second embodiment has the same configuration as that of the first embodiment, but unlike the first embodiment, the Max luminance value used for tone mapping is not determined by the area active control / luminance stretch unit 4.
  • the light emission detection unit 2 determines the luminance stretch amount based on the detection result of the light emission portion, and the mapping unit 3 executes tone mapping based on the determined luminance stretch amount. Therefore, the mapping unit 3 of the signal processing unit 1 does not need to output the Max luminance value by luminance stretching from the area active control / luminance stretching unit 4 as in the first embodiment.
  • the light emission detection unit 2 may only detect the light emission part, and the mapping unit 3 may be configured to calculate the luminance stretch amount from the detection result of the light emission part.
  • a third threshold Th3 is further set.
  • the third threshold value Th3 is between Th1 and Th2, and is provided for detecting the state of the pixel in the light emitting portion.
  • the threshold value Th3 may be the same value as Th2, but is provided in order to facilitate processing by providing a wider margin for the light emitting portion equal to or greater than Th2. Accordingly, Th3 is expressed by the following equation (4).
  • Th3 Ave + Q ⁇ (M ⁇ Q ⁇ N) (4)
  • FIG. 12 is a diagram illustrating a setting example of the luminance stretch according to the pixels having the third threshold Th3 or more.
  • the horizontal axis represents the score of the pixel value equal to or greater than the threshold Th3, and the vertical axis represents the luminance stretch amount according to the score.
  • the score is defined as [ratio of pixels with luminance greater than a certain threshold] ⁇ [distance from threshold (luminance difference)], and counts the number of pixels having a gradation value larger than the third threshold Th3,
  • the degree of brightness is indicated by weighting and calculating the distance from the threshold Th3. For example, it is calculated by the following equation (5).
  • the total number of pixels is not limited to i> Th3, but is a value obtained by counting all the numbers of pixels.
  • the score increases when there are many high gradation pixels apart from Th3 in the light emitting portion. Even if the number of pixels larger than Th3 is constant, the score is higher as the number of pixels with higher gradation is larger.
  • the luminance stretch amount is the same as that described in the first embodiment, and is indicated by, for example, a backlight duty value, similarly to the Max luminance.
  • the luminance stretch amount determined according to the values of the first and second threshold values Th1 and Th2 detected by the light emission detection unit 2 and the score of the pixel equal to or greater than Th3 is output to the mapping unit 3 and used for generation of tone mapping.
  • the toe mapping process in the mapping unit 3 is the same as in the first embodiment. That is, as shown in FIG. 7, the first gain G1 is set for a region smaller than Th1 detected by the light emission detector 2, and the second gain G2 is set so as to linearly connect Th1 and Th2. To do. At this time, when setting the gain G1, the luminance stretch amount detected by the light emission detection unit 2 is used, and the luminance is reduced by video signal processing according to the luminance stretch amount of the backlight. The obtained tone mapping is applied to the input video signal and input to the area active control / luminance stretch unit 4.
  • the luminance corresponding to the luminance stretch is reduced by the video signal processing in the non-light-emitting portion, as a result, the luminance of only the light-emitting portion is increased on the screen, and a high-contrast high-quality image is displayed. be able to.
  • the relationship between the input video signal and the screen luminance is the same as that in FIG. 9 shown in the first embodiment.
  • the signal processing unit 1 and the area active control / luminance stretch unit 4 execute or stop the light emission partial enhancement processing of the second embodiment according to the determination result of the video type determination unit 14. Since the configuration of the video type determination unit 14 according to the present invention is the same as that of the first embodiment, repeated description thereof is omitted here.
  • the above area active control may be executed in the same manner as in the first embodiment.
  • the maximum gradation value of the video signal is extracted for each divided region, and the lighting rate (drive duty) of the LED for each region is determined according to the extracted maximum gradation value. decide.
  • the process of stretching the backlight luminance according to the Max luminance obtained from the average lighting rate is stopped.
  • FIG. 13 is a view for explaining still another embodiment (Embodiment 3) of the video display apparatus according to the present invention, and shows still another configuration example of the main part of the video display apparatus.
  • the luminance stretch unit 4 a receives the video signal to which the tone mapping generated by the mapping unit 3 is applied, and outputs display control data for displaying the video signal to the display control unit 7. At this time, processing by area active control is not performed. On the other hand, the entire backlight unit 6 is uniformly stretched based on the luminance stretch amount output from the mapping unit 3.
  • the bright image that is emitted becomes brighter and brighter.
  • the luminance corresponding to the luminance stretch is reduced by the video signal processing in the non-light emitting part, as a result, the luminance of the light emitting part is increased on the screen, and high contrast and high quality images are displayed. Can do.
  • the signal processing unit 1 and the luminance stretch unit 4a execute or stop the light emission partial enhancement processing according to the third embodiment according to the determination result of the video type determination unit 14. Since the configuration of the video type determination unit 14 according to the present invention is the same as that of the first and second embodiments, repeated description thereof is omitted here.
  • the signal processing unit 1 stops the process in the light emission detection unit 2, and thus the luminance stretch amount (N) is not calculated, and the mapping unit 3 and the luminance stretch unit 4a have the luminance stretch amount. (N) is not output.
  • the mapping unit 3 outputs, for example, a default tone mapping (for example, a tone curve corresponding to one-to-one input / output) to the multiplier.
  • the luminance stretch unit 4a stops the process related to the luminance stretch, outputs control data for controlling the backlight unit 6 to the backlight control unit 5 with respect to the input video signal, and displays the display unit 8 Display control data for control is output to the display control unit 7.
  • these control data and display control data for example, default setting data can be used.
  • a luminance stretch unit 4a that does not execute the area active control may be provided instead of the area active control / luminance stretch unit 4 of FIG. 1, a luminance stretch unit 4a that does not execute the area active control may be provided.
  • Max luminance is obtained from the average lighting rate (however, in this example, the temporary lighting rate itself is the temporary average lighting rate of the entire screen) in the luminance stretch unit 4a, and the light emission luminance of the LED is based on that. And the Max luminance may be fed back to the mapping unit 3.
  • L * is the lightness of the color of interest
  • L * modeboundary is the lightness of the boundary that appears to emit light with the same chromaticity as the color of interest.
  • L * modeboundary ⁇ lightness of the brightest color (the lightest color of the object color).
  • the broadcast video signal is standardized based on the BT.709 standard and transmitted. Accordingly, the RGB data of the broadcast video signal is first converted into tristimulus value XYZ data using a conversion matrix for BT.709. Then, the brightness L * is calculated from Y using a conversion formula. It is assumed that the color L * of interest is at position PL1 in FIG. Next, chromaticity is calculated from the converted XYZ, and L * (L * modeboundary) of the brightest color having the same chromaticity as the target color is examined from the already known brightest color data. The position on FIG. 14 is PL2.
  • CMI is calculated using the above equation (6).
  • CMI is indicated by the ratio of the optimal color of L * (L * modeboundary) of L * and chromaticity of the pixel of interest.
  • the CMI is obtained for each pixel of the video signal by the above method. Since it is a standardized broadcast signal, all the pixels have a CMI ranging from 0 to 100. Then, for one frame image, a CMI histogram is created with the horizontal axis as CMI and the vertical axis as frequency. Here, the average value Ave. and the standard deviation ⁇ are calculated, and each threshold value is set to detect the light emitting portion.
  • Max RGB is data having the maximum gradation value among RGB data.
  • the fact that two colors have the same chromaticity is synonymous with the fact that the ratio of RGB does not change. That is, the process of calculating the brightest color of the same chromaticity in the CMI is a process of obtaining a combination of RGB when the gradation of the RGB data becomes the maximum when the RGB data is multiplied by a certain value without changing the ratio.
  • a pixel having gradation RGB data as shown in FIG. 15B When the RGB data of the pixel of interest is multiplied by a certain number, as shown in FIG. 15B, the color when one of RGB is first saturated is the brightest color with the same chromaticity as the original pixel.
  • the gradation of the target pixel of the first saturated color R in this case
  • the gradation of the brightest R is r2
  • a value similar to CMI can be obtained by the following equation (7).
  • the color that first saturates when it is multiplied by a certain value to RGB is the color having the maximum gradation among the RGB of the pixel of interest. (R1 / r2) ⁇ 100 (7)
  • Display control 8 is a display unit, 9 is a tuner, 10 is an HDMI terminal, 11 is a LAN terminal, 12 is a DVI terminal, 13 is a selector, 14 is a video type determination unit, and 15 is a remote control signal processing unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Liquid Crystal Display Device Control (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Liquid Crystal (AREA)

Abstract

La présente invention permet d'obtenir des expressions vidéo avec un sens amélioré de l'éclat et du contraste pour des signaux vidéo, tout en supprimant le sens d'éclat et de contraste lors de l'affichage d'images vidéo obtenues à partir de sites web ou similaires. Ce dispositif d'affichage vidéo comprend : une unité (8) d'affichage ; une unité (6) de rétroéclairage; une unité de commande ; une unité (1) de traitement de signal; une unité (4) de commande de zone -active /d'étirage de la luminance, qui commande l'unité (8) d'affichage et l'unité (6) de rétroéclairage ; et une unité de détermination de type de vidéo (14). L'unité de commande exécute ou arrête un processus d'amélioration de la section d'émission de lumière, en accord avec le résultat de la détermination par l'unité de détermination de type de vidéo (14). Dans le processus d'amélioration de la section d'émission de lumière, la luminance d'affichage d'une section d'émission de lumière est améliorée par : la création d'un histogramme dans lequel le nombre de pixels est intégré pour une quantité caractéristique prédéterminée relative à la luminosité d'un signal vidéo d'entrée ; la détection d'une région supérieure à l'intérieur d'une plage prédéterminée de l'histogramme, définie comme la section d'émission de lumière ; le fait d'étirer et d'accroître la luminance de l'unité (6) de rétroéclairage; et la réduction de la luminance du signal vidéo, à l'intérieur dudit signal vidéo d'entrée, des sections non-émettrices de lumière autres que ladite section d'émission de lumière.
PCT/JP2012/067815 2012-01-31 2012-07-12 Dispositif d'affichage vidéo et récepteur de télévision WO2013114643A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-017806 2012-01-31
JP2012017806A JP5330552B2 (ja) 2012-01-31 2012-01-31 映像表示装置およびテレビ受信装置

Publications (1)

Publication Number Publication Date
WO2013114643A1 true WO2013114643A1 (fr) 2013-08-08

Family

ID=48904716

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/067815 WO2013114643A1 (fr) 2012-01-31 2012-07-12 Dispositif d'affichage vidéo et récepteur de télévision

Country Status (2)

Country Link
JP (1) JP5330552B2 (fr)
WO (1) WO2013114643A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013161092A (ja) * 2013-03-05 2013-08-19 Sharp Corp 映像表示装置およびテレビ受信装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004272156A (ja) * 2003-03-12 2004-09-30 Sharp Corp 画像表示装置
JP2005148710A (ja) * 2003-11-17 2005-06-09 Lg Philips Lcd Co Ltd 液晶表示装置の駆動方法及び駆動装置
JP2005227694A (ja) * 2004-02-16 2005-08-25 Canon Inc 画像表示装置および画像表示方法
JP2006301049A (ja) * 2005-04-18 2006-11-02 Seiko Epson Corp 表示装置および表示方法
WO2008001512A1 (fr) * 2006-06-28 2008-01-03 Sharp Kabushiki Kaisha Dispositif d'affichage d'images
JP2008099207A (ja) * 2006-10-16 2008-04-24 Toshiba Microelectronics Corp 映像輝度補正回路及びそれを用いた輝度補正方法
JP2009063694A (ja) * 2007-09-05 2009-03-26 Seiko Epson Corp 画像処理装置、画像表示装置、画像処理方法及びプログラム
JP2010262288A (ja) * 2009-04-30 2010-11-18 Samsung Electronics Co Ltd 光源駆動方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334146A (ja) * 1994-06-08 1995-12-22 N F Kairo Sekkei Block:Kk グラフ表示方式
JP2005165387A (ja) * 2003-11-28 2005-06-23 Seiko Epson Corp 画面のスジ欠陥検出方法及び装置並びに表示装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004272156A (ja) * 2003-03-12 2004-09-30 Sharp Corp 画像表示装置
JP2005148710A (ja) * 2003-11-17 2005-06-09 Lg Philips Lcd Co Ltd 液晶表示装置の駆動方法及び駆動装置
JP2005227694A (ja) * 2004-02-16 2005-08-25 Canon Inc 画像表示装置および画像表示方法
JP2006301049A (ja) * 2005-04-18 2006-11-02 Seiko Epson Corp 表示装置および表示方法
WO2008001512A1 (fr) * 2006-06-28 2008-01-03 Sharp Kabushiki Kaisha Dispositif d'affichage d'images
JP2008099207A (ja) * 2006-10-16 2008-04-24 Toshiba Microelectronics Corp 映像輝度補正回路及びそれを用いた輝度補正方法
JP2009063694A (ja) * 2007-09-05 2009-03-26 Seiko Epson Corp 画像処理装置、画像表示装置、画像処理方法及びプログラム
JP2010262288A (ja) * 2009-04-30 2010-11-18 Samsung Electronics Co Ltd 光源駆動方法

Also Published As

Publication number Publication date
JP2013156484A (ja) 2013-08-15
JP5330552B2 (ja) 2013-10-30

Similar Documents

Publication Publication Date Title
US9495921B2 (en) Video display device and television receiving device with luminance stretching
JP5085793B1 (ja) 映像表示装置およびテレビ受信装置
JP4991949B1 (ja) 映像表示装置およびテレビ受信装置
JP5197858B1 (ja) 映像表示装置およびテレビ受信装置
US8964124B2 (en) Video display device that stretches a video signal and a signal of the light source and television receiving device
JP5221780B1 (ja) 映像表示装置およびテレビ受信装置
JP5165802B1 (ja) 映像表示装置およびテレビ受信装置
WO2012105117A1 (fr) Dispositif d'affichage vidéo
WO2014002712A1 (fr) Dispositif d'affichage d'image
JP5092057B1 (ja) 映像表示装置およびテレビ受信装置
JP5143959B1 (ja) 映像表示装置およびテレビ受信装置
JP5330552B2 (ja) 映像表示装置およびテレビ受信装置
JP5174982B1 (ja) 映像表示装置およびテレビ受信装置
JP5303062B2 (ja) 映像表示装置およびテレビ受信装置
JP5244251B1 (ja) 映像表示装置およびテレビ受信装置
JP2013161092A (ja) 映像表示装置およびテレビ受信装置
JP6532103B2 (ja) 映像表示装置およびテレビ受信装置
JP2013167876A (ja) 映像表示装置およびテレビ受信装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12867119

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12867119

Country of ref document: EP

Kind code of ref document: A1