WO2013014791A1 - Video signal processing device and video signal processing method - Google Patents

Video signal processing device and video signal processing method Download PDF

Info

Publication number
WO2013014791A1
WO2013014791A1 PCT/JP2011/067312 JP2011067312W WO2013014791A1 WO 2013014791 A1 WO2013014791 A1 WO 2013014791A1 JP 2011067312 W JP2011067312 W JP 2011067312W WO 2013014791 A1 WO2013014791 A1 WO 2013014791A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
unit
signals
signal processing
signal
Prior art date
Application number
PCT/JP2011/067312
Other languages
French (fr)
Japanese (ja)
Inventor
浪岡 利幸
Original Assignee
株式会社 東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝 filed Critical 株式会社 東芝
Priority to PCT/JP2011/067312 priority Critical patent/WO2013014791A1/en
Publication of WO2013014791A1 publication Critical patent/WO2013014791A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/21Circuitry for suppressing or minimising disturbance, e.g. moiré or halo
    • H04N5/213Circuitry for suppressing or minimising impulsive noise
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive

Definitions

  • Embodiments of the present invention relate to a video signal processing device and a video signal processing method used for a video display device such as a digital television broadcast receiver.
  • Video signals are often supplied in the form of three primary color signals, that is, in the form of R (red), G (green), and B (blue) signals.
  • the video signal processing LSI outputs R, G, and B signals with the number of bits corresponding to the number of input bits of the video display panel, and the video display panel is output from the video signal processing LSI. Based on the supplied R, G, B signals, it functions to display an image on the panel surface.
  • an FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • R, G, B signals output from the video signal processing LSI are temporarily converted into luminance signals Y and color difference signals Cb, Cr, and necessary signal processing is performed.
  • the processed signals Y, Cb and Cr may be reconverted into R, G and B signals and supplied to the video display panel. In such a case, the quality of the displayed video may be deteriorated.
  • An object of the present invention is to provide a video signal processing device and a video signal processing method.
  • the video signal processing apparatus includes a generation unit, an extraction unit, and a processing unit.
  • the generation unit generates a luminance signal based on the R, G, and B signals.
  • the extraction unit extracts a feature amount of the video based on the luminance signal.
  • the processing unit performs predetermined signal processing corresponding to the feature amount on the R, G, and B signals.
  • Predetermined signal processing can be performed on each of R, G, and B signals without conversion to other signal formats, and necessary functions can be added and quality of the displayed video can be prevented from deteriorating.
  • a video signal processing apparatus and a video signal processing method can be provided.
  • the video signal processing apparatus includes a generation unit, an extraction unit, and a processing unit.
  • the generation unit generates a luminance signal based on the R, G, and B signals.
  • the extraction unit extracts a feature amount of the video based on the luminance signal.
  • the processing unit performs predetermined signal processing corresponding to the feature amount on the R, G, and B signals.
  • FIG. 1 schematically shows a signal processing system of a digital television broadcast receiving apparatus 11 described in this embodiment. That is, a digital television broadcast signal received by the antenna 12 is supplied to the tuner unit 14 via the input terminal 13, so that a broadcast signal of a desired channel is selected.
  • the broadcast signal selected by the tuner unit 14 is supplied to the demodulation / decoding unit 15 and restored to a digital video signal, audio signal, etc., and then output to the signal processing unit 16.
  • the signal processing unit 16 performs predetermined digital signal processing on the digital video signal and audio signal supplied from the demodulation / decoding unit 15.
  • the signal processing unit 16 outputs a digital video signal to the video processing unit 17 and outputs a digital audio signal to the audio processing unit 18.
  • the video processing unit 17 is a digital video signal supplied from the signal processing unit 16, and has R, G, and B bits each having a number of bits (for example, 8 bits) that can be input to the video display panel 20 in the subsequent stage. Each signal is converted and output to the noise reduction processing unit 19.
  • the noise reduction processing unit 19 changes the signal form, the number of bits of each signal (for example, 8 bits), etc. with respect to the R, G, and B signals supplied from the video processing unit 17. Noise reduction processing is applied.
  • the R, G, and B signals output from the noise reduction processing unit 19 are supplied to the video display panel 20 and used for video display.
  • the audio processing unit 18 converts the input digital audio signal into an analog audio signal in a format that can be reproduced by the speaker 21 at the subsequent stage.
  • the analog audio signal output from the audio processing unit 18 is supplied to the speaker 21 for audio reproduction.
  • the control unit 22 includes a CPU (central processing unit) 22 a and receives operation information from the operation unit 23 installed in the main body of the digital television broadcast receiver 11 or sends it from the remote controller 24. In response to the operation information received by the receiving unit 25, each unit is controlled so that the operation content is reflected.
  • CPU central processing unit
  • the control unit 22 uses the memory unit 22b.
  • the memory unit 22b mainly includes a ROM (read memory) storing a control program executed by the CPU 22a, a RAM (random memory) for providing a work area to the CPU 22a, and various setting information and control information. And so on.
  • an HDD (hard disk drive) 26 is connected to the control unit 22.
  • the control unit 22 encrypts a digital video signal and audio signal obtained from the demodulation / decoding unit 15 by a recording / playback processing unit 27 based on an operation of the operation unit 23 or the remote controller 24 by a user, and a predetermined recording format. After being converted to, it can be controlled to be supplied to the HDD 26 and recorded on the hard disk 26a.
  • control unit 22 causes the HDD 26 to read out digital video signals and audio signals from the hard disk 26 a based on the operation of the operation unit 23 and the remote controller 24 by the user, and decodes them by the recording / playback processing unit 27. Thereafter, by supplying the signal to the signal processing unit 16, it can be controlled to be used for the above-described video display and audio reproduction.
  • an input terminal 28 is connected to the recording / playback processing unit 27.
  • the input terminal 28 is for directly inputting digital video signals and audio signals from the outside of the digital television broadcast receiver 11.
  • the digital video signal and audio signal input through the input terminal 28 are supplied to the signal processing unit 16 through the recording / playback processing unit 27 based on the control of the control unit 22, and thereafter For video display and audio playback.
  • the digital video signal and audio signal input via the input terminal 28 are recorded and reproduced on the hard disk 26 a by the HDD 26 after passing through the recording and reproduction processing unit 27 under the control of the control unit 22. To be served.
  • a network interface 29 is connected to the control unit 22.
  • This network interface 29 is connected to an external network 30.
  • the network 30 is connected to a network server 31 for providing various services using a communication function via the network 30.
  • control unit 22 provides information by accessing the network server 31 through the network interface 29 and the network 30 and performing information communication based on the operation of the operation unit 23 and the remote controller 24 by the user. You can use the service you have.
  • FIG. 2 shows an example of the noise reduction processing unit 19 described above. That is, the noise reduction processing unit 19 includes an input terminal 19a to which an R signal output from the video processing unit 17 is supplied, an input terminal 19b to which a G signal output from the video processing unit 17 is supplied, And an input terminal 19c to which the B signal output from the video processing unit 17 is supplied.
  • the R, G, B signals supplied to these input terminals 19a, 19b, 19c have the same number of bits (for example, 8 bits) that can be input to the video display panel 20, respectively. Each is input to the adaptive filter unit 33.
  • the matrix processing unit 32 generates a luminance signal Y having the same number of bits (for example, 8 bits) based on the input R, G, and B signals and outputs the luminance signal Y to the video feature extraction unit 34.
  • the video feature extraction unit 34 generates a video feature signal corresponding to the feature amount (complexity) of the video constituting one screen based on the input luminance signal Y, and outputs the video feature signal to the adaptive filter unit 33. Yes.
  • the adaptive filter unit 33 adaptively performs filtering on the input R, G, and B signals without changing the number of bits, based on the video feature signal supplied from the video feature extraction unit 34.
  • the noise component for the display image on the image display panel 20 is reduced.
  • the noise reduction effect is enhanced by applying strong filter processing.
  • the filtering process is weakened so that the fineness of the displayed video is not impaired.
  • the R, G, and B signals subjected to the filter processing by the adaptive filter unit 33 are output terminals 19d, 19e, and 19f while maintaining the number of bits (for example, 8 bits) when input to the adaptive filter unit 33, respectively. Is supplied to the video display panel 20 via the, and used for video display.
  • FIG. 3 shows a flowchart summarizing the main processing operations of the noise reduction processing unit 19 described above. That is, when the processing is started (step S1), in step S2, the matrix processing unit 32 determines from the R, G, B signals supplied to the input terminals 19a, 19b, 19c, respectively, the same number of bits ( For example, a luminance signal Y of 8 bits is generated.
  • step S3 the video feature extraction unit 34 generates a video feature signal corresponding to the complexity of the video constituting one screen based on the luminance signal Y generated by the matrix processing unit 32.
  • step S4 the adaptive filter unit 33 converts the R, G, and B signals supplied to the input terminals 19a, 19b, and 19c with the intensity based on the video feature signal generated by the video feature extraction unit 34. Then, the filter processing is adaptively performed to reduce the noise component for the display video, and the processing is terminated (step S5).
  • the input R, G, and B signals are subjected to the filtering process with the strength based on the video feature signal without changing the number of bits. Yes. That is, the input R, G, and B signals are subjected to predetermined signal processing as they are without converting the input R, G, and B signals into other signal forms such as the luminance signal Y and the color difference signals Cb and Cr. Making it possible. For this reason, it is possible to realize the addition of necessary functions and to prevent deterioration of the quality of the displayed video.
  • the video feature extraction unit 34 can extract the video feature quantity with the required accuracy.
  • FIG. 4 shows an example of the video feature extraction unit 34 and the adaptive filter unit 33 that constitute the noise reduction processing unit 19.
  • the video feature extraction unit 34 includes an input terminal 34a to which the luminance signal Y output from the matrix processing unit 32 is supplied.
  • the luminance signal Y supplied to the input terminal 34a is supplied to the surrounding pixel similarity determination unit 34b and the flatness determination unit 34c, respectively.
  • the surrounding pixel similarity determination unit 34b determines the similarity between one pixel to be processed and the surrounding pixels of the pixel, and outputs a signal indicating the surrounding pixel similarity as the determination result to the surrounding pixel filter. This is output to the coefficient generator 34e.
  • the surrounding pixel similarity determination unit 34b performs a block composed of 7 pixels in the horizontal direction and 7 pixels in the vertical direction centering on one pixel to be processed, and ⁇ pixels in the horizontal direction and ⁇ pixels in the vertical direction from this block.
  • the degree of similarity between the two blocks is calculated by comparing a block consisting of 7 pixels in the horizontal direction and 7 pixels in the vertical direction at a position where the pixels are shifted.
  • the similarity for a block consisting of 5 pixels in the horizontal direction and 5 pixels in the vertical direction centering on the pixel to be processed is calculated and used as the peripheral pixel similarity.
  • the flatness determination unit 34c obtains a change amount of a block of 7 pixels in the horizontal direction and 7 pixels in the vertical direction centering on the pixel to be processed, thereby obtaining a flatness detection signal corresponding to the complexity of the video. Is output to the surrounding pixel filter coefficient generator 34e.
  • the surrounding pixel filter coefficient generation unit 34e based on the signal indicating the surrounding pixel similarity and the flatness detection signal, the higher the degree of similarity of the surrounding pixels and the higher the degree of flatness, that is, the complexity of the video. The lower the is, the surrounding pixel weighting coefficient is generated so that the adaptive filter unit 33 in the subsequent stage performs stronger filter processing.
  • the surrounding pixel weighting coefficient is output from the output terminal 34e as a video feature signal.
  • the adaptive filter unit 33 corresponds to the input terminal 33a to which the surrounding pixel weighting coefficient output from the video feature extraction unit 34 is supplied and the R, G, and B signals output from the video processing unit 17.
  • the R, G, and B signals supplied to the input terminals 33b, 33c, and 33d are respectively input to surrounding pixel weighting addition units 33e, 33f, and 33g.
  • the surrounding pixel weighting coefficients supplied to the input terminal 33a are respectively input to the surrounding pixel weighting addition units 33e, 33f, and 33g and the weighting coefficient phase sum calculation unit 33h.
  • the surrounding pixel weighting addition unit 33e performs weighting addition processing on the inputted R signal based on the inputted surrounding pixel weighting coefficient, and outputs a signal indicating the addition result to the division processing unit 33i.
  • the surrounding pixel weighting addition unit 33f performs weighting addition processing on the input G signal based on the input surrounding pixel weighting coefficient, and outputs a signal indicating the addition result to the division processing unit 33j.
  • the surrounding pixel weighting addition unit 33g performs weighting addition processing on the inputted B signal based on the inputted surrounding pixel weighting coefficient, and outputs a signal indicating the addition result to the division processing unit 33k.
  • the weighting coefficient phase sum calculation unit 33h calculates the phase sum of the input surrounding pixel weighting coefficients and outputs the calculation result to each of the division processing units 33i, 33j, and 33k.
  • the division processing unit 33i divides the addition result of the surrounding pixel weighting addition unit 33e by the output of the weighting coefficient phase sum calculation unit 33h, thereby generating a weighted averaged R signal via the output terminal 33l.
  • the division processing unit 33j divides the addition result of the surrounding pixel weighting addition unit 33f by the output of the weighting coefficient phase sum calculation unit 33h, thereby generating a weighted average G signal, which is output via the output terminal 33m.
  • the division processing unit 33i divides the addition result of the surrounding pixel weighting addition unit 33e by the output of the weighting coefficient phase sum calculation unit 33h, thereby generating a weighted average G signal, which is output via the output terminal 33m.
  • the division processing unit 33k divides the addition result of the surrounding pixel weighting addition unit 33g by the output of the weighting coefficient phase sum calculation unit 33h, thereby generating a weighted average B signal, which is output via the output terminal 33n. Are supplied to the video display panel 20.
  • the input R, G, B signals are obtained from the luminance signal Y without converting the input R, G, B signals into other signal forms such as the luminance signal Y and the color difference signals Cb, Cr.
  • R, G, B signals with reduced noise can be obtained by adaptively performing filter processing according to the complexity of the video.
  • R, G, and B signals can be used for various signal processing.
  • the present invention is not limited to the above-described embodiment as it is, and can be embodied by variously modifying the constituent elements without departing from the scope of the invention in the implementation stage.
  • Various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above-described embodiments. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, constituent elements according to different embodiments may be appropriately combined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Picture Signal Circuits (AREA)

Abstract

In a preferred embodiment a video signal processing device is equipped with a generation section, an extraction section, and a processing section. The generation section generates a luminance signal on the basis of RGB signals. The extraction section extracts an amount of characteristic of the video on the basis of the luminance signal. The processing section performs predetermined signal processing of the RGB signals, according to the amount of characteristic.

Description

映像信号処理装置及び映像信号処理方法Video signal processing apparatus and video signal processing method
 この発明の実施の形態は、例えばデジタルテレビジョン放送受信装置のような映像表示装置等に使用される映像信号処理装置及び映像信号処理方法に関する。 Embodiments of the present invention relate to a video signal processing device and a video signal processing method used for a video display device such as a digital television broadcast receiver.
 周知のように、例えば液晶やプラズマ等を用いた映像表示パネルを搭載する薄型指向のデジタルテレビジョン放送受信装置にあっては、その映像信号処理用LSI(large scale integration)から映像表示パネルへの映像信号の供給が、3原色信号の形態、つまり、R(red),G(green),B(blue)信号の形態で多く行なわれている。 As is well known, in a thin-oriented digital television broadcast receiver equipped with a video display panel using, for example, liquid crystal or plasma, the video signal processing LSI (large-scale integration) is connected to the video display panel. Video signals are often supplied in the form of three primary color signals, that is, in the form of R (red), G (green), and B (blue) signals.
 すなわち、上記映像信号処理用LSIは、映像表示パネルの入力ビット数に対応したビット数で、R,G,Bの各信号をそれぞれ出力しており、映像表示パネルは、映像信号処理用LSIから供給されたR,G,B信号に基づいて、パネル面に映像を表示するように機能している。 That is, the video signal processing LSI outputs R, G, and B signals with the number of bits corresponding to the number of input bits of the video display panel, and the video display panel is output from the video signal processing LSI. Based on the supplied R, G, B signals, it functions to display an image on the panel surface.
 ところで、このようなデジタルテレビジョン放送受信装置においては、新規な機能を付加する場合、映像信号処理用LSIと映像表示パネルとの間に、FPGA(field programmable gate array)やASIC(application specific integrated circuit)等のデバイスを挿入することがある。 By the way, in such a digital television broadcast receiver, when a new function is added, an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit) is provided between the video signal processing LSI and the video display panel. ) And other devices may be inserted.
 この場合、挿入されるデバイスによっては、映像信号処理用LSIから出力されたR,G,B信号を、一旦、輝度信号Y及び色差信号Cb,Crに変換して必要な信号処理を施し、その処理後の信号Y,Cb,CrをR,G,B信号に再変換して映像表示パネルに供給することもあり、このような場合、表示映像の品質劣化を招くことがある。 In this case, depending on the device to be inserted, R, G, B signals output from the video signal processing LSI are temporarily converted into luminance signals Y and color difference signals Cb, Cr, and necessary signal processing is performed. The processed signals Y, Cb and Cr may be reconverted into R, G and B signals and supplied to the video display panel. In such a case, the quality of the displayed video may be deteriorated.
特開2007-110303号公報JP 2007-110303 A
 他の信号形態に変換することなくR,G,Bの各信号に所定の信号処理を施すことを可能として、必要な機能の付加を実現するとともに表示映像の品質劣化を防止し得るようにした映像信号処理装置及び映像信号処理方法を提供することを目的とする。 Predetermined signal processing can be performed on each of R, G, and B signals without conversion to other signal formats, and necessary functions can be added and quality of the displayed video can be prevented from deteriorating. An object of the present invention is to provide a video signal processing device and a video signal processing method.
 実施の形態によれば、映像信号処理装置は、生成部と抽出部と処理部とを備える。生成部は、R,G,B信号に基づいて輝度信号を生成する。抽出部は、輝度信号に基づいて映像の特徴量を抽出する。処理部は、R,G,B信号に特徴量に応じた所定の信号処理を施す。 According to the embodiment, the video signal processing apparatus includes a generation unit, an extraction unit, and a processing unit. The generation unit generates a luminance signal based on the R, G, and B signals. The extraction unit extracts a feature amount of the video based on the luminance signal. The processing unit performs predetermined signal processing corresponding to the feature amount on the R, G, and B signals.
 他の信号形態に変換することなくR,G,Bの各信号に所定の信号処理を施すことを可能として、必要な機能の付加を実現するとともに表示映像の品質劣化を防止し得るようにした映像信号処理装置及び映像信号処理方法を提供することができる。 Predetermined signal processing can be performed on each of R, G, and B signals without conversion to other signal formats, and necessary functions can be added and quality of the displayed video can be prevented from deteriorating. A video signal processing apparatus and a video signal processing method can be provided.
実施の形態におけるデジタルテレビジョン放送受信装置の信号処理系の一例を概略的に説明するために示すブロック構成図である。It is a block block diagram shown in order to demonstrate roughly an example of the signal processing system of the digital television broadcast receiver in embodiment. 同実施の形態におけるデジタルテレビジョン放送受信装置に使用されるノイズ低減処理部の一例を説明するために示すブロック構成図である。It is a block block diagram shown in order to demonstrate an example of the noise reduction process part used for the digital television broadcast receiver in the embodiment. 同実施の形態におけるノイズ低減処理部の主要な処理動作の一例を説明するために示すフローチャートである。It is a flowchart shown in order to demonstrate an example of main processing operation | movement of the noise reduction process part in the embodiment. 同実施の形態におけるノイズ低減処理部の映像特徴抽出部及び適応フィルタ部の一例を説明するために示すブロック構成図である。It is a block block diagram shown in order to demonstrate an example of the image | video feature extraction part and adaptive filter part of the noise reduction process part in the embodiment.
 以下、実施の形態について図面を参照して詳細に説明する。実施の形態によれば、映像信号処理装置は、生成部と抽出部と処理部とを備える。生成部は、R,G,B信号に基づいて輝度信号を生成する。抽出部は、輝度信号に基づいて映像の特徴量を抽出する。処理部は、R,G,B信号に特徴量に応じた所定の信号処理を施す。 Hereinafter, embodiments will be described in detail with reference to the drawings. According to the embodiment, the video signal processing apparatus includes a generation unit, an extraction unit, and a processing unit. The generation unit generates a luminance signal based on the R, G, and B signals. The extraction unit extracts a feature amount of the video based on the luminance signal. The processing unit performs predetermined signal processing corresponding to the feature amount on the R, G, and B signals.
 図1は、この実施の形態で説明するデジタルテレビジョン放送受信装置11の信号処理系を概略的に示している。すなわち、アンテナ12で受信したデジタルテレビジョン放送信号は、入力端子13を介してチューナ部14に供給されることにより、所望のチャンネルの放送信号が選局される。 FIG. 1 schematically shows a signal processing system of a digital television broadcast receiving apparatus 11 described in this embodiment. That is, a digital television broadcast signal received by the antenna 12 is supplied to the tuner unit 14 via the input terminal 13, so that a broadcast signal of a desired channel is selected.
 このチューナ部14で選局された放送信号は、復調復号部15に供給されてデジタルの映像信号及び音声信号等に復元された後、信号処理部16に出力される。この信号処理部16は、復調復号部15から供給されたデジタルの映像信号及び音声信号に対してそれぞれ所定のデジタル信号処理を施している。 The broadcast signal selected by the tuner unit 14 is supplied to the demodulation / decoding unit 15 and restored to a digital video signal, audio signal, etc., and then output to the signal processing unit 16. The signal processing unit 16 performs predetermined digital signal processing on the digital video signal and audio signal supplied from the demodulation / decoding unit 15.
 そして、この信号処理部16は、デジタルの映像信号を映像処理部17に出力し、デジタルの音声信号を音声処理部18に出力している。このうち、映像処理部17は、信号処理部16から供給されるデジタルの映像信号を、それぞれが後段の映像表示パネル20に入力可能なビット数(例えば8ビット)を有するR,G,Bの各信号に変換して、ノイズ低減処理部19に出力している。 The signal processing unit 16 outputs a digital video signal to the video processing unit 17 and outputs a digital audio signal to the audio processing unit 18. Among these, the video processing unit 17 is a digital video signal supplied from the signal processing unit 16, and has R, G, and B bits each having a number of bits (for example, 8 bits) that can be input to the video display panel 20 in the subsequent stage. Each signal is converted and output to the noise reduction processing unit 19.
 このノイズ低減処理部19は、詳細は後述するが、映像処理部17から供給されたR,G,B信号に対して、その信号形態や各信号のビット数(例えば8ビット)等を変えることなくノイズ低減処理を施している。そして、このノイズ低減処理部19から出力されたR,G,B信号が、上記映像表示パネル20に供給されて映像表示に供される。 As will be described in detail later, the noise reduction processing unit 19 changes the signal form, the number of bits of each signal (for example, 8 bits), etc. with respect to the R, G, and B signals supplied from the video processing unit 17. Noise reduction processing is applied. The R, G, and B signals output from the noise reduction processing unit 19 are supplied to the video display panel 20 and used for video display.
 また、上記音声処理部18は、入力されたデジタルの音声信号を、後段のスピーカ21で再生可能なフォーマットのアナログ音声信号に変換している。そして、この音声処理部18から出力されたアナログ音声信号が、スピーカ21に供給されることにより音声再生に供される。 The audio processing unit 18 converts the input digital audio signal into an analog audio signal in a format that can be reproduced by the speaker 21 at the subsequent stage. The analog audio signal output from the audio processing unit 18 is supplied to the speaker 21 for audio reproduction.
 ここで、このデジタルテレビジョン放送受信装置11は、上記した各種の受信動作を含むその全ての動作を制御部22によって統括的に制御されている。この制御部22は、CPU(central processing unit)22aを内蔵しており、デジタルテレビジョン放送受信装置11の本体に設置された操作部23からの操作情報を受けて、または、リモートコントローラ24から送出され受信部25で受信した操作情報を受けて、その操作内容が反映されるように各部をそれぞれ制御している。 Here, in the digital television broadcast receiving apparatus 11, all the operations including the above-described various receiving operations are comprehensively controlled by the control unit 22. The control unit 22 includes a CPU (central processing unit) 22 a and receives operation information from the operation unit 23 installed in the main body of the digital television broadcast receiver 11 or sends it from the remote controller 24. In response to the operation information received by the receiving unit 25, each unit is controlled so that the operation content is reflected.
 この場合、制御部22は、メモリ部22bを利用している。このメモリ部22bは、主として、CPU22aが実行する制御プログラムを格納したROM(read only memory)と、該CPU22aに作業エリアを提供するためのRAM(random access memory)と、各種の設定情報及び制御情報等が格納される不揮発性メモリとを有している。 In this case, the control unit 22 uses the memory unit 22b. The memory unit 22b mainly includes a ROM (read memory) storing a control program executed by the CPU 22a, a RAM (random memory) for providing a work area to the CPU 22a, and various setting information and control information. And so on.
 また、この制御部22には、HDD(hard disk drive)26が接続されている。この制御部22は、ユーザによる操作部23やリモートコントローラ24の操作に基づいて、上記復調復号部15から得られるデジタルの映像信号及び音声信号を、記録再生処理部27によって暗号化し所定の記録フォーマットに変換した後、HDD26に供給してハードディスク26aに記録させるように制御することができる。 In addition, an HDD (hard disk drive) 26 is connected to the control unit 22. The control unit 22 encrypts a digital video signal and audio signal obtained from the demodulation / decoding unit 15 by a recording / playback processing unit 27 based on an operation of the operation unit 23 or the remote controller 24 by a user, and a predetermined recording format. After being converted to, it can be controlled to be supplied to the HDD 26 and recorded on the hard disk 26a.
 さらに、この制御部22は、ユーザによる操作部23やリモートコントローラ24の操作に基づいて、HDD26によりハードディスク26aからデジタルの映像信号及び音声信号を読み出させ、上記記録再生処理部27によって復号化した後、信号処理部16に供給することによって、以後、上記した映像表示及び音声再生に供させるように制御することができる。 Further, the control unit 22 causes the HDD 26 to read out digital video signals and audio signals from the hard disk 26 a based on the operation of the operation unit 23 and the remote controller 24 by the user, and decodes them by the recording / playback processing unit 27. Thereafter, by supplying the signal to the signal processing unit 16, it can be controlled to be used for the above-described video display and audio reproduction.
 また、上記記録再生処理部27には、入力端子28が接続されている。この入力端子28は、デジタルテレビジョン放送受信装置11の外部からデジタルの映像信号及び音声信号を直接入力するためのものである。この入力端子28を介して入力されたデジタルの映像信号及び音声信号は、制御部22の制御に基づいて、記録再生処理部27を介した後、信号処理部16に供給されて、以後、上記した映像表示及び音声再生に供される。 Also, an input terminal 28 is connected to the recording / playback processing unit 27. The input terminal 28 is for directly inputting digital video signals and audio signals from the outside of the digital television broadcast receiver 11. The digital video signal and audio signal input through the input terminal 28 are supplied to the signal processing unit 16 through the recording / playback processing unit 27 based on the control of the control unit 22, and thereafter For video display and audio playback.
 さらに、この入力端子28を介して入力されたデジタルの映像信号及び音声信号は、制御部22の制御に基づいて、記録再生処理部27を介した後、HDD26によるハードディスク26aに対しての記録再生に供される。 Further, the digital video signal and audio signal input via the input terminal 28 are recorded and reproduced on the hard disk 26 a by the HDD 26 after passing through the recording and reproduction processing unit 27 under the control of the control unit 22. To be served.
 また、上記制御部22には、ネットワークインターフェース29が接続されている。このネットワークインターフェース29は、外部のネットワーク30に接続されている。そして、このネットワーク30には、当該ネットワーク30を介した通信機能を利用して各種のサービスを提供するためのネットワークサーバ31が接続されている。 Further, a network interface 29 is connected to the control unit 22. This network interface 29 is connected to an external network 30. The network 30 is connected to a network server 31 for providing various services using a communication function via the network 30.
 このため、制御部22は、ユーザによる操作部23やリモートコントローラ24の操作に基づき、ネットワークインターフェース29及びネットワーク30を介して、ネットワークサーバ31にアクセスして情報通信を行なうことにより、そこで提供しているサービスを利用することができるようになっている。 Therefore, the control unit 22 provides information by accessing the network server 31 through the network interface 29 and the network 30 and performing information communication based on the operation of the operation unit 23 and the remote controller 24 by the user. You can use the service you have.
 図2は、上記したノイズ低減処理部19の一例を示している。すなわち、このノイズ低減処理部19は、上記映像処理部17から出力されたR信号が供給される入力端子19aと、上記映像処理部17から出力されたG信号が供給される入力端子19bと、上記映像処理部17から出力されたB信号が供給される入力端子19cとを備えている。 FIG. 2 shows an example of the noise reduction processing unit 19 described above. That is, the noise reduction processing unit 19 includes an input terminal 19a to which an R signal output from the video processing unit 17 is supplied, an input terminal 19b to which a G signal output from the video processing unit 17 is supplied, And an input terminal 19c to which the B signal output from the video processing unit 17 is supplied.
 これらの各入力端子19a,19b,19cに供給されたR,G,B信号は、それぞれが上記映像表示パネル20に入力可能な同じビット数(例えば8ビット)を有し、マトリクス処理部32及び適応フィルタ部33にそれぞれ入力される。 The R, G, B signals supplied to these input terminals 19a, 19b, 19c have the same number of bits (for example, 8 bits) that can be input to the video display panel 20, respectively. Each is input to the adaptive filter unit 33.
 このうち、マトリクス処理部32は、入力されたR,G,B信号に基づいて、それらと同じビット数(例えば8ビット)の輝度信号Yを生成し、映像特徴抽出部34に出力している。この映像特徴抽出部34は、入力された輝度信号Yに基づいて、1画面を構成する映像の特徴量(複雑度)に応じた映像特徴信号を生成し、上記適応フィルタ部33に出力している。 Among these, the matrix processing unit 32 generates a luminance signal Y having the same number of bits (for example, 8 bits) based on the input R, G, and B signals and outputs the luminance signal Y to the video feature extraction unit 34. . The video feature extraction unit 34 generates a video feature signal corresponding to the feature amount (complexity) of the video constituting one screen based on the input luminance signal Y, and outputs the video feature signal to the adaptive filter unit 33. Yes.
 この適応フィルタ部33は、映像特徴抽出部34から供給された映像特徴信号に基づいて、入力されたR,G,B信号にそれぞれビット数を変えることなく適応的にフィルタ処理を施すことにより、映像表示パネル20での表示映像に対するノイズ成分を低減させるようにしている。 The adaptive filter unit 33 adaptively performs filtering on the input R, G, and B signals without changing the number of bits, based on the video feature signal supplied from the video feature extraction unit 34. The noise component for the display image on the image display panel 20 is reduced.
 この場合、映像特徴信号が映像の複雑度が低いことを示しているほど、強いフィルタ処理を施してノイズ低減効果を高めるようにしている。また、映像特徴信号が映像の複雑度が高いことを示しているほど、フィルタ処理を弱めて表示映像の精細さが損なわれないようにしている。 In this case, as the video feature signal indicates that the complexity of the video is low, the noise reduction effect is enhanced by applying strong filter processing. Further, as the video feature signal indicates that the complexity of the video is higher, the filtering process is weakened so that the fineness of the displayed video is not impaired.
 そして、この適応フィルタ部33でフィルタ処理の施されたR,G,B信号は、それぞれ適応フィルタ部33に入力されたときのビット数(例えば8ビット)のまま、出力端子19d,19e,19fを介して上記映像表示パネル20に供給されて、映像表示に供される。 The R, G, and B signals subjected to the filter processing by the adaptive filter unit 33 are output terminals 19d, 19e, and 19f while maintaining the number of bits (for example, 8 bits) when input to the adaptive filter unit 33, respectively. Is supplied to the video display panel 20 via the, and used for video display.
 図3は、上記したノイズ低減処理部19の主要な処理動作をまとめたフローチャートを示している。すなわち、処理が開始(ステップS1)されると、ステップS2で、上記マトリクス処理部32が、入力端子19a,19b,19cにそれぞれ供給されたR,G,B信号から、それらと同じビット数(例えば8ビット)の輝度信号Yを生成する。 FIG. 3 shows a flowchart summarizing the main processing operations of the noise reduction processing unit 19 described above. That is, when the processing is started (step S1), in step S2, the matrix processing unit 32 determines from the R, G, B signals supplied to the input terminals 19a, 19b, 19c, respectively, the same number of bits ( For example, a luminance signal Y of 8 bits is generated.
 そして、ステップS3で、上記映像特徴抽出部34が、マトリクス処理部32で生成された輝度信号Yに基づいて、1画面を構成する映像の複雑度に応じた映像特徴信号を生成する。 In step S3, the video feature extraction unit 34 generates a video feature signal corresponding to the complexity of the video constituting one screen based on the luminance signal Y generated by the matrix processing unit 32.
 その後、ステップS4で、上記適応フィルタ部33が、映像特徴抽出部34で生成された映像特徴信号に基づいた強さで、入力端子19a,19b,19cに供給されたR,G,B信号に、それぞれ適応的にフィルタ処理を施すことにより、表示映像に対するノイズ成分を低減させて、処理を終了(ステップS5)する。 Thereafter, in step S4, the adaptive filter unit 33 converts the R, G, and B signals supplied to the input terminals 19a, 19b, and 19c with the intensity based on the video feature signal generated by the video feature extraction unit 34. Then, the filter processing is adaptively performed to reduce the noise component for the display video, and the processing is terminated (step S5).
 上記した実施の形態に係るノイズ低減処理部19によれば、映像特徴信号に基づいた強さで、入力されたR,G,B信号にそれぞれビット数を変えることなくフィルタ処理を施すようにしている。つまり、入力されたR,G,B信号を輝度信号Y及び色差信号Cb,Cr等の他の信号形態に変換することなく、入力されたR,G,B信号にそのまま所定の信号処理を施すことを可能としている。このため、必要な機能の付加を実現することができるとともに、表示映像の品質劣化を防止することもできるようになる。 According to the noise reduction processing unit 19 according to the above-described embodiment, the input R, G, and B signals are subjected to the filtering process with the strength based on the video feature signal without changing the number of bits. Yes. That is, the input R, G, and B signals are subjected to predetermined signal processing as they are without converting the input R, G, and B signals into other signal forms such as the luminance signal Y and the color difference signals Cb and Cr. Making it possible. For this reason, it is possible to realize the addition of necessary functions and to prevent deterioration of the quality of the displayed video.
 また、R,G,B信号から輝度信号Yを生成する場合には、通常、
   Y=0.299*R+0.587*G+0.114*B
なる演算を行なう必要があるが、上記したノイズ低減処理部19では、マトリクス処理部32でR,G,B信号から輝度信号Yを生成する際に、
   Y=R+2*G+B
なる簡易な演算で得られる大凡の輝度信号Yを用いても、映像特徴抽出部34で必要な精度の映像特徴量を抽出することが可能である。
When the luminance signal Y is generated from the R, G, and B signals,
Y = 0.299 * R + 0.587 * G + 0.114 * B
In the noise reduction processing unit 19 described above, when the matrix processing unit 32 generates the luminance signal Y from the R, G, and B signals,
Y = R + 2 * G + B
Even if the approximate luminance signal Y obtained by the simple calculation is used, the video feature extraction unit 34 can extract the video feature quantity with the required accuracy.
 図4は、上記ノイズ低減処理部19を構成する映像特徴抽出部34及び適応フィルタ部33の一例を示している。まず、上記映像特徴抽出部34は、マトリクス処理部32から出力された輝度信号Yが供給される入力端子34aを備えている。この入力端子34aに供給された輝度信号Yは、周囲画素類似度判定部34b及び平坦度判定部34cにそれぞれ供給される。 FIG. 4 shows an example of the video feature extraction unit 34 and the adaptive filter unit 33 that constitute the noise reduction processing unit 19. First, the video feature extraction unit 34 includes an input terminal 34a to which the luminance signal Y output from the matrix processing unit 32 is supplied. The luminance signal Y supplied to the input terminal 34a is supplied to the surrounding pixel similarity determination unit 34b and the flatness determination unit 34c, respectively.
 このうち、周囲画素類似度判定部34bは、処理対象となる1つの画素と、当該画素の周辺画素との類似度を判定し、その判定結果である周囲画素類似度を示す信号を周囲画素フィルタ係数発生部34eに出力している。この場合、周囲画素類似度判定部34bは、処理対象となる1つの画素を中心とした水平方向7画素×垂直方向7画素でなるブロックと、このブロックから水平方向にα画素、垂直方向にβ画素ずれた位置にある水平方向7画素×垂直方向7画素でなるブロックとを比較することにより、両ブロックの類似度を計算している。 Among these, the surrounding pixel similarity determination unit 34b determines the similarity between one pixel to be processed and the surrounding pixels of the pixel, and outputs a signal indicating the surrounding pixel similarity as the determination result to the surrounding pixel filter. This is output to the coefficient generator 34e. In this case, the surrounding pixel similarity determination unit 34b performs a block composed of 7 pixels in the horizontal direction and 7 pixels in the vertical direction centering on one pixel to be processed, and α pixels in the horizontal direction and β pixels in the vertical direction from this block. The degree of similarity between the two blocks is calculated by comparing a block consisting of 7 pixels in the horizontal direction and 7 pixels in the vertical direction at a position where the pixels are shifted.
 そして、この周囲画素類似度判定部34bは、処理対象となる画素を中心としたブロックと、αを-2,-1,0,+1,+2とし、βを-2,-1,0,+1,+2とする組み合わせ(ただし、α=0,β=0の組み合わせは、処理対象となる画素を中心としたブロック自体となるので除く)で得られる24種類のブロックとの類似度をそれぞれ計算することにより、処理対象となる画素を中心とした水平方向5画素×垂直方向5画素でなるブロックについての類似度を計算し、それを周辺画素類似度としている。 Then, the surrounding pixel similarity determination unit 34b sets a block centered on a pixel to be processed, α is −2, −1, 0, +1, +2, and β is −2, −1, 0, +1. , +2 (however, the combination of α = 0, β = 0 is excluded because it is the block itself centered on the pixel to be processed), and the similarities are calculated respectively. As a result, the similarity for a block consisting of 5 pixels in the horizontal direction and 5 pixels in the vertical direction centering on the pixel to be processed is calculated and used as the peripheral pixel similarity.
 また、上記平坦度判定部34cは、処理対象となる画素を中心とした水平方向7画素×垂直方向7画素でなるブロックの変化量を求めることにより、映像の複雑度に応じた平坦度検出信号を生成し、上記周囲画素フィルタ係数発生部34eに出力している。 Further, the flatness determination unit 34c obtains a change amount of a block of 7 pixels in the horizontal direction and 7 pixels in the vertical direction centering on the pixel to be processed, thereby obtaining a flatness detection signal corresponding to the complexity of the video. Is output to the surrounding pixel filter coefficient generator 34e.
 この周囲画素フィルタ係数発生部34eは、周囲画素類似度を示す信号と平坦度検出信号とに基づいて、周囲画素の類似度が高いほど、また、平坦度が高いほど、つまり、映像の複雑度が低いほど、後段の適応フィルタ部33に強いフィルタ処理を施させるような周囲画素重み付け係数を生成している。そして、この周囲画素重み付け係数が映像特徴信号として出力端子34eから出力される。 The surrounding pixel filter coefficient generation unit 34e, based on the signal indicating the surrounding pixel similarity and the flatness detection signal, the higher the degree of similarity of the surrounding pixels and the higher the degree of flatness, that is, the complexity of the video. The lower the is, the surrounding pixel weighting coefficient is generated so that the adaptive filter unit 33 in the subsequent stage performs stronger filter processing. The surrounding pixel weighting coefficient is output from the output terminal 34e as a video feature signal.
 一方、上記適応フィルタ部33は、映像特徴抽出部34から出力された周囲画素重み付け係数が供給される入力端子33aと、上記映像処理部17から出力されたR,G,Bの各信号が対応的に供給される入力端子33b,33c,33dとを備えている。そして、各入力端子33b,33c,33dに供給されたR,G,B信号は、対応的に設けられた周囲画素重み付け加算部33e,33f,33gにそれぞれ入力される。 On the other hand, the adaptive filter unit 33 corresponds to the input terminal 33a to which the surrounding pixel weighting coefficient output from the video feature extraction unit 34 is supplied and the R, G, and B signals output from the video processing unit 17. Input terminals 33b, 33c, and 33d. The R, G, and B signals supplied to the input terminals 33b, 33c, and 33d are respectively input to surrounding pixel weighting addition units 33e, 33f, and 33g.
 また、上記入力端子33aに供給された周囲画素重み付け係数は、各周囲画素重み付け加算部33e,33f,33g及び重み付け係数相和演算部33hにそれぞれ入力されている。 The surrounding pixel weighting coefficients supplied to the input terminal 33a are respectively input to the surrounding pixel weighting addition units 33e, 33f, and 33g and the weighting coefficient phase sum calculation unit 33h.
 そして、上記周囲画素重み付け加算部33eは、入力された周囲画素重み付け係数に基づいて、入力されたR信号に重み付け加算処理を施し、その加算結果を示す信号を除算処理部33iに出力している。また、上記周囲画素重み付け加算部33fは、入力された周囲画素重み付け係数に基づいて、入力されたG信号に重み付け加算処理を施し、その加算結果を示す信号を除算処理部33jに出力している。さらに、上記周囲画素重み付け加算部33gは、入力された周囲画素重み付け係数に基づいて、入力されたB信号に重み付け加算処理を施し、その加算結果を示す信号を除算処理部33kに出力している。 Then, the surrounding pixel weighting addition unit 33e performs weighting addition processing on the inputted R signal based on the inputted surrounding pixel weighting coefficient, and outputs a signal indicating the addition result to the division processing unit 33i. . The surrounding pixel weighting addition unit 33f performs weighting addition processing on the input G signal based on the input surrounding pixel weighting coefficient, and outputs a signal indicating the addition result to the division processing unit 33j. . Further, the surrounding pixel weighting addition unit 33g performs weighting addition processing on the inputted B signal based on the inputted surrounding pixel weighting coefficient, and outputs a signal indicating the addition result to the division processing unit 33k. .
 また、上記重み付け係数相和演算部33hは、入力された周囲画素重み付け係数の相和を演算し、その演算結果を各除算処理部33i,33j,33kに出力している。 The weighting coefficient phase sum calculation unit 33h calculates the phase sum of the input surrounding pixel weighting coefficients and outputs the calculation result to each of the division processing units 33i, 33j, and 33k.
 そして、上記除算処理部33iが、周囲画素重み付け加算部33eの加算結果を、重み付け係数相和演算部33hの出力で除算することにより、重み付け平均されたR信号が生成され、出力端子33lを介して上記映像表示パネル20に供給される。また、上記除算処理部33jが、周囲画素重み付け加算部33fの加算結果を、重み付け係数相和演算部33hの出力で除算することにより、重み付け平均されたG信号が生成され、出力端子33mを介して上記映像表示パネル20に供給される。さらに、上記除算処理部33kが、周囲画素重み付け加算部33gの加算結果を、重み付け係数相和演算部33hの出力で除算することにより、重み付け平均されたB信号が生成され、出力端子33nを介して上記映像表示パネル20に供給される。 Then, the division processing unit 33i divides the addition result of the surrounding pixel weighting addition unit 33e by the output of the weighting coefficient phase sum calculation unit 33h, thereby generating a weighted averaged R signal via the output terminal 33l. Are supplied to the video display panel 20. Also, the division processing unit 33j divides the addition result of the surrounding pixel weighting addition unit 33f by the output of the weighting coefficient phase sum calculation unit 33h, thereby generating a weighted average G signal, which is output via the output terminal 33m. Are supplied to the video display panel 20. Further, the division processing unit 33k divides the addition result of the surrounding pixel weighting addition unit 33g by the output of the weighting coefficient phase sum calculation unit 33h, thereby generating a weighted average B signal, which is output via the output terminal 33n. Are supplied to the video display panel 20.
 これにより、入力されたR,G,B信号を輝度信号Y及び色差信号Cb,Cr等の他の信号形態に変換することなく、入力されたR,G,B信号に輝度信号Yから求めた映像の複雑度に応じて適応的にフィルタ処理を施すことにより、ノイズの低減されたR,G,B信号を得ることができる。 Thus, the input R, G, B signals are obtained from the luminance signal Y without converting the input R, G, B signals into other signal forms such as the luminance signal Y and the color difference signals Cb, Cr. R, G, B signals with reduced noise can be obtained by adaptively performing filter processing according to the complexity of the video.
 ここで、上記した実施の形態では、適応フィルタ部33によりR,G,B信号にノイズ低減処理を施すことについて説明したが、これに限らず、この発明は、必要な機能を実現するために、R,G,Bの各信号に様々な信号処理を施す際に利用することができる。 Here, in the above-described embodiment, it has been described that the noise reduction processing is performed on the R, G, and B signals by the adaptive filter unit 33. However, the present invention is not limited to this, and the present invention realizes a necessary function. , R, G, B signals can be used for various signal processing.
 なお、この発明は上記した実施の形態そのままに限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で構成要素を種々変形して具体化することができる。また、上記した実施の形態に開示されている複数の構成要素を適宜に組み合わせることにより、種々の発明を形成することができる。例えば、実施の形態に示される全構成要素から幾つかの構成要素を削除しても良いものである。さらに、異なる実施の形態に係る構成要素を適宜組み合わせても良いものである。 It should be noted that the present invention is not limited to the above-described embodiment as it is, and can be embodied by variously modifying the constituent elements without departing from the scope of the invention in the implementation stage. Various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above-described embodiments. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, constituent elements according to different embodiments may be appropriately combined.

Claims (9)

  1.  R,G,B信号に基づいて輝度信号を生成する生成部と、
     前記輝度信号に基づいて映像の特徴量を抽出する抽出部と、
     前記R,G,B信号に前記特徴量に応じた所定の信号処理を施す処理部とを具備する映像信号処理装置。
    A generator that generates a luminance signal based on the R, G, and B signals;
    An extraction unit for extracting a feature amount of the video based on the luminance signal;
    A video signal processing apparatus comprising: a processing unit that performs predetermined signal processing corresponding to the feature amount on the R, G, and B signals.
  2.  前記生成部は、前記R,G,B信号に基づいて、
       Y=R+2*G+B
    なる演算を行なって前記輝度信号Yを生成する請求項1記載の映像信号処理装置。
    The generator is based on the R, G, B signals,
    Y = R + 2 * G + B
    The video signal processing apparatus according to claim 1, wherein the luminance signal Y is generated by performing the following calculation.
  3.  前記抽出部は、前記輝度信号に基づいて映像の複雑度に応じた信号を生成する請求項1記載の映像信号処理装置。 The video signal processing apparatus according to claim 1, wherein the extraction unit generates a signal corresponding to the complexity of the video based on the luminance signal.
  4.  前記処理部は、前記R,G,B信号にノイズ低減のためのフィルタ処理を施すフィルタである請求項1記載の映像信号処理装置。 2. The video signal processing apparatus according to claim 1, wherein the processing unit is a filter that performs filter processing for noise reduction on the R, G, and B signals.
  5.  前記抽出部は、前記輝度信号に基づいて映像の複雑度に応じた信号を生成し、
     前記処理部は、前記R,G,B信号に前記複雑度に応じた強さのフィルタ処理を施す請求項1記載の映像信号処理装置。
    The extraction unit generates a signal according to the complexity of the video based on the luminance signal,
    The video signal processing apparatus according to claim 1, wherein the processing unit performs a filtering process having a strength corresponding to the complexity on the R, G, and B signals.
  6.  前記処理部は、映像の複雑度が低いほど強いフィルタ処理を施す請求項5記載の映像信号処理装置。 6. The video signal processing apparatus according to claim 5, wherein the processing unit performs stronger filter processing as the video complexity is lower.
  7.  前記抽出部は、
     処理対象となる画素と当該画素の周辺画素との類似度を判定する第1の判定部と、
     処理対象となる画素を中心とした複数の画素でなるブロックの平坦度を判定する第2の判定部と、
     前記類似度が高いほど、また、前記平坦度が高いほど、前記処理部に強いフィルタ処理を施させる係数を生成する係数生成部とを備える請求項5記載の映像信号処理装置。
    The extraction unit includes:
    A first determination unit that determines the degree of similarity between a pixel to be processed and peripheral pixels of the pixel;
    A second determination unit that determines the flatness of a block composed of a plurality of pixels centered on a pixel to be processed;
    The video signal processing apparatus according to claim 5, further comprising: a coefficient generation unit configured to generate a coefficient that causes the processing unit to perform a strong filter process as the similarity is higher and the flatness is higher.
  8.  前記処理部は、
     前記R,G,B信号にそれぞれ前記係数を重み付け加算する加算部と、
     前記加算部から出力されるR,G,B信号をそれぞれ前記係数の相和で除算する除算部とを備える請求項7記載の映像信号処理装置。
    The processor is
    An adder that weights and adds the coefficients to the R, G, and B signals,
    The video signal processing apparatus according to claim 7, further comprising a division unit that divides the R, G, and B signals output from the addition unit by the phase sum of the coefficients.
  9.  R,G,B信号に基づいて輝度信号を生成し、
     前記輝度信号に基づいて映像の特徴量を抽出し、
     前記R,G,B信号に前記特徴量に応じた所定の信号処理を施す映像信号処理方法。
    Generating a luminance signal based on the R, G, B signals;
    Extracting the feature amount of the video based on the luminance signal,
    A video signal processing method for performing predetermined signal processing corresponding to the feature amount on the R, G, and B signals.
PCT/JP2011/067312 2011-07-28 2011-07-28 Video signal processing device and video signal processing method WO2013014791A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/067312 WO2013014791A1 (en) 2011-07-28 2011-07-28 Video signal processing device and video signal processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/067312 WO2013014791A1 (en) 2011-07-28 2011-07-28 Video signal processing device and video signal processing method

Publications (1)

Publication Number Publication Date
WO2013014791A1 true WO2013014791A1 (en) 2013-01-31

Family

ID=47600677

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/067312 WO2013014791A1 (en) 2011-07-28 2011-07-28 Video signal processing device and video signal processing method

Country Status (1)

Country Link
WO (1) WO2013014791A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003101815A (en) * 2001-09-26 2003-04-04 Fuji Photo Film Co Ltd Signal processor and method for processing signal
JP2005026962A (en) * 2003-07-01 2005-01-27 Nikon Corp Signal processing apparatus, signal processing program, and electronic camera
JP2010114667A (en) * 2008-11-06 2010-05-20 Toshiba Corp Automatic white balance adjustment system
JP2010153969A (en) * 2008-12-24 2010-07-08 Sanyo Electric Co Ltd Imaging apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003101815A (en) * 2001-09-26 2003-04-04 Fuji Photo Film Co Ltd Signal processor and method for processing signal
JP2005026962A (en) * 2003-07-01 2005-01-27 Nikon Corp Signal processing apparatus, signal processing program, and electronic camera
JP2010114667A (en) * 2008-11-06 2010-05-20 Toshiba Corp Automatic white balance adjustment system
JP2010153969A (en) * 2008-12-24 2010-07-08 Sanyo Electric Co Ltd Imaging apparatus

Similar Documents

Publication Publication Date Title
US8305397B2 (en) Edge adjustment method, image processing device and display apparatus
JP2022088557A (en) Display device, conversion device, display method, and computer program
US8139887B2 (en) Image-signal processing apparatus, image-signal processing method and image-signal processing program
JP4996725B2 (en) Video processing device
JP2010041337A (en) Image processing unit and image processing method
WO2016181819A1 (en) Image-processing device, image-processing method, and program
CN101969557B (en) Image recording device, and image recording method
US8363167B2 (en) Image processing apparatus, image processing method, and communication system
JP4861228B2 (en) Noise reduction device and noise reduction method
JP5397079B2 (en) Video signal processing apparatus, enhancement gain generation method, and program
US9607359B2 (en) Electronic device, method, and computer program product
JP5066041B2 (en) Image signal processing apparatus and image signal processing method
JP2008182347A (en) Video processing system
US20100079483A1 (en) Image processing apparatus, image processing method, and program
WO2013014791A1 (en) Video signal processing device and video signal processing method
JP5161935B2 (en) Video processing device
JP2011048040A (en) Video signal processing apparatus, method of processing video signal, and program
JP4869302B2 (en) Image processing apparatus and image processing method
JP2009038682A (en) Image processor, and image processing method
JP4960463B2 (en) Video signal processing apparatus and video signal processing method
JP4991884B2 (en) Image processing apparatus and image processing method
US20130044955A1 (en) Image processing apparatus, image processing method, and computer program
JP5374753B2 (en) Video display device and method of operating video display device
WO2012114373A1 (en) Image signal processing method and device
WO2012073865A1 (en) Image processing device, image processing method, image processing program, and display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11870043

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11870043

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP