CN101668149B - Image processing apparatus, image processing method and image display system - Google Patents

Image processing apparatus, image processing method and image display system Download PDF

Info

Publication number
CN101668149B
CN101668149B CN 200910168321 CN200910168321A CN101668149B CN 101668149 B CN101668149 B CN 101668149B CN 200910168321 CN200910168321 CN 200910168321 CN 200910168321 A CN200910168321 A CN 200910168321A CN 101668149 B CN101668149 B CN 101668149B
Authority
CN
China
Prior art keywords
display
image
signal
frame
section
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 200910168321
Other languages
Chinese (zh)
Other versions
CN101668149A (en
Inventor
黑木义彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2004244641A external-priority patent/JP2006078505A/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN101668149A publication Critical patent/CN101668149A/en
Application granted granted Critical
Publication of CN101668149B publication Critical patent/CN101668149B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Liquid Crystal Display Device Control (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The present invention relates to a display apparatus and method for presenting a moving image of less degradation to an observer who is a person viewing a displayed moving image, on the basis of human visual characteristics without unnecessarily increasing the frame rate. A control signal generation section 125 and data line driving circuits 133-1 to 133-4 control display so as to display a moving image made of 105 or more frames/sec on an LCD 131. The LCD 131 displays a moving image made of 105 or more frames/sec on the basis of the control of the control signal generation section 125 and the data line driving circuits 133-1 to 133-4. In the LCD 131, the display of each pixel on the screen is maintained during each frame period. The present invention can be applied to image display systems.

Description

Image processing apparatus, image processing method, and display apparatus
The application is a divisional application of an invention patent application with the application date of 2005, 6 and 9, and the application number of 200580001091.X, and the name of 'display equipment and method'.
Technical Field
The present invention relates to a display apparatus and method, and more particularly, to a display apparatus and method suitable for displaying moving images. In particular, the present invention relates to an image processing apparatus, an image processing method, and a display apparatus.
Background
There is a need to improve the quality of display images by improving signal processing techniques and driving techniques for image display devices.
In general, the image quality of an image can be improved by increasing the resolution of the image and smoothing the texture thereof. The information amount of an image is expressed in units of pixels representing points (dots) constituting the image. The number of pixels of an image is expressed in terms of the number of horizontal and vertical points of the image, e.g., 800 × 600 or 1024 × 768. More specifically, the larger the number of pixels (dots), the smoother the image texture and the larger the amount of information constituting the image.
In order to display an image at high resolution, there is a technique (for example, refer to patent document 1) of displaying an image at twice as high resolution in multiple modes by, for example, using two displays 1 and 2 so that the display 1 displays an image in a normal single mode, and causing the respective displays 1 and 2 to display a left half and a right half of an image, respectively, in multiple modes.
[ patent document 1] Japanese patent application publication No. Hei-10-124024
If an image is displayed at an increased resolution, the amount of information constituting the image increases, so that the amount of data to be transmitted to the display 1 or 2 increases, and the data transmission rate needs to be increased. Therefore, the system is configured to perform transmission of image data without increasing the data transmission rate by reducing the data amount of each point of the displays 1 and 2 and performing conversion of the reduced data through signal processing.
In addition, in particular, the image quality of a moving image can be improved by increasing the frame rate, which is the number of times the screen is updated per second.
For example, when a moving image is projected and displayed on a screen by using a projector, the projector displays frame images line by performing horizontal scanning on a line-by-line basis, and after all lines of one image frame are scanned, starts scanning image data of a subsequent frame, thereby displaying the moving image.
Disclosure of Invention
[ problems to be solved by the invention ]
As described above, in particular, the image quality of a moving image can be improved by increasing the frame rate. However, in order to perform display processing according to a high frame rate, it is necessary to increase the processing speed of a driving circuit for driving the display device, and further, it is necessary to increase the reaction speed of a light amount modulation element that determines the image intensity. This method is technically difficult and leads to increased costs.
Although it is known that the image quality of a moving image can be improved by increasing the frame rate, it is practically impossible to study the relationship between the frame rate and the image quality of a moving image in the case of increasing the frame rate. Therefore, it is not clear whether it is possible to improve the image quality of a moving image to an infinite degree by increasing the frame rate to an infinite degree.
Of course, it is impossible to quantitatively understand the relationship between the frame rate and the moving image in the case of increasing the frame rate.
Therefore, the present inventors paid attention to the frame rate of the next generation digital cinema format, and studied necessary limitations thereof according to visual characteristics.
It has previously been thought that the speed of tracking eye movement, known as smooth tracking, coincides with the speed of movement of the visual target. Westheimer has stated that the eye moves at the same speed as the visual target speed of not more than 30 degrees/sec (Westheimer, g., a.m.a.arch.ophthal.52, pp.932-941, 1954).
However, later studies have demonstrated that the speed of tracking eye movements is in almost all cases less than that of visual targets. Meyer et al have stated that the tracking speed of the eye is about 0.87 times the speed of the target for vision. (Meyer, C.H. et al, Vision Res.Vol.25, No.4, pp.561-563, 1985).
Although Meyer has reported that a maximum tracking speed limit of 100 degrees/second is achieved, Meyer states that such tracking speed is a result obtained from skilled test subjects and that general test subjects cannot do so. The experimental condition was a visual distance of 80 cm, which is very different from the visual environment of a movie theatre. The visual target is a spot of light moved by a galvanometer and Meyer does not discuss the spatial frequency of the visual target.
In Japan, there are NHK reports discussing frame rates (Yasushi Tadokoro et al, NHKTtechnical Report, September (1968), pp422-426, 1968), but the Report is conditioned on a visual distance of 7H (H: screen height), a maximum luminance of 30fl (102.78 cd/m)2)14 inch monitors and still does not take into account the conditions of the theater. In addition, the report concludes that a field frequency of 60Hz or higher is unnecessary since large motion does not occur in general contents. Experimental conditions for Miyahara's dynamic visual acuity with respect to a vibrating visual target were a 14 inch monitor, a visual distance of 4H, and a maximum luminance of 400cd/m2. The experiments on visual characteristics were mainly conducted under visual environmental conditions such as a relatively short distance and a relatively high brightness.
Thus, the inventors are in the visual environment of a movie theater, i.e., maximum brightness of 40cd/m2And a visual distance of 5 to 15m, the dynamic spatial frequency characteristics of the eye were experimentally studied. It is important to study the moving image quality that depends on such dynamic spatial frequency characteristics, because such studies have led to a great deal of consideration for the conventional format for frame rate.
In the course of this study, the present inventors actually studied the relationship between the frame rate and the moving image quality in a higher frame rate, and confirmed the visual characteristics of human beings.
The present invention has been made in view of the above circumstances, and aims to make it possible to present a less degraded moving image to an observer, who is a person who views a displayed moving image, without having to increase the frame rate, based on human visual characteristics.
[ means for solving the problem ]
In one aspect of the present invention, there is provided an image processing apparatus comprising: a control section for controlling n data line driving circuits included in the display device, the data line driving circuits driving the n sub-pixels; wherein the control section causes the moving image to be displayed such that: frame information of a moving image having a first frame rate, displayed at a second frame rate, which is 1/n of the first frame rate; and the n sub-pixels included in each pixel are sequentially driven with a time shift of 1/n of the display period of a single frame having the second frame rate.
In another aspect of the present invention, there is provided an image processing method including: controlling n data line driving circuits included in the display device to drive n sub-pixels so as to display a moving image such that: frame information of a moving image having a first frame rate, displayed at a second frame rate, which is 1/n of the first frame rate; and the n sub-pixels included in each pixel are sequentially driven with a time shift of 1/n of the display period of a single frame having the second frame rate.
In another aspect of the invention, there is provided a software product for processing an image, comprising: controlling n data line driving circuits included in the display device to drive n sub-pixels so as to display a moving image such that: frame information of a moving image having a first frame rate, displayed at a second frame rate, which is 1/n of the first frame rate; and the n sub-pixels included in each pixel are sequentially driven with a time shift of 1/n of the display period of a single frame having the second frame rate.
In another aspect of the present invention, there is provided a display device including: a display section for displaying a moving image, the display section including n data line driving circuits for driving the respective sub-pixels; and a control section for controlling the n data line driving circuits included in the display section; wherein the control section causes the moving image to be displayed such that: frame information of a moving image having a first frame rate, displayed at a second frame rate, which is 1/n of the first frame rate; and the n sub-pixels included in each pixel are sequentially driven with a time shift of 1/n of the display period of a single frame having the second frame rate.
A first display device of the present invention is characterized by comprising: display control means for controlling display so as to cause the display means to display a moving image made up of not less than 105 frames/sec; and a display section for displaying a moving image composed of not less than 105 frames/second based on the control of the display control section, wherein the display of each pixel on the screen is maintained during each frame period.
The display control means controls the display so as to cause the display means to display the moving image made up of not less than 230 frames/sec, and the display means is capable of displaying the moving image made up of not less than 230 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display a moving image made up of not more than 480 frames/sec, and the display means is capable of displaying the moving image made up of not more than 480 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 120 frames/sec, and the display means is capable of displaying the moving image made up of 120 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 240 frames/sec, and the display means is capable of displaying the moving image made up of 240 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 250 frames/sec, and the display means is capable of displaying the moving image made up of 250 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 360 frames/sec, and the display means is capable of displaying the moving image made up of 360 frames/sec based on the control of the display control means.
A first display method of the present invention is a display method for a display device equipped with a display section in which display of each pixel on a screen is maintained during each frame period, and is characterized by comprising a display control step of: the display is controlled so that the display section displays a moving image made up of not less than 105 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of not less than 230 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of not more than 480 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 120 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 240 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 250 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 360 frames/sec.
The second display device of the present invention is characterized by comprising: display control means for controlling display so as to cause the display means to display a moving image made up of not less than 105 frames/sec; and a display section for displaying a moving image composed of not less than 105 frames/sec based on control by the display control section, the display section being matrix-driven.
The display control means controls the display so as to cause the display means to display the moving image made up of not less than 230 frames/sec, and the display means is capable of displaying the moving image made up of not less than 230 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display a moving image made up of not more than 480 frames/sec, and the display means is capable of displaying the moving image made up of not more than 480 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 120 frames/sec, and the display means is capable of displaying the moving image made up of 120 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 240 frames/sec, and the display means is capable of displaying the moving image made up of 240 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 250 frames/sec, and the display means is capable of displaying the moving image made up of 250 frames/sec based on the control of the display control means.
The display control means controls the display so as to cause the display means to display the moving image made up of 360 frames/sec, and the display means is capable of displaying the moving image made up of 360 frames/sec based on the control of the display control means.
A second display method of the present invention is a display method for a display device equipped with a matrix-driven display section, and is characterized by comprising a display control step of: the display is controlled so that the display section displays a moving image made up of not less than 105 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of not less than 230 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of not more than 480 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 120 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 240 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 250 frames/sec.
In the display control step, the display is controlled so that the display section displays a moving image composed of 360 frames/sec.
In the first display apparatus and the first display method according to the present invention, the display is controlled so as to cause the display section in which the display of each pixel on the screen is maintained during each frame period to display a moving image composed of not less than 105 frames/sec.
In the second display apparatus and the second display method according to the present invention, the display is controlled so that the matrix-driven display section displays a moving image composed of not less than 105 frames/sec.
[ advantages of the invention ]
As described above, according to the present invention, it is possible to present a less deteriorated moving image to an observer, who is a person who views a displayed moving image, without having to increase the frame rate based on human visual characteristics.
Drawings
Fig. 1 is a block diagram for explaining a first configuration example of an image display system to which the present invention is applied.
Fig. 2 is a diagram for explaining timings of an input video signal and an output video signal.
Fig. 3 is a view for explaining a configuration example of the image display device shown in fig. 1.
Fig. 4 is a diagram for explaining an update rate of an edge portion of a moving image displayed on the image display apparatus shown in fig. 3.
Fig. 5 is a flowchart for explaining display control processing 1 to be executed by the image display system shown in fig. 1.
Fig. 6 is a block diagram for explaining a second configuration example of an image display system to which the present invention is applied.
Fig. 7 is a flowchart for explaining display control processing 2 to be executed by the image display system shown in fig. 6.
Fig. 8 is a diagram for explaining timings of an input video signal and an output video signal.
Fig. 9 is a view showing an example of an actual scene in which a moving object and a stationary object coexist together.
Fig. 10 is a view for explaining a gaze (gaze) condition.
Fig. 11 is a view for explaining a tracking condition.
Fig. 12 is a view for explaining the identification of the observer during tracking and gaze fixation.
Fig. 13 is a view for explaining recognition of an observer under an image capturing condition, under a display condition, and under an observation condition.
Fig. 14 is a view for explaining a strobe artifact (strobe artifact).
Fig. 15 is a view for explaining viewer recognition at a high frame rate under image capturing conditions, under display conditions, and under observation conditions.
Fig. 16 is a view for explaining a result of evaluating moving image quality in terms of judder (jitter).
Fig. 17 is a view for explaining a result of evaluating moving image quality in accordance with motion blur.
Fig. 18 is a view for explaining a configuration example of a projector and a screen, in which n is a number other than 2.
Fig. 19 is a view for explaining timings of an input video signal and an output video signal, where m is 240 and n is 4.
Fig. 20 is a view for explaining timings of an input video signal and an output video signal, where m is 250 and n is 5.
Fig. 21 is a diagram for explaining the configuration of an image display system 101 using an LCD.
[ description of reference numerals ]
An image display system, 11.. image signal conversion means, 12.. image display means, 21.. A/D conversion section, 22.. synchronous signal detection section, 23.. frame memory, 24.. controller, 25.. D/A conversion section, 27.. display control section, 41.. scan control section, 43.. display section, 71.. image display system, 81.. image signal conversion section, 91.. data separation section, 92.. data storage section, 94.. controller, 93.. frame memory, 101.. image display system, 111.. signal processing section, 112.. clock/sampling pulse generation section, 113.. image display means, 121.. Y/C separation/chroma decoding section, a/D conversion section, 124.. a sync signal detection section, 125.. a control signal generation section, 131.. LCD, 133-1 to 133-4.. a data line driving circuit, 134.. a gate line driving section, 141.. a liquid crystal element, 142.. TFT, 143.. a capacitor.
Detailed Description
Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a block diagram showing the configuration of an image display system 1 to which the present invention is applied. The image display system 1 includes an image signal conversion apparatus 11 and an image display apparatus 12. The image display system 1 is configured to be supplied with an analog image signal corresponding to a moving image, process the image signal in the image signal conversion device 11, and supply the processed image signal to the image display device 12 so as to display the moving image.
The analog image signal supplied to the image signal conversion device 11 is supplied to the a/D conversion section 21 and the synchronization signal detection section 22.
The a/D conversion section 21 converts an analog image signal having a frame rate m into a digital image signal, and supplies the digital image signal to the frame memory 23. The synchronization signal detection section 22 detects the frame rate and dot clock of the image signal from the image signal, and generates a vertical synchronization signal and dot clock signal, and supplies the vertical synchronization signal and dot clock signal to the controller 24. The dot clock is the inverse of the time required to display a dot on the display.
The controller 24 is supplied with the vertical synchronization signal and the dot clock signal from the synchronization signal detection section 22, and controls the video signal output from the frame memory 23, and supplies information associated with the video signal output from the frame memory 23 to the display control section 27. The frame memory 23 outputs the supplied digital image signal to the D/a conversion section 25-1 or the D/a conversion section 25-2 based on the control of the controller 24.
The video signal input and output of the frame memory 23 under the control of the controller 24 will be described below with reference to fig. 2.
Let m denote the frame rate of the input video signal S1 input to the frame memory 23. It is also assumed that frames sequentially input to the frame memory 23 are α frame, α +1 frame, α +2 frame. When the α frame and the α +1 frame are sequentially input to the frame memory 23, the controller 24 controls the frame memory 23 so as to output the α frame to the D/a conversion section 25-1 as the output video signal S2 at a frame rate of 1/2 equal to the frame rate of the input video signal S1, and so as to output the α +1 frame to the D/a conversion section 25-2 as the output video signal S3 at a supply start time b delayed by 1/m from the supply start time a of the α frame.
The time period taken to supply the α frame to the D/a conversion section 25-1 is 2/m, and the supply end time c is 1/m later than the supply start time b of the α +1 frame to the D/a conversion section 25-2. After the α +1 frame, α +2 frames and α +3 frames are sequentially input to the frame memory 23, and the controller 24 controls the frame memory 23 so that the α +2 frames are supplied as the output video signal S2 to the D/a conversion section 25 at a frame rate of 1/2 equal to the frame rate of the input video signal S1 continuously with the supply of the α frames (i.e., at the supply time c). Similarly, the controller 24 supplies the α +3 frame as the output video signal S3 to the D/a conversion section 25-2 at the supply start time D delayed by 1/m from the supply start time c of the α +2 frame and equal to the supply end time of the α +1 frame.
The supply timing deviation between the output video signal S2 and the output video signal S3 is determined by the vertical synchronization signal of the input video signal S1. More specifically, as shown in fig. 2, the period between the supply start time a of the output video signal S2 and the supply start time f of the output video signal S3 is equal to one frame period of the input video signal S1. Based on the vertical synchronizing signal supplied from the synchronizing signal detecting section 22, the controller 24 controls the supply timing of the output video signal S2 to the D/a converting section 25, and the supply timing of the output video signal S3 to the D/a converting section 25-2.
Accordingly, the controller 24 controls the frame memory 23 so as to supply the output video signal S2 and the output video signal S3 to the D/a conversion section 25-1 and the D/a conversion section 25-2, respectively, alternately on a frame-by-frame basis at a frame rate m/2 of 1/2 equal to the frame rate m of the input video signal S1, in such a manner that the supply start time of each frame of one of the output video signals S2 and S3 is shifted by half (1/m) the output one-frame supply time (2/m) with respect to the supply start time of each frame of the other.
Returning to the description of the image display system 1 shown in fig. 1.
The D/a conversion section 25-1 converts the supplied digital image signal into an analog image signal, and supplies the analog image signal to the scan control section 41-1 of the image display apparatus 12. The D/a conversion section 25-2 converts the supplied digital image signal into an analog image signal, and supplies the analog image signal to the scan control section 41-2 of the image display apparatus 12.
Based on the information supplied from the controller 24, the display control section 27 controls the moving image display of the image display apparatus 12 so as to display frame images corresponding to the output video signals S2 and S3 at a timing similar to that described above with reference to fig. 2.
As described above with reference to fig. 2, the frame rate of each of the output video signal S2 and the output video signal S3 is equal to 1/2 of the frame rate of the input video signal S1. More specifically, the dot clock of each of the output video signal S2 and the output video signal S3 is equal to 1/2 of the dot clock of the input video signal S1. Based on the information associated with the video signal output from the frame memory 23 and supplied from the controller 24, the display control section 27 performs control so that the dot clock of each of the output video signal S2 and the output video signal S3 displayed on the image display device 12 becomes 1/2 equal to the dot clock of the input video signal S1.
The driver 28 may be connected to the controller 24 as needed. A magnetic disk 31, an optical disk 32, a magneto-optical disk 33, or a semiconductor memory 34 is mounted in the drive 28 so as to transmit and receive information.
The image display apparatus 12 is supplied with the two-line analog video signals converted by the image signal conversion apparatus 11, and displays a moving image on the display section 43 based on the control of the display control section 27 by using the scan control section 41-1 and the scan control section 41-2.
The scan control section 41-1 is supplied with an analog video signal corresponding to the output video signal S2, which is read from the frame memory 23 at the timing as described above with reference to fig. 2 and converted into an analog signal by the D/a conversion section 25-1. Similarly, the scan control section 41-2 is supplied with an analog video signal corresponding to the output video signal S3, which is read from the frame memory 23 at the timing as described above with reference to fig. 2 and converted into an analog signal by the D/a conversion section 25-2.
The scan control section 41-1 and the scan control section 41-2 display the respective supplied analog video signals on the display section 43 by a dot-sequential or line-sequential scanning method. At this time, the scan control section 41-1 and the scan control section 41-2 can perform image display on the display section 43 at a frame rate twice as high as the frame rate at which the scan control section 41-1 or the scan control section 41-2 alone performs image drawing by alternately scanning consecutive frames while shifting the scan start time of each of the consecutive frames by 1/2 frames with respect to the scan start time of the subsequent frame.
The image display device 12 may be configured not only as a single device but also as an image display system constituted by a plurality of devices. If the image display device 12 is configured as an image display system, for example, as shown in fig. 3, the image display system may be constituted by a projector 51-1, a projector 51-2, and a screen 52.
The specific operation of the image display apparatus 12 will be described below with reference to an example using the projector 51-1, the projector 51-2, and the screen 52 shown in fig. 3. The projector 51-1 corresponds to the scan control section 41-1 in fig. 1, the projector 51-2 corresponds to the scan control section 41-2 in fig. 1, and the screen 52 corresponds to the display section 43 in fig. 1.
For example, the projector 51-1 is supplied with an analog video signal corresponding to the output video signal S2, which is read from the frame memory 23 at the timing as described above with reference to fig. 2 and converted into an analog signal by the D/a converting section 25-1. Similarly, the projector 51-2 is supplied with an analog video signal corresponding to the output video signal S3, which is read from the frame memory 23 at the timing as described above with reference to fig. 2 and converted into an analog signal by the D/a converting section 25-2.
At a timing based on the control of the display control section 27, each of the projector 51-1 and the projector 51-2 displays a frame image corresponding to the supplied video signal by scanning the screen 52 in the horizontal direction from the pixel (X, Y) ═ 0 to the pixel (X, Y) ═ p, q, where the range from the pixel (X, Y) ═ 0 to the pixel (X, Y) ═ p, q forms an image to be displayed. The frame rate of the frame image displayed by each of the projector 51-1 and the projector 51-2 is m/2. As in the case of the output video signal S2 and the output video signal S3 described above with reference to fig. 2, the scanning start time of each frame displayed by one of the projectors 51-1 and 51-2 is shifted by 1/2 with respect to one display frame provided by the other, and the phase difference between the scanning thereof is 1/m.
For example, when projector 51-2 scans a line on screen 52 corresponding to the α +1 frame on the line represented by scan B, projector 51-1 scans a line on screen 52 corresponding to the α +2 frame on the line represented by scan A. The row represented by scan B is a row of 1/2 offset by the number of rows of one frame relative to the row represented by scan a. More specifically, the moving image displayed on the screen 52 is alternately rewritten by the scan a and the scan B at a time interval of 1/m.
For example, if the frame rate of the display image output from each of the projector 51-1 and the projector 51-2 is 150Hz, the frame rate of the moving image displayed on the screen becomes substantially 300 Hz.
In addition, in order to prevent a deviation from occurring between scanning lines formed at the same position by a corresponding one of the scanning a and the scanning B, it is possible to correct the scanning position of the pixel by using a technique similar to the image optical position correction used for the conventional so-called double stack (twin stack) technique. The double-stacking technique is a technique capable of displaying a bright image by displaying the same image at the same time and at the same position using two projectors. When an image is displayed by using the double-stack technique, the luminance of the displayed image becomes twice as high, so that sharp projection can be achieved even in a bright environment or a long projection distance.
The use of the double stack technique causes a problem that image blur occurs due to a deviation between pixel positions of two projected images, but a so-called screen shift function capable of finely adjusting the pixel positions of optical projected images is widely used to solve such a problem. According to the screen shift function, the positions of images projected from the two projectors can be made to accurately coincide with each other.
For example, a technique of correcting a deviation between pixel positions of two projection images is disclosed in japanese patent application No. hei 10-058291.
By adjusting the image display device 12 so that the deviation between the scanning lines formed by the scanning a and the scanning B becomes not more than one pixel (one dot or one pixel), the image display device 12 becomes capable of displaying a moving image without causing image blur due to overlapping of images shifted from each other by one frame.
As described above, in the case where the projector 51-1 and the projector 51-2 alternately draw frame images on a frame-by-frame basis while shifting each frame image by 1/2 frames with respect to the subsequent frame image, the scanning for drawing an image of one frame is started by one of the projectors earlier than the previous frame in which the scanning and drawing are completed by the other projector. At this time, when the object C displayed on the screen 52 of fig. 3 is displayed to move from left to right on the display screen, for example, the smoothing of the movement of the edge portion β is perceived as smoothing of the displayed image by the user of the observed moving image.
The display of the edge portion β of the object C on the screen 52 will be described below with reference to fig. 4.
The object C of α frame is displayed by the projector 51-1, and after a period of 1/m second, the object C of α +1 frame is displayed by the projector 51-2. After 1/m period of display from the α frame, the position of the edge portion β of the object C at this time is rewritten. Then, after 1/m period, the object C of α +2 frames is displayed by the projector 51-1. After 1/m period displayed from the α +1 frame, the edge portion β of the object C is rewritten.
For example, when the frame rate of the display image output from each of the projector 51-1 and the projector 51-2 is 150Hz, frames of the moving image displayed by each individual projector of the projector 51-1 and the projector 51-2 are rewritten at an interval of 1/150 (seconds). However, the edge portion β of the object C, which is displayed on the screen 52 by alternately displaying frame images on a frame-by-frame basis by means of the projector 51-1 and the projector 51-2, is refreshed at an interval of 1/300 (seconds). Therefore, the movement of the edge portion β of the object C observed by the user becomes very smooth.
The image display device 12 has been described as being configured to control the display of images under the control of the display control device 27. However, in order to control the operations of the projector 51-1 and the projector 51-2 as described above by way of example with reference to fig. 3, the image display apparatus 12 may have internally the display control section 27 so as to be supplied with a control signal necessary for image display from the controller 24, or may have internally a control section different from the display control section 27 so as to be supplied with a vertical synchronization signal and a dot clock signal from the display control section 27.
The operation of image display device 12 is described above with illustrative reference to a projection display system consisting of projector 51-1, projector 51-2, and screen 52. However, the image display apparatus 12 may use any other display system capable of drawing an image by a dot-sequential or line-sequential method as long as the display system can cause two display apparatuses to alternately scan consecutive frames at an offset of 1/2 frames and perform display of a moving image at a frame rate twice as high as that of each individual display apparatus of the two display apparatuses.
The image display device 12 may use a device that performs image drawing by a dot sequential or line sequential method, such as a direct-view display or a projector using a CRT (cathode ray tube), an LCD (liquid crystal display), a GLV (grating light valve), an LED (light emitting diode), or an FED (field emission display).
For example, GLV is an image display technology using a micro ribbon array (micro ribbon array), which is a projection device for controlling the direction, color, and the like of light by using a light diffraction effect. The micro strip array includes micro light diffraction devices arranged in rows, and the GLV performs projection of an image by irradiating laser light onto the micro strip array. The stripes (ribbon) can be independently driven by an electrical signal, and the driving amount of each stripe can be adjusted so as to change the amount of light diffraction and produce light and dark in the image by means of the difference between each stripe. Therefore, the GLV can realize smooth gray representation and high contrast.
An LED is a device formed by segments of two semiconductors and is capable of emitting light when an electrical current is applied.
The FED is a device capable of obtaining an image by an emission principle similar to a CRT by taking out electrons from a cathode and causing the electrons to collide with a fluorescent material coated on an anode to emit light. However, the cathode of the CRT has a structure using a point electron source, and the cathode of the FED has a structure using a flat electron source.
The display control process 1 to be executed by the image display system 1 shown in fig. 1 will be described below with reference to a flowchart shown in fig. 5.
In step S1, the synchronization signal detection section 22 detects a synchronization signal and a dot clock from the supplied analog video signal, and supplies the vertical synchronization signal and the dot clock signal to the controller 24.
In step S2, the a/D conversion section 21 performs a/D conversion of the supplied analog video signal, and supplies the digital video signal to the frame memory 23.
In step S3, the frame memory 23 sequentially stores the supplied digital video signals.
At step S4, as described above with reference to fig. 2, according to the control of the controller 24, the frame memory 23 alternately outputs video signals to the two D/a conversion sections 25-1 and 25-2 on a frame-by-frame basis at a frame rate corresponding to the output dot clock whose rate is half the rate of the input video signal S1, while shifting each frame by half the scanning period required for displaying one frame with respect to the subsequent frame. The video signal output to the D/a conversion section 25-1 is an output video signal S2, and the video signal output to the D/a conversion section 25-2 is an output video signal S3.
In other words, the controller 24 controls the frame memory 23 to separate the frame stored in the frame memory 23 into an odd frame and an even frame, and shifts each of the odd and even frames with respect to the subsequent frame by a period of half of the scanning period required for displaying one frame, so as to alternately output the frames to the D/a conversion section 25-1 and the D/a conversion section 25-2.
In step S5, the D/a conversion section 25-1 and the D/a conversion section 25-2 perform D/a conversion of the supplied video signal, and supply the analog video signal to the image display apparatus 12.
At step S6, the display control section 27 at a timing similar to that used in the case of the output video signal S2 and the output video signal S3 described above with reference to fig. 2, a scan control section 41-1 and a scan control section 41-2 (in fig. 3, a projector 51-1 and a projector 51-2) that control the image display apparatus 12, and the scanning start time of each frame is shifted from the scanning start time of the subsequent frame by a period of half the scanning period required for displaying one frame, thereby alternately scanning the respective frames of the video signal on a frame-by-frame basis to display a video image on the display section 43 (in fig. 3, the screen 52) at a display frame rate substantially twice as high as the frame rate of each of the scan control section 41-1 and the scan control section 41-2. The display control section 27 controls the image display device 12 in this manner, and completes the processing.
Through the above-described processing, a moving image to be displayed is separated into odd and even frames, and the odd and even frames are supplied to the two display devices, respectively. Then, each display apparatus scans odd and even frames at a frame rate half the frame rate of the moving image to be displayed at an offset of 1/2 frames so that the moving image can be displayed at a frame rate twice as high as the capability of the display apparatus.
In addition, by adjusting the scanning position accuracy of two scanning lines so that the positional deviation is not more than one point (one pixel), it is possible to clearly display a moving image without image blurring due to overlapping of images shifted from each other by one frame.
In addition, if each of the projector 51-1 and the projector 51-2 is a so-called liquid crystal projector, a shutter (shutter) may be provided in front of a projection lens of the projector 51-1, for example, in fig. 2, between a supply start time a and a supply start time b, between a supply start time c and a supply start time d, and between a supply start time e and a supply start time f, the shutter passing light for displaying an image projected by the projector 51-1, and in fig. 2, between the supply start time b and the supply start time c, and between the supply start time d and the supply start time e, the shutter blocking light for displaying an image projected by the projector 51-1. In addition, a shutter may be provided in front of the projection lens of the projector 51-2, for example, the shutter passes light for displaying an image projected by the projector 51-2 between the supply start time b and the supply start time c, and between the supply start time d and the supply start time e in fig. 2, and blocks light for displaying an image projected by the projector 51-2 between the supply start time a and the supply start time b, between the supply start time c and the supply start time d, and between the supply start time e and the supply start time f in fig. 2.
More specifically, a shutter provided in front of a projection lens of the projector 51-1 passes or blocks light for displaying an image projected by the projector 51-1 so as to display α frames, α +2 frames, α +4 frames, and so forth all synchronized with the input video signal S1 shown in fig. 2, that is, (α +2 × n) frames (n is an integer) synchronized with only the input video signal S1. A shutter provided in front of the projection lens of the projector 51-2 passes or blocks light for displaying an image projected by the projector 51-2, so as to display α +1 frames, α +3 frames,. that is, (α +2 × n +1) frames (n is an integer) synchronized with only the input video signal S1, all in synchronization with the input video signal S1 shown in fig. 2.
In addition, each of the shutters may be a liquid crystal shutter or a mechanical shutter, and need only be capable of transmitting or blocking light at intervals of a predetermined cycle.
In addition, each of the shutters may be provided in the projector 51-1 or the projector 51-2, for example, between the light source and the liquid crystal device or behind the liquid crystal device.
Fig. 6 is a block diagram showing the configuration of an image display system 71 to which the present invention is applied and which has a different configuration from the image display system 1 shown in fig. 1.
The same reference numerals are used to designate portions corresponding to those shown in fig. 1, and the description of the same portions is omitted.
The image display system 71 shown in fig. 6 causes the image display apparatus 12 similar to that used in the image display system 1 shown in fig. 1 to display a moving image, but converts an image signal by using an image signal conversion apparatus 81 different from the image signal conversion apparatus 11 shown in fig. 1.
The analog image signal supplied to the image signal conversion device 81 is supplied to the a/D conversion section 21 and the synchronization signal detection section 22.
The a/D conversion section 21 converts the analog image signal having the frame rate m into a digital image signal, and supplies the digital image signal to the data separation section 91. The synchronization signal detection section 22 detects the frame rate and dot clock of the image signal from the image signal and generates a vertical synchronization signal and dot clock signal, and supplies the vertical synchronization signal and dot clock signal to the data separation section 91, the data holding section 92-1, the data holding section 92-2, and the controller 94.
The data separating section 91 separates the supplied digital image signal into individual frames based on the vertical synchronizing signal supplied from the synchronizing signal detecting section 22, and alternately supplies the frames to the data holding section 92-1 or the data holding section 92-2 on a frame-by-frame basis. For example, the data separating section 91 supplies the odd frames to the data holding section 92-1 and supplies the even frames to the data holding section 92-2.
The data holding section 92-1 serves as an interface between the data separating section 91 and the frame memory 93-1, and the data holding section 92-2 serves as an interface between the data separating section 91 and the frame memory 93-2. Each of the data holding sections 92-1 and 92-1 supplies the supplied image signal to the frame memory 93-1 or the frame memory 93-2 on a frame-by-frame basis based on the vertical synchronization signal supplied from the synchronization signal detecting section 22.
The controller 94 is supplied with the vertical synchronization signal and the dot clock signal from the synchronization signal detection section 22, and controls the output timing of the video signals of the frame memory 93-1 and the frame memory 93-2.
The frame memory 93-1 supplies the video signal to the D/a conversion portion 25-1 based on the control of the controller 94. The frame memory 93-2 supplies the video signal to the D/a conversion portion 25-2 based on the control of the controller 94.
If it is assumed here that the signal supplied to the data separation section 91 is the input video signal S1, the signal output from the frame memory 93-1 is the output video signal S2, and the signal output from the frame memory 93-2 is the output video signal S3, the input-output relationship between these signals is similar to that described above with reference to fig. 2.
Fig. 2 does not consider signal delay caused by data separation processing or the like in the data separation section 91, but the controller 94 may be adapted to adjust the delay of the signal and the timing deviation between two lines of video signals, for example, by means of the signal output timings of the data holding section 92-1 and the data holding section 92-2.
The D/a conversion section 25-1 converts the supplied digital image signal into an analog image signal, and supplies the analog image signal to the image display device 12. The D/a conversion section 25-2 converts the supplied digital image signal into an analog image signal, and supplies the analog image signal to the image display device 12.
The display control section 27 controls the moving image display on the image display apparatus 12 based on the information supplied from the controller 94, and displays frame images corresponding to the output video signal S2 and the output video signal S3 at a timing similar to that described above with reference to fig. 2.
The driver 28 may be connected to the controller 24 as needed. A magnetic disk 31, an optical disk 32, a magneto-optical disk 33, or a semiconductor memory 34 is mounted on the drive 28 so as to transmit and receive information.
The display control process 1 to be executed by the image display system 66 shown in fig. 6 will be described below with reference to a flowchart shown in fig. 7.
In step S21, the synchronization signal detection section 22 detects the synchronization signal and the dot clock from the supplied analog image signal, and supplies the vertical synchronization signal and the dot clock signal to the data separation section 91, the data holding section 92-1, the data holding section 92-2, and the controller 94.
In step S22, the a/D conversion section 21 performs a/D conversion of the supplied analog video signal, and supplies the digital video signal to the data separation section 91.
At step S23, based on the vertical synchronizing signal supplied from the synchronizing signal detecting section 22, the data separating section 91 separates the supplied analog video signal into individual frames and alternately supplies the frames to the data holding section 92-1 or the data holding section 92-2 on a frame-by-frame basis. For example, the data separating section 91 supplies the odd frames to the data holding section 92-1 and supplies the even frames to the data holding section 92-2.
At step S24, each of the data holding part 92-1 and the data holding part 92-2 supplies the supplied video signal to the frame memory 93-1 or the frame memory 93-2, and causes the frame memory 93-1 or the frame memory 93-2 to store the supplied video signal.
At step S25, the controller 94 controls the frame memory 93-1 and the frame memory 93-2 so as to alternately output one frame of the video signal from the frame memory 93-1 to the D/a converting section 25-1 and from the frame memory 93-2 to the D/a converting section 25-2 on a frame-by-frame basis at a frame rate corresponding to an output dot clock equal to half of the dot clock of the input video signal S1, while shifting each frame relative to the succeeding frame by a period of half the scanning period required for displaying one frame. More specifically, if it is assumed here that the signal supplied to the data separation section 91 is the input video signal S1, the signal output from the frame memory 93-1 is the output video signal S2, and the signal output from the frame memory 93-2 is the output video signal S3, the input-output relationship between these signals is similar to that described above with reference to fig. 2.
In step S26, each of the D/a conversion section 25-1 and the D/a conversion section 25-2 performs D/a conversion of the supplied video signal, and supplies the analog video signal to the image display apparatus 12.
At step S27, the display control section 27 controls the scan control section 41-1 and the scan control section 41-2 (in fig. 3, the projector 51-1 and the projector 51-2) of the image display apparatus 12 at a timing similar to the timing used in the case of the output video signal S2 and the output video signal S3 described above with reference to fig. 2, and shifts the scan start time of each frame relative to the scan start time of the succeeding frame by a period of half the scan period required for displaying one frame, thereby alternately scanning the respective frames of the video signal on a frame-by-frame basis to display the video image on the display section 43 (in fig. 3, the screen 52) at a display frame rate that is substantially twice as high as the frame rate of each of the scan control section 41-1 and the scan control section 41-2. The display control section 27 controls the image display device 12 in this manner, and completes the processing.
Even in the image display system 71 shown in fig. 6, as in the case of the image display system shown in fig. 1, by the above-described processing, the moving image to be displayed is separated into odd and even frames, and the odd and even frames are supplied to the two display devices, i.e., the scan control section 41-1 and the scan control section 41-2, respectively. Then, each display apparatus scans the frame image at a frame rate half the frame rate of the moving image to be displayed, at an offset of 1/2 frames, so that the moving image can be displayed at a frame rate substantially twice as high as the capability of the display apparatus.
In the above description of the embodiments of the present invention, reference is made to the case where the supplied image signal is separated into two lines of image signals and the image is drawn by the two scan control sections, but the separation number of the image signals may be any number not less than two.
If the number of separations of the image signal is three, for example, the image signal output from the frame memory is sequentially supplied to the three D/a conversion sections, or the frames separated into three by the data separation section are sequentially supplied to and stored in the three frame memories, respectively. Thus, as shown in fig. 8, the input video signal S1 is separated into three output video signals S2, S3, and S4, and the three output video signals S2, S3, and S4 are supplied to the three scan control sections, respectively.
The first scan control section controls display of α frames, α +3 frames, α +6 frames, all of which correspond to the output video signal S2. The second scan control section controls display of α +1 frames, α +4 frames, α +7 frames,. all of which correspond to the output video signal S3. The third scan control section controls display of α +2 frames, α +5 frames, α +8 frames,. all of which correspond to the output video signal S4. The frame rate of frames of the output video signal displayed by the first scan control section, the second scan control section, and the third scan control section, respectively, is 1/3 of the frame rate of the input video signal, and the scan start times of the frames scanned by the first scan control section, the second scan control section, and the third scan control section, respectively, are offset from each other by 1/3 of the scan period required for displaying one frame of each of the output video signals S2 to S4.
If the input video signal S1 is 180Hz, for example, the input video signal S1 is separated into three output video signals S2, S3, and S4, and the three output video signals S2, S3, and S4 are supplied to the three scan control sections, respectively, and scanned and displayed as output video signals at a frame rate of 60Hz by the respective scan control sections. If the input video signal S1 is, for example, 150Hz, the input video signal S1 is separated into three output video signals S2, S3, and S4, and the three output video signals S2, S3, and S4 are supplied to the three scan control sections, respectively, and scanned at a frame rate of 50Hz as the output video signals by the respective scan control sections and displayed. In this way, a scan control section of the type most widely used at present, which is capable of displaying moving images at a much higher frame rate at 50Hz (PAL: phase alternating line) or 60Hz (NTSC: national television system committee or HD (high definition) video signal), can be employed.
While the NTSC frame rate is more correctly 59.94 frames/second, the NTSC frame rate referred to herein is defined as 60 frames/second, as is customary by those skilled in the art. Similarly, the multiple of 59.94 is referred to as a multiple of 60. More specifically, 59.94, 119.88, 179.82, 239.76, 299.70, 359.64, 418.58, and 479.52 are referred to herein as 60, 120, 180, 240, 300, 360, 420, and 480, respectively.
Therefore, if the number of divisions of the input video signal is n, for example, n scan control sections are provided, and the frame rates of the frames of the output video signal displayed by the first to nth scan control sections, respectively, are 1/n of the frame rate of the input video signal. The drawing start times of the frames scanned by the first to nth scan control sections, respectively, are shifted from each other by 1/n of the display period of one frame of the respective output video signals, whereby moving images can be displayed at a frame rate substantially n times higher than a case where each of the scan control sections individually displays moving images.
In addition, the number of scan control sections may be set to s, and the separation number of video signals may be set to n smaller than s, so as to display a moving image by using n scan control sections among the s scan control sections.
In the above description made in conjunction with fig. 1 and 6, reference is made to an image display system each of which is constituted by an image signal conversion device and an image display device, but, of course, each of the constituent elements may be implemented as a single device.
The image signal conversion apparatus 11 shown in fig. 1 is described on the assumption that the controller 24 controls the frame memory 23 and the display control section 27 controls the image display apparatus 12, while the image signal conversion apparatus 81 shown in fig. 6 is described on the assumption that the controller 94 controls the frame memory 93-1 and the frame memory 93-2 and the display control section 27 controls the image display apparatus 12. However, a frame memory for storing a video signal and an image display apparatus for displaying an image may be controlled by the same controller. The display control section 27 may be provided not in the image signal conversion apparatus 11 or 81 but in the image display apparatus 12.
Moving images are accompanied by characteristic image quality degradation that does not occur in still images. In displays of the type most widely used at present, 50Hz (pal) and 60Hz (NTSC and HD video signals), the reproduction in the temporal direction is defective, and under certain conditions this defect in the temporal direction is converted into a defect in the spatial direction. Therefore, deterioration of image quality of a moving image occurs due to, for example, a shutter cycle used to acquire moving image data, an emission cycle of a display device during display of the moving image, and a line-of-sight condition of an individual.
Fig. 9 shows an example of an actual scene in which a stationary object and a moving object coexist together. The scenario assumes that the car is moving to the right and the tree is stationary on the ground. Fig. 10 and 11 illustrate the identification of an observer observing the scene illustrated in fig. 9.
Fig. 10 is a view showing video image recognition of an observer who gazes (doze) a tree. In this case, a car moving to the right is hazy visible to the observer. Fig. 11 is a view showing video image recognition of an observer who looks at a car. In this case, the static tree is hazy visible to the viewer.
In the following description, a case where the observer fixes his/her sight line on a fixed object of the observation plane coordinates is referred to as a gaze condition, and a case where the observer makes the sight line track a moving object of the observation plane coordinates is referred to as a tracking condition. More specifically, the case described in connection with fig. 10 corresponds to the gaze condition, and the case described in connection with fig. 11 corresponds to the tracking condition. In both gaze and tracking conditions, the object gazed at by the observer is clearly visible, while the object that changes relative position with respect to the gazed object is dimly visible.
The reason for this is that human visual characteristics have the function of integrating light incident on the retina within a certain period. An object moving on the coordinates of the retina of the eye shows a position change integrated in the time direction, so that the moving object is perceived as a blurred image. The blur is proportional to the speed of motion in the retinal coordinates. The velocity of motion on the retinal coordinates does not correspond to the actual velocity of the object but to its angular velocity (degrees/second).
As described above, an object that is stationary in the retinal coordinates is clearly visible, and an object that is moving in the retinal coordinates is hazily visible. In order to display a moving image having reality, that is, a high-quality moving image which appears to be a smooth motion, a video image in accordance with such actual recognition is important.
The difference between the observer recognition described above with reference to fig. 10 and the observation recognition described above with reference to fig. 11 will be described below with reference to fig. 12. The upper part of fig. 12 shows the actual movement in the outside world. The vertical axis represents the time axis and the horizontal axis represents the horizontal direction, and the upper part of fig. 12 shows the position of a point at each time in a scene where a fixed point (corresponding to the tree in fig. 9 to 11 and represented by x in fig. 12) and a point moving at a constant speed (corresponding to the car in fig. 9 to 11 and represented by y in fig. 12) exist in the outside world. The lower part of fig. 12 shows the viewer's recognition of motion in the outside world during gaze and tracking. The arrow shown in dotted lines indicates the movement of the viewpoint of the observer, i.e., the direction in which the video image is integrated on the retina. The arrow extending in the vertical direction indicates the direction of integration during gaze, and the arrow extending in the oblique direction indicates the direction of integration during tracking. More specifically, when the observer tracks, the fixed points (trees) are hazy visible, but the moving points (cars) are clearly visible. On the other hand, when the observer looks deep, the fixed points (tree) are clearly visible, but the moving points (automobile) are hazily visible.
The identification of the motion in the outside world shown in fig. 9 by the observer when the motion captured by the fixed image capturing method is reproduced as a moving image will be described below with reference to fig. 13 in terms of image capturing conditions, display conditions, and observation conditions. The upper part of fig. 13 shows the temporal change of the moving image display. The recognition as the observer in the lower part of fig. 13 shows the result obtained by integrating the lights displayed as the moving image in the direction of the line-of-sight movement during the gaze and tracking, i.e., the direction of the integrated axis.
Fig. 13A shows the identification of the viewer in the case of an image captured by the open shutter (open shutter) technique and displayed by the impulse type display. Fig. 13B shows the identification of the viewer in the case of an image captured by the open shutter technique and displayed by a hold-type display. Fig. 13C shows the identification of the observer in the case of an image captured by the high-speed shutter technique and displayed by the impulse-type display. Fig. 13D shows the identification of the observer in the case of an image captured by the high-speed shutter technique and displayed by the hold-type display.
The hold type as used herein refers to a display type that holds the display of each pixel on the screen during each frame period, and the hold type display is, for example, an LCD. A display device using an LED or a display device using EL (electroluminescence) can be used as the hold type display.
The impulse type display is, for example, a CRT or FED.
In addition, displays are classified into not only a hold type (hold type) and an impulse type, but also a pixel type display (for example, a display using an LCD or an LED and a display using an EL) in which elements are arranged in respective pixels, and a so-called matrix drive display driven by a voltage, a current, or the like which is individually applied to a vertical position arranged on a screen in units of a predetermined length and a horizontal position arranged on a screen in units of a predetermined length.
As can be seen from fig. 13A to 13D, the moving image quality deterioration is different under different conditions. For example, the moving object observed by tracking in fig. 13B and 13D is hazily visible, compared with the moving object recognition observed by tracking in fig. 13 or 13C. This phenomenon is referred to as "motion blur" and is characteristic of displays operating under hold-type emission conditions. The "motion blur" is a blur that occurs in an object at which an observer is gazing, and is a deterioration that is easily perceived by the observer.
In addition, deterioration such as a stroboscopic artifact (shake) due to gaze in fig. 3D and a stroboscopic artifact due to tracking in fig. 13A and 13C occurs. Stroboscopic artifacts refer to a degradation of the moving image that causes the viewer to see multiple copies of a moving object (such as a car) at the same time, or as shown in fig. 14, when the viewer gazes at a fixed object (e.g., a tree) on the display, the moving object appears to be undergoing uneven discrete motion. Therefore, in many cases, the strobe artifact appearing in the gazed moving object as well as the tracked fixed object is a deterioration that appears in a different part from the object being gazed on, and is not very noticeable as compared to "motion blur". However, when the line of sight is not completely tracked, the relationship between the gazed-on object and the line of sight becomes the same as the relationship between the moving object and the line of sight during gaze, or the relationship between the fixed object and the line of sight during tracking. In this case, strobe artifacts appear in the object being looked at, so that very significant degradation occurs. This phenomenon is significant in video sources where the motion is too fast to easily predict the next motion, such as sports broadcasts and action films. During capturing of a moving image of a movie or the like, various techniques are employed in order to prevent such degradation in quality of the moving image: for example, a moving object is captured by a camera while being tracked by the camera, and the captured moving object is displayed in a state of a fixed object on a display screen, or a blur called a motion blur is introduced to suppress a stroboscopic artifact. However, limitations imposed by these techniques result in limitations on the manner of representation. In addition, these approaches cannot be used for sports, etc., because the motion of the object of interest is unpredictable.
The above-described moving image quality deterioration increases according to the angular velocity of the moving object. Therefore, if a moving image of the same video scene is displayed on a display having a larger angle of view, the quality of the moving image deteriorates more significantly. In addition, attempts to increase the resolution hardly improve the above-described moving image quality deterioration. Conversely, a higher resolution results in a greater improvement in still image quality, so that moving image quality degradation becomes more noticeable. As larger screen sizes and higher resolution displays are developed, it is expected that the above-described moving image quality will become a greater problem in the future.
The reason for the deterioration of the moving image quality is lack of temporal reproducibility. Therefore, the basic solution is to improve temporal reproducibility. More specifically, a useful solution is to increase the frame rate for both image capture and display.
The relationship between the moving image quality degradation and the display type will be described in more detail.
For example, as can be seen from comparison between fig. 13A and 13B, the length of the moving object image visually recognized by tracking in fig. 13B is longer than that of the moving object image visually recognized by tracking in fig. 13A, so that the motion blur felt by the observer who tracks the moving object displayed on the hold-type display becomes large as compared with the case of the impulse-type display. On the other hand, from the fact that the fixed object is visually recognized as a separate image by tracking in fig. 13A, and as a spatially continuous image by tracking in fig. 13B, it can be seen that the fixed object displayed on the hold-type display is naturally visible during tracking, as compared with the case of the impulse-type display.
Similarly, as can be seen from comparison between fig. 13C and 13D, the length of the moving object image visually recognized by tracking in fig. 13D is longer than that of the moving object image visually recognized by tracking in fig. 13C, so that the motion blur perceived by the observer who tracks the moving object displayed on the hold-type display becomes large as compared with the case of the impulse-type display. On the other hand, from the fact that the fixed object is visually recognized as a separate image by tracking in fig. 13C, and as a spatially continuous image by tracking in fig. 13D, it can be seen that the fixed object displayed on the hold-type display is naturally visible during tracking, as compared with the case of the impulse-type display.
During tracking, the identification of the moving object and the fixed object shown in fig. 13A is the same as the identification of the moving object and the fixed object shown in fig. 13B, and the identification of the moving object and the fixed object shown in fig. 13C is the same as the identification of the moving object and the fixed object shown in fig. 13D, but the identification of the moving object and the fixed object shown in fig. 13A and 13B is different from the identification of the moving object and the fixed object shown in fig. 13C and 13D. From this fact it can be seen that the recognition of moving objects and stationary objects during gaze (the way motion blur and strobe artifacts (jitter) occur) is the same, regardless of whether the display type is impulse or hold type. In addition, it can be seen that even a moving object whose image is captured by the shutter-open technique is observed by gaze, a stroboscopic artifact (shake) is not perceived, but if a moving object whose image is captured by the high-speed shutter technique is observed by gaze, a stroboscopic artifact (shake) is perceived.
Fig. 15 shows the degree of improvement in moving image degradation achieved when the moving image data described above with reference to fig. 13 is captured at a frame rate twice as high and displayed at a frame rate twice as high.
Fig. 15A shows the identification of a moving image by an observer, which is captured by the open shutter technique and displayed by the impulse type display at a frame rate twice as high as that in the case described above with reference to fig. 13. Fig. 15B shows the identification of a moving image by the observer, which is captured by the open shutter technique and displayed by the hold-type display at a frame rate twice as high as that in the case described above with reference to fig. 13. Fig. 15C shows the identification of a moving image by the observer, which is captured by the high-speed shutter technique and displayed by the impulse type display at a frame rate twice as high as that in the case described above with reference to fig. 13. Fig. 15D shows the identification of a moving image by the observer, which is captured by the high-speed shutter technique and displayed by the hold-type display at a frame rate twice as high as that in the case described above with reference to fig. 13.
As shown in each of fig. 15A to 15D, in each of the capturing and displaying methods, the amount of blurring due to blurring artifacts in recognition of a display image is reduced to half. In addition, image degradation due to strobe artifacts is improved because the discrete number of strobes becomes twice as large. More specifically, blurring artifacts and flicker artifacts improve linearly with respect to an increase in frame rate. In addition, as the frame rate is increased, the difference between the quality of the moving image quality deterioration depending on the shutter period and the emission period is reduced. More specifically, it can be considered that an increased frame rate is a very useful way for improving the quality of moving images.
From a comparison of fig. 13A and 15A and a comparison of fig. 13B and 15B, it is apparent that the ratio of the length of the moving object visually perceived by tracking in fig. 15B to the length of the moving object visually perceived by tracking in fig. 13B is smaller than the ratio of the length of the moving object visually perceived by tracking in fig. 15A to the length of the moving object visually perceived by tracking in fig. 13A. Similarly, from a comparison of fig. 13C and 15C and a comparison of fig. 13D and 15D, it is apparent that the ratio of the length of the moving object visually perceived by tracking in fig. 15D to the length of the moving object visually perceived by tracking in fig. 13D is smaller than the ratio of the length of the moving object visually perceived by tracking in fig. 15C to the length of the moving object visually perceived by tracking in fig. 13C.
More specifically, it can be considered that if the frame rates of both the impulse and hold type displays are similarly increased, the effect of reducing motion blur in the hold type display is higher than that of the impulse type display. More specifically, in a hold-type display, the effect of frame rate increase on reducing motion blur occurring during tracking is significant.
On the other hand, with respect to the stroboscopic artifact (shake), since the interval between the fixed object images displayed separately becomes generally shorter, the stroboscopic artifact (shake) becomes generally less perceptible.
For the display of a moving image captured by opening a shutter, evaluation of the moving image quality thereof is performed under tracking conditions in terms of shaking and motion blur by visual psychophysical experiments.
Fig. 16 shows the result evaluated in terms of jitter, and fig. 17 shows the result evaluated in terms of motion blur. For this evaluation, various moving images such as a natural moving image captured by opening a shutter video, a CG movement, and a video image are prepared. The scores are given according to the following degradation scale: evaluation value 5 ═ degradation is imperceptible ", evaluation value 4 ═ degradation is perceptible but not objectionable", evaluation value 3 ═ degradation is perceptible but not obstructive ", evaluation value 2 ═ degradation is obstructive", and evaluation value 1 ═ degradation is very obstructive ". In addition, the evaluation scores were given according to the following evaluation scale: the evaluation value 5 was "very good", the evaluation value 4 was "good", the evaluation value 3 was "medium", the evaluation value 2 was "poor", and the evaluation value 1 was "very poor". In this experiment, evaluation was performed on a sufficient number of test subjects so as to achieve research on evaluation of general moving image quality. In fig. 16 and 17, the mean and standard deviation of the evaluation values given by all the test subjects with respect to all the scenes are plotted.
The evaluation value of the motion blur shown in fig. 17 is largely changed as compared with the shake shown in fig. 16, and for both the shake and the motion blur, a tendency is observed in common that the evaluation value of the moving image quality becomes high as the frame rate becomes higher. In particular, the evaluation value of the motion blur shows a dogleg-like tendency to reach the vicinity of the evaluation value 4.5 as a perception limit in the vicinity of 250fps, and shows a flat value not lower than the evaluation value 4.5 at a much higher frame rate. The evaluation value of the judder also shows such a dogleg-like tendency that the vicinity of the evaluation value 4.5 as the sensory limit is reached in the vicinity of 250fps, and a substantially flat value not lower than the evaluation value 4.5 is shown at a much higher frame rate.
Therefore, motion blur during tracking, which causes particularly significant deterioration in moving image quality, can be satisfactorily improved by a frame rate close to 250 fps. More specifically, this fact suggests that a frequency around 250fps is ideal, which takes into account the availability of video resources that are currently widely used. Specifically, as described earlier, a large number of video resources widely used at present have a frame rate of 50Hz or 60Hz, and this fact implies that 240Hz or 250Hz, which is an integral multiple of this frequency, is an ideal frequency in consideration of the effectiveness of the video resources.
This evaluation will be described in more detail below. In the EBU (european broadcasting union) method, the evaluation value of 4.5 is a sensory limit value, above which no difference is substantially imperceptible in any region corresponding to an evaluation value higher than 4.5, while the evaluation value of 3.5 is an allowable limit value, below which improvement is substantially imperceptible in any region corresponding to an evaluation value lower than 3.5.
In the result of evaluation focusing on motion blur, the frame rate corresponding to the allowable limit of the evaluation value 3.5 is 105. At a frame rate of 105, the average user begins to perceive an improvement in motion blur. More specifically, at a frame rate of 105 or more, the general user can feel improvement in motion blur.
In the result of evaluation focusing on motion blur, the frame rate corresponding to the perceptual limit of the evaluation value of 4.5 is 230. At a frame rate of 230 or higher, the average user feels a satisfactory motion blur improvement. In other words, at a frame rate of 230 or higher, the average user may perceive an improvement in motion blur up to the peak. More specifically, at a frame rate of 230 or more, the general user can satisfactorily feel the improvement of the motion blur.
In the result of evaluation focusing on jitter, the evaluation value of the frame rate 480 is 5.0, which is a value whose standard deviation is very small. Therefore, at the frame rate 480, the general user cannot recognize the jitter. More specifically, at the frame rate 480, image degradation due to shaking can be suppressed to such an extent that the user cannot recognize it.
Therefore, at a frame rate of 150, 200, 250, 300, 350, 400, 450, or 500, which is a frame rate not lower than 105 and equal to an integral multiple of the frame rate 50 in the PAL, the moving image quality deterioration can be improved. At a frame rate not lower than 150 and equal to an integer multiple of the frame rate 50 in PAL, the general user can feel the improvement of the motion blur. At a frame rate not lower than 250 and equal to an integral multiple of the frame rate 50 in PAL, the general user can satisfactorily perceive the improvement of motion blur.
Similarly, at a frame rate of 120, 180, 240, 300, 360, 420, or 480, which is not lower than 105 and is equal to an integer multiple of the frame rate 60 in NTSC, the moving image quality deterioration can be improved. At a frame rate not lower than 120 and equal to an integral multiple of the frame rate 60 in NTSC, the general user can feel improvement of motion blur. At a frame rate not lower than 240 and equal to an integral multiple of the frame rate 60 in NTSC, the improvement in motion blur can be satisfactorily perceived by the general user.
Image processing can be easily performed at a frame rate equal to an integral multiple of the frequency of a general broadcast format such as NTSC or PAL. In addition, the use of three-panel type prisms (three-panel type prisms) during the capture of video images has been common. Accordingly, image processing can be easily performed on a video signal having a frame rate 180, wherein the frame rate 180 is a frame rate not lower than the evaluation value of 3.5, and in the EBU method, a general user can feel improvement of motion blur above the evaluation value of 3.5, and such a video signal can be easily obtained by capturing video images each having a frame rate of 60 while shifting the video images from each other by 1/180 seconds by means of a three panel type prism.
In addition, it has been found from a series of experiments that a frame rate of 360 or 350 is particularly preferred when computer graphics images are to be displayed. This is because computer graphics images typically contain high frequency components, for example, at their edges. Therefore, deterioration of image quality due to shaking can be easily perceived, and even a general user can perceive improvement of image quality when the frame rate of 250 or 240 is changed to the frame rate of 360 or 350.
The image display system according to the present invention shown above with reference to fig. 1 or fig. 6 may be used to display moving images at 240Hz or 250Hz, where 240Hz or 250Hz is a frequency equal to an integer multiple of 50Hz or 60 Hz. For example, as shown in fig. 18, moving image display at 240Hz or 250Hz, which is a frequency equal to an integer multiple of 50Hz or 60Hz, may be realized using two or more projectors 51-1 to 51-n and the image display system according to the present invention.
Each of the projectors 51-1 to 51-n displays a frame image corresponding to the supplied video signal on the screen 52 by scanning in the horizontal direction a pixel (X, Y) forming a display image to be displayed by (0, 0) to a pixel (X, Y) by (p, q) at a timing based on the control of the display control section 27. When the frame rate of the moving image supplied to the image display system is m Hz, the frame rate of the frame image displayed on the screen 52 by each of the projectors 51-1 to 51-n is m/n Hz, but the frame rate of the moving image displayed by the projectors 51-1 to 51-n is m Hz. The scanning start timing of each frame displayed by each of the projectors 51-1 to 51-n is shifted by 1/n phase, i.e., 1/m second, with respect to one display frame provided by each of the projectors 51-1 to 51-n.
For example, when projector 51-2 scans a line on screen 52 corresponding to the α +1 frame on the line indicated by scan B, projector 51-3 scans a line on screen 52 corresponding to the α +2 frame on the line indicated by scan A. The line indicated by scan B is shifted by 1/n of the number of lines of one frame with respect to the line indicated by scan a. More specifically, at time intervals of 1/m, the moving image displayed on the screen 52 is alternately rewritten by a plurality of scans including scan a and scan B.
If the frame rate of the input image signal is 240Hz and the number of separations of the image signal is four, for example, the image signal output from the frame memory is sequentially supplied to the four D/a conversion sections, or the frames separated into four by the data separation section are sequentially supplied to the four frame memories and stored therein, respectively. Thus, as shown in fig. 19, the input video signal S1 is separated into four output video signals S2, S3, S4, and S5, and the four output video signals S2, S3, S4, and S5 are supplied to the four scan control sections, respectively.
The first scan control section controls display of α frames, α +4 frames,. all of which correspond to the output video signal S2. The second scan control section controls display of α +1 frames, α +5 frames,. all of which correspond to the output video signal S3. The third scan control section controls display of α +2 frames, α +6 frames,. all of which correspond to the output video signal S4. The fourth scan control section controls display of α +3 frames, α +7 frames,. all of which correspond to the output video signal S5. The frame rate of frames of the output video signal displayed by the first to fourth scan control sections, respectively, is 1/4 of the frame rate of the input video signal, and the scan start times of the frames scanned by the first to fourth scan control sections, respectively, are shifted from each other by 1/4 of the scan period required for displaying one frame of each of the output video signals S2 to S5.
If the frame rate of the input image signal is 240Hz and the number of separations of the image signal is five, for example, the image signal output from the frame memory is sequentially supplied to five D/a conversion sections, or the frames separated into five by the data separation section are sequentially supplied to and stored in five frame memories, respectively. Thus, as shown in fig. 20, the input video signal S1 is separated into five output video signals S2, S3, S4, S5, and S6, and the five output video signals S2, S3, S4, S5, and S6 are supplied to the five scan control sections, respectively.
The first scan control section controls display of α frames, α +5 frames,. all of which correspond to the output video signal S2. The second scan control section controls display of α +1 frames, α +6 frames,. all of which correspond to the output video signal S3. The third scan control section controls display of α +2 frames, α +7 frames,. all of which correspond to the output video signal S4. The fourth scan control section controls display of α +3 frames, α +8 frames,. all of which correspond to the output video signal S5. The fifth scan control section controls display of α +4 frames, α +9 frames,. all of which correspond to the output video signal S6. The frame rate of frames of the output video signal displayed by the first to fifth scan control sections, respectively, is 1/5 of the frame rate of the input video signal, and the scan start times of the frames scanned by the first to fifth scan control sections, respectively, are shifted from each other by 1/5 of the scan period required for displaying one frame of each of the output video signals S2 to S6.
In moving image display at 50Hz or 60Hz which is most widely used at present, deterioration in moving image quality such as blurring or shaking is significant. On the other hand, for example, when 4 or 5 is used as the separation number n of the video signal according to the present invention, it is possible to display a moving image having a high frame rate by using a widely used conventional type display apparatus (e.g., a projector) operating at a frame rate of 50Hz or 60 Hz. For example, when the separation number of the input video signal is n-4, and the frame rate of the display image output from each of the projectors 51-1 to 51-4 is 60Hz, the frame rate of the moving image displayed on the screen 52 becomes substantially 240 Hz. Further, for example, when the separation number of the input video signal is n-5 and the frame rate of the display image output from each of the projectors 51-1 to 51-4 is 50Hz, the frame rate of the moving image displayed on the screen 52 becomes substantially 250 Hz.
As described above, a large number of video resources widely used at present have a frame rate of 50Hz or 60Hz, so that 240Hz or 250Hz, which is an integral multiple of this frequency, becomes an ideal frequency in consideration of the effectiveness of the video resources.
Also in this case, of course, the number of scan control sections may be set to s, and the separation number of the input video signal may be set to n smaller than s, so that a moving image may be displayed by using n scan control sections among the s scan control sections.
An image display system 101 having another configuration according to an embodiment of the present invention will be described below. Fig. 21 is a block diagram showing the configuration of an image display system 101 using an LCD.
The image display system 101 shown in fig. 21 includes a signal processing section 111, a clock/sampling pulse generating section 112, and an image display device 113. The signal processing section 111 acquires an image signal as an input signal, and applies signal processing to the acquired image signal, and supplies a digital RGB (red, green, blue) signal to the image display device 113. The clock/sampling pulse generating section 112 acquires an image signal as an input signal, and detects a horizontal synchronization signal and a vertical synchronization signal from the acquired image signal, and generates a control signal based on the detected horizontal and vertical synchronization signals. The clock/sampling pulse generating section 112 supplies the generated control signal to the signal processing section 111 and the image display device 113.
The image display device 113 is equipped with an LCD, and displays an image based on signals supplied from the signal processing section 111 and the clock/sampling pulse generating section 112.
The signal processing section 111 is constituted by a Y/C separation/chroma decoding section 121, an a/D conversion section 122, and a frame memory 123. The Y/C separation/chroma decoding section 121 separates the acquired image signal into a luminance signal (Y) and a color signal (C), and decodes the color signal, and generates an analog RGB signal. The Y/C separation/chroma decoding section 121 supplies the generated analog RGB signals to the a/D conversion section 122.
The a/D conversion section 122 performs analog/digital conversion on the analog RGB signals supplied from the Y/C separation/chroma decoding section 121 based on the control signal supplied from the clock/sampling pulse generating section 112, and supplies the generated digital RGB signals to the frame memory 123. The frame memory 123 temporarily stores the digital RGB signals sequentially supplied from the a/D conversion section 122, and supplies the stored digital RGB signals to the image display device 113.
The clock/sampling pulse generating section 112 includes a synchronization signal detecting section 124 and a control signal generating section 125. The synchronization signal detection section 124 detects a horizontal synchronization signal and a vertical synchronization signal from the acquired image signal, and supplies the detected horizontal and vertical synchronization signals to the control signal generation section 125. The control signal generating section 125 generates a control signal for controlling analog/digital conversion in the a/D converting section 122 and a control signal for controlling display on the image display device 113 based on the horizontal and vertical synchronizing signals supplied from the synchronizing signal detecting section 124, and supplies the generated control signals to the a/D converting section 122 and the image display device 113.
The image display device 113 includes an LCD131, a backlight 132, data line driving circuits 133-1 to 133-4, a gate line driving section 134, and a backlight driving circuit 135. The LCD131 is a hold type display having matrix-driven pixels, and displays an image by changing the amount of transmitted light by controlling the orientation of liquid crystals inside the pixels, which are formed of liquid crystal elements arranged in a screen, respectively.
The backlight 132 is a light source that emits light to enter the LCD131 from the back of the LCD 131. Based on the control signal supplied from the control signal generating part 125, the data line driving circuits 133-1 to 133-4 and the gate line driving part 134 perform matrix driving of each pixel of the LCD131 according to the digital RGB signals supplied from the signal processing part 111. The backlight driving circuit 135 drives the backlight 132 so as to emit light.
More specifically, in the LCD131, a group of liquid crystal elements 141-1-1, TFTs (thin film transistors) 142-1-1 and capacitors 143-1-1 to 141-n-m (not shown), TFTs 142-n-m (not shown) and capacitors 143-n-m (not shown) are arranged in the nth column of the first to nth rows of the first row, respectively.
The liquid crystal elements 141-1-1 to 141-n-m will be hereinafter referred to simply as liquid crystal elements 141 unless it is necessary to individually identify the liquid crystal elements 141-1-1 to 141-n-m. Unless it is necessary to individually identify TFTs 142-1-1 to 142-n-m, TFTs 142-1-1 to 142-n-m will be hereinafter referred to simply as TFTs 142. The capacitors 143-1-1 to TFT 143-n-m are hereinafter referred to simply as capacitors 143 unless it is necessary to individually identify the capacitors 143-1-1 to 143-n-m.
One liquid crystal element 141, one TFT142, and one capacitor 143 are arranged in one group, thereby constructing a sub-pixel. The liquid crystal element 141 contains liquid crystal, and changes the transmitted light amount of light irradiated from the backlight 132 according to the voltage applied by the TFT 142. The TFT142 drives the liquid crystal element 141 by applying a voltage to the liquid crystal element 141. The capacitor 143 is provided in parallel with the liquid crystal element 141, and it holds a voltage applied to the liquid crystal element 141 during a period of each frame.
In the LCD131, the liquid crystal element 141-1-1, the TFT 142-1-1 and the capacitor 143-1-1, all constituting one sub-pixel, are arranged in the first left column of the first top row. In the LCD131, a liquid crystal element 141-1-2, a TFT 142-1-2, and a capacitor 143-1-2, all of which constitute one sub-pixel, are arranged to the right of the liquid crystal element 141-1-1, the TFT 142-1-1, and the capacitor 143-1-1. Further, in the LCD131, the liquid crystal elements 141-1-3, TFTs 142-1-3 and capacitors 143-1-3, all of which constitute one sub-pixel, and the liquid crystal elements 141-1-4, TFTs 142-1-4 and capacitors 143-1-4, all of which constitute one sub-pixel, are arranged in the specified order on the right.
In the LCD131, four sub-pixels arranged side by side in a horizontal line constitute one pixel (pixel). More specifically, the liquid crystal element 141-1-1 to the capacitor 143-1-4 constitute one pixel.
Similarly, in the LCD131, the liquid crystal element 141-2-1, the TFT 142-2-1 and the capacitor 143-2-1, all constituting one sub-pixel, are arranged in the first left column of the second top row. In the LCD131, the liquid crystal element 141-2-2, the TFT 142-2-2, and the capacitor 143-2-2, all of which constitute one sub-pixel, are arranged to the right of the liquid crystal element 141-2-1, the TFT 142-2-1, and the capacitor 143-2-1. Further, in the LCD131, the liquid crystal element 141-2-3, the TFT 142-2-3 and the capacitor 143-2-3, all of which constitute one sub-pixel, and the liquid crystal element 141-2-4, the TFT 142-2-4 and the capacitor 143-2-4, all of which constitute one sub-pixel, are arranged in the specified order on the right.
The liquid crystal element 141-2-1 to the capacitor 143-2-4 constitute one pixel.
For example, if an image signal of 240 frames/sec is supplied, the control signal generating section 125 supplies a control signal to the data line driving circuit 133-1 so that frame 1 as the first frame is displayed on the leftmost sub-pixel located at one pixel.
The data line drive circuit 133-1 reads the digital RGB signals of frame 1 from the frame memory 123 and supplies drive signals to the LCD131 based on the read digital RGB signals of frame 1 so as to display frame 1 on the leftmost sub-pixel among the four sub-pixels arranged side by side on the horizontal line of one pixel (pixel), for example, the sub-pixel formed of the liquid crystal element 141-1-1, the TFT 142-1-1 and the capacitor 143-1-1, or the sub-pixel formed of the liquid crystal element 141-2-1, the TFT 142-2-1 and the capacitor 143-2-1.
Then, the control signal generating section 225 supplies a control signal to the data line driving circuit 133-2 so that frame 2, which is the second frame of the moving image of 240 frames/sec, is displayed on the sub-pixel located on the second left of this pixel.
The data line drive circuit 133-2 reads the digital RGB signals of frame 2 from the frame memory 123 and supplies drive signals to the LCD131 based on the read digital RGB signals of frame 2 so as to display frame 2 on the sub-pixel located on the second left among the four sub-pixels arranged side by side on the horizontal line of this pixel (pixel), for example, the sub-pixel formed of the liquid crystal element 141-1-2, the TFT 142-1-2, and the capacitor 143-1-2, or the sub-pixel formed of the liquid crystal element 141-2-2, the TFT 142-2-2, and the capacitor 143-2-2.
Further, the control signal generating section 125 supplies a control signal to the data line driving circuit 133-3 so that frame 3, which is the third frame of a moving image of 240 frames/sec, is displayed on the sub-pixel located on the third left of this pixel.
The data line drive circuit 133-3 reads the digital RGB signals of frame 3 from the frame memory 123 and supplies drive signals to the LCD131 based on the read digital RGB signals of frame 3 to display frame 3 on a sub-pixel located on the third left among four sub-pixels arranged side by side on a horizontal line of this pixel (pixel), for example, a sub-pixel formed of the liquid crystal element 141-1-3, the TFT 142-1-3, and the capacitor 143-1-3, or a sub-pixel formed of the liquid crystal element 141-2-3, the TFT 142-2-3, and the capacitor 143-2-3.
Further, the control signal generating section 125 supplies a control signal to the data line driving circuit 133-4 so that frame 4, which is the fourth frame of a moving image of 240 frames/sec, is displayed on the sub-pixel located on the rightmost side of the pixel.
The data line drive circuit 133-4 reads the digital RGB signals of frame 4 from the frame memory 123 and supplies drive signals to the LCD131 based on the read digital RGB signals of frame 4 to display frame 4 on the sub-pixel located on the rightmost side among the four sub-pixels arranged side by side on the horizontal line of this pixel (pixel), for example, the sub-pixel formed of the liquid crystal element 141-1-4, the TFT 142-1-4, and the capacitor 143-1-4, or the sub-pixel formed of the liquid crystal element 141-2-4, the TFT 142-2-4, and the capacitor 143-2-4.
Then, the control signal generating section 125 supplies a control signal to the data line driving circuit 133-1 so that frame 5, which is the fifth frame of a moving image of 240 frames/sec, is displayed on the leftmost sub-pixel located at one pixel.
The data line drive circuit 133-1 reads the digital RGB signals of frame 5 from the frame memory 123 and supplies drive signals to the LCD131 based on the read digital RGB signals of frame 5 to display frame 5 on the leftmost sub-pixel among the four sub-pixels arranged side by side on the horizontal line of this pixel (pixel), for example, the sub-pixel formed of the liquid crystal element 141-1-1, the TFT 142-1-1 and the capacitor 143-1-1, or the sub-pixel formed of the liquid crystal element 141-2-1, the TFT 142-2-1 and the capacitor 143-2-1.
In this way, four sub-pixels arranged side by side on a horizontal line of one pixel (pixel) sequentially display an image of one frame.
In this case, it is preferable that each frame is displayed during a period of 1/240 seconds, but it is also preferable that each frame is displayed during a longer period of, for example, 1/60 seconds.
According to this configuration, even if the response time of the liquid crystal is long, it is possible to display a moving image composed of a large number of frames per second. For example, it is possible to display a moving image of 240 frames/sec.
Although an LCD is used in the above configuration, any type of matrix driving display may be used instead of an LCD. For example, a display using an LED or an organic EL display may be used.
As described above, in a hold-type display that holds the display of each pixel on a screen during each frame period, the display is controlled so as to display a moving image made up of 105 or more frames/sec, and when the moving image made up of 105 or more frames/sec is displayed based on such control, it is possible to present a less degraded moving image to an observer, who is a person who views the displayed moving image, based on human visual characteristics without having to increase the frame rate.
In addition, in the matrix-drive type display, the display is controlled so as to display a moving image made up of 105 or more frames/sec, and when a moving image made up of 105 or more frames/sec is displayed based on such control, a less deteriorated moving image can be presented to the observer, who is a person who views the displayed moving image, based on the human visual characteristics without having to increase the frame rate.
All the above-described processes may also be performed by means of software. The software may be installed from a recording medium onto a computer having dedicated hardware incorporating a program constituting the software, or a general-purpose computer capable of executing various functions by various programs installed thereon.
The recording medium is formed of a package medium or the like in which a program is recorded so as to be distributed to users separately from a computer, and includes, as shown in fig. 1 or 6, a magnetic disk 31 (e.g., a flexible disk), an optical disk 32 (e.g., a CD-ROM (compact disk-read only memory) and a DVD (digital versatile disk)), a magneto-optical disk 33 (e.g., an MD (mini disk (trademark)), a semiconductor memory 34, and the like.
In this specification, of course, the steps describing the program recorded on the recording medium may include not only the processes performed in a time-series manner in the stated order but also processes not necessarily performed but performed in parallel or individually.
In this specification, the term "system" means an entire apparatus constituted by a plurality of devices.

Claims (2)

1. An image display system comprising:
a signal processing section for acquiring an image signal as an input signal and applying signal processing to the acquired image signal to generate a digital RGB signal;
a control section for detecting a horizontal synchronization signal and a vertical synchronization signal from the acquired image signal, and generating a control signal based on the detected horizontal synchronization signal and vertical synchronization signal;
an image display section for performing display based on the digital RGB signals supplied from the signal processing section and the control signals supplied from the control section;
wherein,
the image display part further comprises
A plurality of pixels each composed of n sub-pixels arranged side by side on a horizontal line and adjacently arranged;
n data line driving circuits, each of the n data line driving circuits driving sub-pixels in the same column within each pixel, respectively; wherein n has a value of 4;
wherein the control section sequentially supplies control signals to the n data line drive circuits so that digital RGB signals of frames of a moving image having a first frame rate, which is 240 frames/sec, are sequentially displayed on the n sub-pixels.
2. An image processing method comprising:
acquiring an image signal as an input signal and applying signal processing to the acquired image signal to generate a digital RGB signal;
detecting a horizontal synchronization signal and a vertical synchronization signal from the acquired image signal, and generating a control signal based on the detected horizontal synchronization signal and vertical synchronization signal; and
performing display based on the digital RGB signals and the control signals,
wherein performing the display includes
Controlling n data line driving circuits in a display section further including a plurality of pixels each composed of n sub-pixels arranged side by side on a horizontal line and adjacently arranged; each of the n data line driving circuits drives the sub-pixels in the same column in each pixel, respectively; wherein n has a value of 4; control signals are sequentially supplied to the n data line driving circuits so that digital RGB signals of frames of a moving image having a first frame rate, which is 240 frames/sec, are sequentially displayed on the n sub-pixels.
CN 200910168321 2004-08-10 2005-06-09 Image processing apparatus, image processing method and image display system Expired - Fee Related CN101668149B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2004233280 2004-08-10
JP233280/04 2004-08-10
JP2004244641A JP2006078505A (en) 2004-08-10 2004-08-25 Display apparatus and method
JP244641/04 2004-08-25

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CNA200580001091XA Division CN1860521A (en) 2004-08-10 2005-06-09 Display device and method thereof

Publications (2)

Publication Number Publication Date
CN101668149A CN101668149A (en) 2010-03-10
CN101668149B true CN101668149B (en) 2013-02-27

Family

ID=37298752

Family Applications (4)

Application Number Title Priority Date Filing Date
CNA200580001091XA Pending CN1860521A (en) 2004-08-10 2005-06-09 Display device and method thereof
CN 200910168321 Expired - Fee Related CN101668149B (en) 2004-08-10 2005-06-09 Image processing apparatus, image processing method and image display system
CN 200810161983 Expired - Fee Related CN101415093B (en) 2004-08-10 2005-06-09 Image processing apparatus, image processing method and image display system
CNA2008101619827A Pending CN101437128A (en) 2004-08-10 2005-06-09 Display apparatus and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CNA200580001091XA Pending CN1860521A (en) 2004-08-10 2005-06-09 Display device and method thereof

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN 200810161983 Expired - Fee Related CN101415093B (en) 2004-08-10 2005-06-09 Image processing apparatus, image processing method and image display system
CNA2008101619827A Pending CN101437128A (en) 2004-08-10 2005-06-09 Display apparatus and method

Country Status (2)

Country Link
JP (1) JP4826602B2 (en)
CN (4) CN1860521A (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009093371A1 (en) * 2008-01-22 2009-07-30 Sharp Kabushiki Kaisha Display system, display control device, image display device
US8933988B2 (en) 2009-01-28 2015-01-13 Nec Corporation Picture transmission system and picture transmission method
JP5343714B2 (en) * 2009-06-05 2013-11-13 ソニー株式会社 Video processing device, display device, and display system
CN101847393B (en) * 2010-04-23 2014-08-20 中国电子科技集团公司第五十四研究所 Method for processing remote sensing image
CN102262519A (en) * 2010-05-26 2011-11-30 图诚科技股份有限公司 Image processing device and image signal processing system
JP5317023B2 (en) * 2010-09-16 2013-10-16 カシオ計算機株式会社 Camera shake correction apparatus, camera shake correction method, and program
CN103426386B (en) * 2012-05-24 2017-02-15 群康科技(深圳)有限公司 Display device and control method thereof
CN103426387B (en) * 2012-05-25 2017-02-15 群康科技(深圳)有限公司 Display device and control method thereof
JP6019332B2 (en) * 2012-06-04 2016-11-02 株式会社Joled Display device, image processing device, and display method
US8797340B2 (en) * 2012-10-02 2014-08-05 Nvidia Corporation System, method, and computer program product for modifying a pixel value as a function of a display duration estimate
JP6743732B2 (en) * 2017-03-14 2020-08-19 トヨタ自動車株式会社 Image recording system, image recording method, image recording program
CN107589618B (en) * 2017-10-23 2020-08-25 杭州光粒科技有限公司 Micro projection system with high refresh rate and method for improving micro display refresh rate
CN111199697B (en) * 2018-11-16 2023-06-30 瑞昱半导体股份有限公司 Display device and display method for reducing dynamic blurring
US20220223104A1 (en) * 2021-01-13 2022-07-14 Nvidia Corporation Pixel degradation tracking and compensation for display technologies

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1211868A (en) * 1997-08-27 1999-03-24 德国汤姆逊-布朗特公司 Method for obtaining line synchronization information items from video signal, and apparatus for carrying out the method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07199149A (en) * 1993-12-28 1995-08-04 Sharp Corp Picture display device and its driving method
JPH10254390A (en) * 1997-03-10 1998-09-25 Canon Inc Liquid crystal device
EP1758088A3 (en) * 1998-12-01 2008-02-27 Seiko Epson Corporation Color display device and color display method
JP2002041002A (en) * 2000-07-28 2002-02-08 Toshiba Corp Liquid-crystal display device and driving method thereof
FR2817992B1 (en) * 2000-12-12 2003-04-18 Philippe Charles Gab Guillemot DIGITAL VIDEO SCREEN DEVICE
JP3890926B2 (en) * 2001-07-17 2007-03-07 セイコーエプソン株式会社 Projection type liquid crystal display device
JP3749147B2 (en) * 2001-07-27 2006-02-22 シャープ株式会社 Display device
JP2004177575A (en) * 2002-11-26 2004-06-24 Sharp Corp Liquid crystal display device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1211868A (en) * 1997-08-27 1999-03-24 德国汤姆逊-布朗特公司 Method for obtaining line synchronization information items from video signal, and apparatus for carrying out the method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JP特开2002-41002A 2002.02.08
JP特开2003-29238A 2003.01.29
JP特开2004-177575A 2004.06.24

Also Published As

Publication number Publication date
JP2008268968A (en) 2008-11-06
CN101668149A (en) 2010-03-10
CN101415093A (en) 2009-04-22
CN101415093B (en) 2013-03-06
CN101437128A (en) 2009-05-20
CN1860521A (en) 2006-11-08
JP4826602B2 (en) 2011-11-30

Similar Documents

Publication Publication Date Title
CN101668149B (en) Image processing apparatus, image processing method and image display system
EP2157563B1 (en) Image processing apparatus and method with frame rate conversion
US10621934B2 (en) Display and display method
US9355488B2 (en) 3D visualization
US8077172B2 (en) Method and apparatus for processing an image, image display system, storage medium, and program
US9071800B2 (en) Display unit and displaying method for enhancing display image quality
EP1524862A2 (en) Display system with scrolling color and wobble device
JP2003069961A (en) Frame rate conversion
CN109493787B (en) Method for adjusting dynamic fuzzy effect and display system
JP4030336B2 (en) Video display device
US5764202A (en) Suppressing image breakup in helmut mounted displays which use temporally separated bit planes to achieve grey scale
US20130010206A1 (en) Liquid crystal display device and television receiving apparatus
CN111751982A (en) Scanning display method and device
US20080024467A1 (en) Determining sequence of frames delineated into sub-frames for displaying on display device
JP2004266808A (en) Image processing apparatus and image processing method, image display system, recording media, and program
JP2008072300A (en) Image display device
JP2006518480A (en) Image signal processing for color sequential displays

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130227