WO2018126677A1 - 显示装置和显示方法 - Google Patents

显示装置和显示方法 Download PDF

Info

Publication number
WO2018126677A1
WO2018126677A1 PCT/CN2017/096225 CN2017096225W WO2018126677A1 WO 2018126677 A1 WO2018126677 A1 WO 2018126677A1 CN 2017096225 W CN2017096225 W CN 2017096225W WO 2018126677 A1 WO2018126677 A1 WO 2018126677A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
light
field
depth
sub
Prior art date
Application number
PCT/CN2017/096225
Other languages
English (en)
French (fr)
Inventor
谭纪风
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US15/749,572 priority Critical patent/US10534178B2/en
Publication of WO2018126677A1 publication Critical patent/WO2018126677A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0938Using specific optical elements
    • G02B27/095Refractive optical elements
    • G02B27/0955Lenses
    • G02B27/0961Lens arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/30Collimators
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/40Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/29Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B19/00Condensers, e.g. light collectors or similar non-imaging optics
    • G02B19/0004Condensers, e.g. light collectors or similar non-imaging optics characterised by the optical means employed
    • G02B19/0028Condensers, e.g. light collectors or similar non-imaging optics characterised by the optical means employed refractive and reflective surfaces, e.g. non-imaging catadioptric systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • G02B2027/0174Head mounted characterised by optical features holographic
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors

Definitions

  • the present disclosure relates to the field of display, and in particular, to a display device and a display method.
  • Virtual reality display technology refers to virtualizing a real scene and generating a virtual image, and then presenting the combined image to the viewer through the display device through fusion.
  • the lenses used in conventional virtual display display technology are relatively heavy, and single lenses introduce optical aberrations, causing visual discomfort to the viewer.
  • this lens-based imaging technique cannot provide a virtual image with adjustable depth of field.
  • At least one embodiment of the present disclosure provides a display device and a display method to improve visual comfort of a viewer or to provide a depth-adjustable virtual image.
  • a display device comprising:
  • a display unit having a plurality of pixels
  • a collimating unit for collimating light in a light emitting direction of the display unit to obtain collimated light
  • the adjusting unit is configured to deflect the collimated light so that pixels at different positions in the display unit are imaged at different depths of field.
  • the plurality of pixels in the display unit are arranged in an array, and the light collimated by the collimating unit emitted by each of the plurality of pixels includes at least a first partial light and a second partial light;
  • the adjusting unit includes a plurality of sub-adjusting units, wherein the first sub-adjusting unit and the second sub-adjusting unit are respectively disposed for the first partial light and the second partial light, such that the first refractive index of the first partial light and the second partial optical The angle of refraction is different.
  • the first sub-adjusting unit and the second sub-adjusting unit are arranged such that the first partial light and the second partial light converge at the first depth of field;
  • the first sub-adjusting unit and the second sub-adjusting unit are set such that A portion of the light and the second portion of the light converge at a second depth of field that is different from the first depth of field.
  • the first sub-adjusting unit and the second sub-adjusting unit are set such that the collimated unit is collimated by one row of pixels A portion of the light is concentrated with a second portion of the light that is collimated by the collimating unit from another row of pixels at a third depth of field, the third depth of field being between the first depth of field and the second depth of field.
  • the sub-adjusting unit of the second partial light that is collimated by the collimating unit for the one row of pixels and the sub-light of the first partial light that is collimated by the collimating unit for the other row of pixels The adjustment unit is arranged such that the collimated light for the respective sub-adjustment unit is totally reflected.
  • one of the first depth of field and the first depth of field is in a range of 0.20 m to 0.30 m in a direction opposite to the light exit direction of the display unit, and the other of the first depth of field and the second depth of field is located at the display unit
  • the direction of light emission is in the range of 3 meters to 5 meters in the opposite direction.
  • a display method of a display device comprising: a display unit having a plurality of pixels; and a collimation unit for performing light of a light emitting direction of the display unit Collimating to obtain collimated light; and adjusting unit for biasing the collimated light to image light emitted from pixels at different positions in the display unit at different depths of field; the display method includes:
  • the first image frame comprises a foreground image and the second image frame comprises a background image.
  • the method further comprises a third time period, wherein at least a third image frame is displayed on the display unit, the third image frame being deflected by the adjustment unit at a third depth of field, the third depth of field being Between the first depth of field and the second depth of field.
  • FIG. 1 is a schematic structural view of a virtual display device
  • FIG. 2 shows a schematic cross-sectional view of a display device in accordance with an embodiment of the present disclosure
  • FIG. 3 illustrates a schematic light path diagram of a display device in accordance with an embodiment of the present disclosure
  • FIG. 4 shows a schematic diagram of a first example collimating unit in accordance with an embodiment of the present disclosure
  • FIG. 5A shows a schematic diagram of a second example collimating unit in accordance with an embodiment of the present disclosure
  • Figure 5B shows an optical path diagram of the collimating unit of Figure 5A
  • FIG. 6 shows a schematic light path diagram of an adjustment unit in accordance with an embodiment of the present disclosure
  • FIG. 7 illustrates an optical path diagram of a refracting ray of an adjustment unit in accordance with an embodiment of the present disclosure
  • FIG. 8 shows a flow chart of a display method in accordance with an embodiment of the present disclosure.
  • a virtual reality display scheme is shown in FIG. 1 and includes a display screen 101, a first lens 111, and a second lens 112.
  • the display screen 101 is used to display an image.
  • the display screen 101 may be a single display screen divided into two display areas for respectively displaying a left eye image and a right eye image, or two display screens respectively displaying a left eye image and a right eye image.
  • the first lens 111 and the second lens 112 are respectively disposed for the left eye image and the right eye image, so that the focal length of the lens is larger than the distance from the display screen to the lens, so that the lens functions as a magnifying glass, so that each of the eyes can be seen to be erected.
  • the virtual image is fused in the brain to produce stereoscopic vision.
  • Embodiments of the present disclosure provide a display device and a display method, which are described in detail below with reference to the accompanying drawings.
  • FIG. 2 shows a schematic cross-sectional view of a display device 200 in accordance with an embodiment of the present disclosure.
  • the display device 200 includes a display unit 210, a collimation unit 220, and an adjustment unit 230.
  • the display unit 210 has a plurality of pixels.
  • the collimation unit 220 is disposed in the light-emitting direction of the display unit 210 to collimate the light in the light-emitting direction of the display unit 210 to obtain collimated light.
  • the adjusting unit 230 deflects the collimated light to make images of different positions in the display unit The light emitted by the element is imaged at different depths of field.
  • FIG. 3 shows a schematic optical path diagram of a display device 200 in accordance with an embodiment of the present disclosure.
  • the display unit 210 has a plurality of pixels arranged in an array, and each of the pixels can emit light such as red (R), green (G), and blue (B).
  • R red
  • G green
  • B blue
  • FIG. 3 shows one pixel 210_i of a column of pixels of the display unit as including three portions represented by different hatching, respectively emitting red, green, and blue light, as will be understood by those skilled in the art, as shown in the figure.
  • the structure is not required as long as one pixel can emit light of a corresponding color.
  • the red light is represented as a solid line
  • the green light is indicated as a broken line
  • the blue light is represented as a dotted line.
  • the light in the light outgoing direction of the display unit 210 is incident on the collimation unit 220, and the collimation unit 220 collimates the incident light to obtain collimated light.
  • the collimated light is incident on the adjustment unit 230, and the adjustment unit 230 deflects the incident collimated light, thereby imaging the light emitted by the pixels at different positions in the display unit at different depths of field.
  • the human eye it can be visually seen that light emitted by different pixels in the display unit 210 is imaged at different depths of field.
  • the term "deflect” means that the light is deflected or reflected by the optical element, and the light is deviated from the original direction.
  • FIG. 4 shows a schematic diagram of a first example collimation unit 420 in accordance with an embodiment of the present disclosure.
  • the collimating unit 420 may include a blocking structure 423 and a lens 425.
  • the pixel 210_i included in the display unit 210 can be regarded as a point light source, a corresponding lens 425 is disposed for each pixel 210_i and the pixel 210_i is placed at the focus of the lens 425, H is the focal length of the lens 425, and D is the pixel opening area diameter.
  • the light in the lens aperture of the lens 425 is refracted by the lens 425 to become parallel light, so that the divergent light emitted by the pixel 210_i is collimated into collimated light.
  • the occlusion structure 423 blocks light that may be present outside the aperture of the lens, further improving the collimation of the light from the collimating unit.
  • the shielding structure 423 may be composed of a plurality of laminated black light shielding layers.
  • FIG. 5A shows a schematic structural view of a second example collimating unit 520
  • FIG. 5B shows a light path diagram of the collimating unit 520 of FIG. 5A
  • the collimating unit 520 may include a convex lens 521 and a parabolic mirror 523.
  • the pixel 210_i included in the display unit 210 can be regarded as a point light source, a corresponding convex lens 521 and a parabolic mirror 523 are provided for each pixel 210_i and the pixel 210_i is placed at the focus F of the convex lens 521 and the parabolic mirror 523.
  • the light in the lens aperture of the convex lens 521 is refracted by the convex lens 521 to become parallel light, so that the divergent light emitted from the pixel 210_i is collimated into collimated light.
  • the parabolic mirror 523 performs collimation processing on light rays other than the aperture of the convex lens 521. At the same time, the parabolic mirror 523 also avoids stray light from other adjacent pixels.
  • the light obtained by collimating the collimating unit emitted by each pixel included in the display unit 210 may include at least a first partial light and a second partial light, and the adjusting unit 230 is disposed to make the first portion The light and the second portion of the light are deflected at different angles.
  • FIG. 6 shows a schematic diagram of the adjustment unit 230 adjusting light according to an embodiment of the present disclosure.
  • the four pixels 2101, 2102, 2103, and 2104 in a column of pixels in the display unit are taken as an example. How the adjustment unit 230 adjusts the light.
  • the collimation unit between the display unit 210 and the adjustment unit 230 is omitted in FIG.
  • the light obtained by the collimating unit processing may include the first partial light L11 and the second partial light L12.
  • the light obtained by the second pixel 2102 after being processed by the collimating unit may include the first A part of the light L21 and the second partial light L22
  • the light obtained by the third pixel 2103 after being processed by the collimating unit may include the first partial light L31 and the second partial light L32
  • the light obtained later may include a first partial light L41 and a second partial light L42.
  • the first sub-adjusting unit 2301_1 and the second sub-adjusting unit 2301_1 are respectively provided for the first partial light L11 and the second partial light L12 of the first pixel 2101. Similarly, the sub-regulating units are respectively provided for the respective first partial light and second partial light of the second pixel 2102 to the fourth pixel 2104.
  • FIG. 6 illustrates the adjustment unit 230 as including a plurality of sub-adjustment units composed of prisms, those skilled in the art will appreciate that the adjustment unit 230 may be implemented using other means such as a convex lens, a Fresnel lens, a grating, a liquid crystal lens, or the like, As long as different parts of the light emitted from the same pixel can be deflected at different angles.
  • the first sub-modulation unit 2301_1 and the second sub-modulation unit 2301_1 are respectively provided for the first partial light L11 and the second partial light L12 of the first pixel 2101.
  • FIG. 7 shows a refracted optical path diagram of the first partial light and the second partial light of the first pixel 2101.
  • the principle of refraction of the sub-regulating unit according to an embodiment of the present disclosure will be described in detail below with reference to FIGS. 6 and 7.
  • the first partial light and the second partial light may be mixed light including, for example, R, G, and B according to an embodiment of the present disclosure, and for the sake of clarity of illustration, the mixed light is equivalent to one light beam in FIGS. 6 and 7 and will
  • the adjustment unit is shown equivalently as a prism.
  • the collimated unit collimated light emitted by the first pixel 2101 includes a first partial light L11 and a second partial light L12, and a first sub-adjustment is set for the first partial light L11 and the second partial light L12, respectively.
  • ⁇ 1 is an incident angle of the first partial light L11 to the first sub-modulation unit 2301_1
  • ⁇ 2 is an incident angle of the second partial light L12 to the second sub-modulation unit 2301_2, that is, ⁇ 1 and ⁇ 2 are the first partial light L11, respectively.
  • a first portion of the light L11 is refracted light L11 'is the angle between the normal of ⁇ ' 1
  • the second portion of the refractive light L12 L12 ' is the angle between the normal of ⁇ ' 2.
  • the magnitude of the divergence angle ⁇ determines the vertical distance from the point A (ie, the image of the first pixel 2101) to the human eye.
  • the larger the ⁇ the closer the vertical distance of the point A to the human eye, that is, the human eye feels the distance of the first pixel 2101.
  • n1 is the refractive index of the prism and n2 is the refractive index of the external medium (for example, air), according to the geometric relationship,
  • the vertical distance D (depth of field) of the plane to the human eye is 20 cm
  • ⁇ ' 1 0.75°
  • ⁇ 2 1.8°
  • ⁇ ' 2 1.2°.
  • the sub-adjustment unit can be flexibly designed for each part of the light emitted from each pixel of the display unit to obtain a predetermined depth of field.
  • the correspondence between the predetermined depth of field and the divergence angle ⁇ can be calculated in advance in practice.
  • Table 1 below shows an example of the correspondence relationship between the predetermined depth of field and the divergence angle ⁇ .
  • the depth of field D is between 20cm and 50cm, which is a comfortable distance range for the near vision of the human eye.
  • the depth of field D is between 1m and 3m, which is a comfortable range of distance vision of the human eye.
  • the image of the first pixel 2101 can be imaged at the first depth of field having the depth of field D of 20 cm.
  • the corresponding first sub-adjusting unit 2303_1 and the second sub-adjusting unit 2303_1 can be designed in the same manner as the first pixel 2101, and the image of the third pixel 2103 is also imaged in the depth of field D.
  • the corresponding sub-adjustment units may be identically designed for the odd-numbered rows of the display unit 210 such as the fifth pixel, the seventh pixel, etc., such that all odd-numbered pixels are equally imaged at a depth D of 20 cm.
  • the first depth of field is such that when the odd line pixels of the display unit 210 display the first image frame, the image of the first image frame may be formed at the first depth of field.
  • the first image frame may be a foreground image such as a character.
  • the corresponding first sub-adjusting unit 2302_1 and second sub-adjusting unit 2302_1 may be designed to image the image of the second pixel 2102 at a second depth of field whose depth of field D is, for example, 3 m.
  • the corresponding sub-adjustment units can be identically designed for even-numbered rows of pixels of the fourth pixel 2104, the sixth pixel, the eighth pixel, etc., such that all even-numbered rows of pixels are also imaged in the depth of field.
  • D is at a second depth of field of 3 m, so that when the even line pixels of the display unit 210 display the second image frame, the image of the second image frame can be formed at the second depth of field.
  • the second image frame may be a background image such as a landscape.
  • the display manners of the above first image frame and the second image frame are only examples, and the odd-line images can be imaged at the second depth of field and the even-line images can be imaged at the first depth of field. Use other display methods.
  • the first partial light L11 of the first pixel 2101 and the second partial light L21 of the second pixel 2102 may converge at a third depth of field between the first depth of field and the second depth of field, ie, imaging At the third depth of field, this imaging method can be referred to as "pixel borrowing.”
  • control may be performed such that the sub-adjustment unit 2301_1 for the first partial light L11 of the first pixel 2101 and the sub-adjustment unit 2302_2 pair of the second partial light L22 for the second pixel 2102 Incident to sub-regulating unit 2301_1 and sub-regulating unit 2302_2
  • the corresponding collimated light is refracted while the sub-adjusting unit 2301_2 for the second partial light L12 of the first pixel 2101 and the sub-adjusting unit 2302_1 for the first partial light L21 of the second pixel 2102 are incident on the sub-adjusting unit 2301_2 and sub-adjusting
  • the adjustment unit when the adjustment unit is implemented by a liquid crystal lens or a grating, the above embodiment can be realized by turning on the sub-adjusting units 2301_1, 2302_2, 2303_1, and 2304_2 and turning off the sub-adjusting units 2301_2, 2302_1, 2303_2, and 2304_1 in respective periods.
  • the display unit 210 can control the odd-line image (the first frame image), the even-line image (the second frame image), and the complete image according to actual needs.
  • a first period of display all sub adjustment units are turned on, and the display unit 210 displays a first image frame including only odd line images, the first image frame being imaged in the first a depth of field; in the second time period of display, all sub adjustment units are turned on, and the display unit 210 displays a second image frame including only even line images, the second image frame is imaged at the second depth of field; and in the display In a three-period period, a portion of the sub-adjustment unit is turned on and another portion of the sub-adjustment unit is turned off, and the display unit 210 displays a third image frame including a complete image, the third image frame being imaged at the first depth of field and the second depth of field The third depth of field between.
  • the adjustment unit when the adjustment unit images the corresponding pixel by diverging the incident collimated light, there may be a problem that light emitted from the pixel of the edge portion of the display unit cannot enter the human eye.
  • the parameters ⁇ ' 1 , ⁇ 2 and ⁇ ' 2 of the adjustment subunit are calculated, one of the refractive indices n1, n2 and the divergence angle ⁇ and ⁇ 1 and ⁇ 2 are set in advance.
  • the divergence angle ⁇ is predetermined, and at least one of n1, n2, and ⁇ 1 or ⁇ 2 may be set according to the region in which the corresponding pixel is located, such that the farther the pixel is from the center of the display unit, Accordingly, ⁇ ' 1 or ⁇ ' 2 is made smaller, so that even for the pixels of the edge region of the display unit, the human eye can see the light emitted by the pixel.
  • the adjustment unit 230 when the adjustment unit 230 is implemented by a prism or a lens, the refractive indices n1 and n2 are fixed, and the value of ⁇ 1 or ⁇ 2 can be adjusted.
  • each sub-adjustment unit when designing each sub-adjustment unit according to an embodiment of the present disclosure, a person skilled in the art can set the reference ⁇ according to actual conditions, such as the size of the display area of the display unit 210 and the vertical distance of the human eye to the plane where the display unit is located. 1 or ⁇ 2 value, and correspondingly increase or decrease the value of ⁇ 1 or ⁇ 2 according to the position where the corresponding pixel is located.
  • the adjustment unit 230 when the adjustment unit 230 is realized by, for example, a liquid crystal lens, the refractive index of the liquid crystal lens can be changed according to the voltage applied to the liquid crystal lens, that is, the refractive indices n1 and n2 are variable, and thus the ⁇ ' 1 can be more flexibly adjusted. Or the value of ⁇ ' 2 such that the refracted light of the pixels of the edge region of the display unit can also be seen by the human eye.
  • FIG. 8 shows a flow chart of a display method in accordance with an embodiment of the present disclosure. It should be noted that the serial numbers of the respective steps in the following methods are only as a representation of the steps for the description, and should not be regarded as indicating the execution order of the respective steps. This method does not need to be performed exactly as shown, unless explicitly stated. As shown in FIG. 8, the display method 80 according to an embodiment of the present disclosure may include the following steps.
  • step S801 in the first time period, displaying a first image frame at an odd row pixel or an even row pixel of the display unit, the first image frame being imaged by the adjustment unit at a first depth of field;
  • step S803 in the second period, a second image frame different from the first image frame is displayed at the even line pixel or the odd line pixel of the display unit, and the second image frame is imaged by the adjustment unit
  • the first depth of field is different from the first depth of field.
  • the first image frame may be, for example, a frame in a first sequence of images for a foreground object such as a person
  • the second image frame may be, for example, a frame in a second sequence of images for a background object such as a landscape.
  • the above display steps S801 and S803 can be controlled by the display unit such that the viewer can visually see the sequence of images including the foreground object at the first depth of field and the background object at the second depth of field.
  • the display method may further include a third time period in which the display unit displays at least a third image frame, the third image frame is imaged by the adjustment unit at a third depth of field, and the third depth of field is at Between a depth of field and a second depth of field.
  • the display of the third image frame can be achieved by borrowing between different portions of the light that are collimated by the collimating unit and emitted by the adjacent pixels.
  • the display unit 210 may be controlled to turn on all of the sub-adjustment units during the first time period of display, and the display unit 210 displays a first image frame including only odd-line images or even-line images, the first image frame being imaged at the first Depth of field; in the second time period of display, all sub adjustment units are turned on, and display unit 210 Displaying a second image frame comprising only even or odd line images, the second image frame being imaged at a second depth of field; and during a third time period of display, opening a portion of the sub-adjustment unit and turning off another of the sub-adjustment units Part of the collimated light incident on the other partial sub-regulating unit is totally reflected, and the display unit 210 displays a third image frame including a complete image, the third image frame being imaged between the first depth of field and the second depth of field The third depth of field.
  • the third image frame may be one of a third sequence of images for other objects between the foreground object and the background object.
  • the display unit may be controlled to periodically perform the steps of displaying the first image frame, the second image frame, and the third image frame, such that the viewer can visually see the foreground object at the first depth of field, the second depth of field A sequence of images of the background object and other objects at the third depth of field.
  • the third time period in which the third image frame is displayed may be between the first time period and the second time period, or may be after the second time period.
  • the refreshing frequency of the display unit 210 is sufficiently fast, it is of course possible to achieve more depth of field by pixel borrowing as long as the viewer can visually see that it is located.
  • the sequence of images corresponding to different objects at different depths of field can be used.
  • the collimation unit collimates the light in the light exiting direction of the display unit, and the collimating light is deflected by the adjusting unit, thereby imaging the light emitted by the pixels at different positions in the display unit to be different.
  • the adjusting unit At the depth of field, a virtual near-eye display is achieved for the viewer and the depth of field of the resulting image can be adjusted to restore the real thing as much as possible without changing the physical structure of the existing display unit.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Nonlinear Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

一种显示装置(200)和一种显示方法(80)。该显示装置(200)包括:显示单元(210),具有多个像素;准直化单元(220),用于将显示单元出光方向的光进行准直化,得到准直光;以及调节单元(230),用于使得准直光发生偏折,从而使显示单元(210)中不同位置的像素发出的光成像于不同的景深处。

Description

显示装置和显示方法
相关申请的交叉参考
本申请要求2017年1月5日递交的中国专利申请CN201710008823.2的优先权,其全部内容通过引用合并于此。
技术领域
本公开涉及显示领域,尤其涉及一种显示装置和显示方法。
背景技术
虚拟现实显示技术是指将现实场景虚拟化并生成虚拟图像,然后通过融合将组合的图像通过显示装置呈现给观看者。然而,传统的虚拟显示显示技术中使用的透镜较厚重,并且单透镜会引进光学像差,引起观看者视觉不适。同时,利用这种基于透镜的成像技术不能提供景深可调整的虚拟图像。
发明内容
本公开的至少一个实施例提供了一种显示装置和显示方法,以改进观看者的视觉舒适性或提供景深可调整的虚拟图像。
根据本公开的一个方面,提出了一种显示装置,包括:
显示单元,具有多个像素;
准直化单元,用于将显示单元的出光方向的光进行准直化,得到准直光;以及
调节单元,用于使得准直光发生偏折,从而使显示单元中不同位置的像素成像于不同的景深处。
优选地,所述显示单元中的多个像素排列为阵列,所述多个像素中的每个像素发出的经过准直化单元准直化的光至少包括第一部分光和第二部分光;
所述调节单元包括多个子调节单元,其中分别针对第一部分光和第二部分光设置第一子调节单元和第二子调节单元,使得第一部分光的第一折射角度与第二部分光的第二折射角度不同。
优选地,针对显示单元的奇数行像素,所述第一子调节单元和第二子调节单元被设置为使得第一部分光和第二部分光会聚在第一景深处;以及
针对显示单元的偶数行像素,所述第一子调节单元和第二子调节单元被设置为使得第 一部分光和第二部分光会聚在与第一景深不同的第二景深处。
优选地,对于显示单元中相邻的奇数行像素和偶数行像素,所述第一子调节单元和第二子调节单元被设置为使得其中一行像素发出的经过准直化单元准直化的第一部分光与另一行像素发出的经过准直化单元准直化的第二部分光会聚在第三景深处,所述第三景深位于第一景深和第二景深之间。
优选地,针对所述一行像素发出的经过准直化单元准直化的第二部分光的子调节单元和针对所述另一行像素发出的经过准直化单元准直化的第一部分光的子调节单元被设置为使得针对相应子调节单元的准直光发生全反射。
优选地,所述第一景深和第一景深中的一个位于与显示单元的出光方向相反方向上0.20米到0.30米的范围内,第一景深和第二景深中的另一个位于与显示单元的出光方向相反方向上3米到5米的范围内。
根据本公开实施例的另一方面,提供了一种显示装置的显示方法,所述显示装置包括:显示单元,具有多个像素;准直化单元,用于将显示单元的出光方向的光进行准直化,得到准直光;以及调节单元,用于使得准直光发生偏,从而使显示单元中不同位置的像素发出的光成像于不同的景深处;所述显示方法包括:
在第一时段,在显示单元的奇数行像素或偶数行像素处显示第一图像帧,所述第一图像帧经所述调节单元偏折成像于第一景深处;
在第二时段,在显示单元的偶数行像素或奇数行像素处显示与第一图像帧不同的第二图像帧,所述第二图像帧经所述调节单元偏折成像于与第一景深不同的第二景深处。
优选地,第一图像帧包括前景图像,第二图像帧包括背景图像。
优选地,所述方法还包括第三时段,其中在显示单元上显示至少第三图像帧,所述第三图像帧经所述调节单元偏折成像于第三景深处,所述第三景深在第一景深和第二景深之间。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图,在附图中:
图1示出了一种虚拟显示装置的结构示意图;
图2示出了根据本公开实施例的显示装置的示意截面图;
图3示出了根据本公开实施例的显示装置的示意光路图;
图4示出了根据本公开实施例的第一示例准直单元的示意图;
图5A示出了根据本公开实施例的第二示例准直单元的示意图;
图5B示出了图5A中的准直单元的光路图;
图6示出了根据本公开实施例的调节单元的示意光路图;
图7示出了根据本公开实施例的调节单元折射光线的光路图;以及
图8示出了根据本公开实施例的显示方法的流程图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整的描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部。基于所描述的本公开的实施例,本领域普通技术人员在无需创造性劳动的前提下获得的所有其他实施例都属于本公开保护的范围。应注意,贯穿附图,相同的元素由相同或相近的附图标记来表示。在以下描述中,一些具体实施例仅用于描述目的,而不应该理解为对本公开有任何限制,而只是本公开实施例的示例。在可能导致对本公开的理解造成混淆时,将省略常规结构或构造。应注意,图中各部件的形状和尺寸不反映真实大小和比例,而仅示意本公开实施例的内容。
除非另外定义,本公开实施例使用的技术术语或科学术语应当是本领域技术人员所理解的通信意义。本公开实施例中使用的“第一”、“第二”以及类似词语并不表示任何顺序、数量或重要性,而只是用于区分不同的组成部分。
一种虚拟现实显示方案如图1所示,包括:显示屏101、第一透镜111和第二透镜112。显示屏101用于显示图像。显示屏101可以是被分成两个显示区域用于分别显示左眼图像和右眼图像的单个显示屏,也可是分别显示左眼图像和右眼图像的两个显示屏。分别针对左眼图像和右眼图像设置第一透镜111和第二透镜112,使透镜的焦距大于显示屏到透镜的距离,从而使透镜起到放大镜的效果,因此双眼可看到各个放大正立的虚像并在大脑中进行融合,产生立体视觉。
本公开实施例提供了一种显示装置和显示方法,下面结合附图进行详细描述。
图2示出了根据本公开实施例的显示装置200的示意截面图。如图2所示,显示装置200包括显示单元210、准直化单元220和调节单元230。显示单元210具有多个像素。准直化单元220设置在显示单元210的出光方向上,将显示单元210的出光方向的光进行准直化,得到准直光。调节单元230使得准直光发生偏折,从而使显示单元中不同位置的像 素发出的光成像于不同的景深处。
图3示出了根据本公开实施例的显示装置200的示意光路图。显示单元210具有排列为阵列的多个像素,每个像素可以发出例如红色(R)、绿色(G)和蓝色(B)的光。为了便于演示,图3将显示单元的一列像素中的一个像素210_i示出为包括用不同阴影表示的三部分,分别发出红光、绿光和蓝光,本领域技术人员可以理解,图中所示的结构不是必需的,只要一个像素能够发出具有相应颜色的光即可。为了便于区分,图中将红光表示为实线、将绿光表示为虚线、将蓝光表示为点划线。
如图3所示,显示单元210的出光方向的光入射到准直化单元220,准直化单元220对入射的光进行准直化,得到准直光。准直光入射到调节单元230,调节单元230使得入射的准直光发生偏折,从而使显示单元中不同位置的像素发出的光成像于不同的景深处。对于人眼,在视觉上能够看到显示单元210中不同像素发出的光成像于不同的景深处。
应注意,根据本公开实施例,术语“偏折”(deflect)是指经过光学元件的折射或反射,光偏离了原有的方向。
图4示出了根据本公开实施例的第一示例准直单元420的示意图。如图4所示,根据本公开实施例第一示例的准直单元420可以包括遮挡结构423和透镜425。可以将显示单元210中包括的像素210_i视为点光源,针对每个像素210_i设置相应的透镜425并将像素210_i放置于透镜425的焦点处,H是透镜425的焦距,D是像素开口区的直径。对于透镜425的透镜孔径内的光,经过透镜425折射后成为平行光,从而使像素210_i发出的发散光被准直化为准直光。此外,遮挡结构423遮挡了可能存在于透镜孔径以外的光,进一步改进了准直单元出光的准直化。例如,遮挡结构423可以由层叠的多个黑色遮光层构成。
图5A示出了第二示例准直单元520的结构示意图,图5B示出了图5A中准直单元520的光路图。如图5A所示,准直单元520可以包括凸透镜521和抛物面反射镜523。可以将显示单元210中包括的像素210_i视为点光源,针对每个像素210_i设置对应的凸透镜521和抛物面反射镜523并将像素210_i放置于凸透镜521和抛物面反射镜523的焦点F处。对于凸透镜521的透镜孔径内的光,经过凸透镜521折射后成为平行光,从而使像素210_i发出的发散光被准直化为准直光。此外,抛物面反射镜523对凸透镜521的孔径以外的光线进行准直化处理。同时,抛物面反射镜523也避免了来自其他相邻像素的杂散光。
根据本公开实施例,显示单元210中包括的每个像素发出的经过准直单元准直后得到的光可以包括至少第一部分光和第二部分光,调节单元230被设置为使所述第一部分光和第二部分光偏折不同的角度。图6示出了根据本公开实施例的调节单元230调节光线的示意图。图6中以显示单元中一列像素中的4个像素2101、2102、2103和2104为示例来描 述调节单元230如何调节光线。为了图示清楚,图6中省略显示单元210和调节单元230之间的准直单元。例如对于第一像素2101,经过准直单元处理后得到的光可以包括第一部分光L11和第二部分光L12,类似地,第二像素2102发出的经过准直单元处理后得到的光可以包括第一部分光L21和第二部分光L22,第三像素2103发出的经过准直单元处理后得到的光可以包括第一部分光L31和第二部分光L32,以及第四像素2104发出的经过准直单元处理后得到的光可以包括第一部分光L41和第二部分光L42。分别针对第一像素2101的第一部分光L11和第二部分光L12设置第一子调节单元2301_1和第二子调节单元2301_1。类似地,分别针对第二像素2102至第四像素2104的各个第一部分光和第二部分光设置子调节单元。尽管图6将调节单元230示出为包括由棱镜构成的多个子调节单元,本领域技术人员可以理解,可以使用例如凸透镜、菲涅尔透镜、光栅、液晶透镜等其他方式来实现调节单元230,只要能够将从同一像素发出的不同部分光偏折不同角度即可。
以第一像素2101为例,分别针对第一像素2101的第一部分光L11和第二部分光L12设置第一子调节单元2301_1和第二子调节单元2301_1。图7示出了第一像素2101的第一部分光和第二部分光的折射光路图。接下来将结合图6和图7来详细描述根据本公开实施例的子调节单元的折射原理。应注意,根据本公开实施例第一部分光和第二部分光可以是包括例如R、G和B的混合光,为了图示清楚,图6和图7中将混合光等效为一个光束并且将调节单元等效示出为棱镜。
如图7所示,第一像素2101发出的经过准直单元准直化的光包括第一部分光L11和第二部分光L12,分别针对第一部分光L11和第二部分光L12设置第一子调节单元2301_1和第二子调节单元2301_1,作为示例,将第一子调节单元2301_1和第二子调节单元2301_1示出为棱镜。θ1是第一部分光L11到第一子调节单元2301_1的入射角,θ2是第二部分光L12到第二子调节单元2301_2的入射角,即,θ1和θ2分别是第一部分光L11和第二部分光L12与棱镜斜面法线的夹角。经棱镜折射后,第一部分光L11的折射光L11’与法线的夹角是θ′1,第二部分光L12的折射光L12’与法线的夹角是θ′2。当折射光L11,和L12’进入人眼时,经人眼成像后会聚到空间中的一个点A,即,人眼看到第一像素2101的像位于点A处。α是点A发出的光相对于人眼的夹角(发散角),α=α12,如图7所示。发散角α的大小决定了点A(即,第一像素2101的像)到人眼的垂直距离,α越大,点A到人眼的垂直距离越近,即人眼感觉第一像素2101距离越近,反之,α越小,点A到人眼的垂直距离越远,即人眼感觉第一像素2101距离越远。
由光的折射定律,可知
n1sinθ1=n2sinθ1
n1sinθ2=n2sinθ2′        (公式1)
其中,n1是棱镜的折射率,n2为外部介质(例如空气)的折射率,根据几何关系,
α1=θ1′-θ1
α2=θ2′-θ2
θ1′≥θ1
θ2′≥θ2
由以上公式可得
α=α12=θ1′+θ2′-θ21     (公式2)
对于以上公式(1)和公式(2),其中n1、n2已知。假定第一像素2101的像所在的A点处于与显示单元所在平面平行的平面中,将该平面到人眼的垂直距离D称为景深,当希望成像于给定景深处时,能够根据第一像素2101的大小(与显示单元的分辨率相关)以及第一像素2101到人眼的实际垂直距离来确定出相应的发散角α,因此可以由公式(1)和公式(2)来得到对应等效棱镜的夹角。
例如,如果希望上述平面到人眼的垂直距离D(景深)是20cm,能够确定对应的发散角为0.86°,即α=0.86,例如,假定n1=1.5,n2=1.0,θ1=0.5°,则由公式(1)和公式(2)可以计算出
θ′1=0.75°,θ2=1.8°,θ′2=1.2°。
因此,根据本公开实施例,能够针对显示单元的每个像素发出的各部分光灵活地设计子调节单元来得到预定景深。本领域技术人员可以设想,可以实际情况预先计算预定景深与发散角α之间的对应关系。以下表1示出了预定景深与发散角α之间的对应关系示例。其中,景深D在20cm-50cm之间是人眼的近视力感觉比较舒适的距离范围,景深D在1m-3m之间是人眼的远视力感觉比较舒适的距离范围。
表1
景深D 空间发散角α
20cm 0.86°
35cm 0.49°
50cm 0.34°
1m 0.17°
2m 0.086°
3m 0.057°
无穷远 0
如上所述,利用第一子调节单元2301_1和第二子调节单元2301_1,能够将第一像素2101的像成像于景深D为20cm的第一景深处。如图6所示,对于第三像素2103,可以与第一像素2101相同地设计对应的第一子调节单元2303_1和第二子调节单元2303_1,将第三像素2103的像同样成像于景深D为20cm的第一景深处。类似地,例如可以针对第五像素、第七像素......等显示单元210的奇数行像素均相同地设计对应的子调节单元,使得所有奇数行像素同样成像于景深D为20cm的第一景深处,从而当显示单元210的奇数行像素显示第一图像帧时,可以在第一景深处形成第一图像帧的像。例如,第一图像帧可以是人物等前景图像。
此外,对于第二像素2102,可以设计对应的第一子调节单元2302_1和第二子调节单元2302_1,将第二像素2102的像成像于景深D为例如3m的第二景深处。类似地,例如可以针对第四像素2104、第六像素、第八像素......等显示单元的偶数行像素均相同地设计对应的子调节单元,使得所有偶数行像素同样成像于景深D为3m的第二景深处,从而当显示单元210的偶数行像素显示第二图像帧时,能够在第二景深处形成第二图像帧的像。例如,第二图像帧可以是风景等背景图像。
本领域技术人员可以理解,以上第一图像帧和第二图像帧的显示方式仅为示例,可以将奇数行图像成像于第二景深处而将偶数行图像成像于第一景深处,当然也可以使用其他显示方式。
此外,如图6所示,例如第一像素2101的第一部分光L11和第二像素2102的第二部分光L21可以会聚在位于第一景深和第二景深之间的第三景深处,即成像于第三景深处,可以将这种成像方式称作“像素借用”。当在显示单元210显示第三图像帧时,可以进行控制,使得针对第一像素2101的第一部分光L11的子调节单元2301_1和针对第二像素2102的第二部分光L22的子调节单元2302_2对入射到子调节单元2301_1和子调节单元2302_2 的相应准直光进行折射,同时针对第一像素2101的第二部分光L12的子调节单元2301_2和针对第二像素2102的第一部分光L21的子调节单元2302_1使入射到子调节单元2301_2和子调节单元2302_1的相应准直光发生全反射;使得针对第三像素2103的第一部分光L31的子调节单元2303_1和针对第四像素2104的第二部分光L42的子调节单元2304_2对入射到子调节单元2303_1和子调节单元2304_2的相应准直光进行折射,同时针对第三像素2103的第二部分光L32的子调节单元2303_2和针对第四像素2104的第一部分光L41的子调节单元2304_1使得入射到子调节单元2303_2和子调节单元2304_1的相应准直光发生全反射;......依次类推,由此能够在第一景深和第二景深之间的第三景深处形成第三图像帧的像。例如,当由液晶透镜或光栅实现调节单元时,可以通过在相应时段开启子调节单元2301_1、2302_2、2303_1和2304_2以及关闭应子调节单元2301_2、2302_1、2303_2和2304_1,来实现以上实施例。
本领域技术人员可以理解,以上像素借用方式仅为示例,可以根据实际需要,通过控制显示单元210对于奇数行图像(第一帧图像)、偶数行图像(第二帧图像)和完整图像(第三帧图像)的显示时序,并结合对于子调节单元的开启和关闭的控制,在第一景深和第二景深之间的任意第三景深处形成第三图像帧的像。例如,以液晶透镜或光栅实现调节单元为例,在显示的第一时段,开启所有子调节单元,且显示单元210显示仅包括奇数行图像的第一图像帧,该第一图像帧成像于第一景深处;在显示的第二时段,开启所有子调节单元,且显示单元210显示仅包括偶数行图像的第二图像帧,该第二图像帧成像于第二景深处;以及在显示的第三时段,开启子调节单元中的一部分并关闭子调节单元中的另一部分,且显示单元210显示包括完整图像的第三图像帧,该第三图像帧成像于位于第一景深和第二景深之间的第三景深处。
根据本公开实施例,当调节单元通过使入射的准直光发散来使对应像素成像时,可能会存在显示单元的边缘部分的像素发出的光不能进入人眼的问题。如上所述,当计算调节子单元的参数θ′1、θ2和θ′2时,预先设定了折射率n1、n2和发散角α以及θ1和θ2之一。对于给定景深D,发散角α是预定的,可以根据对应像素所处的区域来设定n1、n2和θ1或θ2中的至少一个,使得该像素越远离显示单元中心的位置,则相应地将θ′1或θ′2越小,从而即使对于显示单元的边缘区域的像素,人眼也能够看到该像素发出的光。
例如,当调节单元230由棱镜或透镜实现时,折射率n1和n2是固定的,可以调整θ1或θ2的值。例如,对应像素所处的区域距离人眼的直线距离越远,即,该像素越远离显示单 元中心的位置,可以相应将θ1或θ2设置的越小,对应像素所处的区域距离人眼的直线距离越近,即,该像素越靠近显示单元中心的位置,则相应将θ1或θ2设置的越大,使得对于显示单元的边缘区域的像素,即使调节单元使入射光发散,人眼也能够看到该像素发出的光。因此,在设计根据本公开实施例的各个子调节单元时,本领域技术人员可以根据实际情况,例如显示单元210的显示区的大小以及人眼到显示单元所处平面的垂直距离来设置基准θ1或θ2值,并按照对应像素所处的位置来相应增大或减小θ1或θ2的值。
此外,当调节单元230由例如液晶透镜实现时,可以根据施加到液晶透镜上的电压来改变液晶透镜的折射率,即折射率n1和n2是可变的,因此能够更加灵活地调整θ′1或θ′2的值,使得显示单元的边缘区域的像素的折射光也能够被人眼看到。
根据本公开实施例,还提出了一种根据本公开实施例的显示装置的显示方法。图8示出了根据本公开实施例的显示方法的流程图。应注意,以下方法中各个步骤的序号仅作为该步骤的表示以便描述,而不应被看作表示该各个步骤的执行顺序。除非明确指出,否则该方法不需要完全按照所示顺序来执行。如图8所示,根据本公开实施例的显示方法80可以包括以下步骤。
在步骤S801,在第一时段,在显示单元的奇数行像素或偶数行像素处显示第一图像帧,所述第一图像帧经所述调节单元偏折成像于第一景深处;
在步骤S803,在第二时段,在显示单元的偶数行像素或奇数行像素处显示与第一图像帧不同的第二图像帧,所述第二图像帧经所述调节单元偏折成像于与第一景深不同的第二景深处。
例如,第一图像帧可以是例如针对人物等前景对象的第一图像序列中的一帧,第二图像帧可以是例如针对风景等背景对象的第二图像序列中的一帧。可以控制显示单元周期性地执行以上步骤S801和S803,从而观看者在视觉上能够看到包括第一景深处的前景对象和第二景深处的背景对象在内的图像序列。
根据本公开实施例的显示方法还可以包括第三时段,其中显示单元显示至少第三图像帧,所述第三图像帧经调节单元偏折成像于第三景深处,所述第三景深在第一景深和第二景深之间。优选地,可以通过相邻像素发出的经过准直化单元准直化的不同部分光之间的借用来实现第三图像帧的显示。例如,可以控制显示单元210,在显示的第一时段,开启所有子调节单元,且显示单元210显示仅包括奇数行图像或偶数行图像的第一图像帧,该第一图像帧成像于第一景深处;在显示的第二时段,开启所有子调节单元,且显示单元210 显示仅包括偶数行或奇数行图像的第二图像帧,该第二图像帧成像于第二景深处;以及在显示的第三时段,开启子调节单元中的一部分并关闭子调节单元中的另一部分使得入射到该另一部分子调节单元的准直光发生全反射,且显示单元210显示包括完整图像的第三图像帧,该第三图像帧成像于位于第一景深和第二景深之间的第三景深处。例如,第三图像帧可以是针对前景对象和背景对象之间的其他对象的第三图像序列中的一帧。可以控制显示单元周期性地执行以上显示第一图像帧、第二图像帧和第三图像帧的步骤,从而观看者在视觉上能够看到包括第一景深处的前景对象、第二景深处的背景对象以及第三景深处的其他对象在内的图像序列。
本领域技术人员可以理解,显示第三图像帧的第三时段可以在第一时段和第二时段之间,也可以在第二时段之后。此外,尽管以上示例中仅描述了三个景深的情况,在显示单元210的刷新频率足够快的情况下,当然可以通过像素借用来实现更多景深,只要观看者在视觉上能够看到包括位于不同景深处的对应不同对象在内的图像序列即可。
根据本公开实施例,准直化单元将显示单元的出光方向的光进行准直化,并由调节单元使得准直光发生偏折,从而使显示单元中不同位置的像素发出的光成像于不同的景深处,对于观看者实现了虚拟近眼显示并且所成图像的景深可以调整,以尽可能地还原真实事物,而无需改变已有显示单元的物理结构。
尽管已经参考本公开的典型实施例,具体示出和描述了本公开,但本领域普通技术人员应当理解,在不脱离所附权利要求所限定的本公开的精神和范围的情况下,可以对这些实施例进行形式和细节上的多种改变。

Claims (9)

  1. 一种显示装置,包括:
    显示单元,具有多个像素;
    准直化单元,用于将显示单元出光方向的光进行准直化,得到准直光;以及
    调节单元,用于使得准直光发生偏折,从而使显示单元中不同位置的像素发出的光成像于不同的景深处。
  2. 根据权利要求1所述的显示装置,其中,所述显示单元中的多个像素排列为阵列,所述多个像素中的每个像素发出的经过准直化单元准直化的光至少包括第一部分光和第二部分光;
    所述调节单元包括多个子调节单元,其中分别针对第一部分光和第二部分光设置第一子调节单元和第二子调节单元,使得第一部分光的第一折射角度与第二部分光的第二折射角度不同。
  3. 根据权利要求2所述的显示装置,其中,
    针对显示单元的奇数行像素,所述第一子调节单元和第二子调节单元被设置为使得第一部分光和第二部分光会聚在第一景深处;以及
    针对显示单元的偶数行像素,所述第一子调节单元和第二子调节单元被设置为使得第一部分光和第二部分光会聚在与第一景深不同的第二景深处。
  4. 根据权利要求3所述的显示装置,其中,对于显示单元中相邻的奇数行像素和偶数行像素,所述第一子调节单元和第二子调节单元被设置为使得其中一行像素发出的经过准直化单元准直化的第一部分光与另一行像素发出的经过准直化单元准直化的第二部分光会聚在第三景深处,所述第三景深位于第一景深和第二景深之间。
  5. 根据权利要求4所述的显示装置,其中,针对所述一行像素发出经过准直化单元准直化的第二部分光的子调节单元和针对所述另一行像素发出的经过准直化单元准直化的第一部分光的子调节单元被设置为使得针对相应子调节单元的准直光发生全反射。
  6. 根据权利要求3-5之一所述的显示装置,其中,所述第一景深和第一景深中的一个 位于与显示单元的出光方向相反方向上0.20米到0.30米的范围内,第一景深和第二景深中的另一个位于与显示单元的出光方向相反方向上3米到5米的范围内。
  7. 一种显示装置的显示方法,所述显示装置包括:显示单元,具有多个像素;准直化单元,用于将显示单元出光方向的光进行准直化,得到准直光;以及调节单元,用于使得准直光发生偏折,从而使显示单元中不同位置的像素发出的光成像于不同的景深处;所述显示方法包括:
    在第一时段,在显示单元的奇数行像素或偶数行像素处显示第一图像帧,所述第一图像帧经所述调节单元偏折成像于第一景深处;
    在第二时段,在显示单元的偶数行像素或奇数行像素处显示与第一图像帧不同的第二图像帧,所述第二图像帧经所述调节单元偏折成像于与第一景深不同的第二景深处。
  8. 根据权利要求7所述的显示方法,其中,第一图像帧包括前景图像,第二图像帧包括背景图像。
  9. 根据权利要求7或8所述的显示方法,还包括第三时段,其中在显示单元上显示至少第三图像帧,所述第三图像帧经所述调节单元偏折成像于第三景深处,所述第三景深在第一景深和第二景深之间。
PCT/CN2017/096225 2017-01-05 2017-08-07 显示装置和显示方法 WO2018126677A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/749,572 US10534178B2 (en) 2017-01-05 2017-08-07 Display apparatuses and display methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710008823.2A CN106526864B (zh) 2017-01-05 2017-01-05 显示装置和显示方法
CN201710008823.2 2017-01-05

Publications (1)

Publication Number Publication Date
WO2018126677A1 true WO2018126677A1 (zh) 2018-07-12

Family

ID=58336599

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/096225 WO2018126677A1 (zh) 2017-01-05 2017-08-07 显示装置和显示方法

Country Status (3)

Country Link
US (1) US10534178B2 (zh)
CN (1) CN106526864B (zh)
WO (1) WO2018126677A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11927871B2 (en) 2018-03-01 2024-03-12 Hes Ip Holdings, Llc Near-eye displaying method capable of multiple depths of field imaging

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106526864B (zh) * 2017-01-05 2019-08-30 京东方科技集团股份有限公司 显示装置和显示方法
CN107490862B (zh) * 2017-03-23 2019-10-25 华为机器有限公司 近眼显示器及近眼显示***
WO2018187955A1 (zh) * 2017-04-12 2018-10-18 陈台国 具有聚焦效果的近眼显示方法
CN111742551B (zh) * 2017-11-14 2022-11-29 弗格茨皮克斯.卡姆私人有限公司 用于改变光的聚散度以改善电子显示器的人类视觉的装置和方法
US11086129B2 (en) * 2018-02-21 2021-08-10 Valve Corporation Head-mounted display with narrow angle backlight
CN109298532A (zh) * 2018-11-22 2019-02-01 同方计算机有限公司 一种人机结合的增强视觉显示装置
CN111381371B (zh) * 2018-12-27 2022-04-15 中强光电股份有限公司 头戴式显示装置
CN111290164A (zh) 2020-03-31 2020-06-16 京东方科技集团股份有限公司 透明显示面板、显示装置及眼镜
WO2024016271A1 (zh) * 2022-07-21 2024-01-25 京东方科技集团股份有限公司 显示装置和虚拟现实设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900981A (en) * 1997-04-15 1999-05-04 Scitex Corporation Ltd. Optical system for illuminating a spatial light modulator
CN101726915A (zh) * 2008-10-24 2010-06-09 仁宝电脑工业股份有限公司 显示器
CN105827922A (zh) * 2016-05-25 2016-08-03 京东方科技集团股份有限公司 一种摄像装置及其拍摄方法
CN106094231A (zh) * 2016-08-25 2016-11-09 京东方科技集团股份有限公司 显示基板及显示装置
CN106154797A (zh) * 2016-09-09 2016-11-23 京东方科技集团股份有限公司 一种全息显示面板、全息显示装置及其显示方法
CN106526864A (zh) * 2017-01-05 2017-03-22 京东方科技集团股份有限公司 显示装置和显示方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2431728A (en) * 2005-10-31 2007-05-02 Sharp Kk Multi-depth displays
US9372349B2 (en) 2011-06-30 2016-06-21 Hewlett-Packard Development Company, L.P. Glasses-free 3D display for multiple viewers with a resonant subwavelength lens layer
US9841537B2 (en) * 2012-07-02 2017-12-12 Nvidia Corporation Near-eye microlens array displays
CN205281069U (zh) * 2016-01-08 2016-06-01 京东方科技集团股份有限公司 一种显示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900981A (en) * 1997-04-15 1999-05-04 Scitex Corporation Ltd. Optical system for illuminating a spatial light modulator
CN101726915A (zh) * 2008-10-24 2010-06-09 仁宝电脑工业股份有限公司 显示器
CN105827922A (zh) * 2016-05-25 2016-08-03 京东方科技集团股份有限公司 一种摄像装置及其拍摄方法
CN106094231A (zh) * 2016-08-25 2016-11-09 京东方科技集团股份有限公司 显示基板及显示装置
CN106154797A (zh) * 2016-09-09 2016-11-23 京东方科技集团股份有限公司 一种全息显示面板、全息显示装置及其显示方法
CN106526864A (zh) * 2017-01-05 2017-03-22 京东方科技集团股份有限公司 显示装置和显示方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11927871B2 (en) 2018-03-01 2024-03-12 Hes Ip Holdings, Llc Near-eye displaying method capable of multiple depths of field imaging

Also Published As

Publication number Publication date
CN106526864A (zh) 2017-03-22
CN106526864B (zh) 2019-08-30
US20190018244A1 (en) 2019-01-17
US10534178B2 (en) 2020-01-14

Similar Documents

Publication Publication Date Title
WO2018126677A1 (zh) 显示装置和显示方法
US9274346B2 (en) Multi-view auto-stereoscopic display
KR102071077B1 (ko) 시준화 입체표시시스템
EP3108291B1 (en) Autostereoscopic 3d display device using holographic optical elements
KR20160093039A (ko) 몰입형 컴팩트 디스플레이 안경
US9632406B2 (en) Three-dimension light field construction apparatus
WO2020042605A1 (en) Display apparatus and display system
JP2010538313A (ja) 広視野角を有する現実的画像表示装置
JP2011501822A (ja) 表示装置及びその表示方法
CN102004317A (zh) 眼镜型图像显示装置
JP2006235415A (ja) レンズアレイおよびそれを利用した表示装置
US10642061B2 (en) Display panel and display apparatus
WO2019214366A1 (zh) 近眼显示装置和近眼显示方法
TW201805657A (zh) 影像顯示裝置及影像顯示方法
TW201708888A (zh) 影像顯示裝置
US10534192B2 (en) Stereo display panel and display device having the stereo display panel
KR101549884B1 (ko) 3차원 영상 디스플레이 장치
CN108761818A (zh) 一种自由立体显示***
US11409108B2 (en) Near-eye display panel and near-eye display device
CN108761819B (zh) 一种全视差自由立体显示***
CN107817609A (zh) 一种自由立体显示***
WO2022028301A1 (zh) 一种近眼显示装置
KR20110049048A (ko) 입체 영상 표시 방법 및 이를 수행하기 위한 입체 영상 표시 장치
CN208752319U (zh) 一种自由立体显示***
CN103676175A (zh) 裸眼3d显示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17890611

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17890611

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/03/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17890611

Country of ref document: EP

Kind code of ref document: A1