WO2017164573A1 - Near-eye display apparatus and near-eye display method - Google Patents

Near-eye display apparatus and near-eye display method Download PDF

Info

Publication number
WO2017164573A1
WO2017164573A1 PCT/KR2017/002910 KR2017002910W WO2017164573A1 WO 2017164573 A1 WO2017164573 A1 WO 2017164573A1 KR 2017002910 W KR2017002910 W KR 2017002910W WO 2017164573 A1 WO2017164573 A1 WO 2017164573A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
displays
display apparatus
eye display
light
Prior art date
Application number
PCT/KR2017/002910
Other languages
French (fr)
Inventor
Tao Hong
Mingcai Zhou
Weiming Li
Zairan WANG
Zhihua Liu
Haitao Wang
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201610170301.8A external-priority patent/CN107229119A/en
Priority claimed from CN201620228921.8U external-priority patent/CN205787364U/en
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2017164573A1 publication Critical patent/WO2017164573A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/10Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images using integral imaging methods
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type

Definitions

  • the present invention relates to the technical field of terminal apparatuses, and in particular to a near-eye display apparatus and a near-eye display method.
  • An augmented reality display apparatus can superpose virtual objects in real scenes.
  • virtual information is applied to the real world to be perceived by human senses, thereby achieving sensual experience beyond reality. That is, the real environment and the virtual objects are superposed in real time into a same picture or space where they coexist.
  • an existing display technology provides a display device which can improve the light efficiency and the angle of view.
  • this device includes an illumination module, a first polarization beam splitter, a polarization rotator, and an end reflector.
  • the illumination module comprises a display 250, a diffusion surface 255, a second polarization beam splitter, a light source, and a condenser lens 240. Light from the light source is irradiated to the second polarization beam splitter by the condenser lens.
  • Light of S polarization state is reflected to the display 250 by the second polarization beam splitter, and then reflected by the display 250 to become light of P polarization state; light of P polarization state is transmitted to the end reflector by the second polarization beam splitter, the first polarization beam splitter and the polarization rotator, then reflected back by the end reflector, and passed through the polarization rotator to become light of S polarization state; and light of S polarization state is reflected to eyes by the first polarization beam splitter to form a 3D image.
  • this display device is unable to display a natural three-dimensional object, and thus visual fatigue of the eyes is caused when viewing stereoscopic images.
  • long-term wearing such a display apparatus is harmful to the health of eyes, and the wearing experience of users is poor.
  • an existing related display apparatus has the problem that visual fatigue of eyes is easily caused when displaying a three-dimensional virtual object.
  • An objective of the present invention is to at least solve one of the technical defects mentioned above, particularly the problem that the visual fatigue of eyes is easily caused when displaying a three-dimensional virtual object.
  • the present invention provides an augmented reality display apparatus, comprising an illumination module and an optical modulation element;
  • the illumination module is configured to output an image
  • the optical modulation element is configured to perform integral imaging on the image to display a three-dimensional virtual image.
  • the optical modulation element includes a microlens array or pinhole array; the microlens array or pinhole array is a flat microlens array or flat pinhole array, and alternatively the microlens array or pinhole array can be a curved microlens array or curved pinhole array.
  • the optical modulation element can include a dynamic microlens array or pinhole array formed of liquid crystal elements.
  • the illumination module includes at least two displays and a beam splitting device
  • the at least two displays are located on two sides of the beam splitting device, and form a predetermined angle with the beam splitting device;
  • the displays are configured to display an image
  • the splitting is configured to conduct an image displayed by the displays.
  • the at least two displays form an angle of 45° with the beam splitting device.
  • the illumination module further includes a light source and a condenser lens
  • the condenser lens is located between the light source and the displays, and light emitted by the light source illuminates the displays by the condenser lens.
  • the beam splitting device is a polarization splitting beam device; and the polarization beam splitting device reflects light of a first polarization direction component, which is from the light source and collimated by the condenser lens, and transmits light of a second polarization direction component orthogonal thereto, light in polarization directions transmitted and reflected being used for illuminating a non-self-luminous panel.
  • a first polarization direction component which is from the light source and collimated by the condenser lens
  • the beam splitting device transmits and reflects light from the self-luminous panels.
  • the beam splitting device is a band-pass color beam splitting device; and the band-pass color beam splitting device reflects light which is from the monochrome self-luminous panels and is in a same color as that of the band-pass color beam splitting device, and transmits light in other colors.
  • the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, a predetermined proportion of pixels are interleaved in horizontal and vertical directions.
  • the n displays switch a displayed image in a frequency greater than n ⁇ 30Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions.
  • the optical modulation element is located in the illumination module at a first predetermined distance from each display.
  • the display apparatus further includes an optical conduction unit, the optical conduction unit including at least one lens; and the optical conduction unit conducts an image displayed by the displays to a position with a predetermined distance from the optical modulation element, on which integral imaging is performed by the optical modulation element.
  • the optical conduction unit includes two lenses
  • at least one display is located at one focal length from one lens, and the distance between the two lenses is twice the focal length; and the optical modulation element is located at a second predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
  • the optical conduction unit includes one lens
  • at least one display is located twice focal length from the lens
  • the optical modulation element is located at a third predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
  • the near-eye display apparatus further includes a reflecting element; and the reflecting element is located in a light path direction of the illumination module to guide the three-dimensional virtual image to eyes.
  • the reflecting element includes a reflector or beam splitter.
  • the near-eye display apparatus when the near-eye display apparatus is an augmented reality near-eye display apparatus, the near-eye display apparatus further includes a correction module, and the reflecting element is a beam splitter;
  • the beam splitter divides light of a three-dimensional virtual image and an external real image into two paths which are guided to the eyes and to the correction module, respectively;
  • the correction module is configured to, based on the three-dimensional virtual image and the external real image conducted by the beam splitter, correct the three-dimensional virtual image, and display the corrected three-dimensional virtual image by the illumination module.
  • the correction module includes:
  • an image capturing unit configured to acquire a three-dimensional virtual image and an external real image from the beam splitter
  • a correction unit configured to analyze the three-dimensional virtual image and the external real image, and correct the three-dimensional virtual image according to the result of analysis
  • an image rendering unit configured to render the corrected three-dimensional virtual image.
  • the correction module includes:
  • a light source control unit configured to adjust the brightness of light emitted by the light source according to the corrected three-dimensional virtual image.
  • the present invention further provides a near-eye display method, including:
  • an illumination module of a near-eye display outputting an image
  • an optical modulation element of the near-eye display performing integral imaging on the image to display a three-dimensional virtual image.
  • the optical modulation element includes a microlens array or pinhole array; the microlens array or pinhole array is a flat microlens array or flat pinhole array, and alternatively the microlens array or pinhole array is curved microlens array or curved pinhole array.
  • the optical modulation element includes a dynamic microlens array or pinhole array formed of liquid crystal elements.
  • the illumination module includes at least two displays and a beam splitting device
  • the at least two displays are located on two sides of the beam splitting device, and form a predetermined angle with the beam splitting device;
  • the displays are configured to display an image
  • the splitting is configured to conduct an image displayed by the displays.
  • the at least two displays form an angle of 45° with the beam splitting device.
  • the illumination module further includes a light source and a condenser lens
  • the condenser lens is located between the light source and the displays, and light emitted by the light source illuminates the displays by the condenser lens.
  • the beam splitting device is a polarization beam splitting device; and the polarization beam splitting device reflects light of a first polarization direction component, which is from the light source and collimated by the condenser lens, and transmits light of a second polarization direction component orthogonal thereto, light in polarization directions transmitted and reflected being used for illuminating a non-self-luminous panel.
  • a first polarization direction component which is from the light source and collimated by the condenser lens
  • the splitting device transmits and reflects light from the self-luminous panels.
  • the splitting device is a band-pass color splitting device; and the band-pass color splitting device reflects light which is from the monochrome self-luminous panels and is in a same color as that of the band-pass color splitting device, and transmits light in other colors.
  • the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, a predetermined proportion of pixels are interleaved in horizontal and vertical directions.
  • the n displays switch a displayed image in a frequency greater than n ⁇ 30Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions.
  • the optical modulation element is located in the illumination module at a first predetermined distance from each display.
  • the display apparatus further includes an optical conduction unit, the optical conduction unit including at least one lens; and the optical conduction unit conducts an image displayed by the displays to a position with a predetermined distance from the optical modulation element, on which integral imaging is performed by the optical modulation element.
  • the optical conduction unit includes two lenses
  • at least one display is located at one focal length from one lens, and the distance between the two lenses is twice the focal length; and the optical modulation element is located at a second predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
  • the optical conduction unit includes one lens
  • at least one display is located twice focal length from the lens
  • the optical modulation element is located at a third predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
  • the present invention provides a near-eye display apparatus.
  • the near-eye display apparatus can be a near-eye display apparatus for virtual reality.
  • the near-eye display apparatus presents a natural three-dimensional object to eyes by the integral imaging display principle. The visual fatigue problem caused by long-term viewing a three-dimensional stereoscopic image is eliminated. It is particularly important for application scenarios where this apparatus is required to be worn long-term.
  • the near-eye display apparatus is a near-eye display apparatus for augmented reality
  • the near-eye display apparatus can, according to the matching degree of an external real image and a three-dimensional virtual image acquired, correct, adjust and render the three-dimensional virtual image in real time.
  • the functionality of the near-eye display apparatus is perfected.
  • the near-eye display apparatus can improve the quality of display by multi-display-screen multiplexing, and meanwhile, improve the light efficiency of the polarization beam splitter.
  • the above solutions as provided in the present invention just make minor modification to the existing systems, and hence will not influence the system compatibility.
  • the implementations of these solutions as provided are both simple and highly effective.
  • the present invention provides a near-eye display apparatus and a near-eye display method that solve the problem that visual fatigue of eyes is easily caused when displaying a three-dimensional virtual object.
  • Fig. 1 is a schematic device structure diagram of an existing display technology according to the present invention.
  • Fig. 2 is a schematic apparatus diagram of a virtual reality display apparatus according to a first embodiment of the present invention
  • Fig. 3 is a schematic apparatus diagram of an augmented reality display apparatus according to the present invention.
  • Fig. 4 is a schematic apparatus diagram of another augmented reality display apparatus according to the present invention.
  • Fig. 5a is a schematic structure diagram of a first implementation of an illumination module according to the present invention.
  • Fig. 5b is a schematic structure diagram of a second implementation of an illumination module according to the present invention.
  • Fig. 5b is a schematic structure diagram of a third implementation of an illumination module according to the present invention.
  • Fig. 5d is a schematic structure diagram of a fourth implementation of an illumination module according to the present invention.
  • Fig. 6a is a schematic structure diagram of a first implementation of an optical conduction unit according to the present invention.
  • Fig. 6b is a schematic structure diagram of a second implementation of an optical conduction unit according to the present invention.
  • Fig. 7 is a schematic structure diagram of a planar lens array according to the present invention.
  • Fig. 8 is a schematic structure diagram of a curved lens array according to the present invention.
  • Fig. 9 is a schematic structure diagram of a pinhole array according to the present invention.
  • Fig. 10 is a schematic diagram when eyes are viewing a two-dimensional image
  • Fig. 11 is a schematic diagram when eyes are viewing a three-dimensional image
  • Fig. 12 is a schematic diagram when eyes are viewing an integral imaging display according to the present invention.
  • Fig. 13 is a schematic diagram of binocular near-eye light field display according to the present invention.
  • Fig. 14 shows an augmented reality display apparatus according to a second embodiment of the present invention.
  • Fig. 15 shows an augmented reality display apparatus according to a third embodiment of the present invention.
  • Fig. 16 shows an augmented reality display apparatus according to a fourth embodiment of the present invention.
  • Fig. 17 is a schematic flowchart of three-dimensional augmented reality according to one embodiment of the present invention.
  • Fig. 18 is a schematic flowchart of a three-dimensional augmented reality engine according to one preferred embodiment of the present invention.
  • Fig. 19 is a schematic flowchart of three-dimensional light field rendering according to one preferred embodiment of the present invention.
  • the near-eye display apparatus of the present invention includes a virtual reality display apparatus or an augmented reality display apparatus.
  • the display apparatus includes an illumination module and an optical modulation element, wherein the illumination module is configured to output an image; and the optical modulation element is configured to perform integral imaging on the image to display a three-dimensional virtual image.
  • the illumination module is configured to output an image
  • the optical modulation element is configured to perform integral imaging on the image to display a three-dimensional virtual image.
  • the display apparatus further includes a reflecting element; and the reflecting element is located in a light path direction of the illumination module to guide the three-dimensional virtual image displayed by the optical modulation element to eyes.
  • the reflecting element includes a reflector or a beam splitter.
  • Fig. 2 shows a virtual reality display apparatus according to a first embodiment of the present invention.
  • the display apparatus includes an illumination module, an optical modulation element and a reflector.
  • the optical modulation element processes an image displayed by the displays into a virtual object light field of a three-dimensional virtual image, and projects the virtual object light field to eyes by the beam splitter.
  • the eyes can observe a virtual three-dimensional object, so as to realize the virtual reality display.
  • the optical modulation element is located in the illumination module at a first predetermined distance from each display.
  • the structure of the virtual reality display apparatus and the near-eye display method as shown in Fig. 2 are both applicable to an augmented reality display apparatus.
  • the beam splitting element can be a beam splitter.
  • the near-eye display apparatus further includes an optical conduction unit, the optical conduction unit including at least one lens; and the optical conduction unit conducts an image displayed by the displays to a position with a predetermined distance from the optical modulation element, on which integral imaging is performed by the optical modulation element.
  • the display apparatus when the near-eye display apparatus is an augmented reality display apparatus, the display apparatus further includes a correction module, and the reflecting element is a beam splitter;
  • the beam splitter splits light of a three-dimensional virtual image and an external real image into two paths which are guided to the eyes and to the correction module, respectively;
  • the correction module is configured to, based on a three-dimensional virtual image and an external real image conducted by the beam splitter, correct the three-dimensional virtual image, and display the corrected three-dimensional virtual image by the illumination module.
  • the correction module includes an image capturing unit, a correction unit and an image rendering unit; and the image capturing unit can be a camera.
  • the image capturing unit is configured to acquire a three-dimensional virtual image and an external real image from the beam splitter; the correction unit is configured to analyze the three-dimensional virtual image and the external real image, and correct the three-dimensional virtual image according to the result of analysis; and the image rendering unit is configured to render the corrected three-dimensional virtual image.
  • the augmented reality display apparatus After replacing the beam splitter in the augmented reality display apparatus of Fig. 3 with a reflector, or after replacing the beam splitter of Fig. 4 with a reflector and removing the image capturing unit, the augmented reality display apparatus can become a virtual reality display apparatus.
  • the correction module further includes a light source control unit; and the light source control unit is configured to adjust the brightness of light emitted by the light source according to the corrected three-dimensional virtual image.
  • the displays are self-luminous displays
  • an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device, and the optical conduction unit images the image displayed by the displays to a position with a predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display.
  • the displayed virtual object light field enters the eyes and the image capturing unit by the beam splitter, respectively.
  • the eyes and the image capturing unit can acquire an external real image, simultaneously.
  • the image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed.
  • the three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the three-dimensional virtual image is adjusted in real time.
  • the displays are non-self-luminous displays
  • light for illumination emitted by the light source illuminates the displays by the splitting device
  • the optical conduction unit images an image displayed by the displays to a position with a second predetermined distance from the optical modulation element so as to form integral imaging display.
  • the optical modulation element processes the image displayed by the displays into a three-dimensional virtual image for integral imaging display.
  • the virtual object light field of the three-dimensional virtual image enters the eyes and the image capturing unit by the beam splitter, respectively.
  • the eyes and the image capturing unit can acquire an external real image, simultaneously.
  • the image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed.
  • the three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the three-dimensional virtual image is adjusted in real time.
  • the illumination module includes at least two displays and a beam splitting device; the at least two displays are located on two sides of the beam splitting device, and form a predetermined angle with the splitting device; and preferably, the at least two displays form an angle of 45° with the splitting device.
  • the displays are configured to display an image
  • the beam splitting device is configured to conduct an image displayed by the displays.
  • the illumination module further includes a light source and a condenser lens; and the condenser lens is located between the light source and the displays, and light emitted by the light source illuminates the displays by the condenser lens.
  • the illumination module requires no additional light source, and the beam splitting device transmits and reflects light from the self-luminous panels.
  • OLED organic light-emitting diode
  • the beam splitting device is a band-pass color beam splitting device; and the band-pass color beam splitting device reflects light which is from the monochrome self-luminous panels and is in a same color as that of the band-pass color beam splitting device, and transmits light in other colors.
  • the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, a predetermined proportion of pixels are interleaved in horizontal and vertical directions.
  • the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, 1/2 pixels are interleaved in horizontal and vertical directions. In this case, the resolution can be doubled.
  • the n displays switch a displayed image in a frequency greater than n ⁇ 30Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions.
  • n displays switch a displayed image in a frequency of n ⁇ 60Hz, so that the eyes will not sense any flicker.
  • the illumination module in this embodiment includes any one of the following implementation solutions.
  • the illumination module uses a plurality of displays which are placed around the beam splitting device and form a predetermined angle, for example, an angle of 45°, with the beam splitting device.
  • a predetermined angle for example, an angle of 45°
  • the beam splitting device can be a polarization beam splitter.
  • the polarization beam splitting device reflects light of a first polarization direction component, which is from the light source and collimated by the condenser lens, and transmits light of a second polarization direction component orthogonal thereto, light in polarization directions transmitted and reflected being used for illuminating a non-self-luminous panel.
  • the polarization beam splitter reflects, in the light for illumination, light of a first polarization state to the display 1 to provide light for illumination to the display 1, and transmits light of a second polarization state to the display 2 to provide light for illumination to the display 2.
  • Light of the first polarization state is reflected by the display 1 to become light of the second polarization state which, carrying displayed image information, for example, an image displayed by the display 1, is then transmitted by the polarization beam splitter.
  • Light of the second polarization state is reflected by the display 2 to become light of the first polarization state which, carrying displayed image information, for example, an image displayed by the display 2, is then reflected by the polarization beam splitter.
  • the two displays and the polarization beam splitter by the two displays and the polarization beam splitter, light for illumination from the light source will not be reflected back to the light source or scattered, all light is used to irradiate the displays and provided to the subsequent light path, so that the utilization light efficiency for illumination is effectively improved.
  • the loss of light is reduced, the loss of energy is decreased, and the life of the apparatus battery can be effectively prolonged.
  • the time-division multiplexing of n displays is similar to the time-division multiplexing of two displays.
  • the predetermined refresh frequency of each panel is greater than n ⁇ 30 Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions. Thus, the resolution of the image will be increased by n times.
  • the illumination module uses a plurality of displays which are placed around the beam splitting device and form an angle of 45° with the beam splitting device.
  • the beam splitting device only uses a common beam splitter instead of any polarization beam splitter. As shown in Fig. 5b, there can be two displays, and the beam splitting device can be a common beam splitter.
  • Light for illumination from the light source is passed through the condenser lens and then collimated to parallel light, and the parallel light is irradiated to the beam splitter at an angle of 45° with the beam splitter.
  • Part of light is reflected to the display 1 to provide light for illumination to the display 1, and the other part of light is transmitted to the display 2 to provide light for illumination to the display 2.
  • Light for illumination after being reflected by the display 1 and carrying displayed image information of an image, is transmitted by the beam splitter to enter the optical conduction unit; and the other part of light, after being reflected by the display 2 and carrying displayed image information of an image, is reflected by the beam splitter to enter the optical conduction unit.
  • the solution of the displayed image will be increased.
  • the utilization of light efficiency for illumination will be lower than that of Solution 1.
  • Solution 3 The illumination module uses a plurality of OLED panels which are placed around the beam splitting device and form an angle of 45° with the beam splitting device. As shown in Fig. 5c, there are two OLED panels, and the splitting element is a common beam splitter.
  • Light, carrying displayed image information of an image, emitted by the two OLED panels is reflected and transmitted by a beam splitter, respectively, to enter the optical conduction unit.
  • a beam splitter By one scattering surface, light, which is emitted by the OLED panels and reflected and transmitted onto the scattering surface by the beam splitter without being utilized, is scattered. As a result, light will not enter the conduction light path. The generation of background noise in the displayed image is avoided.
  • the resolution of the display apparatus is increased.
  • the illumination module uses three monochrome OLED panels, for example, red, green and blue OLED panels, which are placed around the splitting element and form an angle of 45° with the beam splitting device. As shown in Fig. 5d, the three monochrome OLED panels emit red, green and blue light, respectively; and the beam splitting device is a band-pass green and blue color beam splitter which reflects green and blue light and transmits light in other colors.
  • Monochrome images displayed by the three monochrome OLED panels form a color image by fusion, and the color image enters the optical conduction unit.
  • the resolution of the color image, which is formed by the monochrome images displayed by the three monochrome OLED panels by fusion, is increased. Theoretically, the resolution is increased by three times when compared with a single display. With the use of the color splitting element, the utilization efficiency of light can be improved, and the apparatus energy loss can be further reduced.
  • the optical conduction unit in this embodiment can include one of the following solutions.
  • Solution 1 When the optical conduction unit includes two lenses, at least one display is located at one focal length from one lens, and the distance between the two lenses is twice the focal length; and the optical modulation element is located at a second predetermined distance from an imaging position of an image after being conducted by the optical conduction unit. That is, it can be a 4f optical system formed of two lenses, where the displays are located at one focal length from the front lens, the distance between the two lenses is twice the focal length, and the imaging surface is located at one focal length from the rear lens.
  • the displays are self-luminous displays
  • an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device
  • the displays are non-self-luminous displays
  • light for illumination illuminates the displays by the beam splitting device
  • the optical conduction unit images the image displayed by the displays to a position with a second predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display.
  • the optical modulation element processes an image displayed by the displays into a virtual object light field of a three-dimensional virtual image, and projects the virtual object light field to eyes.
  • the optical conduction unit images an image displayed by the displays to a position near the optical modulation element, so that the displays and the optical modulation element form an integral imaging display device.
  • integral imaging display can be increased by time-division multiplexing of a plurality of displays.
  • the specific implementation is as described above.
  • the integral imaging display contents are captured by the eyes and the camera by the beam splitter, respectively, and an image of the external real scene can also be captured by the eyes and the camera. In this way, the augmented reality display where a virtual object and a real scene are superposed is realized.
  • the optical conduction unit includes one lens
  • at least one display is located twice focal length from the lens
  • the optical modulation element is located at a third predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
  • the optical conduction unit is formed of one lens.
  • the displays are located twice focal length from the lens, the imaging side is also located twice focal length from the lens at a position close to the microlens array, as shown in Fig. 6b.
  • the structure of the display apparatus for augmented reality as shown in this embodiment can be applicable to a virtual reality display apparatus. Specifically, after replacing the beam splitter in the augmented reality display apparatus of Fig. 6a and Fig. 6b with a reflector, the augmented reality display apparatus can become a virtual reality display apparatus.
  • the optical modulation element is a microlens array.
  • An image displayed by the displays is imaged by the optical conduction unit to a position near the microlens array to form an integral imaging display system.
  • integral imaging display many elemental views will be displayed in the displays, each elemental view will be imaged by a corresponding microlens, and a three-dimensional object light field is formed in the space. A real three-dimensional object can be perceived by capturing this three-dimensional object light field by the eyes.
  • the lens array is a planar lens array.
  • the angle of view in integral imaging display will be limited by the area of each elemental view in the displays. Usually, every microlens corresponds to one display area in the displays. In order to prevent superposition of images, the display contents beyond this display area will be abandoned. As shown in Fig. 7, the elemental views cannot be completely displayed in the display areas corresponding to the marginal lens. Thus, the number of corresponding elemental views is limited, and the integrated image will not be observed beyond the angle of view.
  • this problem can be solved by replacing the planar lens array with a curved lens array.
  • the display areas corresponding to the marginal microlens in the curved lens array will be increased, so that the elemental views can be completely displayed. Accordingly, the corresponding elemental views can be increased, and thus the angle of view will be greatly increased, as shown in Fig. 8.
  • the optical modulation element is a pinhole array.
  • the microlens array can be replaced by a pinhole array, as shown in Fig. 9, to form an integral imaging display system.
  • the microlens array or pinhole array in the present invention can be a dynamic liquid crystal microlens or liquid crystal pinhole array formed of liquid crystal elements.
  • the microlens arrays or pinhole arrays in part or all of areas have a refraction function or have no refraction function.
  • those arrays can become transparent elements, so that the switch between two-dimensional display and three-dimensional display or the mixed display of two-dimensional and three-dimensional objects can be realized.
  • a display apparatus can be worn on eyes to project two images with parallax from each other, respectively, to form stereoscopic vision.
  • images with parallax from each other are generally displayed in the respective screens of left and right eyes.
  • Such images will form three-dimensional stereoscopic vision after being processed by the human brain.
  • such a way will result in contradiction of the focus adjustment and the convergence adjustment of the eyes.
  • long-term wearing will make eyes become tired.
  • the reconstructed three-dimensional object is formed of many point light sources in the space. Those point light sources are formed by the convergence of images displayed in the displays by the refraction function of the microlens array. Those point light sources form an object light field distribution that really exists in the three-dimensional space, just like the eyes view a real object, as shown in Fig. 12.Integral imaging display allows for complete matching of the focus and the convergence of eyes, without causing visual fatigue by long-term wearing such an apparatus.
  • any three-dimensional object viewed by each eye is formed of the real three-dimensional object light field, just like the external real scene viewed by the eyes.
  • natural three-dimensional display is realized, the contradiction of the focus and the convergence is relieved, and visual fatigue caused by long-term wearing a display apparatus is avoided. Therefore, it is helpful for the health of eyes of the viewers.
  • Fig. 13 shows a case in which the present invention is applied to binocular near-eye light field display. Since the continuous convergence adjustment is provided within a certain depth of field range, such binocular display can solve the contradiction of the focus adjustment and the convergence adjustment as in common binocular stereoscopic display. It is to be noted that, after replacing the beam splitter in the augmented reality display apparatus of Fig. 13 with a reflector and removing the camera, the augmented reality display apparatus can become a virtual reality display apparatus.
  • the structure of the near-eye display apparatus and the near-eye display method as shown in the first embodiment of the present invention can be applicable to an augmented reality display apparatus and a virtual reality display apparatus.
  • Fig. 14 shows an augmented reality display apparatus according to a second embodiment of the present invention.
  • the display apparatus includes an illumination module, an optical conduction unit and an optical modulation unit, wherein the optical modulation element includes a microlens array or pinhole array.
  • the illumination module includes at least two displays and a beam splitting device, and the beam splitting device includes a polarization beam splitter.
  • the illumination module requires no additional light source.
  • the illumination module further includes a light source and a condenser lens.
  • the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display.
  • the optical modulation element processes an image displayed by the displays to a virtual object light field; and the virtual object light field is conducted by the optical conduction unit, that is, conducted by a relay optical system, to be projected to the eyes.
  • the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
  • the displays are non-self-luminous displays
  • light for illumination is irradiated to the displays by the beam splitting device
  • the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display.
  • the optical modulation element processes an image displayed by the displays to a virtual object light field; and the virtual object light field is conducted by the optical conduction unit, that is, conducted by a relay optical system, to be projected to the eyes.
  • the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
  • the two displays switch a displayed image in a frequency greater than a predetermined refresh frequency; the predetermined refresh frequency is far higher than a refresh frequency that can be distinguished by eyes; and the displayed image is interleaved in horizontal and vertical directions by the pixel distance of half a display. That is, the solution is increased by time-division multiplexing of the displays. For example, if the pixel positions of the two displays are displaced by half a pixel, the resolution can be doubled.
  • This apparatus further includes a correction module, wherein the correction module includes an image capturing unit for example a camera, a beam splitter, a correction unit, an image rendering unit and a light source control unit.
  • the correction module includes an image capturing unit for example a camera, a beam splitter, a correction unit, an image rendering unit and a light source control unit.
  • the optical modulation element is directly in the illumination module at a first predetermined distance from the displays to form integral imaging display; the optical modulation element processes an image displayed by the displays to a virtual object light field; and the virtual object light field is conducted by the optical conduction unit to enter the eyes and the image capturing unit by the beam splitter, respectively.
  • the eyes and the image capturing unit can acquire an external real image, simultaneously.
  • the image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed.
  • the three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
  • the displays are non-self-luminous displays
  • light for illumination illuminates the displays by the beam splitting device
  • the optical modulation element is directly located in the illumination unit at a first predetermined distance from the displays to form integral imaging display
  • the optical modulation element processes an image displayed by the displays to a virtual object light field of a three-dimensional virtual image
  • the virtual object light field is conducted by the optical conduction unit to enter the eyes and the image capturing unit by the beam splitter, respectively.
  • the eyes and the image capturing unit can acquire an external real image, simultaneously.
  • the image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed.
  • the three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
  • the above-mentioned implementation solutions of the illumination module in the first embodiment of the present invention can be used.
  • the structure of the augmented reality display apparatus as shown in second embodiment can be applicable to a virtual reality display apparatus. Specifically, after replacing the beam splitter in the augmented reality display apparatus of Fig. 14 with a reflecting mirror and removing the camera, the augmented reality display apparatus can become a virtual reality display apparatus.
  • Fig. 15 shows a virtual reality display apparatus according to a third embodiment of the present invention.
  • the display apparatus includes an illumination module and an optical modulation unit, wherein the optical modulation element includes a microlens array or pinhole array.
  • the illumination module includes at least two displays and a beam splitting device, and the beam splitting device includes a polarization beam splitter.
  • the illumination module requires no additional light source.
  • the illumination module further includes a light source and a condenser lens.
  • the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display.
  • the optical modulation element processes an image displayed by the displays to a virtual object light field, and further projects the virtual object light field to the eyes.
  • the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
  • the displays are non-self-luminous displays
  • light for illumination is irradiated to the displays by the beam splitting device
  • the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display.
  • the optical modulation element processes an image displayed by the displays to a virtual object light field, and further projects the virtual object light field to the eyes.
  • the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
  • the two displays switch a displayed image in a frequency greater than a predetermined refresh frequency; the predetermined refresh frequency is far higher than a refresh frequency that can be distinguished by eyes; and the displayed image is interleaved in horizontal and vertical directions by the pixel distance of half a display. That is, the solution is increased by time-division multiplexing of the displays. For example, if the pixel positions of the two displays are displaced by half a pixel, the resolution can be doubled.
  • the optical conduction unit is omitted in this embodiment, so that the apparatus becomes more compact.
  • This display apparatus can be used in some cases having limited space dimension.
  • This display apparatus further includes a correction module, wherein the correction module includes an image capturing unit for example a camera, a beam splitter, a correction unit, an image rendering unit and a light source control unit.
  • the correction module includes an image capturing unit for example a camera, a beam splitter, a correction unit, an image rendering unit and a light source control unit.
  • the optical modulation element is located in the illumination module at a first predetermined distance from the displays directly to form integral imaging display; the optical modulation element processes an image displayed by the displays to a virtual object light field of a three-dimensional virtual image; and the virtual object light field enters the eyes and the image capturing unit by the beam splitter, respectively.
  • the eyes and the image capturing unit can acquire an external real image, simultaneously.
  • the image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed.
  • the three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
  • the displays are non-self-luminous displays
  • light for illumination illuminates the displays by the beam splitting device
  • the optical modulation element is directly located in the illumination unit at a first predetermined distance from the displays to form integral imaging display
  • the optical modulation element processes an image displayed by the displays to a virtual object light field of a three-dimensional virtual image
  • the virtual object light field enters the eyes and the image capturing unit by the beam splitter, respectively.
  • the eyes and the image capturing unit can acquire an external real image, simultaneously.
  • the image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed.
  • the three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
  • the above-mentioned implementation solutions of the illumination module in the first embodiment of the present invention can be used.
  • the structure of the virtual reality display apparatus as shown in third embodiment can be applicable to an augmented reality display apparatus. Specifically, after replacing the beam splitter in the augmented reality display apparatus of Fig. 15 with a reflector and removing the camera, the augmented reality display apparatus can become a virtual reality display apparatus.
  • Fig. 16 shows a virtual reality display apparatus according to a fourth embodiment of the present invention.
  • the display apparatus includes an illumination module and an optical modulation element.
  • no other elements are required.
  • a reflector is required.
  • Fig. 16 shows an arrangement mode in which the viewing position of the eyes forms an angle of 90° with the light path of the outgoing light of the illumination module; an image from the illumination module is formed into a three-dimensional virtual image for integral imaging display by the optical modulation element; and the reflector is placed at an end opposite to the illumination module to guide the three-dimensional virtual image to the eyes.
  • the display apparatus further includes a correction module.
  • the correction module includes an image capturing unit, an image rendering unit and a light source control unit; the image capturing unit is configured to acquire a three-dimensional virtual image from a reflector; the image rendering unit is configured to correct and then render the three-dimensional virtual image, and conduct the three-dimensional virtual image by the illumination module; and the light source control unit is configured to adjust the brightness of light emitted by the light source according to the corrected three-dimensional virtual image.
  • the virtual object light field of the three-dimensional image information is captured by the eyes only.
  • the following description will be given by taking the solution reformed from the first embodiment as an example.
  • the second embodiment and the third embodiment can also be similarly reformed to obtain a similar virtual reality display apparatus.
  • the display apparatus includes an illumination module, an optical conduction and an optical modulation unit, wherein the optical modulation element includes a microlens array or pinhole array.
  • the illumination module includes at least two displays and a beam splitting device, and the beam splitting device specifically includes a polarization beam splitter.
  • the illumination module requires no additional light source.
  • the illumination module further includes a light source and a condenser lens.
  • an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device; the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element, so as to form integral imaging display; and the optical modulation element processes an image displayed by the displays into a virtual object light field, and projects the virtual object light field to eyes.
  • the eyes can observe a virtual three-dimensional object, so as to realize the virtual reality display.
  • the displays are non-self-luminous displays
  • light for illumination illuminates the displays by the beam splitting device
  • the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display
  • the optical modulation element processes an image displayed by the displays into a virtual object light field, and projects the virtual object light field to eyes.
  • the eyes can observe a virtual three-dimensional object, so as to realize the virtual reality display.
  • the two displays switch a displayed image in a frequency greater than a predetermined refresh frequency; the predetermined refresh frequency is far higher than a refresh frequency that can be distinguished by eyes; and the displayed image is interleaved in horizontal and vertical directions by the pixel distance of half a display. That is, the resolution is increased by time-division multiplexing of the displays. For example, if the pixel positions of the two displays are displaced by half a pixel, the resolution can be doubled.
  • the display apparatus includes a correction module; and the correction module includes an image capturing unit, a reflector, an image rendering unit and a light source control unit.
  • an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device; the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element so as to form integral imaging display; and the optical modulation element processes an image displayed by the displays into a virtual object light field of a three-dimensional virtual image, and the displayed virtual object light field enters the eyes by the reflector.
  • Real-time rendering is performed by the image rendering unit, so as to adjust the displayed virtual object in real time.
  • the displays are non-self-luminous displays
  • light for illumination illuminates the displays by the beam splitting device
  • the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display
  • the optical modulation element processes an image displayed by the displays into a virtual object light field, and the displayed virtual object light field enters the eyes by the beam splitter.
  • Real-time rendering is performed by the image rendering unit, so as to adjust the displayed virtual object in real time.
  • the image capturing unit acquires a superposed image of an external real image and a three-dimensional virtual image; then, inputs the captured image to a processing unit to detect a new salient object and track the original object; then, identifies the new salient object; check the new salient object by GPS and the movement direction; then, generates a three-dimensional virtual image model for the new salient object, including texts or images; adjusting the position of the three-dimensional virtual image in real time according to the tracking data; calculates the contrast of the region of the original object; adjusting the color, size and shape of the three-dimensional virtual image according to the contrast; further performing three-dimensional image rendering on all three-dimensional virtual images; and superposing the rendered three-dimensional virtual images into the real scene by the display unit.
  • Fig. 18 is a schematic flowchart of a three-dimensional augmented reality engine according to one preferred embodiment of the present invention.
  • a user can select a proper three-dimensional augmented mode as needed.
  • the three-dimensional augmented modes include the above-mentioned basic drawing mode, an intelligent gaze tracking mode or a specific category mode.
  • the basic drawing mode all salient objects in a scene are to be detected and identified, and then input to a drawing module for self-adaptive drawing.
  • the intelligent gaze tracking mode the system will identify the gazing direction of the current user, determine a region to be augmented in the scene according to the gazing direction, and then detect the salient object by which this region is identified and perform self-adaptive augmented display.
  • a user can set a category of interest, for example, hotel, cinema, bank, attraction in a scene, and then detect contents of this specific category in the scene according to the selection of the user and perform self-adaptive augmented display.
  • a category of interest for example, hotel, cinema, bank, attraction in a scene
  • other augmented modes can be designed.
  • the present invention provides a plurality of three-dimensional augmented modes for users to choose.
  • the mode selection interface can be selection by a menu or by a quick command.
  • a three-dimensional augmented mode is set by a predetermined quick voice command.
  • a basic rendering flow of a three-dimensional light field is provided.
  • two above-mentioned near-eye apparatus also can be near-eye light field displays
  • the two near-eye light field displays respectively correspond to displays for left and right eyes in AR displays or VR displays, and are respectively used for displaying pictures seen by the left and right eyes.
  • it is required to adjust the rendering contents of the two near-field displays in an associated manner.
  • the position and angle of the rendering contents of the two near-eye light field displays can be adjusted based on the distance between the two near-eye light field displays. As shown in Fig.
  • a display apparatus having two near-eye light field displays mentioned above determines three-dimensional model information to be superposed, and acquires parameters of each near-eye light field display, wherein the acquired parameters include at least one of the following: pixel size of each two-dimensional panel, the pitch of the microlens array, the gap between the panel and the microlens, or more.
  • the display apparatus further acquires the distance between the near-eye light field displays for the left and right eyes, so as to match the papillary distance of the viewers .
  • the display apparatus sets the positions of left and right virtual cameras in a three-dimensional drawing engine according to the distance between the left and right near-eye light field displays, puts a three-dimensional mode to be drawn into the three-dimensional drawing engine, and respectively sets parameters of the virtual cameras according to the parameters of each near-eye light field display.
  • the set parameters include resolution, angle of view or more.
  • images at a plurality of angles are drawn according to the pixel distribution under the microlens, the images at a plurality of angles of the left and right cameras are interleaved and fused to generate two elemental image arrays, and the interleaved and fused left and right elemental image arrays are input to the left and right near-eye light field displays to be displayed.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The present invention provides a near-eye display apparatus comprising an illumination module and an optical modulation element; the illumination module is configured to output an image; and the optical modulation element is configured to perform integral imaging on the image to display a three-dimensional virtual image. The near-eye display apparatus can comprise a near-eye display apparatus for virtual reality. The near-eye display apparatus presents a natural three-dimensional object to eyes by the integral imaging display principle. The visual fatigue problem caused by long-term viewing a three-dimensional stereoscopic image is eliminated. It is particularly important for application scenarios where this apparatus is required to be worn long-term.

Description

NEAR-EYE DISPLAY APPARATUS AND NEAR-EYE DISPLAY METHOD
The present invention relates to the technical field of terminal apparatuses, and in particular to a near-eye display apparatus and a near-eye display method.
An augmented reality display apparatus can superpose virtual objects in real scenes. By such an augmented reality display apparatus, virtual information is applied to the real world to be perceived by human senses, thereby achieving sensual experience beyond reality. That is, the real environment and the virtual objects are superposed in real time into a same picture or space where they coexist.
An existing display technology provides a display device which can improve the light efficiency and the angle of view. As shown in Fig. 1, this device includes an illumination module, a first polarization beam splitter, a polarization rotator, and an end reflector. The illumination module comprises a display 250, a diffusion surface 255, a second polarization beam splitter, a light source, and a condenser lens 240. Light from the light source is irradiated to the second polarization beam splitter by the condenser lens. Light of S polarization state is reflected to the display 250 by the second polarization beam splitter, and then reflected by the display 250 to become light of P polarization state; light of P polarization state is transmitted to the end reflector by the second polarization beam splitter, the first polarization beam splitter and the polarization rotator, then reflected back by the end reflector, and passed through the polarization rotator to become light of S polarization state; and light of S polarization state is reflected to eyes by the first polarization beam splitter to form a 3D image. When light of the light source is irradiated to the second polarization beam splitter by the condenser lens, light of P polarization state is transmitted to the diffusion surface 255 by the second polarization splitter, and this part of light is diffused by the diffusion surface. In this solution, the light efficiency can be improved by polarization beam splitters, and the angle of view can be improved by concave reflectors.
However, this display device is unable to display a natural three-dimensional object, and thus visual fatigue of the eyes is caused when viewing stereoscopic images. As a result, long-term wearing such a display apparatus is harmful to the health of eyes, and the wearing experience of users is poor.
In conclusion, an existing related display apparatus has the problem that visual fatigue of eyes is easily caused when displaying a three-dimensional virtual object.
An objective of the present invention is to at least solve one of the technical defects mentioned above, particularly the problem that the visual fatigue of eyes is easily caused when displaying a three-dimensional virtual object.
The present invention provides an augmented reality display apparatus, comprising an illumination module and an optical modulation element;
the illumination module is configured to output an image; and
the optical modulation element is configured to perform integral imaging on the image to display a three-dimensional virtual image.
The optical modulation element includes a microlens array or pinhole array; the microlens array or pinhole array is a flat microlens array or flat pinhole array, and alternatively the microlens array or pinhole array can be a curved microlens array or curved pinhole array. The optical modulation element can include a dynamic microlens array or pinhole array formed of liquid crystal elements.
Preferably, the illumination module includes at least two displays and a beam splitting device;
specifically, the at least two displays are located on two sides of the beam splitting device, and form a predetermined angle with the beam splitting device;
wherein the displays are configured to display an image, and the splitting is configured to conduct an image displayed by the displays.
Preferably, the at least two displays form an angle of 45° with the beam splitting device.
Optionally, when the displays are non-self-luminous panels, the illumination module further includes a light source and a condenser lens; and
the condenser lens is located between the light source and the displays, and light emitted by the light source illuminates the displays by the condenser lens.
Preferably, the beam splitting device is a polarization splitting beam device; and the polarization beam splitting device reflects light of a first polarization direction component, which is from the light source and collimated by the condenser lens, and transmits light of a second polarization direction component orthogonal thereto, light in polarization directions transmitted and reflected being used for illuminating a non-self-luminous panel.
Optionally, when the displays are self-luminous panels, the beam splitting device transmits and reflects light from the self-luminous panels.
Optionally, when the displays are monochrome self-luminous panels, the beam splitting device is a band-pass color beam splitting device; and the band-pass color beam splitting device reflects light which is from the monochrome self-luminous panels and is in a same color as that of the band-pass color beam splitting device, and transmits light in other colors.
Preferably, the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, a predetermined proportion of pixels are interleaved in horizontal and vertical directions.
Further preferably, when the number of the displays is n, the n displays switch a displayed image in a frequency greater than n×30Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions.
Preferably, the optical modulation element is located in the illumination module at a first predetermined distance from each display.
Preferably, the display apparatus further includes an optical conduction unit, the optical conduction unit including at least one lens; and the optical conduction unit conducts an image displayed by the displays to a position with a predetermined distance from the optical modulation element, on which integral imaging is performed by the optical modulation element.
Optionally, when the optical conduction unit includes two lenses, at least one display is located at one focal length from one lens, and the distance between the two lenses is twice the focal length; and the optical modulation element is located at a second predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
Optionally, when the optical conduction unit includes one lens, at least one display is located twice focal length from the lens; and the optical modulation element is located at a third predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
Preferably, the near-eye display apparatus further includes a reflecting element; and the reflecting element is located in a light path direction of the illumination module to guide the three-dimensional virtual image to eyes.
The reflecting element includes a reflector or beam splitter.
Preferably, when the near-eye display apparatus is an augmented reality near-eye display apparatus, the near-eye display apparatus further includes a correction module, and the reflecting element is a beam splitter;
the beam splitter divides light of a three-dimensional virtual image and an external real image into two paths which are guided to the eyes and to the correction module, respectively; and
the correction module is configured to, based on the three-dimensional virtual image and the external real image conducted by the beam splitter, correct the three-dimensional virtual image, and display the corrected three-dimensional virtual image by the illumination module.
Preferably, the correction module includes:
an image capturing unit configured to acquire a three-dimensional virtual image and an external real image from the beam splitter;
a correction unit configured to analyze the three-dimensional virtual image and the external real image, and correct the three-dimensional virtual image according to the result of analysis; and
an image rendering unit configured to render the corrected three-dimensional virtual image.
Further preferably, the correction module includes:
a light source control unit configured to adjust the brightness of light emitted by the light source according to the corrected three-dimensional virtual image.
The present invention further provides a near-eye display method, including:
by an illumination module of a near-eye display, outputting an image; and
by an optical modulation element of the near-eye display, performing integral imaging on the image to display a three-dimensional virtual image.
The optical modulation element includes a microlens array or pinhole array; the microlens array or pinhole array is a flat microlens array or flat pinhole array, and alternatively the microlens array or pinhole array is curved microlens array or curved pinhole array. The optical modulation element includes a dynamic microlens array or pinhole array formed of liquid crystal elements.
Preferably, the illumination module includes at least two displays and a beam splitting device;
specifically, the at least two displays are located on two sides of the beam splitting device, and form a predetermined angle with the beam splitting device;
wherein the displays are configured to display an image, and the splitting is configured to conduct an image displayed by the displays.
Preferably, the at least two displays form an angle of 45° with the beam splitting device.
Optionally, when the displays are non-self-luminous panels, the illumination module further includes a light source and a condenser lens; and
the condenser lens is located between the light source and the displays, and light emitted by the light source illuminates the displays by the condenser lens.
Preferably, the beam splitting device is a polarization beam splitting device; and the polarization beam splitting device reflects light of a first polarization direction component, which is from the light source and collimated by the condenser lens, and transmits light of a second polarization direction component orthogonal thereto, light in polarization directions transmitted and reflected being used for illuminating a non-self-luminous panel.
Optionally, when the displays are self-luminous panels, the splitting device transmits and reflects light from the self-luminous panels.
Optionally, when the displays are monochrome self-luminous panels, the splitting device is a band-pass color splitting device; and the band-pass color splitting device reflects light which is from the monochrome self-luminous panels and is in a same color as that of the band-pass color splitting device, and transmits light in other colors.
Preferably, the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, a predetermined proportion of pixels are interleaved in horizontal and vertical directions.
Further preferably, when there are n displays, the n displays switch a displayed image in a frequency greater than n×30Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions.
Preferably, the optical modulation element is located in the illumination module at a first predetermined distance from each display.
Preferably, the display apparatus further includes an optical conduction unit, the optical conduction unit including at least one lens; and the optical conduction unit conducts an image displayed by the displays to a position with a predetermined distance from the optical modulation element, on which integral imaging is performed by the optical modulation element.
Optionally, when the optical conduction unit includes two lenses, at least one display is located at one focal length from one lens, and the distance between the two lenses is twice the focal length; and the optical modulation element is located at a second predetermined distance from an imaging position of an image after being conducted by the optical conduction unit.
Optionally, when the optical conduction unit includes one lens, at least one display is located twice focal length from the lens; and the optical modulation element is located at a third predetermined distance from an imaging position of an image after being conducted by the optical conduction unit. The present invention provides a near-eye display apparatus. For example, the near-eye display apparatus can be a near-eye display apparatus for virtual reality. The near-eye display apparatus presents a natural three-dimensional object to eyes by the integral imaging display principle. The visual fatigue problem caused by long-term viewing a three-dimensional stereoscopic image is eliminated. It is particularly important for application scenarios where this apparatus is required to be worn long-term. Further, if the near-eye display apparatus is a near-eye display apparatus for augmented reality, the near-eye display apparatus can, according to the matching degree of an external real image and a three-dimensional virtual image acquired, correct, adjust and render the three-dimensional virtual image in real time. The functionality of the near-eye display apparatus is perfected. Still further, the near-eye display apparatus can improve the quality of display by multi-display-screen multiplexing, and meanwhile, improve the light efficiency of the polarization beam splitter. Meanwhile, the above solutions as provided in the present invention just make minor modification to the existing systems, and hence will not influence the system compatibility. Moreover, the implementations of these solutions as provided are both simple and highly effective.
Additional aspects and advantageous of the present invention will be appreciated and become apparent from the descriptions below, or will be well learned from the practice of the present invention.
The present invention provides a near-eye display apparatus and a near-eye display method that solve the problem that visual fatigue of eyes is easily caused when displaying a three-dimensional virtual object.
The above and/or additional aspects and advantages of the present invention will become apparent from and be more readily appreciated from the following description of embodiments taken with reference to the accompanying drawings, in which:
Fig. 1 is a schematic device structure diagram of an existing display technology according to the present invention;
Fig. 2 is a schematic apparatus diagram of a virtual reality display apparatus according to a first embodiment of the present invention;
Fig. 3 is a schematic apparatus diagram of an augmented reality display apparatus according to the present invention;
Fig. 4 is a schematic apparatus diagram of another augmented reality display apparatus according to the present invention;
Fig. 5a is a schematic structure diagram of a first implementation of an illumination module according to the present invention;
Fig. 5b is a schematic structure diagram of a second implementation of an illumination module according to the present invention;
Fig. 5b is a schematic structure diagram of a third implementation of an illumination module according to the present invention;
Fig. 5d is a schematic structure diagram of a fourth implementation of an illumination module according to the present invention;
Fig. 6a is a schematic structure diagram of a first implementation of an optical conduction unit according to the present invention;
Fig. 6b is a schematic structure diagram of a second implementation of an optical conduction unit according to the present invention;
Fig. 7 is a schematic structure diagram of a planar lens array according to the present invention;
Fig. 8 is a schematic structure diagram of a curved lens array according to the present invention;
Fig. 9 is a schematic structure diagram of a pinhole array according to the present invention;
Fig. 10 is a schematic diagram when eyes are viewing a two-dimensional image;
Fig. 11 is a schematic diagram when eyes are viewing a three-dimensional image;
Fig. 12 is a schematic diagram when eyes are viewing an integral imaging display according to the present invention;
Fig. 13 is a schematic diagram of binocular near-eye light field display according to the present invention;
Fig. 14 shows an augmented reality display apparatus according to a second embodiment of the present invention;
Fig. 15 shows an augmented reality display apparatus according to a third embodiment of the present invention;
Fig. 16 shows an augmented reality display apparatus according to a fourth embodiment of the present invention;
Fig. 17 is a schematic flowchart of three-dimensional augmented reality according to one embodiment of the present invention;
Fig. 18 is a schematic flowchart of a three-dimensional augmented reality engine according to one preferred embodiment of the present invention; and
Fig. 19 is a schematic flowchart of three-dimensional light field rendering according to one preferred embodiment of the present invention.
Embodiments of the present invention will be described in detail hereafter. The examples of these embodiments have been illustrated in the drawings throughout which same or similar reference numerals refer to same or similar elements or elements having same or similar functions. The embodiments described hereafter with reference to the drawings are illustrative, merely used for explaining the present invention and should not be regarded as any limitations thereto.
It should be understood by those skill in the art that singular forms "a", "an", "the", and "said" may be intended to include plural forms as well, unless otherwise stated. It should be further understood that terms "include/including" used in this specification specify the presence of the stated features, integers, steps, operations, elements and/or components, but not exclusive of the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or combinations thereof. It should be understood that when a component is referred to as being "connected to" or "coupled to" another component, it may be directly connected or coupled to other elements or provided with intervening elements therebetween. In addition, "connected to" or "coupled to" as used herein may include wireless connection or coupling. As used herein, term "and/or" includes all or any of one or more associated listed items or combinations thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by those skill in the art to which the present invention belongs. It shall be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meanings in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. It is to be noted that, the near-eye display apparatus of the present invention includes a virtual reality display apparatus or an augmented reality display apparatus.
In one embodiment of the present invention, the display apparatus includes an illumination module and an optical modulation element, wherein the illumination module is configured to output an image; and the optical modulation element is configured to perform integral imaging on the image to display a three-dimensional virtual image. In this embodiment, a natural three-dimensional object is presented to eyes by the integral imaging display principle. The visual fatigue problem caused by long-term viewing a three-dimensional stereoscopic image is eliminated.
Preferably, the display apparatus further includes a reflecting element; and the reflecting element is located in a light path direction of the illumination module to guide the three-dimensional virtual image displayed by the optical modulation element to eyes. The reflecting element includes a reflector or a beam splitter.
Fig. 2 shows a virtual reality display apparatus according to a first embodiment of the present invention. The display apparatus includes an illumination module, an optical modulation element and a reflector. In this embodiment, the optical modulation element processes an image displayed by the displays into a virtual object light field of a three-dimensional virtual image, and projects the virtual object light field to eyes by the beam splitter. By this embodiment, the eyes can observe a virtual three-dimensional object, so as to realize the virtual reality display.
Preferably, the optical modulation element is located in the illumination module at a first predetermined distance from each display.
It is to be noted that, the structure of the virtual reality display apparatus and the near-eye display method as shown in Fig. 2 are both applicable to an augmented reality display apparatus. Here, the beam splitting element can be a beam splitter.
Preferably, the near-eye display apparatus further includes an optical conduction unit, the optical conduction unit including at least one lens; and the optical conduction unit conducts an image displayed by the displays to a position with a predetermined distance from the optical modulation element, on which integral imaging is performed by the optical modulation element. Preferably, as shown in Fig. 3, when the near-eye display apparatus is an augmented reality display apparatus, the display apparatus further includes a correction module, and the reflecting element is a beam splitter;
the beam splitter splits light of a three-dimensional virtual image and an external real image into two paths which are guided to the eyes and to the correction module, respectively; and
the correction module is configured to, based on a three-dimensional virtual image and an external real image conducted by the beam splitter, correct the three-dimensional virtual image, and display the corrected three-dimensional virtual image by the illumination module.
Preferably, as shown in Fig. 4, the correction module includes an image capturing unit, a correction unit and an image rendering unit; and the image capturing unit can be a camera.
The image capturing unit is configured to acquire a three-dimensional virtual image and an external real image from the beam splitter; the correction unit is configured to analyze the three-dimensional virtual image and the external real image, and correct the three-dimensional virtual image according to the result of analysis; and the image rendering unit is configured to render the corrected three-dimensional virtual image.
After replacing the beam splitter in the augmented reality display apparatus of Fig. 3 with a reflector, or after replacing the beam splitter of Fig. 4 with a reflector and removing the image capturing unit, the augmented reality display apparatus can become a virtual reality display apparatus.
Further preferably, the correction module further includes a light source control unit; and the light source control unit is configured to adjust the brightness of light emitted by the light source according to the corrected three-dimensional virtual image. For example, when the displays are self-luminous displays, an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device, and the optical conduction unit images the image displayed by the displays to a position with a predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display. The displayed virtual object light field enters the eyes and the image capturing unit by the beam splitter, respectively. The eyes and the image capturing unit can acquire an external real image, simultaneously. The image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed. The three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the three-dimensional virtual image is adjusted in real time.
For another example, when the displays are non-self-luminous displays, light for illumination emitted by the light source illuminates the displays by the splitting device, and the optical conduction unit images an image displayed by the displays to a position with a second predetermined distance from the optical modulation element so as to form integral imaging display. The optical modulation element processes the image displayed by the displays into a three-dimensional virtual image for integral imaging display. The virtual object light field of the three-dimensional virtual image enters the eyes and the image capturing unit by the beam splitter, respectively. The eyes and the image capturing unit can acquire an external real image, simultaneously. The image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed. The three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the three-dimensional virtual image is adjusted in real time.
The implementations of constituent parts of the first embodiment will be described, respectively.
I. Illumination module
The illumination module includes at least two displays and a beam splitting device; the at least two displays are located on two sides of the beam splitting device, and form a predetermined angle with the splitting device; and preferably, the at least two displays form an angle of 45° with the splitting device.
The displays are configured to display an image, and the beam splitting device is configured to conduct an image displayed by the displays.
Optionally, when the displays are non-self-luminous panels, the illumination module further includes a light source and a condenser lens; and the condenser lens is located between the light source and the displays, and light emitted by the light source illuminates the displays by the condenser lens.
Optionally, when the displays are self-luminous panels, for example, organic light-emitting diode (OLED) displays, the illumination module requires no additional light source, and the beam splitting device transmits and reflects light from the self-luminous panels.
Optionally, when the displays are monochrome self-luminous panels, the beam splitting device is a band-pass color beam splitting device; and the band-pass color beam splitting device reflects light which is from the monochrome self-luminous panels and is in a same color as that of the band-pass color beam splitting device, and transmits light in other colors.
Preferably, the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, a predetermined proportion of pixels are interleaved in horizontal and vertical directions. For example, the at least displays switch a displayed image in a frequency greater than a predetermined refresh frequency, and for an image displayed by each display, 1/2 pixels are interleaved in horizontal and vertical directions. In this case, the resolution can be doubled.
Further preferably, when there are n displays, the n displays switch a displayed image in a frequency greater than n×30Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions. For example, n displays switch a displayed image in a frequency of n×60Hz, so that the eyes will not sense any flicker.
The illumination module in this embodiment includes any one of the following implementation solutions.
Solution 1: The illumination module uses a plurality of displays which are placed around the beam splitting device and form a predetermined angle, for example, an angle of 45°, with the beam splitting device. As shown in Fig. 5a, there can be two displays, and the beam splitting device can be a polarization beam splitter. The polarization beam splitting device reflects light of a first polarization direction component, which is from the light source and collimated by the condenser lens, and transmits light of a second polarization direction component orthogonal thereto, light in polarization directions transmitted and reflected being used for illuminating a non-self-luminous panel.
Light for illumination from the light source is passed through the condenser lens and then collimated to parallel light, and the parallel light is irradiated to the polarization beam splitter at an angle of 45° with the polarization beam splitter. The polarization beam splitter reflects, in the light for illumination, light of a first polarization state to the display 1 to provide light for illumination to the display 1, and transmits light of a second polarization state to the display 2 to provide light for illumination to the display 2. Light of the first polarization state is reflected by the display 1 to become light of the second polarization state which, carrying displayed image information, for example, an image displayed by the display 1, is then transmitted by the polarization beam splitter. Light of the second polarization state is reflected by the display 2 to become light of the first polarization state which, carrying displayed image information, for example, an image displayed by the display 2, is then reflected by the polarization beam splitter.
In this embodiment, by the two displays and the polarization beam splitter, light for illumination from the light source will not be reflected back to the light source or scattered, all light is used to irradiate the displays and provided to the subsequent light path, so that the utilization light efficiency for illumination is effectively improved. Compared with the method for diffusing part of light for illumination by using a diffusing film in the prior art, the loss of light is reduced, the loss of energy is decreased, and the life of the apparatus battery can be effectively prolonged.
The time-division multiplexing of two displays is specifically as follows: image contents are displayed by two displays; and the two displays switch a displayed image in a frequency greater than a predetermined refresh frequency; the predetermined refresh frequency is far higher than a refresh frequency that can be distinguished by eyes, for example, the predetermined refresh frequency is greater than 2×30Hz =60Hz; and the displayed image is interleaved in horizontal and vertical directions by the pixel distance of half a display. In this way, during synthesis in the eyes, the resolution of the displayed image will be increased by two times. The time-division multiplexing of n displays is similar to the time-division multiplexing of two displays. The predetermined refresh frequency of each panel is greater than n×30 Hz, and for an image displayed by each display, 1/n pixels are interleaved in horizontal and vertical directions. Thus, the resolution of the image will be increased by n times.
In this solution, on one hand, increasing the resolution of the displayed image by time-division multiplexing of n displays can provide excellent quality of display, and on the other hand, light for illumination is fully utilized and thus the utilization of light efficiency for illumination is improved.
Solution 2: The illumination module uses a plurality of displays which are placed around the beam splitting device and form an angle of 45° with the beam splitting device. The beam splitting device only uses a common beam splitter instead of any polarization beam splitter. As shown in Fig. 5b, there can be two displays, and the beam splitting device can be a common beam splitter.
Light for illumination from the light source is passed through the condenser lens and then collimated to parallel light, and the parallel light is irradiated to the beam splitter at an angle of 45° with the beam splitter. Part of light is reflected to the display 1 to provide light for illumination to the display 1, and the other part of light is transmitted to the display 2 to provide light for illumination to the display 2.Light for illumination, after being reflected by the display 1 and carrying displayed image information of an image, is transmitted by the beam splitter to enter the optical conduction unit; and the other part of light, after being reflected by the display 2 and carrying displayed image information of an image, is reflected by the beam splitter to enter the optical conduction unit. Like the time-division multiplexing used in Solution 1, the solution of the displayed image will be increased. However, the utilization of light efficiency for illumination will be lower than that of Solution 1.
Solution 3: The illumination module uses a plurality of OLED panels which are placed around the beam splitting device and form an angle of 45° with the beam splitting device. As shown in Fig. 5c, there are two OLED panels, and the splitting element is a common beam splitter.
Light, carrying displayed image information of an image, emitted by the two OLED panels is reflected and transmitted by a beam splitter, respectively, to enter the optical conduction unit. By one scattering surface, light, which is emitted by the OLED panels and reflected and transmitted onto the scattering surface by the beam splitter without being utilized, is scattered. As a result, light will not enter the conduction light path. The generation of background noise in the displayed image is avoided.
In this implementation, by using two OLED panels, the resolution of the display apparatus is increased.
Solution 4: The illumination module uses three monochrome OLED panels, for example, red, green and blue OLED panels, which are placed around the splitting element and form an angle of 45° with the beam splitting device. As shown in Fig. 5d, the three monochrome OLED panels emit red, green and blue light, respectively; and the beam splitting device is a band-pass green and blue color beam splitter which reflects green and blue light and transmits light in other colors.
Monochrome images displayed by the three monochrome OLED panels form a color image by fusion, and the color image enters the optical conduction unit. The resolution of the color image, which is formed by the monochrome images displayed by the three monochrome OLED panels by fusion, is increased. Theoretically, the resolution is increased by three times when compared with a single display. With the use of the color splitting element, the utilization efficiency of light can be improved, and the apparatus energy loss can be further reduced.
II. Optical conduction unit
The optical conduction unit in this embodiment can include one of the following solutions. Solution 1: When the optical conduction unit includes two lenses, at least one display is located at one focal length from one lens, and the distance between the two lenses is twice the focal length; and the optical modulation element is located at a second predetermined distance from an imaging position of an image after being conducted by the optical conduction unit. That is, it can be a 4f optical system formed of two lenses, where the displays are located at one focal length from the front lens, the distance between the two lenses is twice the focal length, and the imaging surface is located at one focal length from the rear lens.
For example, when the displays are self-luminous displays, an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device, and when the displays are non-self-luminous displays, light for illumination illuminates the displays by the beam splitting device; and the optical conduction unit images the image displayed by the displays to a position with a second predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display. That is, the optical modulation element processes an image displayed by the displays into a virtual object light field of a three-dimensional virtual image, and projects the virtual object light field to eyes. In this embodiment, the optical conduction unit images an image displayed by the displays to a position near the optical modulation element, so that the displays and the optical modulation element form an integral imaging display device. With such a structure, the whole system becomes simpler, the use of a plurality of optical modulation elements is avoided, and the cost is reduced. Furthermore, the distance between the optical modulation element and the eyes is shortened, which is helpful for increasing the visual angle of users.
Meanwhile, the solution of integral imaging display can be increased by time-division multiplexing of a plurality of displays. The specific implementation is as described above. The integral imaging display contents are captured by the eyes and the camera by the beam splitter, respectively, and an image of the external real scene can also be captured by the eyes and the camera. In this way, the augmented reality display where a virtual object and a real scene are superposed is realized.
Solution 2: When the optical conduction unit includes one lens, at least one display is located twice focal length from the lens; and the optical modulation element is located at a third predetermined distance from an imaging position of an image after being conducted by the optical conduction unit. For example, the optical conduction unit is formed of one lens. The displays are located twice focal length from the lens, the imaging side is also located twice focal length from the lens at a position close to the microlens array, as shown in Fig. 6b.
It is to be noted that, the structure of the display apparatus for augmented reality as shown in this embodiment can be applicable to a virtual reality display apparatus. Specifically, after replacing the beam splitter in the augmented reality display apparatus of Fig. 6a and Fig. 6b with a reflector, the augmented reality display apparatus can become a virtual reality display apparatus.
III. Optical modulation element
Solution 1: The optical modulation element is a microlens array.
An image displayed by the displays is imaged by the optical conduction unit to a position near the microlens array to form an integral imaging display system. In integral imaging display, many elemental views will be displayed in the displays, each elemental view will be imaged by a corresponding microlens, and a three-dimensional object light field is formed in the space. A real three-dimensional object can be perceived by capturing this three-dimensional object light field by the eyes.
In this embodiment, the lens array is a planar lens array. The angle of view in integral imaging display will be limited by the area of each elemental view in the displays. Usually, every microlens corresponds to one display area in the displays. In order to prevent superposition of images, the display contents beyond this display area will be abandoned. As shown in Fig. 7, the elemental views cannot be completely displayed in the display areas corresponding to the marginal lens. Thus, the number of corresponding elemental views is limited, and the integrated image will not be observed beyond the angle of view.
Preferably, this problem can be solved by replacing the planar lens array with a curved lens array. The display areas corresponding to the marginal microlens in the curved lens array will be increased, so that the elemental views can be completely displayed. Accordingly, the corresponding elemental views can be increased, and thus the angle of view will be greatly increased, as shown in Fig. 8.
Solution 2: The optical modulation element is a pinhole array.
In the present invention, the microlens array can be replaced by a pinhole array, as shown in Fig. 9, to form an integral imaging display system.
Both the pinhole array and the microlens array have the function of conducting light at a specific position in a specific direction. The microlens array or pinhole array in the present invention can be a dynamic liquid crystal microlens or liquid crystal pinhole array formed of liquid crystal elements. In this way, by controlling the liquid crystal elements, the microlens arrays or pinhole arrays in part or all of areas have a refraction function or have no refraction function. When they have no refraction function, those arrays can become transparent elements, so that the switch between two-dimensional display and three-dimensional display or the mixed display of two-dimensional and three-dimensional objects can be realized.
In an augmented reality head-mounted display apparatus, displaying a planar two-dimensional object can no longer meet people’s demand. If a three-dimensional object is to be displayed, a display apparatus can be worn on eyes to project two images with parallax from each other, respectively, to form stereoscopic vision. In the existing three-dimensional stereoscopic display, images with parallax from each other are generally displayed in the respective screens of left and right eyes. Such images will form three-dimensional stereoscopic vision after being processed by the human brain. However, such a way will result in contradiction of the focus adjustment and the convergence adjustment of the eyes. As a result, long-term wearing will make eyes become tired. As shown in Fig. 10, when viewing a two-dimensional object on the screen, the focus distance and the convergence distance of each eye are consistent. Therefore, long-term viewing will not make eyes become tired. However, when viewing a three-dimensional object, since the two eyes are viewing images having parallax from each other, people will adjust their eyes to focus on the screen in order to view a clear image. Due to the parallax, the human brain will process the image, so that the three-dimensional image has a certain distance from the screen, and as a result, the focus distance and the convergence distance are not consistent, as shown in Fig. 11. At this time, the adjustment of the eyes will make the eyes converged to this three-dimensional image. Since the focus of the eyes is adjusted to the display screen while the convergence of the eyes is focused on the space point, the eyes will keep adjusting and adapting. Thus, visual fatigue of eyes will be caused by long-term wearing such augmented reality display apparatuses.
In the present invention, with regard to integral imaging display, by a microlens array or pinhole array, the reconstructed three-dimensional object is formed of many point light sources in the space. Those point light sources are formed by the convergence of images displayed in the displays by the refraction function of the microlens array. Those point light sources form an object light field distribution that really exists in the three-dimensional space, just like the eyes view a real object, as shown in Fig. 12.Integral imaging display allows for complete matching of the focus and the convergence of eyes, without causing visual fatigue by long-term wearing such an apparatus. In the present invention, by the integral imaging display principle, any three-dimensional object viewed by each eye is formed of the real three-dimensional object light field, just like the external real scene viewed by the eyes. Thus, natural three-dimensional display is realized, the contradiction of the focus and the convergence is relieved, and visual fatigue caused by long-term wearing a display apparatus is avoided. Therefore, it is helpful for the health of eyes of the viewers.
Fig. 13 shows a case in which the present invention is applied to binocular near-eye light field display. Since the continuous convergence adjustment is provided within a certain depth of field range, such binocular display can solve the contradiction of the focus adjustment and the convergence adjustment as in common binocular stereoscopic display. It is to be noted that, after replacing the beam splitter in the augmented reality display apparatus of Fig. 13 with a reflector and removing the camera, the augmented reality display apparatus can become a virtual reality display apparatus.
It is to be noted that, the structure of the near-eye display apparatus and the near-eye display method as shown in the first embodiment of the present invention can be applicable to an augmented reality display apparatus and a virtual reality display apparatus.
Fig. 14 shows an augmented reality display apparatus according to a second embodiment of the present invention. The display apparatus includes an illumination module, an optical conduction unit and an optical modulation unit, wherein the optical modulation element includes a microlens array or pinhole array.
The illumination module includes at least two displays and a beam splitting device, and the beam splitting device includes a polarization beam splitter.
When the displays are self-luminous displays, for example, when the displays are OLED displays, the illumination module requires no additional light source. When the displays are non-self-luminous panels, the illumination module further includes a light source and a condenser lens.
When the displays are self-luminous displays, the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display. The optical modulation element processes an image displayed by the displays to a virtual object light field; and the virtual object light field is conducted by the optical conduction unit, that is, conducted by a relay optical system, to be projected to the eyes. By this embodiment, the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
When the displays are non-self-luminous displays, light for illumination is irradiated to the displays by the beam splitting device, and the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display. The optical modulation element processes an image displayed by the displays to a virtual object light field; and the virtual object light field is conducted by the optical conduction unit, that is, conducted by a relay optical system, to be projected to the eyes. By this embodiment, the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
In this embodiment, the two displays switch a displayed image in a frequency greater than a predetermined refresh frequency; the predetermined refresh frequency is far higher than a refresh frequency that can be distinguished by eyes; and the displayed image is interleaved in horizontal and vertical directions by the pixel distance of half a display. That is, the solution is increased by time-division multiplexing of the displays. For example, if the pixel positions of the two displays are displaced by half a pixel, the resolution can be doubled.
This apparatus further includes a correction module, wherein the correction module includes an image capturing unit for example a camera, a beam splitter, a correction unit, an image rendering unit and a light source control unit.
When the displays are self-luminous displays, the optical modulation element is directly in the illumination module at a first predetermined distance from the displays to form integral imaging display; the optical modulation element processes an image displayed by the displays to a virtual object light field; and the virtual object light field is conducted by the optical conduction unit to enter the eyes and the image capturing unit by the beam splitter, respectively. The eyes and the image capturing unit can acquire an external real image, simultaneously. The image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed. The three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
When the displays are non-self-luminous displays, light for illumination illuminates the displays by the beam splitting device; the optical modulation element is directly located in the illumination unit at a first predetermined distance from the displays to form integral imaging display; the optical modulation element processes an image displayed by the displays to a virtual object light field of a three-dimensional virtual image; and the virtual object light field is conducted by the optical conduction unit to enter the eyes and the image capturing unit by the beam splitter, respectively. The eyes and the image capturing unit can acquire an external real image, simultaneously. The image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed. The three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
As the implementation solutions of the illumination module in the second embodiment of the present invention, the above-mentioned implementation solutions of the illumination module in the first embodiment of the present invention can be used.
As the implementation solutions of the optical conduction unit in the second embodiment of the present invention, the above-mentioned implementation solutions of the optical conduction unit in the first embodiment of the present invention can be used.
It is to be noted that, the structure of the augmented reality display apparatus as shown in second embodiment can be applicable to a virtual reality display apparatus. Specifically, after replacing the beam splitter in the augmented reality display apparatus of Fig. 14 with a reflecting mirror and removing the camera, the augmented reality display apparatus can become a virtual reality display apparatus.
Fig. 15 shows a virtual reality display apparatus according to a third embodiment of the present invention. The display apparatus includes an illumination module and an optical modulation unit, wherein the optical modulation element includes a microlens array or pinhole array.
The illumination module includes at least two displays and a beam splitting device, and the beam splitting device includes a polarization beam splitter.
When the displays are self-luminous displays, for example, when the displays are OLED displays, the illumination module requires no additional light source. When the displays are non-self-luminous panels, the illumination module further includes a light source and a condenser lens.
When the displays are self-luminous displays, the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display. The optical modulation element processes an image displayed by the displays to a virtual object light field, and further projects the virtual object light field to the eyes. By this embodiment, the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
When the displays are non-self-luminous displays, light for illumination is irradiated to the displays by the beam splitting device, and the optical modulation element is located in the illumination module at a first predetermined distance from each display to form integral imaging display. The optical modulation element processes an image displayed by the displays to a virtual object light field, and further projects the virtual object light field to the eyes. By this embodiment, the eyes can observe a virtual three-dimensional object superposed in the real scene, so as to realize the augmented reality display.
In this embodiment, the two displays switch a displayed image in a frequency greater than a predetermined refresh frequency; the predetermined refresh frequency is far higher than a refresh frequency that can be distinguished by eyes; and the displayed image is interleaved in horizontal and vertical directions by the pixel distance of half a display. That is, the solution is increased by time-division multiplexing of the displays. For example, if the pixel positions of the two displays are displaced by half a pixel, the resolution can be doubled.
Compared with the first embodiment and the second embodiment of the present invention, the optical conduction unit is omitted in this embodiment, so that the apparatus becomes more compact. This display apparatus can be used in some cases having limited space dimension.
This display apparatus further includes a correction module, wherein the correction module includes an image capturing unit for example a camera, a beam splitter, a correction unit, an image rendering unit and a light source control unit.
When the displays are self-luminous displays, the optical modulation element is located in the illumination module at a first predetermined distance from the displays directly to form integral imaging display; the optical modulation element processes an image displayed by the displays to a virtual object light field of a three-dimensional virtual image; and the virtual object light field enters the eyes and the image capturing unit by the beam splitter, respectively. The eyes and the image capturing unit can acquire an external real image, simultaneously. The image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed. The three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
When the displays are non-self-luminous displays, light for illumination illuminates the displays by the beam splitting device; the optical modulation element is directly located in the illumination unit at a first predetermined distance from the displays to form integral imaging display; the optical modulation element processes an image displayed by the displays to a virtual object light field of a three-dimensional virtual image; and the virtual object light field enters the eyes and the image capturing unit by the beam splitter, respectively. The eyes and the image capturing unit can acquire an external real image, simultaneously. The image capturing unit can receive image contents the same as what are observed by the eyes, i.e., an augmented reality display image where the three-dimensional virtual image and the external real image are superposed. The three-dimensional virtual image is corrected by the correction unit, and the corrected three-dimensional virtual image is re-rendered in real time by the image rendering unit, so that the displayed virtual image is adjusted in real time.
As the implementation solutions of the illumination module in the third embodiment of the present invention, the above-mentioned implementation solutions of the illumination module in the first embodiment of the present invention can be used.
It is to be noted that, the structure of the virtual reality display apparatus as shown in third embodiment can be applicable to an augmented reality display apparatus. Specifically, after replacing the beam splitter in the augmented reality display apparatus of Fig. 15 with a reflector and removing the camera, the augmented reality display apparatus can become a virtual reality display apparatus.
Fig. 16 shows a virtual reality display apparatus according to a fourth embodiment of the present invention. The display apparatus includes an illumination module and an optical modulation element. Optionally, when the viewing position of the eyes is on the light path of the outgoing light of the illumination module, no other elements are required. Optionally, when the viewing position of the eyes forms an angle with the light path of the outgoing light of the illumination module, a reflector is required. Fig. 16 shows an arrangement mode in which the viewing position of the eyes forms an angle of 90° with the light path of the outgoing light of the illumination module; an image from the illumination module is formed into a three-dimensional virtual image for integral imaging display by the optical modulation element; and the reflector is placed at an end opposite to the illumination module to guide the three-dimensional virtual image to the eyes. The display apparatus further includes a correction module. The correction module includes an image capturing unit, an image rendering unit and a light source control unit; the image capturing unit is configured to acquire a three-dimensional virtual image from a reflector; the image rendering unit is configured to correct and then render the three-dimensional virtual image, and conduct the three-dimensional virtual image by the illumination module; and the light source control unit is configured to adjust the brightness of light emitted by the light source according to the corrected three-dimensional virtual image.
That is, on the basis of the three embodiments of the present invention, by removing the image capturing unit and the image correction unit and by replacing the beam splitter by the reflector, the virtual object light field of the three-dimensional image information is captured by the eyes only. The following description will be given by taking the solution reformed from the first embodiment as an example. The second embodiment and the third embodiment can also be similarly reformed to obtain a similar virtual reality display apparatus.
As shown in Fig. 16, the display apparatus includes an illumination module, an optical conduction and an optical modulation unit, wherein the optical modulation element includes a microlens array or pinhole array.
The illumination module includes at least two displays and a beam splitting device, and the beam splitting device specifically includes a polarization beam splitter.
When the displays are self-luminous displays, for example, when the displays are OLED displays, the illumination module requires no additional light source. When the displays are non-self-luminous panels, the illumination module further includes a light source and a condenser lens.
For example, when the displays are self-luminous displays, an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device; the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element, so as to form integral imaging display; and the optical modulation element processes an image displayed by the displays into a virtual object light field, and projects the virtual object light field to eyes. By this embodiment, the eyes can observe a virtual three-dimensional object, so as to realize the virtual reality display.
For example, when the displays are non-self-luminous displays, light for illumination illuminates the displays by the beam splitting device; the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display; and the optical modulation element processes an image displayed by the displays into a virtual object light field, and projects the virtual object light field to eyes. By this embodiment, the eyes can observe a virtual three-dimensional object, so as to realize the virtual reality display.
In this embodiment, the two displays switch a displayed image in a frequency greater than a predetermined refresh frequency; the predetermined refresh frequency is far higher than a refresh frequency that can be distinguished by eyes; and the displayed image is interleaved in horizontal and vertical directions by the pixel distance of half a display. That is, the resolution is increased by time-division multiplexing of the displays. For example, if the pixel positions of the two displays are displaced by half a pixel, the resolution can be doubled.
The display apparatus includes a correction module; and the correction module includes an image capturing unit, a reflector, an image rendering unit and a light source control unit.
When the displays are self-luminous displays, an image displayed by the displays is transferred to the optical conduction unit by the beam splitting device; the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element so as to form integral imaging display; and the optical modulation element processes an image displayed by the displays into a virtual object light field of a three-dimensional virtual image, and the displayed virtual object light field enters the eyes by the reflector. Real-time rendering is performed by the image rendering unit, so as to adjust the displayed virtual object in real time.
When the displays are non-self-luminous displays, light for illumination illuminates the displays by the beam splitting device; the optical conduction unit images the image displayed by the displays to a position having a second predetermined distance from the optical modulation element, so as to form a three-dimensional virtual image for integral imaging display; and the optical modulation element processes an image displayed by the displays into a virtual object light field, and the displayed virtual object light field enters the eyes by the beam splitter. Real-time rendering is performed by the image rendering unit, so as to adjust the displayed virtual object in real time. As the implementation solutions of the illumination module in the fourth embodiment of the present invention, the above-mentioned implementation solutions of the illumination module in the first embodiment of the present invention can be used.
As the implementation solutions of the optical conduction unit in the fourth embodiment of the present invention, the above-mentioned implementation solutions of the optical conduction unit in the first embodiment of the present invention can be used.
In another preferred embodiment of the present invention, the basic drawing flowchart of the three-dimensional augmented reality will be described in detail. As shown in Fig. 17, the image capturing unit acquires a superposed image of an external real image and a three-dimensional virtual image; then, inputs the captured image to a processing unit to detect a new salient object and track the original object; then, identifies the new salient object; check the new salient object by GPS and the movement direction; then, generates a three-dimensional virtual image model for the new salient object, including texts or images; adjusting the position of the three-dimensional virtual image in real time according to the tracking data; calculates the contrast of the region of the original object; adjusting the color, size and shape of the three-dimensional virtual image according to the contrast; further performing three-dimensional image rendering on all three-dimensional virtual images; and superposing the rendered three-dimensional virtual images into the real scene by the display unit.
Fig. 18 is a schematic flowchart of a three-dimensional augmented reality engine according to one preferred embodiment of the present invention. A user can select a proper three-dimensional augmented mode as needed. The three-dimensional augmented modes include the above-mentioned basic drawing mode, an intelligent gaze tracking mode or a specific category mode. In the basic drawing mode, all salient objects in a scene are to be detected and identified, and then input to a drawing module for self-adaptive drawing. In the intelligent gaze tracking mode, the system will identify the gazing direction of the current user, determine a region to be augmented in the scene according to the gazing direction, and then detect the salient object by which this region is identified and perform self-adaptive augmented display. In the specific category display mode, a user can set a category of interest, for example, hotel, cinema, bank, attraction in a scene, and then detect contents of this specific category in the scene according to the selection of the user and perform self-adaptive augmented display. In addition to the three modes mentioned above, other augmented modes can be designed. The present invention provides a plurality of three-dimensional augmented modes for users to choose. The mode selection interface can be selection by a menu or by a quick command. For example, a three-dimensional augmented mode is set by a predetermined quick voice command.
In another preferred embodiment of the present invention, a basic rendering flow of a three-dimensional light field is provided. In practical applications, two above-mentioned near-eye apparatus (also can be near-eye light field displays) are required. The two near-eye light field displays respectively correspond to displays for left and right eyes in AR displays or VR displays, and are respectively used for displaying pictures seen by the left and right eyes. Thus, it is required to adjust the rendering contents of the two near-field displays in an associated manner. During the specific implementation, the position and angle of the rendering contents of the two near-eye light field displays can be adjusted based on the distance between the two near-eye light field displays. As shown in Fig. 19, a display apparatus having two near-eye light field displays mentioned above determines three-dimensional model information to be superposed, and acquires parameters of each near-eye light field display, wherein the acquired parameters include at least one of the following: pixel size of each two-dimensional panel, the pitch of the microlens array, the gap between the panel and the microlens, or more. The display apparatus further acquires the distance between the near-eye light field displays for the left and right eyes, so as to match the papillary distance of the viewers .The display apparatus sets the positions of left and right virtual cameras in a three-dimensional drawing engine according to the distance between the left and right near-eye light field displays, puts a three-dimensional mode to be drawn into the three-dimensional drawing engine, and respectively sets parameters of the virtual cameras according to the parameters of each near-eye light field display. The set parameters include resolution, angle of view or more. Then, respectively with regard to the left and right virtual cameras, images at a plurality of angles are drawn according to the pixel distribution under the microlens, the images at a plurality of angles of the left and right cameras are interleaved and fused to generate two elemental image arrays, and the interleaved and fused left and right elemental image arrays are input to the left and right near-eye light field displays to be displayed.
The foregoing descriptions are merely some implementations of the present invention. It should be noted that, to those skilled in the art, various improvements and modifications may be made without departing from the principle of the present invention, and these improvements and modifications shall be regarded as falling into the protection scope of the present invention.

Claims (15)

  1. A near-eye display apparatus, comprising an illumination module and an optical modulation element;
    the illumination module is configured to output an image; and
    the optical modulation element is configured to perform integral imaging on the image to display a three-dimensional virtual image.
  2. The near-eye display apparatus according to claim 1, wherein the optical modulation element comprising a microlens array or pinhole array.
  3. The near-eye display apparatus according to claim 2, wherein the microlens array or pinhole array is a flat microlens array or flat pinhole array; and alternatively the microlens array or pinhole array is curved microlens array or curved pinhole array.
  4. The near-eye display apparatus according to claim 2 or 3, wherein the optical modulation element comprising a dynamic microlens array or pinhole array formed of liquid crystal elements.
  5. The near-eye display apparatus according to claim 1, wherein the illumination module comprising at least two displays and a beam splitting device;
    the at least two displays are located on two sides of the beam splitting device, and form a predetermined angle with the splitting device;
    wherein the displays are configured to display an image, and the beam splitting device is configured to conduct an image displayed by the displays.
  6. The near-eye display apparatus according to claim 5, wherein the at least two displays form an angle of 45° with the beam splitting device.
  7. The near-eye display apparatus according to claim 5, wherein, when the displays are non-self-luminous panels, the illumination module further comprising a light source and a condenser lens; and
    condenser lens is located between the light source and the displays, and light emitted by the light source illuminates the displays by the condenser lens.
  8. The near-eye display apparatus according to claim 7, wherein the beam splitting device is a polarization beam splitting device; and
    the polarization beam splitting device reflects light of a first polarization direction component, which is from the light source and collimated by the condenser lens, and transmits light of a second polarization direction component orthogonal thereto, light in polarization directions transmitted and reflected being used for illuminating a non-self-luminous panel.
  9. The near-eye display apparatus according to claim 5, wherein, when the displays are self-luminous panels, the beam splitting device transmits and reflects light from the self-luminous panels.
  10. The near-eye display apparatus according to claim 5, wherein, when the displays are monochrome self-luminous panels, the beam splitting device is a band-pass color beam splitting device; and
    the band-pass color beam splitting device reflects light which is from the monochrome self-luminous panels and is in a same color as that of the band-pass color beam splitting device, and transmits light in other colors.
  11. The near-eye display apparatus according to claim 1, further comprising a reflecting element; and
    the reflecting element is located in a light path direction of the illumination module to guide the three-dimensional virtual image to eyes.
  12. The near-eye display apparatus according to claim 1, wherein, when the near-eye display apparatus is a near-eye display apparatus for augmented reality, the near-eye display apparatus further comprising a correction module, and the reflecting element is a beam splitter;
    the beam splitter splits light of a three-dimensional virtual image and an external real image into two paths which are guided to the eyes and to the correction module, respectively; and
    the correction module is configured to, based on the three-dimensional virtual image and the external real image conducted by the beam splitter, perform the correction on the three-dimensional virtual image, and display the corrected three-dimensional virtual image by the illumination module.
  13. The near-eye display apparatus according to claim 12, wherein the correction module comprising:
    an image capturing unit configured to acquire a three-dimensional virtual image and an external real image from the beam splitter;
    a correction unit configured to analyze the three-dimensional virtual image and the external real image, and correct the three-dimensional virtual image according to the result of analysis; and
    an image rendering unit configured to render the corrected three-dimensional virtual image.
  14. The near-eye display apparatus according to claim 13, wherein the correction module further comprising:
    a light source control unit configured to adjust the brightness of light emitted by the light source according to the corrected three-dimensional virtual image.
  15. A near-eye display method, comprising:
    by an illumination module of a near-eye display, outputting an image; and
    by an optical modulation element of the near-eye display, performing integral imaging on the image to display a three-dimensional virtual image.
PCT/KR2017/002910 2016-03-23 2017-03-17 Near-eye display apparatus and near-eye display method WO2017164573A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201610170301.8 2016-03-23
CN201610170301.8A CN107229119A (en) 2016-03-23 2016-03-23 The method that near-eye display device and nearly eye are shown
CN201620228921.8 2016-03-23
CN201620228921.8U CN205787364U (en) 2016-03-23 2016-03-23 Near-eye display device

Publications (1)

Publication Number Publication Date
WO2017164573A1 true WO2017164573A1 (en) 2017-09-28

Family

ID=59899357

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/002910 WO2017164573A1 (en) 2016-03-23 2017-03-17 Near-eye display apparatus and near-eye display method

Country Status (1)

Country Link
WO (1) WO2017164573A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107678166A (en) * 2017-11-22 2018-02-09 深圳创维新世界科技有限公司 Augmented reality display device
CN108333780A (en) * 2018-04-20 2018-07-27 深圳创维新世界科技有限公司 Near-eye display system
CN108333778A (en) * 2018-04-20 2018-07-27 深圳创维新世界科技有限公司 Near-eye display system
CN108375832A (en) * 2018-04-20 2018-08-07 深圳创维新世界科技有限公司 Augmented reality shows optics module and augmented reality display system
CN108398793A (en) * 2018-04-20 2018-08-14 深圳创维新世界科技有限公司 Augmented reality display system
CN112433386A (en) * 2019-08-09 2021-03-02 中山大学 Compact optical structure for light field display
US11249311B2 (en) 2018-03-20 2022-02-15 Seiko Epson Corporation Virtual-image display apparatus
CN114326129A (en) * 2022-02-22 2022-04-12 亿信科技发展有限公司 Virtual reality glasses
EP3958037A4 (en) * 2019-03-21 2022-11-02 BOE Technology Group Co., Ltd. Integrated imaging display system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002154A1 (en) * 2007-01-18 2010-01-07 The Az Bd Of Regents On Behalf Of The Univ. Of Az Polarized head-mounted projection display
US20130162673A1 (en) * 2011-12-23 2013-06-27 David D. Bohn Pixel opacity for augmented reality
US20130215235A1 (en) * 2011-04-29 2013-08-22 Austin Russell Three-dimensional imager and projection device
US20140340424A1 (en) * 2013-05-17 2014-11-20 Jeri J. Ellsworth System and method for reconfigurable projected augmented/virtual reality appliance
US20150235455A1 (en) * 2013-11-27 2015-08-20 Magic Leap, Inc. Using polarization modulators for augmented or virtual reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002154A1 (en) * 2007-01-18 2010-01-07 The Az Bd Of Regents On Behalf Of The Univ. Of Az Polarized head-mounted projection display
US20130215235A1 (en) * 2011-04-29 2013-08-22 Austin Russell Three-dimensional imager and projection device
US20130162673A1 (en) * 2011-12-23 2013-06-27 David D. Bohn Pixel opacity for augmented reality
US20140340424A1 (en) * 2013-05-17 2014-11-20 Jeri J. Ellsworth System and method for reconfigurable projected augmented/virtual reality appliance
US20150235455A1 (en) * 2013-11-27 2015-08-20 Magic Leap, Inc. Using polarization modulators for augmented or virtual reality

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107678166A (en) * 2017-11-22 2018-02-09 深圳创维新世界科技有限公司 Augmented reality display device
US11249311B2 (en) 2018-03-20 2022-02-15 Seiko Epson Corporation Virtual-image display apparatus
CN108333780A (en) * 2018-04-20 2018-07-27 深圳创维新世界科技有限公司 Near-eye display system
CN108333778A (en) * 2018-04-20 2018-07-27 深圳创维新世界科技有限公司 Near-eye display system
CN108375832A (en) * 2018-04-20 2018-08-07 深圳创维新世界科技有限公司 Augmented reality shows optics module and augmented reality display system
CN108398793A (en) * 2018-04-20 2018-08-14 深圳创维新世界科技有限公司 Augmented reality display system
CN108333778B (en) * 2018-04-20 2023-10-03 深圳创维新世界科技有限公司 Near-to-eye display system
CN108398793B (en) * 2018-04-20 2023-10-03 深圳创维新世界科技有限公司 Augmented reality display system
CN108375832B (en) * 2018-04-20 2023-10-03 深圳创维新世界科技有限公司 Augmented reality display optical module and augmented reality display system
EP3958037A4 (en) * 2019-03-21 2022-11-02 BOE Technology Group Co., Ltd. Integrated imaging display system
CN112433386A (en) * 2019-08-09 2021-03-02 中山大学 Compact optical structure for light field display
CN114326129A (en) * 2022-02-22 2022-04-12 亿信科技发展有限公司 Virtual reality glasses

Similar Documents

Publication Publication Date Title
WO2017164573A1 (en) Near-eye display apparatus and near-eye display method
WO2021002641A1 (en) Electronic device and method for displaying augmented reality
US6215532B1 (en) Image observing apparatus for observing outside information superposed with a display image
WO2012044130A2 (en) 3d display device using barrier and driving method thereof
EP3914959A1 (en) Electronic device and method for displaying augmented reality
EP0667721B1 (en) Image communication apparatus
WO2020122488A1 (en) Camera-based mixed reality glass apparatus, and mixed reality display method
WO2018139880A1 (en) Head-mounted display apparatus, and method thereof for generating 3d image information
WO2018094928A1 (en) Three-dimensional imaging device, three-dimensional imaging system and three-dimensional imaging method
WO2020004850A1 (en) Wearable smart optical system using hologram optical element
JP2001186442A (en) Video display device
WO2014178509A1 (en) Multi-projection system for extending visual element of main image
EP3225025A1 (en) Display device and method of controlling the same
WO2021010603A1 (en) Near-eye display device, augmented reality glasses including same, and operating method therefor
EP0577268B1 (en) Optical system
WO2018182159A1 (en) Smart glasses capable of processing virtual object
US7226167B2 (en) Autostereoscopic display apparatus
US10764567B2 (en) Display apparatus and method of displaying
US11061237B2 (en) Display apparatus
WO2019156409A1 (en) Floating hologram display device using multilayered display faces and multiple image generation method therefor
WO2022010152A1 (en) Device and method for correcting user's vision and performing calibration
WO2016163783A1 (en) Display device and method of controlling the same
JP3205552B2 (en) 3D image pickup device
WO2013024920A1 (en) Method for processing an image and electronic device for same
WO2017179912A1 (en) Apparatus and method for three-dimensional information augmented video see-through display, and rectification apparatus

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17770544

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17770544

Country of ref document: EP

Kind code of ref document: A1