WO2023130129A2 - Dispositif ayant un affichage 3d transparent optique et ses procédés d'adaptation - Google Patents

Dispositif ayant un affichage 3d transparent optique et ses procédés d'adaptation Download PDF

Info

Publication number
WO2023130129A2
WO2023130129A2 PCT/US2023/021532 US2023021532W WO2023130129A2 WO 2023130129 A2 WO2023130129 A2 WO 2023130129A2 US 2023021532 W US2023021532 W US 2023021532W WO 2023130129 A2 WO2023130129 A2 WO 2023130129A2
Authority
WO
WIPO (PCT)
Prior art keywords
light
microlens
display
spacer
over
Prior art date
Application number
PCT/US2023/021532
Other languages
English (en)
Other versions
WO2023130129A3 (fr
Inventor
Pingfan Wu
Jiechen WANG
Ning Lu
Original Assignee
Futurewei Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futurewei Technologies, Inc. filed Critical Futurewei Technologies, Inc.
Publication of WO2023130129A2 publication Critical patent/WO2023130129A2/fr
Publication of WO2023130129A3 publication Critical patent/WO2023130129A3/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/22Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present invention relates generally to a device having a three- dimensional (3D) display, and in particular embodiments, to techniques and mechanisms for a device having an optical see-through three-dimensional (3D) display and methods for adapting the same.
  • the metaverse is part of the next-generation internet that is real-time, interactive, social, and persistent.
  • a key technology for the metaverse is augmented reality that includes displays that provide real imagery overlaid with generated display items.
  • Optical combiner displays maybe used for augmented reality once such functionality can be produced. As the interest in augmented reality has grown, a need for improved optical combiner displays that support such functionality has emerged.
  • a device includes: a first lightemitting unit including: a first transistor array over a first substrate, the first substrate being flat and opaque to visible light; a first light-emitting diode array over the first transistor array; a first spacer over the first light-emitting diode array, the first spacer including a liquid crystal material having an ordinary refractive index and an extraordinary refractive index; and a first microlens over the first spacer; and a second light-emitting unit spaced apart from the first lightemitting unit, the second light-emitting unit including: a second transistor array over a second substrate, the second substrate being flat and opaque to visible light; a second light-emitting diode array over the second transistor array; a second spacer over the second light-emitting diode array, the second spacer including the liquid crystal material; and a second microlens over the second spacer.
  • the ordinary refractive index is about 1.52 and the extraordinary 7 refractive index is about 1.73.
  • the first light-emitting unit and the second light-emitting unit each have a field of view in a range of 5 degrees to 20 degrees, and the first lightemitting unit and the second light-emitting unit are part of a grid of lightemitting units, the grid of the light-emitting units having from five to twenty of the light-emitting units arranged along a horizontal direction of the grid.
  • the first microlens and the second microlens each have a microlens size, the first microlens being separated from the second microlens by a separation distance, the separation distance being greater than the microlens size.
  • a ratio of the separation distance to the microlens size is in a range of 1 to 6
  • the first microlens and the second microlens are round microlenses in a top-down view.
  • the first microlens and the second microlens are rectangular microlenses in a top-down view.
  • the first microlens and the second microlens are hexagonal microlenses in a top-down view.
  • a first center line through the first light-emitting unit and a second center line through the second light-emitting unit extend through a same point.
  • a device in accordance with another embodiment, includes: an eyeglasses frame; a display panel coupled to the eyeglasses frame, the display panel including: a first light-emitting diode array; a first spacer over the first light-emitting diode array; a second light-emitting diode array; and a second spacer over the second light-emitting diode array; a first microlens over the first spacer of the display panel; and a second microlens over the second spacer of the display panel, the first microlens and the second microlens each having a microlens size, the first microlens being separated from the second microlens by a separation distance, the separation distance being greater than the microlens size.
  • the display panel is curved, the first lightemitting diode array is flat, and the second light-emitting diode array is flat. In some embodiments of the device, the display panel is curved in a lateral direction and in a longitudinal direction. In some embodiments of the device, a ratio of the separation distance to the microlens size is in a range of 1 to 6. In some embodiments of the device, the first microlens and the second microlens are part of a microlens array over the display panel. In some embodiments of the device, the first microlens is part of a first microlens strip over the display panel, and the second microlens is part of a second microlens strip over the display panel. In some embodiments of the device, the first microlens and the second microlens are disposed along a circle. In some embodiments of the device, the circle has a radius in a range of 15 mm to 28 mm.
  • a method includes: obtaining eyeglasses including a three-dimensional display, the three- dimensional display including a plurality of light-emitting units, each of the lightemitting units including: a light-emitting diode array; a liquid crystal spacer over the light-emitting diode array; and a microlens over the liquid crystal spacer; and programming an image depth of the three-dimensional display by programming a refractive index of the liquid crystal spacer of each of the light-emitting units.
  • the eyeglasses include a controller and the image depth is stored in a memory of the controller.
  • the liquid crystal spacer of each of the light-emitting units includes a liquid crystal material having an ordinary refractive index and an extraordinary refractive index.
  • the ordinary refractive index is about 1.52 and the extraordinary refractive index is about 1.73.
  • Figures 1A-1B are views of a pair of eyeglasses, in accordance with some embodiments.
  • Figure 2 is a block diagram of a pair of eyeglasses, in accordance with some embodiments;
  • Figure 3 is a flow diagram of a method for displaying augmented reality content on a three-dimensional display, in accordance with some embodiments;
  • Figure 4 is a schematic diagram of a three-dimensional display during operation, in accordance with some embodiments.
  • Figure 5 is a schematic diagram of three-dimensional displays during display of a 3D image, in accordance with some embodiments.
  • Figures 6A-6B are views of a three-dimensional display, in accordance with some embodiments.
  • Figures 7A-7G are top-down views of three-dimensional displays, in accordance with various embodiments.
  • Figures 8-11 are cross-sectional views of intermediate stages in the manufacturing of a three-dimensional display, in accordance with some embodiments.
  • Figure 12 is a cross-sectional view of an intermediate stage in the manufacturing of a three-dimensional display, in accordance with some embodiments.
  • Figure 13 is a cross-sectional view of an intermediate stage in the manufacturing of a three-dimensional display, in accordance with some embodiments.
  • Figure 14 is a flow chart of a method of manufacturing a three- dimensional display, in accordance with some embodiments.
  • Figure 15 is a schematic diagram of a 3D light-emitting unit during operation, in accordance with some embodiments.
  • Figure 16 is a top-down view of a portion of a pair of eyeglasses, in accordance with some embodiments
  • Figure 17 is a top-down view of a three-dimensional display, in accordance with some embodiments.
  • Figures 18-22 are cross-sectional views of intermediate stages in the manufacturing of a three-dimensional display, in accordance with some embodiments.
  • Figure 23 is a cross-sectional view of an intermediate stage in the manufacturing of a three-dimensional display, in accordance with some embodiments;
  • Figure 24 is a cross-sectional view of a transparent layer for a three- dimensional display, in accordance with some embodiments;
  • Figure 25 is a flow diagram of a method for adapting operation of a pair of eyeglasses, in accordance with some embodiments; and [0027] Figure 26 illustrates a diagram of an embodiment processing system.
  • a device includes a three- dimensional display.
  • the device is a wearable device.
  • the three-dimensional display maybe used as an optical combiner display to display augmented reality content to the user of the device as well as ordinary imagery that can be viewed.
  • the three-dimensional display is formed using microlenses of a particular size and spacing that allow the display to be optical see-through (OST) while still having a desired display resolution.
  • the microlenses are spaced apart such that there are gaps or spaces among the microlenses, and therefore a user of the device perceives the display as being transparent.
  • the three-dimensional display maybe a direct-emissive display, which maybe more compact than other types of displays.
  • Figures 1A-1B are views of a pair of eyeglasses too, in accordance with some embodiments.
  • Figure 1A is a front view of the eyeglasses too
  • Figure 1B is a side view of the eyeglasses too.
  • the eyeglasses too are a wearable device having optical see-through three-dimensional (3D) displays.
  • the eyeglasses 100 include a frame 102, three-dimensional displays 104, sensors 106, and a human interface device 108.
  • the three-dimensional displays 104 are optical see-through displays that may act as optical combiner displays.
  • the three-dimensional displays 104 are lenses for the eyeglasses 100.
  • the human interface device 108 (subsequently described) is for interacting with the eyeglasses too.
  • the eyeglasses 100 include other components, such as speakers 110.
  • the eyeglasses 100 also include a controller 112 and a transceiver 114 (see Figure 2).
  • the controller 112 and the transceiver 114 are disposed with the human interface device 108. Alternatively, they may be disposed within any portion of or attached to the frame 102.
  • the frame 102 is an eyeglasses frame that holds the lenses (e.g., the three-dimensional displays 104) in the proper position for the user (e.g., wearer) of the eyeglasses 100.
  • the frame 102 includes a pair of rims that surround, or at least partially surround, and hold the three-dimensional displays 104 in place, a bridge which connects the rims to one another, and earpieces connected to the sides of the rims.
  • the frame 102 may be formed of any acceptable material such as plastic, metal, a combination thereof, or the like.
  • the three-dimensional displays 104 are coupled to the frame 102. Specifically, the three-dimensional displays 104 are held by the rims of the frame 102, such that the frame 102 extends at least partially around the three- dimensional displays 104 and the bridge of the frame 102 extends between the three-dimensional displays 104.
  • the three-dimensional displays 104 are direct- emissive, optical see-through, three-dimensional displays. Specifically, the three- dimensional displays 104 are optical see-through light field displays. In this context, a display is optical see-through when the user of the eyeglasses 100 perceives the display as being transparent at a desired viewing distance.
  • the three-dimensional displays 104 are perceived as transparent when a real-world environment is visible through the three-dimensional displays 104 to the user of the eyeglasses 100.
  • the three-dimensional displays 104 may be used to display content such as an augmented reality overlay, a user interface (UI), or the like to a user of the eyeglasses too.
  • UI user interface
  • a display is direct-emissive when the display directly emits a desired light field without using a backlight for light emission. Specifically, the display emits light from each pixel with a desired intensity and color, such that the image is produced directly on the display.
  • a direct-emissive display may simply be referred to as an emissive display.
  • the sensors 106 include one or more of visual sensors, audio sensors, position sensors, environment sensors, and the like.
  • Examples of the sensors 106 include cameras such as wide-angle cameras, infrared cameras, or the like; depth sensors; position sensors such as orientation sensors, magnetometers, or the like; motion sensors such as accelerometers, gravity sensors, gyroscopes, or the hke; environment sensors such as light sensors, temperature sensors, humidity sensors, air pressure sensors, or the like; audio sensors such as microphones; medical sensors such as blood-oxygen sensors, brain wave sensors, pulse oximeter sensors, optical heart rate sensors, or the like; satellite navigation sensors such as global positioning system (GPS) sensors; or the like.
  • GPS global positioning system
  • the sensors 106 are disposed in the rims and bridge of the frame 102.
  • the sensors 106 may be disposed symmetrically or asymmetrically around the frame 102, and may face towards and/or away from the user of the eyeglasses 100.
  • the sensors 106 may include cameras facing towards the user of the eyeglasses 100, which may be utilized to track the position(s) of the head, face, and/or eyes of the user.
  • the sensors 106 may include cameras facing away from the user of the eyeglasses too, which may be utilized to track hand gestures of the user.
  • Some of the sensors 106 may be networked sensors that are external to the frame 102 and are communicated with through the transceiver 114 (subsequently described for Figure 2).
  • the human interface device 108 includes features for interacting with the user interface of the eyeglasses 100.
  • the human interface device 108 may include a touch pad, buttons, or the like.
  • the human interface device 108 is attached to the frame 102, such as to the earpieces of the frame 102.
  • the human interface device 108 is a networked human interface device that is external to the frame 102 and is communicated with through the transceiver 114 (subsequently described for Figure 2).
  • the speakers 110 are used for outputting information to the user of the eyeglasses 100.
  • the speakers 110 maybe used to play a notification or alert.
  • speakers 110 are not used to play a notification or alert if the transceiver 114 (subsequently described for Figure 2) is paired with remote speakers (e.g., Apple ® Airpod ® type ear speakers using a communication protocol such as but not limited to Bluetooth ®).
  • the controller 112 sends data to the transceiver 114 for wireless transmission instead of sending audio data or information to the speakers 110.
  • Figure 2 is a block diagram of the eyeglasses too, in accordance with some embodiments. As noted above, the eyeglasses too also include a controller 112 and a transceiver 114.
  • the controller 112 is adapted to control the components of the eyeglasses 100 during operation. Specifically, the controller 112 is adapted to control the components of the eyeglasses 100 by receiving input signals from the input devices of the eyeglasses 100 (e.g., the sensors 106) and transmitting output signals to the output devices of the eyeglasses 100 (e.g., the three-dimensional displays 104 and the speakers 110).
  • the controller 112 may include a control circuit, a processor, an application-specific integrated circuit, a microcontroller, or the like.
  • the controller 112 may include one or more processors and memories, such as non-transitory computer readable storage mediums, that store programming for execution by the processors.
  • One or more modules within the controller 112 may be partially or wholly embodied as software and/or hardware for performing any functionality described herein.
  • the controller 112 is adapted to display augmented reality content on the three-dimensional displays 104.
  • the transceiver 114 is used by the controller 112 to communicate with external devices.
  • the transceiver 114 is adapted to transmit and receive signaling over a network.
  • the transceiver 114 may be adapted to communicate with an external device that assists the controller 112 with processing or even to receive information for generating imagery for the three-dimensional displays 104.
  • the transceiver 114 may include a transmitter and receiver for a wireless telecommunications protocol, such as a cellular protocol (e.g., 5G, long-term evolution (LTE), etc.), a wireless local area network (WLAN) protocol (e.g., Wi-Fi, etc.), or any other type of wireless protocol (e.g., Bluetooth, near field communication (NFC), etc.).
  • the transceiver 114 includes one or more antenna/ radiating elements for transmitting and/or receiving communication signals.
  • FIG 3 is a flow diagram of a method 300 for displaying augmented reality content on a three-dimensional display, in accordance with some embodiments.
  • the method 300 is described in conjunction with Figures 1A-2.
  • augmented reality content includes 3D images that are displayed on a three-dimensional display 104 such that the 3D images are overlaid on the real- world environment visible through the three-dimensional displays 104.
  • the augmented reality content is spatially aligned with the real -wo rid environment visible through the three-dimensional display 104.
  • signals from the input devices of the eyeglasses 100 are analyzed.
  • the signals may be sensor signals received from the sensors 106.
  • the information from analyzing the signals from the input devices may be spatial alignment information that is used to spatially align augmented reality content on the three-dimensional display 104. For example, the position(s) of the head, face, and/or eyes of the user of the eyeglasses too may be determined by analyzing the signals from the sensors 106 (e.g., the cameras facing towards the user of the eyeglasses 100).
  • augmented reality content is rendered according to the analyzed signals.
  • Rendering the augmented reality content includes calculating the locations to display 3D images on the three-dimensional display 104 so that the 3D images are spatially aligned with the real -world environment.
  • the spatial alignment information may be used to render the augmented reality content in desired locations on the three-dimensional display 104.
  • the rendered augmented reality content is output to the three-dimensional display 104.
  • the 3D images may be output to the three- dimensional display 104 by transmitting signals to the three-dimensional display 104. Accordingly, the three-dimensional display 104 displays the 3D images.
  • At least a portion of the method 300 may be performed by the controller 112. In some embodiments, each step of the method 300 is performed by the controller 112. In some embodiments, some steps of the method 300 are performed by the controller 112, and other steps of the method 300 are performed by an external device, such as an external processor (e.g., a server), a cloud computing processor, an edge computing processor, or the like. For example, the rendering of the augmented reality content may take a considerable amount of computing power.
  • an external processor e.g., a server
  • a cloud computing processor e.g., an edge computing processor, or the like.
  • the rendering of the augmented reality content may take a considerable amount of computing power.
  • steps 302 and 306 are performed by a front-end processor (e.g., the controller 112) while step 304 is performed by a back-end processor (e.g., an external device).
  • the front-end processor may communicate with the back-end processor via the transceiver 114.
  • step 302 may be performed in only some circumstances. For example, a calibration process may be performed using the input devices of the eyeglasses 100 (e.g., the sensors 106, specifically, the cameras facing towards the user of the eyeglasses too).
  • the calibration process may produce the spatial alignment information that is used to spatially align the augmented reality content on the three-dimensional display 104.
  • the spatial alignment information may be stored in a memory of the controller 112 and then reused for multiple rendering steps, without repeated recomputing of the spatial alignment information.
  • FIG 4 is a schematic diagram of a three-dimensional display 104 during operation, in accordance with some embodiments.
  • the three-dimensional display 104 is a direct-emissive, optical see-through, three-dimensional display.
  • the three-dimensional display 104 is used to display 3D images to a target 400.
  • the three-dimensional display 104 is an eyeglasses lens, and the target 400 is an eye of the user.
  • the three-dimensional display 104 is adapted to display 3D images to the target 400.
  • the three-dimensional display 104 is a light field display.
  • the three-dimensional display 104 includes a plurality of 3D lightemitting units 402.
  • a 3D lightemitting unit 402 includes a microlens 604 (subsequently described for Figures 6A-6B) and a light-emitting diode array 614 (subsequently described for Figures 6A-6B), which includes a cluster of light-emitting diodes.
  • Each 3D light-emitting unit 402 emits one or more light fields.
  • a light field is described by a vector function that describes the amount of light flowing in a plurality of directions through a plurality of points in space.
  • a light field includes a plurality of light rays 404 defined by a plenoptic function.
  • Each light ray 404 has a radiance, which is a measurement of the amount of light traveling along the light ray 404.
  • each 3D light-emitting unit 402 emits a plurality of light rays 404. In the illustrated example, only some of the light rays 404 radiating towards the target 400 are illustrated.
  • the light rays 404 each travel in a direction that forms an acute angle with a direction that is perpendicular to a major surface of the three-dimensional display 104. As such, an appearance of three-dimensional depth may be created by the light rays 404 that are emitted by the 3D light-emitting units 402 (e.g., by the combination of the light-emitting diode arrays 614 and the microlenses 604).
  • the 3D light-emitting units 402 are self-emitting, and do not use a backlight for light emission. Accordingly, the three-dimensional display 104 is a direct-emissive display.
  • Utilizing a direct-emissive display instead of a backlit display allows for a reduction in the size of the three-dimensional display 104, and also allows for enhanced brightness and contrast of the three-dimensional display 104.
  • the 3D light-emitting units 402 also do not use a projector or waveguide for light emission.
  • a direct-emissive display may have lower power consumption than a backlit or waveguided display.
  • the three-dimensional display 104 is also an optical see-through display. Accordingly, an object 406 maybe visible to the target 400 through the three-dimensional display 104.
  • the 3D light-emitting units 402 are spaced apart from one another, such that each 3D light-emitting unit 402 is separated from others of the 3D light-emitting units 402.
  • the ratio of the distance between the 3D light-emitting units 402 to the size of the 3D light-emitting units 402 is a particular ratio that permits light rays 408 from the object 406 to pass through the three-dimensional display 104.
  • the three-dimensional display 104 can display 3D images to the target 400 while also allowing the light rays 408 to pass through the three-dimensional display 104 and be visible to the target 400. Accordingly, augmented reality content displayed with the three-dimensional display 104 maybe spatially aligned with the real -wo rid environment (e.g., the object 406) visible through the three- dimensional display 104.
  • the real -wo rid environment e.g., the object 406
  • FIG. 5 is a schematic diagram of three-dimensional displays during display of a 3D image, in accordance with some embodiments.
  • a wearable device e.g., the eyeglasses 100
  • different three- dimensional displays 104 are used to display different light fields to different targets 400, e.g., to different eyes of the user.
  • the three- dimensional displays 104 are near-eye displays.
  • the light fields are generated by the three-dimensional displays 104 so that the user perceives a virtual object at a displayed location 502 in three-dimensional space.
  • the sharpness of displaying a 3D image is determined by the pixel density of the three-dimensional displays 104. If the pixel density of the three-dimensional displays 104 is insufficient to display 3D images, then the displayed location 502 of a virtual object may be different from an intended location 504 of the virtual object. When the error between a displayed location 502 and an intended location 504 of a virtual object is large, augmented reality content may not be correctly overlaid on the real -world environment visible through the three- dimensional displays 104. Specifically, the displayed augmented reality content may be high-order grating diffraction patterns that appear to the user as ghost images.
  • the pixel density of a three-dimensional display 104 is determined by the density of the 3D light-emitting units 402 and by the viewing distance between the three-dimensional display 104 and a target 400.
  • the pixel density of the three-dimensional display 104 is in the range of 1 PPD to 10 PPD.
  • a pixel density of less than 1 PPD maybe insufficient to display 3D images, such that there is a large error between a displayed location 502 and an intended location 504 of a virtual object.
  • a pixel density of greater than 10 PPD may be difficult to manufacture.
  • the viewing distance is about 10 mm and where the size of the 3D light-emitting units 402 is about 50 pm
  • the pixel density of the three- dimensional display 104 is about 3 PPD.
  • Figures 6A-6B are views of a three-dimensional display 104, in accordance with some embodiments. Specifically, Figure 6A is a top-down view of a portion of the three-dimensional display 104, and Figure 6B is a cross-sectional view of a portion of the three-dimensional display 104.
  • the three-dimensional display 104 includes a display panel 602 and a plurality of microlenses 604.
  • the display panel 602 is shown schematically in Figures 6A-6B, where some features (subsequently described for Figures 8-12) are omitted for illustration clarity.
  • the display panel 602 includes a transparent layer 612, a plurality of light-emitting diode arrays 614 in/on the transparent layer 612, and a plurality of transistor arrays 616 under the light-emitting diode arrays 614.
  • the transistor arrays 616 may include thin film transistors (TFTs) as indicated in various figures (e.g., Figure 8) of this disclosure.
  • the display panel 602 further includes spacers (e.g., spacers 618 as described with respect to Figures 15- 16) on the light-emitting diode arrays 614.
  • the transparent layer 612 at least laterally surrounds the lightemitting diode arrays 614.
  • the transparent layer 612 may include a rigid layer (such as a glass layer, a plastic layer, or the like), an ultraviolent absorbent layer, a liquid crystal layer, or the like.
  • the material of the transparent layer 612 may be selected based on the desired application of the three-dimensional display 104.
  • a rigid layer is a clear layer that provides good transparency when the three- dimensional display 104 is a lens for clear eyeglasses.
  • An ultraviolent absorbent layer is a photochromic layer that changes transmission based on the brightness of ambient light, which provides good sun protection when the three-dimensional display 104 is a lens for sunglasses.
  • a liquid crystal layer may be programmable such that its transmission may be changed as desired.
  • a liquid crystal layer may have programmable refractive index that can be changed to provide a desired lens power for correcting the vision of the user.
  • a liquid crystal layer may be programmed to have high transparency in a low-light environment and may be programmed to have low transparency in a high-light environment.
  • the controller 112 (see Figure 2) may be adapted to program the liquid crystal layer.
  • a standard corrective lens may be used for this layer.
  • the transparent layer 612 is flat.
  • the transparent layer 612 (and thus the display panel 602) is curved. Utilizing a curved display panel allows it to provide a desired lens power for correcting the vision of the user.
  • the light-emitting diode arrays 614 are light-emitting pixel arrays, and each include any quantity of light-emitting diodes.
  • each lightemitting diode array 614 includes a grid having from 10 to 15 columns of lightemitting diodes and from 10 to 15 rows of light-emitting diodes.
  • the lightemitting diodes are diodes that, during operation, self-emit light in a desired color without a backlight. Examples of suitable light-emitting diodes include organic light-emitting diodes (OLEDs), micro light-emitting diodes (pLEDs), and the like.
  • the light-emitting diode arrays 614 are spaced apart from one another.
  • the spacing between the light-emitting diodes of adjacent light-emitting diode arrays 614 is greater than the spacing between the lightemitting diodes within a light-emitting diode array 614.
  • the microlenses 604 are placed over the display panel 602. Specifically, the microlenses 604 are over the light-emitting diode arrays 614 with a one-to-one correspondence. Each microlens 604 thus overlaps an underlying light-emitting diode array 614.
  • a microlens 604 may be narrower than, wider than, or the same width as the underlying light-emitting diode array 614.
  • the microlenses 604 may be half-sphere lenses, spherical lenses, aspherical lenses, flat lenses, holographic lenses, metalens (e.g., nano-sized meta-surfaces lenses), Fresnel lenses, or the like. Additionally, the microlenses 604 may be singlelayered lenses, multi-element lenses, or the like. The microlenses 604 are spaced apart from one another such that the microlenses 604 do not contact one another in at least one direction. The spacings between the microlenses 604 expose portions of the transparent layer 612 in the top-down view.
  • the transistor arrays 616 are under the light-emitting diode arrays 614. Specifically, the transistor arrays 616 are beneath the light-emitting diode arrays 614 with a one-to-one correspondence. The transistor arrays 616 are spaced apart from one another. The transistor arrays 616 include transistors that control the diodes of the respective overlying light-emitting diode arrays 614.
  • the transistor arrays 616 are formed on opaque layers (e.g., semiconductor layers, which are flat, rigid layers that are opaque to visible light) that block light from passing through the portions of the three-dimensional display 104 where the microlenses 604 are located.
  • the microlenses 604 distort (e.g., refract) light, and preventing light from passing through the three-dimensional display 104 where the microlenses 604 are located helps reduce distortion of the light passing through the three-dimensional display 104.
  • Each 3D light-emitting unit 402 includes a microlens 604, a lightemitting diode array 614, and a transistor array 616.
  • a 3D light-emitting unit 402 further includes a spacer (e.g., a space 618 as shown in Figures 15-16).
  • the microlens 604 refracts light rays emitted by the light-emitting diode array 614.
  • Light rays emitted from a 3D light-emitting unit 402 may be directed in a desired direction by illuminating certain diodes of the light-emitting diode array 614 such that the light emitted by the diodes is refracted in the desired direction by the microlens 604.
  • the relative position between a light-emitting diode and the center of the microlens 604 defines the exit angle of a light ray exiting from the microlens 604.
  • the display panel 602 has a plurality of emitting regions 602E and a plurality of transparent regions 602T.
  • the emitting regions 602E are opaque regions where ambient light may not pass through the display panel 602. Instead, the emitting regions 602E emit light.
  • the light-emitting diode arrays 614 and the transistor arrays 616 are formed in the emitting regions 602E.
  • the microlenses 604 completely overlap the emitting regions 602E.
  • the transparent regions 602T are regions where light may pass through the display panel 602.
  • a transparent region 602T is between adjacent emitting regions 602E.
  • the transparent layer 612 is formed in the transparent regions 602T.
  • the microlenses 604 do not overlap the transparent regions 602T.
  • the emitting regions 602E are opaque (e.g., not transparent), the transparent regions 602T are large enough relative to the emitting regions 602E that the display panel 602 is optical see-through.
  • the separation distance D between the microlenses 604 (in at least one direction) is large relative to the size S of the microlenses 604. Specifically, the separation distance D between the microlenses 604 is larger than the size S of the microlenses 604.
  • the size S of the microlenses 604 is in the range of 20 pm to 50 pm
  • the separation distance D between the microlenses 604 is in the range of 30 pm to 100 pm
  • the pitch P of the microlenses 604 is in the range of 100 pm to 150 pm.
  • the size S of the microlenses 604 is in the range of 5 pm to 2000 pm (such as 200 pm to 2000 pm), the separation distance D between the microlenses 604 is in the range of 200 pm to 2000 pm, and the pitch P of the microlenses 604 is in the range of 400 pm to 4000 pm.
  • the size S, the separation distance D, and the pitch P may also have other values, particularly as technology scales down.
  • the ratio of the microlens separation distance D to the microlens size S determines whether the three-dimensional display 104 is optical see- through. In some embodiments, the ratio of the microlens separation distance D to the microlens size S is in the range of 1 to 6, and more specifically, in the range of 2 to 5.
  • the ratio of the microlens separation distance D to the microlens size S is less than 2, scattering or diffraction may occur and the three-dimensional display 104 may display ghost images. If the ratio of the microlens separation distance D to the microlens size S is greater than 5, the pixel density of the three- dimensional display 104 may be outside of the desired range (previously described) and insufficient to display 3D images. [0063]
  • the microlenses 604 occupy a minority (e.g., less than half) of the area of the display panel 602 in the top-down view. Therefore, the total area of the transparent regions 602T is greater than the total area of the emitting regions 602E.
  • the ratio of the total area of the transparent regions 602T to the total area of the emitting regions 602E is in the range of 1 to 6. In some embodiments, the ratio is about 2, such that the emitting regions 602E occupy about one-third of the area of the display panel 602, and the transparent regions 602T occupy about two-thirds of the area of the display panel 602.
  • Figures 7A-7G are top-down views of three-dimensional displays 104, in accordance with various embodiments.
  • the microlenses 604 may have various shapes in the top-down views. Additionally, the microlenses 604 may have various layouts in the top-down views. It should be appreciated that the lightemitting diode arrays underlying the microlenses 604 have the same layout as the microlenses 604.
  • the microlenses 604 are round microlenses and are laid out in a grid microlens array.
  • the microlenses 604 are aligned along rows and columns, in the top-down view.
  • the spacing between the rows of the microlenses 604 may be different from the spacing between the columns of the microlenses 604.
  • the columns of the microlenses 604 are closer together than the rows of the microlenses 604, such that the microlenses 604 are in dense horizontal lines.
  • the spacing between the rows/columns of the microlenses 604 determines the field of view of the three-dimensional display 104.
  • the horizontal field of view of the three-dimensional display 104 may be larger than the vertical field of view of the three-dimensional display 104.
  • the microlenses 604 are round microlenses and are laid out in a checkerboard microlens array. In a checkerboard microlens array, the microlenses 604 are aligned along diagonal directions, in the top-down view. When the microlenses 604 are in a checkerboard layout, the horizontal field of view of the three-dimensional display 104 maybe equal to the vertical field of view of the three-dimensional display 104.
  • the microlenses 604 are rectangular microlenses (e.g., square microlenses) and are laid out in microlens strips, where each of the microlens strips is spaced apart.
  • the microlenses 604 are aligned along a same direction and are densely packed along the strip.
  • the microlenses 604 within a strip may be closer to one another than previously described, and in some embodiments, the microlenses 604 within a strip are not spaced apart and instead contact one another.
  • microlenses 604 may have other polygon shapes.
  • the microlenses 604 may have non-polygon shapes.
  • the microlenses 604 maybe truncated circular microlenses.
  • the microlenses 604 are rectangular microlenses and are laid out in microlens strips, where some of the microlens strips are not spaced apart and instead contact one another.
  • the microlens strips may be grouped such that a plurality of microlens strips extend along and are in contact with one another, and the groups of the microlens strips are spaced apart.
  • the spacing between groups of the microlens strips may be large.
  • the microlens strips may each have a width of about 33 pm, three microlens strips may be grouped along each row for a total row width of about 100 pm, and the separation distance between the rows may be about 250 pm.
  • the microlenses 604 are hexagonal microlenses and are laid out in microlens strips, where some of the microlens strips are not spaced apart and instead contact one another.
  • the hexagonal microlenses maybe grouped in microlens strips with a honeycomb layout.
  • the honeycomb structures are spaced apart.
  • the honeycomb structures are non-truncated.
  • the edges of the hexagonal microlenses define jagged edges of the microlens strips in the top-down view.
  • the microlenses 604 are hexagonal microlenses and are laid out in microlens strips with a honeycomb layout, where the honeycomb structures are truncated. As such, the edges of the hexagonal microlenses define straight edges of the microlens strips in the top- down view.
  • the microlenses 604 may be disposed closer to one another along a first direction (e.g., the horizontal direction in Figures 7A and 7C-7F) than along a second direction (e.g., the vertical direction in Figures 7A and 7C-7F).
  • the horizontal direction is parallel to a line connecting the eyes of the user (e.g., a line between the three-dimensional displays 104 of the eyeglasses 100, see Figure 1).
  • the microlens separation distance along the first direction is smaller than the desired separation distance (previously described for Figures 6A-6B).
  • the microlenses 604 may not be spaced apart in the first direction.
  • the microlens separation distance along at least one direction, such as the second direction is within the desired separation distance such that the three-dimensional display 104 is optical see-through.
  • the human eye has a greater horizontal field of view than vertical field of view.
  • the microlenses 604 are spaced apart by the desired separation distance along the vertical direction, such that the gaps between the microlenses 604 are horizontally oriented, e.g., oriented along a direction with a greater field of view.
  • the ratio of the microlens separation distance to the microlens size may vary across a three-dimensional display 104.
  • the ratio may have a first value in a first region of a three-dimensional display 104 and may have a second value in a second region of the three-dimensional display 104, where the first value is different from the second value, and the first and second values are both in the previously described range.
  • the ratio in a central portion of a three-dimensional display 104 is less than the ratio in an edge portions of the three-dimensional display 104, where the central portion is between the edge portions.
  • the edge portions may include the upper and lower portions of the three-dimensional display 104.
  • the separation distance between the microlenses 604 may be smaller in the central portion of a three-dimensional display 104 than in the edge portions of the three-dimensional display 104.
  • the microlens strips in the central portion of the three-dimensional display 104 may be closer together than the microlens strips in the edge portions of the three-dimensional display 104.
  • Other acceptable microlens layouts may be utilized.
  • the microlenses 604 are laid out in a randomized microlens array. Specifically, the spacings between adjacent microlenses 604 may be randomized.
  • the ratio of the microlens separation distance to the microlens size of the adjacent microlenses 604 may still be in the previously described range. In other words, the spacings between adjacent microlenses 604 varies, but the spacings are in the previously described range. Randomizing the spacing between adjacent microlenses 604 may help decrease the interference of light passing through the spaces between the microlenses 604.
  • Figure 7G shows a randomized microlens array with round microlenses
  • the microlenses 604 of the randomized microlens array maybe rectangular microlenses, hexagonal microlenses, or the like.
  • Figures 8-11 are cross-sectional views of intermediate stages in the manufacturing of a three-dimensional display 104, in accordance with some embodiments.
  • the three-dimensional display 104 is manufactured by separately manufacturing the light-emitting diode arrays 614 and then transferring the lightemitting diode arrays 614 to the transparent layer 612.
  • a thin-film transistor layer 802 is formed.
  • a light-emitting diode layer 804 is formed on the thin-film transistor layer 802.
  • the thin-film transistor layer 802 includes transistors, which may be arranged in transistor arrays 616, and is formed on an opaque layer, such as a semiconductor substrate.
  • the light-emitting diode layer 804 includes light-emitting diodes, which maybe arranged in light-emitting diode arrays 614.
  • the transistors of the thin-film transistor layer 802 and the diodes of the light-emitting diode layer 804 may be formed by an acceptable complementary metal-oxide semiconductor (CMOS) process.
  • CMOS complementary metal-oxide semiconductor
  • deposition, lithography, and etching processes maybe performed to form various features (e.g., buffer layers, channel layers, pixeldefining layers, cathodes, etc.) for the transistors of the thin-film transistor layer 802 and the diodes of the light-emitting diode layer 804.
  • features e.g., buffer layers, channel layers, pixeldefining layers, cathodes, etc.
  • the microlenses 604 are then placed on the light-emitting diode layer 804.
  • the microlenses 604 may be placed on a microlens sheet 806.
  • the microlens sheet 806 maybe formed of glass or the like.
  • the microlens sheet 806 may then be placed on the light-emitting diode layer 804 and may be aligned with the light-emitting diode layer 804 such that the microlenses 604 are aligned with the respective underlying light-emitting diode arrays of the light-emitting diode layer 804.
  • the microlens sheet 806, the light-emitting diode layer 804, and the thin-film transistor layer 802 are diced to form display components 808.
  • the display components 808 include the transistor arrays 616 (portions of the thin-film transistor layer 802), the light-emitting diode arrays 614 (portions of the light-emitting diode layer 804), and the microlenses 604.
  • each display component 808 is an individual unit comprising a single transistor array 616, a single light-emitting diode array 614, and a single microlens 604.
  • each display component 808 is a strip comprising a plurality of transistor arrays 616, a plurality of light-emitting diode arrays 614, and a plurality of microlenses 604. A dicing process is performed along scribe line regions, e.g., between the display components 808.
  • the dicing process may include performing a sawing process, a laser cutting process, or the like.
  • the dicing process separates the adjacent display components 808.
  • the microlenses 604 may be trimmed by the dicing of the display components 808. After the dicing process, the respective lightemitting diode arrays 614, transistor arrays 616, and microlens 604 are laterally coterminous.
  • the microlens sheet 806/microlenses 604 are placed directly on the light-emitting diode arrays 614.
  • spacers e.g., spacers 618 as described with respect to Figures 15-16
  • a transparent layer 612 is formed as shown.
  • Recesses 810 are patterned in the transparent layer 612.
  • the recesses 810 maybe patterned utilizing acceptable photolithography and etching techniques.
  • a microlens array e.g., a grid microlens array or checkerboard microlens array, see Figures 7A-7B
  • the recesses 810 are pits.
  • microlens strips see Figures 7C-7F
  • the recesses 810 are grooves.
  • the recesses 810 are patterned where the emitting regions 602E of the display panel 602 (see Figure 11) will be located.
  • the display components 808 are transferred to the transparent layer 612, thereby forming the three-dimensional display 104. After transfer, the display components 808 are coupled to the transparent layer 612.
  • the display components 808 may be transferred to the transparent layer 612 by placing the display components 808 in the recesses 810 (see Figure 10).
  • the display components 808 maybe placed in the recesses 810 by an acceptable pick- and-place process.
  • portions of the display components 808 are disposed in the transparent layer 612.
  • the transistor arrays 616 and the light-emitting diode arrays 614 are disposed in the transparent layer 612.
  • Other portions of the display components 808 protrude from the transparent layer 612.
  • the microlenses 604 protrude from the transparent layer 612. Electrical connections to the devices of the display components 808 (e.g., the transistors of the transistor array 616) may be formed on the transparent layer 612.
  • the microlenses 604 are placed on the lightemitting diode arrays 614 before the display components 808 are placed in the recesses 810, e.g., before the display components 808 are diced.
  • the microlenses 604 are placed on the light-emitting diode arrays 614 after the display components 808 are placed in the recesses 810, e.g., after the display components 808 are diced.
  • a microlens sheet 806 (see Figure 8), including the microlenses 604, may be placed on the transparent layer 612 and the display components 808.
  • one or more surfaces of the microlenses 604 are coated.
  • the sidewalls 812 of the microlenses 604 may be coated with a reflective coating, such as a metal coating, which may help block light from obliquely passing through a transparent region 602T and a microlenses 604. Such light would otherwise be deflected by the microlens 604.
  • the rounded top surfaces 814 of the microlenses 604 may be coated with an anti-reflective coating, such as a dielectric multi-layer coating, which may help increase light transmission through the microlenses 604.
  • Figure 12 is a cross-sectional view of an intermediate stage in the manufacturing of a three-dimensional display 104, in accordance with some embodiments.
  • This intermediate stage is similar to the intermediate stage of Figure 11, except the display components 808 are placed on a top surface of the transparent layer 612 instead of being placed in recesses in the transparent layer 612.
  • the recessing of the transparent layer 612 may be omitted when the transparent layer 612 is a layer that cannot be patterned without causing damage, such as a liquid crystal layer.
  • Figure 13 is a cross-sectional view of an lens in an intermediate stage in the manufacturing of a three-dimensional display 104, in accordance with some embodiments.
  • This intermediate stage is similar to the intermedia stage of Figure 11, except the transparent layer 612 includes a liquid crystal layer 622 and a support layer 624.
  • the transparent layer 612 may further include a first protective layer 626 and/or a second protective layer 628.
  • the liquid crystal layer 622 is formed of a liquid crystal material having an ordinary refractive index and an extraordinary refractive.
  • the refractive index of the liquid crystal material may be set to any value between the ordinary and extraordinary indexes during operation.
  • Acceptable liquid crystal materials include nematic phase liquid crystal materials such as cyanobiphenyl or the like.
  • the liquid crystal layer 622 is formed of a liquid crystal material having, at a wavelength of about 632 nm, an ordinary refractive index of about 1.52 and an extraordinary refractive index of about 1.73.
  • the liquid crystal layer 622 is formed of a liquid crystal material having a dispersion with an Abbe number of about 57.4 as measured by an Abbe refractometer.
  • the liquid crystal layer 622 is thicker than the first protective layer 626 and the second protective layer 628. In some embodiments, the liquid crystal layer 622 has a thickness of about 1.5 mm.
  • the support layer 624 may be formed of a glass, plastic, or the like.
  • the liquid crystal material of the liquid crystal layer 622 maybe brittle, and the support layer 624 acts as a substrate to protect the liquid crystal layer 622.
  • the support layer 624 is formed of a plastic such as a polycarbonate, such as Polyfmethyl methacrylate) (PMMA).
  • PMMA Polyfmethyl methacrylate
  • the support layer 624 is thicker than the first protective layer 626 and the second protective layer 628. In some embodiments, the support layer 624 has a thickness of about 1.5 mm.
  • the first protective layer 626 (if present) is disposed on the liquid crystal layer 622. Accordingly, the liquid crystal layer 622 is disposed between the first protective layer 626 and the support layer 624.
  • the first protective layer 626 may be a conductive coating formed of a conductive material, such as indium-tin- oxide.
  • the first protective layer 626 is thinner than the liquid crystal layer 622. In some embodiments, the first protective layer 626 has a thickness of about 0.1 mm.
  • the second protective layer 628 (if present) is disposed on the support layer 624. Accordingly, the support layer 624 is disposed between the second protective layer 628 and the liquid crystal layer 622.
  • the second protective layer 628 may be a conductive coating formed of a conductive material, such as indium-tin-oxide.
  • the second protective layer 628 is thinner than the support layer 624. In some embodiments, the second protective layer 628 has a thickness of about 0.1 mm.
  • the material of the first protective layer 626 may be different from the material of the second protective layer 628.
  • the first protective layer 626 may have a different refractive index than the second protective layer 628. In some embodiments, at a wavelength of about 625 nm, the first protective layer 626 has a refractive index of about 1.4 and the second protective layer 628 has a refractive index of about 1.2.
  • the support layer 624 is disposed on the liquid crystal layer 622.
  • the display components 808 are disposed in recesses that may be patterned in the support layer 624, in a similar manner as described for Figure 10. Accordingly, the recessing of the transparent layer 612 may be performed even when the transparent layer 612 includes a liquid crystal layer.
  • FIG 14 is a flow chart of a method 1400 of manufacturing a three- dimensional display, in accordance with some embodiments.
  • the display maybe a direct-emissive display for a wearable device.
  • the method 1400 maybe implemented using appropriate steps of the processes described for Figures 8-12.
  • step 1402 a thin-film transistor layer and a light-emitting diode layer are diced to form a first display component and a second display component.
  • the first display component includes a first transistor array and a first light-emitting diode array.
  • the second display component includes a second transistor array and a second light-emitting diode array.
  • step 1404 the first display component and the second display component are transferred to a transparent layer.
  • transferring the first display component and the second display component to the transparent layer includes patterning a first recess and a second recess in the transparent layer and placing the first display component and the second display component in, respectively, the first recess and the second recess.
  • transferring the first display component and the second display component to the transparent layer includes placing the first display component and the second display component on a top surface of the transparent layer.
  • step 1406 a first microlens and a second microlens are placed on, respectively, the first light-emitting diode array and the second light-emitting diode array.
  • the first microlens and the second microlens may be placed before transferring the first display component and the second display component to the transparent layer.
  • placing the first microlens and the second microlens includes placing a microlens sheet on the light-emitting diode layer, the microlens sheet diced with the light-emitting diode layer.
  • the first microlens and the second microlens may be placed after transferring the first display component and the second display component to the transparent layer.
  • placing the first microlens and the second microlens includes placing a microlens sheet on the transparent layer, the first microlens and the second microlens being aligned to, respectively, the first light-emitting diode array and the second light-emitting diode array.
  • the eyeglasses 100 may contain features that allow their operation to be adapted during operation.
  • the plane in which a 3D image is displayed by a three- dimensional display 104 may be changed during operation.
  • the display plane may be changed during operation to reduce simulation sickness of the user. Changing the display plane changes the image depth of the three-dimensional display 104.
  • the eyeglasses 100 may be used to correct the vision of the user.
  • the three-dimensional displays 104 are lenses for the eyeglasses 100, and the three-dimensional displays 104 may have one or more features that allow them to provide a desired lens power for correcting the vision of the user.
  • FIG. 15 is a schematic diagram of a 3D light-emitting unit 402 during operation, in accordance with some embodiments.
  • the 3D light-emitting unit 402 is one of a plurality of light-emitting units of a three-dimensional display.
  • the three-dimensional display (including the 3D light-emitting unit 402) is used to display 3D images to a target 400.
  • the 3D lightemitting unit 402 of the three-dimensional display further includes a spacer 618.
  • the spacer 618 is disposed between the microlens 604 and the hght- emitting diode array 614 of the 3D light-emitting unit 402.
  • the spacer 618 is formed of a liquid crystal material having an ordinary refractive index and an extraordinary refractive.
  • the refractive index of the liquid crystal material may be set to any value between the ordinary and extraordinary indexes during operation.
  • Acceptable liquid crystal materials include nematic phase liquid crystal materials such as cyanobiphenyl or the like.
  • the spacer 618 is formed of a liquid crystal material having, at a wavelength of about 632 nm, an ordinary refractive index of about 1.52 and an extraordinary refractive index of about 1.73.
  • the spacer 618 is a liquid crystal spacer.
  • the spacer 618 may include conductive layers (not separately illustrated) for applying a voltage to change the refractive index of the spacer 618.
  • the controller 112 (see Figure 2) may be electrically coupled to the layers of the spacer 618, and may be adapted to change the refractive index of the spacer 618 during operation.
  • Light rays emitted by the light-emitting diode arrays 614 are deflected by the spacer 618.
  • the amount of deflection by the spacer 618 affects the length of an optical path from the light-emitting diode array 614 to the target 400.
  • a relatively larger amount of deflection by the spacer 618 results in a relatively longer optical path length.
  • a relatively smaller amount of deflection by the spacer 618 results in a relatively shorter optical path length.
  • a first optical path 1502 and a second optical path 1504 are illustrated as an example. Less deflection by the spacer 618 occurs along the first optical path 1502 than along the second optical path 1504. Accordingly, the second optical path 1504 is longer than the first optical path 1502.
  • the effective focal length/of the 3D light-emitting unit 402 is given by Equation 1, where di is the length of the portion of the optical path from the light-emitting diode array 614 to the microlens 604, and da is the length of the portion of the optical path from the microlens 604 to the target 400. In some embodiments, di is less than f, such that d 2 is a negative value.
  • the amount of deflection by the spacer 618 depends on the refractive index of the spacer 618.
  • the refractive index of the spacer 618 is changed during operation, the length of the optical path also changes.
  • da in Equation 1 is also changed.
  • the effective focal length/of the 3D light-emitting unit 402 may be changed by changing the refractive index of the spacer 618.
  • the plane in which a 3D image is displayed is determined by the effective focal length/of the 3D light-emitting unit 402.
  • the display plane of the 3D light-emitting unit 402 may thus be changed by changing the refractive index of the spacer 618.
  • Figure 16 is a top-down view of a portion of the eyeglasses 100, in accordance with some embodiments. Specifically, Figure 16 is a cutaway top- down view showing the three-dimensional displays 104 and a portion of the frame 102 (e.g., bridge) between the three-dimensional displays 104. In this embodiment, the transparent layers 612 of the display panels 602 are curved and the 3D light-emitting units 402 include spacers 618.
  • the transparent layers 612 of the display panels 602 are curved so as to provide a desired lens power for correcting the vision of the user.
  • the 3D lightemitting units 402 of a three-dimensional display 104 are oriented so as to display a 3D image in a display plane that extends through a location 1602 behind the three-dimensional display 104. Center lines extending through the centers of the 3D light-emitting units 402 (e.g., the microlenses 604) intersect at the same point (e.g., the location 1602).
  • the location 1602 corresponds to the center of the pupil of an eye of the user.
  • Each 3D light-emitting unit 402 (e.g., each microlens 604) of the three-dimensional display 104 is disposed along a circle having a radius R measured from the location 1602.
  • the center lines extending through the centers of the 3D light-emitting units 402 intersect at the center of the circle.
  • the distance R is in the range of 15 mm to 28 mm, such as in the range of 15 mm to 22 mm or the range of 22 mm to 28 mm.
  • each 3D light-emitting unit 402 of the three-dimensional display 104 may be disposed the same distance from the location 1602.
  • the three-dimensional displays 104 have a large horizontal field of view.
  • the horizontal field of view H is up to 120 degrees, such as about 90 degrees. Utilizing a large horizontal field of view increases the immersiveness of augmented reality when the eyeglasses 100 are used for augmented reality.
  • the quantity of 3D light-emitting units 402 along the horizontal direction of each three-dimensional display 104 is determined by the horizontal field of view H.
  • Each 3D light-emitting unit 402 provides a small field of view, and multiple 3D light-emitting units 402 are used so that their light fields stitch together to provide a desired horizontal field of view H.
  • the horizontal field of view H is equal to the sum of the horizontal fields of view of the 3D lightemitting unit 402.
  • the vertical field of view of the three-dimensional displays 104 maybe determined by the quantity of 3D light-emitting units 402 along the vertical direction of the three-dimensional displays 104.
  • the vertical field of view of the three-dimensional displays 104 may be smaller than the horizontal field of view. In some embodiments, the vertical field of view is about 60 degrees.
  • the three-dimensional display 104 and the frame 102 may have various dimensions to cause a desired amount of vision correction.
  • the length L between the tips of the microlenses 604 at distal ends of a three-dimensional display 104 is about 26 mm.
  • the pupillary distance PD between the three-dimensional displays 104 is about 61 mm.
  • Other acceptable lengths L and pupillary distances PD may be utilized.
  • the frame 102 e.g., bridge
  • the frame 102 may have an adjustable length in order to allow for adjustment of the pupillary distance PD.
  • the frame 102 may have an adjustable bridge.
  • each display panel 602 may be curved in multiple directions.
  • each display panel 602 maybe a domed display panel.
  • a domed display panel may be curved in both a horizontal direction (e.g., lateral direction) and a vertical direction (e.g., longitudinal direction) of a three- dimensional display 104.
  • FIG 17 is a top-down view of a three-dimensional display 104, in accordance with some embodiments.
  • a layout for the microlenses 604 in the three-dimensional displays 104 of Figure 16 is shown.
  • each microlens 604 corresponds to a 3D light-emitting unit.
  • each 3D light-emitting unit has a field of view of about 20 degrees. Therefore, five of the microlenses 604 are arranged along the horizontal direction of the three- dimensional display 104 to obtain a total horizontal field of view of about too degrees.
  • three of the microlenses 604 are arranged along the vertical direction of the three-dimensional display 104 to obtain a total vertical field of view of about 60 degrees.
  • microlenses are omitted from the corners of the three-dimensional displayiO4.
  • each three- dimensional display 104 of this embodiment includes eleven 3D light-emitting units.
  • each 3D light-emitting unit has a field of view in the range of 5 degrees to 20 degrees and the microlenses 604 have a size of about 2000 pm. In some embodiments, from about five to about twenty of the microlenses 604 are arranged along the horizontal direction of the three-dimensional display 104.
  • Figures 18-22 are cross-sectional views of intermediate stages in the manufacturing of a three-dimensional display, in accordance with some embodiments. This embodiment is similar to that of Figures 8-11, except the display components 808 further include spacers 618 (see Figure 19).
  • a thin-film transistor layer 802 is formed.
  • a lightemitting diode layer 804 is formed on the thin-film transistor layer 802.
  • the thin-film transistor layer 802 and the light-emitting diode layer 804 may be formed in a similar manner as previously described for Figure 8.
  • a spacer layer 816 is formed on the light-emitting diode layer 804.
  • the spacer layer 816 is formed of a liquid crystal material, which may be formed by an acceptable deposition process.
  • the microlenses 604 are then placed on the spacer layer 816.
  • the microlenses 604 may be placed on a microlens sheet 806.
  • the microlens sheet 806 may then be placed on the lightemitting diode layer 804 and may be aligned with the light-emitting diode layer 804 such that the microlenses 604 are aligned with the respective underlying light-emitting diode arrays of the light-emitting diode layer 804.
  • the spacer layer 816, the microlens sheet 806, the lightemitting diode layer 804, and the thin-film transistor layer 802 are diced to form display components 808.
  • the display components 808 include the spacers 618 (portions of the spacer layer 816), the transistor arrays 616 (portions of the thin- film transistor layer 802), the light-emitting diode arrays 614 (portions of the light-emitting diode layer 804), and the microlenses 604.
  • the spacers 618 are disposed between the microlenses 604 and the light-emitting diode arrays 614.
  • the dicing process may be performed in a similar manner as previously described for Figure 9.
  • a transparent layer 612 is formed.
  • the transparent layer 612 is curved.
  • the transparent layer 612 is formed having an outer radius Ri of the outer surface of the curve and an inner radius R2 of the inner surface of the curve so that the transparent layer 612 has a desired focal length (and thus lens power).
  • the outer radius Ri is greater than the inner radius R2.
  • the outer radius Ri and the inner radius R2 may be predetermined, such as with an eyeglass prescription for the user.
  • the transparent layer 612 is also formed with a thickness T, which may (or may not) also be predetermined.
  • the outer radius Ri is about 19 mm
  • the inner radius R2 is about 25 mm
  • the thickness T is about 3.2mm, which results in a focal length of about 3969 mm.
  • recesses 810 are patterned in the transparent layer 612.
  • the recesses 810 may be patterned in a similar manner as previously described for Figure 10.
  • the recesses 810 are pits.
  • the recesses 810 are grooves.
  • the transparent layer 612 is formed with the recesses 810.
  • the transparent layer 612 may be formed by molding, such that it is molded to have a desired outer radius Ri, a desired inner radius R2, a desired thickness T, and the recesses 810.
  • the display components 808 are transferred to the transparent layer 612, thereby forming the three-dimensional display 104.
  • the display components 808 maybe transferred to the transparent layer 612 in a similar manner as previously described for Figure 11.
  • Each display component 808 e.g., each transistor array 616 and each light-emitting diode array 614) is flat, even when the transparent layer 612 is curved.
  • the display components 808 may be placed on a curved surface of the transparent layer 612 instead of being placed in recesses in the transparent layer 612, in a similar manner as described for Figure 12.
  • the eyeglasses too may contain addition features that allow their operation to be adapted during operation. Specifically, when the eyeglasses 100 are used to correct the vision of the user, the corrective lens power may be adapted during operation.
  • the transparent layers 612 of the three-dimensional displays 104 has a programmable refractive index, which may allow the transparent layers 612 to provide a desired lens power.
  • FIG 23 is a cross-sectional view of a transparent layer 612 for a three-dimensional display, in accordance with some embodiments.
  • the transparent layer 612 is a curved transparent layer that further includes a liquid crystal layer 622 and a support layer 624, each of which are curved.
  • the liquid crystal layer 622 is disposed on an outer surface of the support layer 624.
  • the transparent layer 612 is part of a three- dimensional display 104 that acts as a lens for the eyeglasses too.
  • the focal length of the transparent layer 612 determines its lens power for correcting the vision of the eyeglasses user.
  • the focal length (and thus lens power) of the transparent layer 612 is determined by the refractive index of the transparent layer 612.
  • the refractive index of the transparent layer 612 is determined in part by the refractive indexes of the liquid crystal layer 622 and the support layer 624. Because the liquid crystal layer 622 has a programmable refractive index, the focal length of the transparent layer 612 may be changed during operation.
  • the liquid crystal layer 622 may include conductive layers (not separately illustrated) for applying a voltage to change the refractive index of the liquid crystal layer 622.
  • the controller 112 (see Figure 2) maybe electrically coupled to the liquid crystal layer 622 and may be adapted to change the refractive index of the liquid crystal layer 622 during operation.
  • the focal length (and thus lens power) of a three-dimensional display 104 may be set during operation.
  • the eyeglasses 100 may thus be programmed to provide a desired lens power to correct the vision of a user after manufacturing of the eyeglasses too.
  • the focal length (and thus lens power) of the transparent layer 612 is also determined in part by the curvature of the surfaces of the transparent layer 612.
  • the outer surface of the support layer 624 and the outer surface of the liquid crystal layer 622 are formed with radii of curvature that allows the transparent layer 612 to have a desired focal length (and thus provide a desired lens power).
  • the radii of curvature of the outer surfaces of the liquid crystal layer 622 and the support layer 624 may be selected such that the lens power may be varied over a broad range when the refractive index of the liquid crystal layer 622 is varied.
  • the radius of curvature of the transparent layer 612 is about 15 mm
  • the liquid crystal layer 622 is formed of a liquid crystal material having an ordinary refractive index of about 1.52 and an extraordinary refractive index of about 1.73
  • the lens power may be varied from less than 100 degrees (when the liquid crystal material is programmed to have its ordinary refractive index) to greater than 300 degrees (when the liquid crystal material is programmed to have its extraordinary refractive index).
  • Figure 24 is a view of a three-dimensional display 104, in accordance with some embodiments. This embodiment is similar to the embodiment of Figure 22, except the transparent layer 612 includes a liquid crystal layer 622 and a support layer 624, in a similar manner as the embodiment of Figure 23. In this embodiment, the liquid crystal layer 622 is disposed on an outer surface of the support layer 624.
  • FIG. 25 is a flow diagram of a method for adapting operation of a pair of eyeglasses, in accordance with some embodiments.
  • eyeglasses are obtained.
  • the eyeglasses comprise a three-dimensional display.
  • the three-dimensional display is programmed to have a desired lens power.
  • the three-dimensional display includes a transparent layer.
  • the transparent layer includes a liquid crystal layer, and programming the lens power of the three-dimensional display includes programming the refractive index of the liquid crystal layer.
  • the three-dimensional display is programmed to have a desired image depth (e.g., display plane) for 3D images.
  • the three- dimensional display includes light-emitting units.
  • the light-emitting units include liquid crystal spacers, and programming the image depth of the three- dimensional display includes programming the refractive index of the liquid crystal spacers.
  • the desired lens power and/or image depth of the three- dimensional display may be values stored in a memory of the controller 112 (see Figure 2), such as values received as input from the user.
  • the liquid crystal spacers of a three- dimensional display are separately controlled to have a obtain image depth.
  • a first liquid crystal spacer of a three-dimensional display may be programmed to have a first refractive index
  • the second liquid crystal spacer of the three-dimensional display may be programmed to have a second refractive index that is different than the first refractive index.
  • FIG. 26 illustrates a block diagram of an embodiment processing system 2600 for performing methods described herein, which may be installed in a host device.
  • the processing system 2600 may be used to implement the controller 112 (see Figure 2).
  • the processing system 2600 includes a processor 2604, a memory 2606, and interfaces 2610-2614, which may (or may not) be arranged as shown in Figure 26.
  • the processor 2604 may be any component or collection of components adapted to perform computations and/or other processing-related tasks
  • the memory 2606 may be any component or collection of components adapted to store programming and/or instructions for execution by the processor 2604.
  • the memory 2606 includes a non-transitory computer readable medium.
  • the interfaces 2610, 2612, 2614 may be any component or collection of components that allow the processing system 2600 to communicate with other devices/components and/or a user.
  • one or more of the interfaces 2610, 2612, 2614 may be adapted to communicate data, control, or management messages from the processor 2604 to applications installed on the host device and/or a remote device.
  • one or more of the interfaces 2610, 2612, 2614 may be adapted to allow a user or user device (e.g., personal computer (PC), etc.) to interact/communicate with the processing system 2600.
  • the processing system 2600 may include additional components not depicted in Figure 26, such as long-term storage (e.g., non-volatile memory, etc.).
  • Embodiments may achieve advantages. Placing the microlenses 604 on the display panel 602 allows the resulting 3D light-emitting units 402 to emit a light field that is suitable for displaying 3D images. Utilizing the particular ratio (previously described) of the microlens separation distance D to the microlens size S allows the three-dimensional display 104 to be optical see-through. Because the three-dimensional displays 104 of the eyeglasses 100 are direct-emissive displays, they may be more compact than other types of displays, such as backlit displays and waveguide combiner displays. Avoiding the use of backlights in the eyeglasses 100 may reduce the power consumption of the eyeglasses 100.
  • the three-dimensional displays 104 being direct-emissive, have high optical emission efficiency that may reduce the power consumption to as low as 10 mW. Additionally, direct-emissive displays have better optical delivery efficiency than waveguide combiner displays. Avoiding the use of waveguide combiner displays in the eyeglasses too may reduce exposure of the user to excessively bright lights. [0125] It should be appreciated that one or more steps of the embodiment methods provided herein may be performed by corresponding units or modules. For example, a signal may be transmitted by a transmitting unit or a transmitting module. A signal may be received by a receiving unit or a receiving module. A signal may be processed by a processing unit or a processing module.
  • an analyzing unit/module a rendering unit/module, an input unit/module, an output unit/module, a displaying unit/module, a controlling unit/module, a sensing unit/module, an obtaining unit/module, a programming unit/module, and/or a networking unit/module.
  • the respective units/modules may be hardware, software, or a combination thereof.
  • one or more of the units/modules may be an integrated circuit, such as field programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs).
  • FPGAs field programmable gate arrays
  • ASICs application-specific integrated circuits

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Liquid Crystal (AREA)
  • Electroluminescent Light Sources (AREA)

Abstract

Selon un mode de réalisation, un dispositif comprend : une première unité électroluminescente comprenant : un premier réseau de transistors sur un premier substrat, le premier substrat étant plat et opaque à la lumière visible ; un premier réseau de diodes électroluminescentes sur le premier réseau de transistors ; un premier espaceur sur le premier réseau de diodes électroluminescentes, le premier espaceur comprenant un matériau à cristaux liquides ayant un indice de réfraction ordinaire et un indice de réfraction extraordinaire ; et une première microlentille sur le premier espaceur ; et une seconde unité électroluminescente espacée de la première unité électroluminescente, la seconde unité électroluminescente comprenant : un second réseau de transistors sur un second substrat, le second substrat étant plat et opaque à la lumière visible ; un second réseau de diodes électroluminescentes sur le second réseau de transistors ; un second espaceur sur le second réseau de diodes électroluminescentes, le second espaceur comprenant le matériau à cristaux liquides ; et une seconde microlentille sur le second espaceur.
PCT/US2023/021532 2022-11-02 2023-05-09 Dispositif ayant un affichage 3d transparent optique et ses procédés d'adaptation WO2023130129A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263421849P 2022-11-02 2022-11-02
US63/421,849 2022-11-02

Publications (2)

Publication Number Publication Date
WO2023130129A2 true WO2023130129A2 (fr) 2023-07-06
WO2023130129A3 WO2023130129A3 (fr) 2023-09-14

Family

ID=85199034

Family Applications (6)

Application Number Title Priority Date Filing Date
PCT/US2022/082564 WO2023039617A2 (fr) 2022-11-02 2022-12-29 Dispositif ayant un écran 2d-3d combiné
PCT/US2022/082548 WO2023039615A2 (fr) 2022-11-02 2022-12-29 Dispositif ayant un écran optique 3d transparent
PCT/US2022/082558 WO2023039616A2 (fr) 2022-11-02 2022-12-29 Dispositif ayant un affichage 3d transparent optique
PCT/US2023/061076 WO2023056487A2 (fr) 2022-11-02 2023-01-23 Dispositif ayant un afficheur 3d à transparence optique et procédés d'adaptation de celui-ci
PCT/US2023/021532 WO2023130129A2 (fr) 2022-11-02 2023-05-09 Dispositif ayant un affichage 3d transparent optique et ses procédés d'adaptation
PCT/US2023/021527 WO2023130128A2 (fr) 2022-11-02 2023-05-09 Dispositif ayant un affichage 3d transparent optique et ses procédés d'adaptation

Family Applications Before (4)

Application Number Title Priority Date Filing Date
PCT/US2022/082564 WO2023039617A2 (fr) 2022-11-02 2022-12-29 Dispositif ayant un écran 2d-3d combiné
PCT/US2022/082548 WO2023039615A2 (fr) 2022-11-02 2022-12-29 Dispositif ayant un écran optique 3d transparent
PCT/US2022/082558 WO2023039616A2 (fr) 2022-11-02 2022-12-29 Dispositif ayant un affichage 3d transparent optique
PCT/US2023/061076 WO2023056487A2 (fr) 2022-11-02 2023-01-23 Dispositif ayant un afficheur 3d à transparence optique et procédés d'adaptation de celui-ci

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2023/021527 WO2023130128A2 (fr) 2022-11-02 2023-05-09 Dispositif ayant un affichage 3d transparent optique et ses procédés d'adaptation

Country Status (1)

Country Link
WO (6) WO2023039617A2 (fr)

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010102295A1 (fr) * 2009-03-06 2010-09-10 The Curators Of The University Of Missouri Verre adaptatif pour correction de la vision
US8172403B2 (en) * 2009-05-21 2012-05-08 Eastman Kodak Company Projection with curved speckle reduction element surface
US9129295B2 (en) * 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US8582209B1 (en) * 2010-11-03 2013-11-12 Google Inc. Curved near-to-eye display
US8508830B1 (en) * 2011-05-13 2013-08-13 Google Inc. Quantum dot near-to-eye display
KR20140059213A (ko) * 2011-08-30 2014-05-15 마이크로소프트 코포레이션 홍채 스캔 프로파일링을 이용하는 헤드 마운티드 디스플레이
EP2974307A1 (fr) * 2013-03-12 2016-01-20 Koninklijke Philips N.V. Dispositif d'affichage autostéréoscopique transparent
US10884493B2 (en) * 2013-06-20 2021-01-05 Uday Parshionikar Gesture based user interfaces, apparatuses and systems using eye tracking, head tracking, hand tracking, facial expressions and other user actions
CN103995361B (zh) * 2014-06-17 2017-01-11 上海新视觉立体显示科技有限公司 裸眼3d显示像素单元及多视图裸眼3d图像显示设备
CN104155769A (zh) * 2014-07-15 2014-11-19 深圳市亿思达显示科技有限公司 2d/3d共融显示装置及广告装置
US10101586B2 (en) * 2014-12-24 2018-10-16 Seiko Epson Corporation Display device and control method for display device
US10205814B2 (en) * 2016-11-03 2019-02-12 Bragi GmbH Wireless earpiece with walkie-talkie functionality
WO2018107150A1 (fr) * 2016-12-09 2018-06-14 Applied Materials, Inc. Dispositif d'affichage à champs lumineux à del collimatés
DE102017107303A1 (de) * 2017-04-05 2018-10-11 Osram Opto Semiconductors Gmbh Vorrichtung zur darstellung eines bildes
WO2018223150A1 (fr) * 2017-06-01 2018-12-06 Pogotec Inc. Système à réalité augmentée à fixation amovible pour lunettes
US10634912B2 (en) * 2017-06-01 2020-04-28 NewSight Reality, Inc. See-through near eye optical module
US12032161B2 (en) * 2017-06-01 2024-07-09 NewSight Reality, Inc. Systems and methods for reducing stray light in augmented reality and related applications
CN115176192A (zh) * 2019-12-11 2022-10-11 E-视觉智能光学公司 具有阵列光学器件的近眼显示器
CN111175982B (zh) * 2020-02-24 2023-01-17 京东方科技集团股份有限公司 近眼显示装置和可穿戴设备

Also Published As

Publication number Publication date
WO2023130128A3 (fr) 2023-09-28
WO2023130129A3 (fr) 2023-09-14
WO2023039617A2 (fr) 2023-03-16
WO2023039617A3 (fr) 2023-08-24
WO2023039615A3 (fr) 2023-09-14
WO2023130128A2 (fr) 2023-07-06
WO2023039616A2 (fr) 2023-03-16
WO2023056487A3 (fr) 2023-08-31
WO2023039615A2 (fr) 2023-03-16
WO2023056487A2 (fr) 2023-04-06
WO2023039616A3 (fr) 2023-08-24

Similar Documents

Publication Publication Date Title
US11726325B2 (en) Near-eye optical imaging system, near-eye display device and head-mounted display device
US11056032B2 (en) Scanning display systems with photonic integrated circuits
US9684174B2 (en) Imaging structure with embedded light sources
US9726887B2 (en) Imaging structure color conversion
US9164351B2 (en) Freeform-prism eyepiece with illumination waveguide
US10690910B2 (en) Plenoptic cellular vision correction
US20170301270A1 (en) Imaging Structure Emitter Configurations
US20170163977A1 (en) Imaging Structure Emitter Calibration
US20130021226A1 (en) Wearable display devices
WO2020042605A1 (fr) Appareil d'affichage et système d'affichage
CN104755994A (zh) 显示设备
TWI717602B (zh) 頭戴式顯示裝置
ES2928489T3 (es) Conjuntos de visualización con transparencia emulada electrónicamente
WO2023130129A2 (fr) Dispositif ayant un affichage 3d transparent optique et ses procédés d'adaptation
US11727891B2 (en) Integrated electronic and photonic backplane architecture for display panels
KR101878981B1 (ko) 표면 플라즈몬 칼라 필터를 구비한 평판 표시 패널 및 이를 이용한 개인 몰입형 장치의 표시장치
JP7475751B1 (ja) コリメート機能付きコンタクトレンズおよびxrグラス
US20240248307A1 (en) Head-mounted display apparatus and optical unit
US20240248308A1 (en) Virtual image display device and head-mounted display apparatus
US20230244023A1 (en) Phase plate and fabrication method for color-separated laser backlight in display systems
US20240248309A1 (en) Virtual image display device and head-mounted display apparatus
JP2024082376A (ja) 虚像表示装置及び頭部装着型表示装置
JP2024102491A (ja) 虚像表示装置及び頭部装着型表示装置
JP2024102735A (ja) 頭部装着型表示装置及び光学ユニット