WO2022245801A1 - Method and system for calibrating a wearable heads-up display including integrated prescription lenses to produce aligned and color corrected images - Google Patents
Method and system for calibrating a wearable heads-up display including integrated prescription lenses to produce aligned and color corrected images Download PDFInfo
- Publication number
- WO2022245801A1 WO2022245801A1 PCT/US2022/029597 US2022029597W WO2022245801A1 WO 2022245801 A1 WO2022245801 A1 WO 2022245801A1 US 2022029597 W US2022029597 W US 2022029597W WO 2022245801 A1 WO2022245801 A1 WO 2022245801A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hmd
- correction unit
- display
- tunable
- camera
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000003287 optical effect Effects 0.000 claims abstract description 123
- 238000012937 correction Methods 0.000 claims abstract description 81
- 238000005259 measurement Methods 0.000 claims description 21
- 238000003860 storage Methods 0.000 description 18
- 238000012545 processing Methods 0.000 description 11
- 230000008901 benefit Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000004075 alteration Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000004438 eyesight Effects 0.000 description 3
- 239000012530 fluid Substances 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 210000001525 retina Anatomy 0.000 description 3
- 230000004308 accommodation Effects 0.000 description 2
- 201000009310 astigmatism Diseases 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000012528 membrane Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001668 ameliorated effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000005086 pumping Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0183—Adaptation to parameters characterising the motion of the vehicle
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0185—Displaying image at variable distance
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/08—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
Definitions
- a combiner is an optical apparatus that combines two light sources, for example, light transmitted from a micro-display and directed to the combiner via a lightguide, and environmental light.
- Optical combiners are used in heads up displays (HUDs), examples of which include head-mounted displays (HMDs) or near-eye displays, which allow a user to view computer-generated content (e.g., text, images, or video content) superimposed over a user’s environment viewed through the HMD, creating what is known as augmented reality (AR) or mixed reality (MR).
- HMD head-mounted displays
- near-eye displays which allow a user to view computer-generated content (e.g., text, images, or video content) superimposed over a user’s environment viewed through the HMD, creating what is known as augmented reality (AR) or mixed reality (MR).
- AR augmented reality
- MR mixed reality
- an HMD is implemented in an eyeglaAss frame form factor with an optical combiner forming at least one of the lenses within the
- components of the HMD are generally calibrated using images captured by a test set-up camera that is positioned where the user’s eye is expected to be located when the user is wearing the HMD. These images simulate what the user is likely to see projected from the HMD, as well as the real-world environment as viewed through the optical combiner of an AR or MR HMD.
- FIG. 1 illustrates an example head-mounted display (HMD) having an optical combiner with an integrated corrective prescription in accordance with some embodiments.
- HMD head-mounted display
- FIG. 2 shows a block diagram of a calibration station in which aspects of an HMD, such as the HMD of FIG. 1 , are measured and calibrated in accordance with some embodiments.
- FIG. 3 shows a block diagram of processing systems associated with a micro-display of the HMD of FIG. 1 and the calibration station of FIG. 2 in accordance with some embodiments.
- FIG. 4 shows a block diagram of a calibration system including the calibration station of FIG. 2 and peripheral devices in accordance with some embodiments.
- FIG. 5 illustrates a method of calibrating an HMD, such as the HMD of FIG. 1 , using the calibration system of FIG. 4.
- Typical methods for including a corrective optical prescription in an HMD require configuring the HMD’s optical combiner to accommodate both a lightguide and a separate prescription lens, either as part of eyeglasses worn by the user or as a lens that is inserted into, or attached to, the optical combiner.
- the result is often a bulky system that can be uncomfortable for a user to wear, thus detracting from the user experience.
- boundary lines of a corrective prescription lens included in a combiner as an insert or attachment are often visible to the user, which also detracts from the user experience.
- optical combiners with integrated corrective prescriptions have been developed. Although there are complications in simultaneously correcting both the light from within the combiner and the environmental light such that a user does not experience undesirable optical aberrations or distortions when viewing an augmented reality scene, these complications can be diagnosed and ameliorated by calibrating the micro-display so that the device provides good visual acuity for the user.
- a corrective prescription into an optical combiner of an HMD presents challenges to calibrating an HMD because, in some cases, the corrective prescription introduces both defocus and distortion into images projected by the microdisplay and light from the environment transmitted through the optical combiner. To avoid such aberrations, it is desirable that, during a calibration process, the environment and displayed content be viewed as through a user’s eye in order to calibrate the micro-display.
- components of a calibration system used to perform calibration of displayed content viewed through a prescription lens are configured to simulate the user’s eye that is in need of refractive correction.
- FIGs. 1-5 illustrate systems and methods for inspecting and calibrating an HMD having at least one optical combiner with an integrated user-specific corrective prescription.
- the output of the micro-display of the HMD and a reference target, as viewed through the optical combiner having a corrective prescription, are measured at a specialized calibration station to calibrate for optical aberrations and artifacts caused by the corrective prescription.
- the calibration station includes a camera and at least one tunable corrective unit, which together mimic the user's eye that is in need of refractive correction. That is, the camera sensor of the camera mimics the retina of a user’s eye to receive light in order to “see” an image and the tunable corrective unit mimics the lens of a user’s eye to focus light such that the image is seen as in-focus by the camera sensor.
- the tunable corrective unit compensates for the shift in focal point caused by the corrective prescription of the optical combiner and allows the camera of the calibration station to capture the sharp images required to perform measurement and calibration of the HMD.
- Calibration of the HMD is comparative such that the micro-display of the HMD is calibrated to match the reference target, both of which are optically modified by the corrective prescription of the optical combiner and the tunable corrective unit.
- the optically modified light from the micro-display of the HMD is matched to the optically modified light from the reference target so that a user of the HMD will see content displayed at the optical combiner and the environment beyond the optical combiner with good visual accuity.
- the tunable correction unit can be adjusted manually by an operator or by an automatic process to create a focused image at the camera during a calibration stage of the production process or as part of the re-calibration of an HMD in need of repair(s). Because components of the tunable correction unit can be adjusted, the calibration station can simulate a number of different corrective prescriptions and allow simulation of how various users would see the real-world and/or displayed content when wearing an HMD configured with their specific prescription.
- FIG. 1 illustrates an example HMD 100 employing an optical combiner 102 having an integrated corrective prescription.
- the HMD 100 has a support structure 104 that includes a frame 106, which houses a micro-display (shown in FIG. 2), such as a laser projector or light- emitting diode (LED) display, that generates visible light in order to project images toward the eye of a user via the optical combiner 102, such that the user perceives the projected images as being displayed in a field of view (FOV) area 108 at the combiner 102.
- the micro-display also generates infrared light for eye tracking purposes.
- Support structure 104 also includes components to allow the support structure 104 to be worn in a position in front of a user’s eyes. Examples of such components are arms 110 and 112 to be supported by a user’s ears. A strap, or straps (not shown), configured to be worn around and/or on top of a user’s head may be used in place of one or more of the arms in some embodiments to secure the support structure 104 in front of a user’s eyes.
- the HMD 100 is symmetrically configured such that lens element 114 is also a combiner and a micro-display is housed in the portion of the frame 106 proximate to arm 112 to project images to a FOV area within lens element 114. Either or both of combiner 102 and lens element 114 can be configured with eye-side and world-side surfaces having curvatures that, together, provide prescription correction of light that is transmitted to a user’s eye(s).
- the HMD 100 is a near-eye display system in which the support structure 104 is configured to be worn on the head of a user and has a general shape and appearance (or “form factor”) of an eyeglasses frame.
- the support structure 104 contains or otherwise includes various components to facilitate the projection of such images toward the eye of the user, such as a processing system described in greater detail below with reference to FIG. 3.
- the support structure 104 further includes various sensors, such as one or more front-facing cameras, rear-facing cameras, other light sensors, motion sensors, accelerometers, and the like.
- the support structure 104 further can include one or more radio frequency (RF) interfaces or other wireless interfaces, such as a Bluetooth(TM) interface, a WiFi interface, and the like. Further, in some embodiments, the support structure 104 includes one or more batteries or other portable power sources for supplying power to the electrical and processing components, such as one or more processors of a processing system of the HMD 100. In some embodiments, some or all of these components of the HMD 100 are fully or partially contained within an inner volume of support structure 104, such as within arm 110 and the portion of the frame 106 in region 116 of the support structure 104. It should be noted that while an example form factor is depicted, it will be appreciated that in other embodiments the HMD 100 may have a different shape and appearance from the eyeglasses frame depicted in FIG. 1.
- combiner 102 of the HMD 100 provides an AR display in which rendered graphical content can be superimposed over or otherwise provided in conjunction with a real-world view as viewed by the user through combiner 102.
- light used to form a perceptible image or series of images may be projected by a microdisplay of the HMD 100 onto the eye of the user via a series of optical elements, such as a lightguide formed at least partially in combiner 102 and one or more lenses and/or filters disposed between the micro-display and the lightguide.
- Optical combiner 102 includes at least a portion of a lightguide that routes display light received by an incoupler of the lightguide to an outcoupler of the lightguide, which outputs the display light toward an eye of a user of the HMD 100.
- optical combiner 102 is sufficiently transparent to allow a user to see through combiner 102 to provide a field of view of the user’s real-world environment such that the image appears superimposed over at least a portion of the user’s real-world environment.
- components of the HMD are tuned using a calibration station, an example of which is described in greater detail in below with reference to FIG. 2.
- FIG. 2 shows a block diagram of a calibration station 200 in which aspects of a display system, such as HMD 100, are measured and calibrated.
- the calibration station 200 includes a holder 202 into which an optical combiner, such as optical combiner 102, is placed.
- the position of the holder 202 is adjustable to allow the optical combiner 102 to be positioned in a primary optical path of light 220 traveling from a micro-display 204 associated with the optical combiner 102 to a camera 206 of the calibration station 200.
- the camera 206 is a combination of a camera sensor for sensing light and a camera lens, or combination of lenses, for focusing light on the camera sensor.
- a tunable correction unit 208 is also disposed in the primary optical path and located between holder 202 and camera 206.
- the tunable correction unit 208 is a phoropter including a variety of lenses, including, but not limited to, spherical lenses, cylindrical lenses, filtered lenses, and prismatic lenses.
- the phoropter in some embodiments, also includes specialized measurement devices, such as Maddox rods and a Jackson cross-cylinder.
- the tunable correction unit 208 is a focus tunable lens, such as a fluid-filled shape-changing lens in which the radius of the lens can be changed by deforming a membrane housing the fluid or by pumping the fluid into or out of the membrane.
- the tunable correction unit 208 is a trial lens kit including a variety of spherical concave lenses, spherical convex lenses, cylindrical concave lenses, cylindrical convex lenses, prismatical lenses, and specialized lenses such as colored lenses, an occluder lens, pin hole lenses, and cross cyliner lenses which can be placed in the optical path between the holder 202 and camera 206.
- a neutral-density (ND) filter is positioned between camera 206 and tunable correction unit 208 to reduce or modify the intensity of the light that enters the camera 206 to avoid capturing images that are too bright or “overexposed”.
- the tunable correction unit 208 is used to reverse the blurriness caused by the corrective prescription so that camera 206 of the calibration station 200 can capture a sharp, in-focus image on which to perform display measurement and calibration.
- the corrective prescription of the optical combiner 102 works in conjunction with the lens and cornea of a user’s eye to focus light on the retina of the eye. Vision problems occur when the focus accommodation of the eye cannot bring an object into focus or when the eye is not symmetric and suffers from astigmatism.
- the corrective prescription of the optical combiner 102 (measured in diopters) changes the focal point of light entering the user’s eye such that the light is correctly focused on the retina, allowing the user to view their environment in focus.
- a corrective prescription is integrated into the optical combiner 102 by shaping the optical combiner 102 to shift the perceived depth of the real world into a common plane, in the case of astigmatism correction, and into a plane that falls within the patient's focus accommodation, in the case of spherical correction. Because the degree to which the focal point is shifted varies on a user’s specific corrective prescription, images viewed through an optical combiner 102 with an integrated prescription will appear blurry or out-of-focus to another user or, in the case of the calibration station 200, to the camera 206.
- the lenses or the shape of the tunable correction unit 208 is adjusted to reverse the shift in focal point imposed by the optical combiner 102 having a corrective prescription such that the light is correctly focused at the camera 206 and the camera 206 can capture an in-focus image of the environment viewed through the tunable correction unit 208 and optical combiner 102.
- the calibration station 200 includes at least one optical relay.
- the example calibration station 200 shown in FIG. 2 includes two optical relays 212, 214.
- the first optical relay 212 is disposed between the holder 202 and the tunable correction unit 208 and the second optical relay 214 is disposed between the tunable correction unit 208 and the camera 206.
- the first optical relay 212 has at least a first set of relay lenses 218 to relay light from the micro-display 204 to the tunable correction unit 208 and the second optical relay 214 has at least a second set of relay lenses 218 to relay light from the tunable correction unit 208 to the camera 206. Because camera 206 is used to simulate the view of the user, without at least one optical relay, the camera 206 would need to be placed very close to the optical combiner 102 being held in the holder 202 in order to emulate the position of the user’s eye.
- the optical relays 212, 214 serve to elongate the primary optical path to accommodate positioning of the tunable correction unit 208 between the camera 206 and the holder 202 and to allow the camera 206 to capture images of digital content projected at the optical combiner 102 and/or a reference target 224 located beyond the optical combiner 102 as a user would see the digital content and reference target 224.
- the reference target 224 is positioned at a set distance from the holder 202 such that camera 206 views the reference target 224 through the optical combiner 102 positioned in the holder 202.
- the reference target 224 is generally a physical item or image, such as a checkerboard grid or other pattern having distinguishable features, that is used to ensure the camera 206 is capturing a focused image of the reference target 224.
- the reference target 224 is a static or dynamic image that is projected onto a surface positioned a set distance from the holder. While the reference target 224 is generally illuminated with white light, colored lighting can also be used to illuminate the reference target 224.
- a beam splitter 216 is positioned between the microdisplay 204 and the tunable correction unit 208.
- the beam splitter 216 redirects a portion of the light 220 from the micro-display 204 away from the primary optical path towards measurement devices, such as a spectrometer, power meter, and/or integrating sphere as described in greater detail below with reference to FIG. 3.
- the beam splitter 216 is used to inject light into the primary optical path.
- the beam splitter 216 is positioned between the tunable correction unit 208 and the camera to partially compensate for the impact of the tunable correction unit 208 on the light from the micro-display 204. In some embodiments, a beam splitter 216 is positioned on either side of the tunable correction unit 208 to fully compensate for the impact of the tunable correction unit 208.
- FIG. 3 shows a block diagram of a processing system 300 associated with the microdisplay 204 of the HMD and the calibration station 200.
- the processing system 300 includes an application processor (AP) 302, which is an integrated circuit (e.g., a microprocessor) that runs one or more software programs to control the microdisplay 204 and other components of the HMD 100.
- AP 302 includes a processor 304, GPU 306, and memory 308.
- Processor 304 and GPU 306 are communicatively coupled to memory 308.
- the memory 308 is configured as temporary storage to hold data and instructions that can be accessed quickly by processor 304 and GPU 306.
- storage 310 is a more permanent storage to hold data and instructions.
- processor 304 is a programmed computer that performs computational operations.
- processor 304 is implemented as a central processing unit (CPU), a microprocessor, a controller, an application specific integrated circuit (ASIC), system on chip (SOC), or a field-programmable gate array (FPGA).
- CPU central processing unit
- ASIC application specific integrated circuit
- SOC system on chip
- FPGA field-programmable gate array
- GPU 306 receives source images from processor 304 and writes or renders the source images into a projector frame buffer, which is transmitted to display controller 322 of the micro-display 204.
- micro-display 204 uses the frame buffer data to generate drive controls for laser diodes or other light sources in the micro-display 204.
- any corrections to the source images to be projected to optical combiner 102 are applied when rendering the source images into the projector frame buffer.
- the display controller 322 applies the corrections prior to providing the frame buffer data to the micro-display 204.
- corrections such as geometric distortions (determined by the calibration process described herein), color correction, and/or other corrections due to physical changes (e.g., thermal changes) in the optical system, are applied to the source images to achieve a corrected image that is displayed to a user.
- an HMD employing a micro-display 204 having laser diodes that each project a different wavelength of light produces virtual images at the FOV 108 by projecting multiple source images from different regions of the projector frame buffer.
- the HMD is designed such that the virtual images overlap in the FOV 108 to be viewed as one image.
- the HMD as manufactured does not automatically produce virtual images that are aligned in the FOV 108, which can result in a user seeing the images as out-of-focus or as “ghost” images (/. e. , slightly offset, overlapping images).
- distortion models are applied to source images such that when the source images are distorted (“corrected”) and projected to the FOV 108, the distorted source images form virtual images that are aligned in a target region(s) of the FOV 108.
- color correction and/or intensity models can be applied to the source images. These models vary the distribution of colors and brightness within certain regions of the source images such that, when projected at the FOV 108, the resulting virtual images have uniform color and brightness as viewed by the user.
- the processing system 300 further includes a calibration processor 312, which is communicatively coupled to camera 206 in order to receive images and/or light intensity data captured by camera 206 during the calibration process, as explained further with reference to FIG. 4.
- the calibration processor can be decoupled from the HMD 100 once calibration is complete.
- a processor that executes a calibration process as described herein may be referred to as a calibration processor.
- calibration processor 312 is a programmed computer that performs computational operations.
- calibration processor 312 can be a central processing unit (CPU), a microprocessor, a controller, an application specific integrated circuit (ASIC), system on chip (SOC), or a field-programmable gate array (FPGA).
- a display screen may be communicatively coupled to calibration processor 312 to allow interaction with a calibration program 314 running on calibration processor 312 and/or to allow calibration processor 312 to display calibration results from the calibration program 314.
- the calibration processor 312 is communicatively coupled to AP processor 302 for calibration purposes.
- calibration processor 312 is shown executing instructions of calibration program 314.
- Calibration program 314 may be stored in memory 316 and accessed by calibration processor 312 at run time.
- Calibration program 314 includes decision logic 318, which when executed by calibration processor 312, in some embodiments, provides test patterns in a defined sequence to AP 302. The defined sequence may be adjusted during the calibration process.
- AP 302 renders the test patterns in the defined sequence into the projector frame buffer.
- Camera 206 captures images of the test patterns as the test patterns are projected by micro-display 204 by way of the optical combiner 102 and generates display data 320.
- Camera 206 also captures images of reference target 224, as viewed through the optical combiner 102 and the tunable correction unit 208, and generates reference target data 324.
- Calibration processor 312 receives display data 320 and reference target data 324 from camera 206 and the calibration program 314 uses the display data 320 and reference target data 324 to determine eye space to projector space mappings that are subsequently used to generate warped or distorted source images.
- spectrum and intensity measurements of the test patterns projected by micro-display 204 are taken by peripheral components, such as those described in greater detail with reference to FIG. 4.
- FIG. 4 shows a block diagram of a calibration system 400, including the calibration station 200 and peripheral devices.
- peripheral devices include a spectrometer 404 and a power meter 406, which are positioned in the optical path of the light 222 that is redirected by the beam splitter 216.
- the peripheral devices include an integrating sphere 408 to measure the total power (flux) of the light 222 that is redirected by the beam splitter 216.
- the measurements of color uniformity and intensity taken at the spectrometer 404, integrating sphere 408, and power meter 406 are communicated to the calibration processor 312 associated with the processing system 300 and configured to determine calibration parameters for the specific HMD 100 being analyzed at the calibration station 200.
- the calibration processor 312 is communicatively coupled to the camera 206, measurement devices, such as the spectrometer 404 and power meter 406, as well as the AP processor 130.
- FIG. 5 illustrates a method 500 of calibrating an HMD, such as HMD 100, using the calibration system 400.
- an optical combiner 102 is positioned in the holder 202 of the calibration station 200.
- the optical combiner 102 will be attached to a frame, such as frame 106 housing micro-display 204 and other HMD components, such as the display controller 322 and AP 302.
- the tunable correction unit 208 is adjusted, at block 504, to correct defocus caused by any corrective prescription included in the optical combiner 102.
- the tunable correction unit 208 is adjusted to compensate for the defocus that would be seen by the camera 206 by providing a -4 spherical correction.
- the camera 206 and tunable correction unit 208 act together to simulate a user’s eye that requires +4 spherical correction.
- the tunable correction unit 208 can be adjusted manually by an operator or automatically during the calibration process.
- camera 206 captures images of a reference target, such as reference target 224, as viewed through the tunable correction unit 208 and the optical combiner 102.
- the tunable correction unit 208 acts to correct for defocus of the reference target 224 introduced by the corrective prescription of the optical combiner 102, as described above with reference to FIG. 2, however, in some cases some distortion of the reference target 224 will still be present in the captured images. In some embodiments, this distortion is desired in the final calibrated HMD because users requiring refractive correction adapt to viewing the world with this distortion. Thus, the final display calibration should match the user’s expectation in viewing their environment.
- the captured images of reference target 224 are sent to the calibration processor 312, where they are measured and analyzed for distortion.
- a distortion model is generated.
- the distortion model is provided to the AP processor 302 of the HMD in order to calibrate the micro-display 204, at block 512, to project images that have the same geometrical distortion as the captured images of the reference target 224.
- the microdisplay 204 provides light representing an image or text to the optical combiner 102, which outputs light 220 along the primary optical path between the optical combiner 102 and the camera 206 of the calibration station 200.
- a portion of the output light is redirected away from the primary optical path by the beam splitter 216 and towards measurement devices of the calibration station 200, such as the spectrometer 404 and power meter 406, where the light is analyzed for uniformity of color and brightness.
- a uniformity correction model is generated at block 518 and provided to the AP processor 302 at block 520.
- the uniformity correction model is applied to the light 220 projected by the micro-display 204 to compensate for non-uniformities in the light output by the optical combiner 102 and to match a predetermined target white point.
- the method of calibration described in blocks 502 through 520 is repeated for a second lens of the HMD 100, in the case of a binocular HMD. The result is an HMD 100 which is optimally calibrated for a user’s eye(s) that require refractive correction.
- certain aspects of the techniques described above may be implemented by one or more processors of a processing system executing software.
- the software comprises one or more sets of executable instructions stored or otherwise tangibly embodied on a non-transitory computer-readable storage medium.
- the software can include the instructions and certain data that, when executed by the one or more processors, manipulate the one or more processors to perform one or more aspects of the techniques described above.
- the non-transitory computer-readable storage medium can include, for example, a magnetic or optical disk storage device, solid state storage devices such as Flash memory, a cache, random access memory (RAM), or other non-volatile memory device or devices, and the like.
- the executable instructions stored on the non-transitory computer- readable storage medium may be in source code, assembly language code, object code, or other instruction format that is interpreted or otherwise executable by one or more processors.
- a computer-readable storage medium may include any storage medium, or combination of storage media, accessible by a computer system during use to provide instructions and/or data to the computer system.
- Such storage media can include, but is not limited to, optical media (e.g., compact disc (CD), digital versatile disc (DVD), Blu-Ray disc), magnetic media (e.g., floppy disc , magnetic tape, or magnetic hard drive), volatile memory (e.g., random access memory (RAM) or cache), non-volatile memory (e.g., read-only memory (ROM) or Flash memory), or microelectromechanical systems (MEMS)-based storage media.
- optical media e.g., compact disc (CD), digital versatile disc (DVD), Blu-Ray disc
- magnetic media e.g., floppy disc , magnetic tape, or magnetic hard drive
- volatile memory e.g., random access memory (RAM) or cache
- non-volatile memory e.g., read-only memory (ROM) or Flash memory
- MEMS microelect
- the computer-readable storage medium may be embedded in the computing system (e.g., system RAM or ROM), fixedly attached to the computing system (e.g., a magnetic hard drive), removably attached to the computing system (e.g., an optical disc or Universal Serial Bus (USB)-based Flash memory), or coupled to the computer system via a wired or wireless network (e.g., network accessible storage (NAS)).
- system RAM or ROM system RAM or ROM
- USB Universal Serial Bus
- NAS network accessible storage
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020237037205A KR20230161513A (en) | 2021-05-18 | 2022-05-17 | Systems and methods for calibrating and evaluating wearable head-up displays with INTEGRATED CORRECTIVE PRESCRIPTION (ICP) |
EP22728735.6A EP4302152A1 (en) | 2021-05-18 | 2022-05-17 | Method and system for calibrating a wearable heads-up display including integrated prescription lenses to produce aligned and color corrected images |
CN202280024862.0A CN117178221A (en) | 2021-05-18 | 2022-05-17 | Method and system for calibrating a wearable heads-up display including an integrated prescription lens |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163190158P | 2021-05-18 | 2021-05-18 | |
US63/190,158 | 2021-05-18 | ||
US17/408,955 | 2021-08-23 | ||
US17/408,955 US20220373802A1 (en) | 2021-05-18 | 2021-08-23 | Systems and methods for calibrating and evaluating a wearable heads-up display with integrated corrective prescription |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022245801A1 true WO2022245801A1 (en) | 2022-11-24 |
Family
ID=81975087
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/029597 WO2022245801A1 (en) | 2021-05-18 | 2022-05-17 | Method and system for calibrating a wearable heads-up display including integrated prescription lenses to produce aligned and color corrected images |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4302152A1 (en) |
KR (1) | KR20230161513A (en) |
WO (1) | WO2022245801A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5731902A (en) * | 1996-08-19 | 1998-03-24 | Delco Electronics Corporation | Head-up display combiner binocular test fixture |
US20140285429A1 (en) * | 2013-03-15 | 2014-09-25 | John Castle Simmons | Light Management for Image and Data Control |
US10043430B1 (en) * | 2016-07-25 | 2018-08-07 | Oculus Vr, Llc | Eyecup-display alignment testing apparatus |
US20200058266A1 (en) * | 2018-08-14 | 2020-02-20 | Oculus Vr, Llc | Display device with throughput calibration |
WO2021021155A1 (en) * | 2019-07-31 | 2021-02-04 | Hewlett-Packard Development Company, L.P. | Head-mounted displays |
-
2022
- 2022-05-17 KR KR1020237037205A patent/KR20230161513A/en unknown
- 2022-05-17 WO PCT/US2022/029597 patent/WO2022245801A1/en active Application Filing
- 2022-05-17 EP EP22728735.6A patent/EP4302152A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5731902A (en) * | 1996-08-19 | 1998-03-24 | Delco Electronics Corporation | Head-up display combiner binocular test fixture |
US20140285429A1 (en) * | 2013-03-15 | 2014-09-25 | John Castle Simmons | Light Management for Image and Data Control |
US10043430B1 (en) * | 2016-07-25 | 2018-08-07 | Oculus Vr, Llc | Eyecup-display alignment testing apparatus |
US20200058266A1 (en) * | 2018-08-14 | 2020-02-20 | Oculus Vr, Llc | Display device with throughput calibration |
WO2021021155A1 (en) * | 2019-07-31 | 2021-02-04 | Hewlett-Packard Development Company, L.P. | Head-mounted displays |
Also Published As
Publication number | Publication date |
---|---|
EP4302152A1 (en) | 2024-01-10 |
KR20230161513A (en) | 2023-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11531303B2 (en) | Video display and method providing vision correction for multiple viewers | |
CN109964167B (en) | Method for determining an eye parameter of a user of a display device | |
US10048750B2 (en) | Content projection system and content projection method | |
KR102599889B1 (en) | Virtual focus feedback | |
KR101443322B1 (en) | Method for optimizing and/or manufacturing eyeglass lenses | |
US11675197B1 (en) | System and method for automatic vision correction in near-to-eye displays | |
US20180084232A1 (en) | Optical See-Through Head Worn Display | |
US11150476B2 (en) | Method for providing a display unit for an electronic information device | |
CN109922707B (en) | Method for determining an eye parameter of a user of a display device | |
US20190258442A1 (en) | Using detected pupil location to align optical components of a head-mounted display | |
US10573080B2 (en) | Apparatus and method for augmented reality presentation | |
US10775624B2 (en) | Binocular device comprising a monocular display device | |
JP2018170554A (en) | Head-mounted display | |
WO2019235059A1 (en) | Video projection system, video projection device, optical element for diffracting video display light, tool, and method for projecting video | |
US20220373802A1 (en) | Systems and methods for calibrating and evaluating a wearable heads-up display with integrated corrective prescription | |
WO2022245801A1 (en) | Method and system for calibrating a wearable heads-up display including integrated prescription lenses to produce aligned and color corrected images | |
CN117957479A (en) | Compact imaging optics with distortion compensation and image sharpness enhancement using spatially positioned freeform optics | |
JP2024522470A (en) | Method and system for calibrating a wearable head-up display with integrated prescription lenses to produce aligned and color-corrected images - Patents.com | |
CN117178221A (en) | Method and system for calibrating a wearable heads-up display including an integrated prescription lens | |
JP6832318B2 (en) | Eye projection system | |
US11327313B2 (en) | Method and system for rendering an image with a pupil enhanced accommodation of the eye | |
CN112748573B (en) | Head-mounted display device and display method | |
EP4218540A1 (en) | Method for simulating an ophthalmic lens on an eye of a subject viewing a virtual three-dimensions scene using a light field display | |
KR20210083810A (en) | Augmented Reality Optical System to Correct Weares's Vision | |
KR20120053419A (en) | Display system having function eyesight correction and recording media for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22728735 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022728735 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022728735 Country of ref document: EP Effective date: 20231005 |
|
ENP | Entry into the national phase |
Ref document number: 20237037205 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023571620 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |