WO2024059383A1 - Fiducial based temporal arm deformation estimation - Google Patents

Fiducial based temporal arm deformation estimation Download PDF

Info

Publication number
WO2024059383A1
WO2024059383A1 PCT/US2023/071487 US2023071487W WO2024059383A1 WO 2024059383 A1 WO2024059383 A1 WO 2024059383A1 US 2023071487 W US2023071487 W US 2023071487W WO 2024059383 A1 WO2024059383 A1 WO 2024059383A1
Authority
WO
WIPO (PCT)
Prior art keywords
gaze tracking
lens
fiducial markers
head mounted
wearable device
Prior art date
Application number
PCT/US2023/071487
Other languages
French (fr)
Inventor
Zhiheng Jia
Mingsong DOU
Eric M. Meisner
Chao GUO
Andrew Logan
Thuyh NGUYEN
Jessica Lynn Busch
Original Assignee
Google Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google Llc filed Critical Google Llc
Publication of WO2024059383A1 publication Critical patent/WO2024059383A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/32Fiducial marks and measuring scales within the optical system
    • G02B27/36Fiducial marks and measuring scales within the optical system adjustable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • This description relates in general to deformation in a wearable device, and in particular to the estimation of deformation in a head mounted wearable device.
  • Wearable computing devices may include, for example head mounted wearable devices, wrist worn wearable devices, hand worn wearable devices pendants, and the like.
  • Head mounted wearable devices may include, for example, smart glasses, headsets, goggles, ear buds, and the like.
  • Wrist/hand worn wearable devices may include, for example, smart watches, smart bracelets, smart rings, and the like.
  • Wearable computing devices may include various types of electronic components that provide for functionality of the wearable computing device. In some situations, deformation of the housing or frame of the wearable computing device may affect the functionality of the electronic components and/or the wearable computing device.
  • a frame of a head mounted wearable device may experience deformation due to, for example, size of the head of the user, shape of the head of the user, movement or slippage of the head mounted wearable device while worn by the user, and the like. In some situations, deformation of the frame may impact the accuracy of eye/gaze tracking capability of the head mounted wearable device.
  • Systems and methods, in accordance with implementations described herein, provide for the detection of deformation, and the estimation of an amount of deformation, experienced by head mounted wearable device including eye tracking capability.
  • the head mounted wearable device includes one or more fiducial markers, for example, on one or both lenses of the head mounted wearable device. Deformation of the frame is detected and estimated based on detection of the one or more fiducial markers.
  • Eye/gaze tracking performed by a gaze tracking device of the head mounted wearable device are adjusted based on the estimated amount of deformation. For example, corrections and/or adjustments may be incorporated into eye/gaze tracking algorithms based on the detected deformation, and detected shift in position and/or orientation of the one or more fiducial markers. This may improve the accuracy of eye/gaze tracking in the head mounted wearable device, and may improve the accuracy of inputs associated with eye/gaze tracking.
  • FIG. 1 A illustrates an example head mounted wearable device worn by a user.
  • FIG. IB is a front view
  • FIG. 1C is a rear view of the example head mounted wearable device shown in FIG. 1 A.
  • FIGs. 2A-2D illustrate operation of an example gaze tracking device of the example head mounted wearable device shown in FIGs. 1 A-1C.
  • FIG. 3A is a perspective view
  • FIG. 3B is a top view
  • FIG. 3C is a side view of a frame of the example head mounted wearable device shown in FIGs. 1A-1C, in a reference state.
  • FIG. 4A is a perspective view
  • FIG. 4B is a top view
  • FIG. 4C is a side view of a frame of the example head mounted wearable device shown in FIGs. 1 A-1C, in a first deformed state.
  • FIG. 5A is a perspective view
  • FIG. 5B is a top view
  • FIG. 5C is a side view of a frame of the example head mounted wearable device shown in FIGs. 1 A-1C, in a second deformed state
  • FIGs. 6A-6E illustrate example positioning of example fiducial markers on an example lens of the example head mounted wearable device shown in FIGs. 1A-1C.
  • FIG. 7A illustrates example positioning of example fiducial markers on an example lens of the example head mounted wearable device shown in FIGs. 1A-1C.
  • FIG. 7B is a cross-sectional view of an example fiducial marker shown in FIG. 7A.
  • FIG. 8 is a flowchart of an example method. DETAILED DESCRIPTION
  • This disclosure relates to systems and methods providing for the detection of deformation in a head mounted wearable device, and the estimation of an amount of deformation exhibited by the head mounted wearable device.
  • the deformation may be detected based on detection of change in position and/or orientation of one or more fiducial markers on the lens(es) of the head mounted wearable device, from a reference position/orientation at which little to no deformation is exhibited.
  • the position and/or orientation of the one or more fiducial markers may be determined in a world coordinate system or a coordinate system fixed to a portion of the head mounted wearable device (e.g., an image sensor).
  • the detected change in position/orientation of the one or more fiducial markers may represent a shift in a position of the lens(es) relative to an image sensor of a gaze tracking device of the head mounted wearable device.
  • the detected change in position/orientation of the one or more fiducial markers and/or associated amount of deformation may be taken into account in the eye/gaze tracking performed by the gaze tracking device.
  • corrections and/or adjustments may be incorporated into eye/gaze tracking algorithms based on the detected change in position/orientation of the one or more fiducial markers and associated deformation. This may improve the accuracy of user eye gaze tracking, and may improve the detection and processing of user inputs associated with user eye gaze.
  • an implemented eye/gaze tracking algorithm comprises at least one parameter (e.g., a correction coefficient or factor) that depends on the detected position and/or orientation of the one or more fiducial markers or on a change of the detected position and/or orientation of the one or more fiducial markers.
  • the claimed adjusting of the gaze tracking algorithm thus is to include adjusting the parameter.
  • the parameter is applied to input data (e.g., values representing an intensity of detected light or data representing an image of the user’s eye) processed by the eye/gaze tracking algorithm.
  • the parameter of the gaze tracking algorithm is set or changed depending on the detected light (e.g., depending on values representing an intensity of the detected reflected or diffracted light).
  • a change of the parameter may be triggered dependent on whether or not a threshold is met as set out further below.
  • at least one image of the one or more fiducial markers may be captured by the gaze tracking device and the parameter of the eye/gaze tracking algorithm may be set based on the captured image, e.g., depending on a position and/or shape of the one or more fiducial markers in the captured image.
  • the detected change in position/orientation of the one or more fiducial markers may represent a shift in a position of the lens(es) relative to a display device of the head mounted wearable device.
  • the detected change in position/orientation of the one or more fiducial markers and/or associated amount of deformation is taken into account in the output of visual content by the display device.
  • corrections and/or adjustments are incorporated into display control algorithms (e.g., a display output algorithm) based on the detected change in position/orientation of the one or more fiducial markers and associated deformation, so that visual content is visible to the user even in the deformed state of the head mounted wearable device.
  • a display control algorithm comprises at least one parameter (e.g., a correction coefficient or factor) that depends on the detected position and/or orientation of the one or more fiducial markers or on a change of the detected position and/or orientation of the one or more fiducial markers.
  • the adjusting of the display control algorithm is to include adjusting the parameter.
  • FIG. 1 A illustrates a user wearing an example head mounted wearable device 100 in the form of smart glasses, or augmented reality glasses, including display capability, eye/gaze tracking capability, and computing and/or processing capability.
  • FIG. IB is a front view
  • FIG. 1C is a rear view, of the example head mounted wearable device 100 shown in FIG. 1 A.
  • the example head mounted wearable device 100 includes a frame 110.
  • the frame 110 includes a front frame portion 120, and a pair of temple arm portions 130 rotatably coupled to the front frame portion 120 by respective hinge portions 140.
  • the front frame portion 120 includes rim portions 123.
  • the rim portions 123 surround respective optical portions in the form of lenses 127, such that the lenses 127 are received in the rim portions 123.
  • a bridge portion 129 connects the rim portions 123 at a central portion of the front frame portion 120.
  • the temple arm portions 130 are coupled, for example, pivotably or rotatably coupled, to the front frame portion 120 at peripheral portions of the respective rim portions 123.
  • one of the temple arm portions 130 is coupled, by one of the hinge portions 140, to a first end portion of the front frame portion 120 corresponding to one of the rim portions 123
  • the other of the temple arm portions 130 is coupled, by the other of the hinge portions 140, to a second end portion of the front frame portion 120 corresponding to the other of the rim portions 123.
  • the lenses 127 are corrective/prescription lenses received in the rim portions 123.
  • the lenses 127 are an optical material including glass and/or plastic portions received in the rim portions, that do not necessarily incorporate corrective/prescription parameters.
  • the wearable device 100 includes a display device 104 that can output visual content, for example, at an output coupler 105, so that the visual content is visible to the user.
  • the display device 104 is provided in one of the two arm portions 130, simply for purposes of discussion and illustration.
  • a display device 104 is provided in each of the two arm portions 130 to provide for binocular output of visual content.
  • the display device 104 is a see through near eye display.
  • the display device 104 is configured to project light from a display source onto a portion of teleprompter glass functioning as a beamsplitter seated at an angle (e.g., 30-45 degrees).
  • the beamsplitter provides for reflection and transmission values that allow the light from the display source to be partially reflected while the remaining light is transmitted through.
  • Such an optic design may allow a user to see both physical items in the world, for example, through the lenses 127, next to content (for example, digital images, user interface elements, virtual content, and the like) output by the display device 104.
  • waveguide optics may be used to depict content on the display device 104.
  • the head mounted wearable device 100 includes at least one of an audio output device 106 (such as, for example, one or more speakers), an illumination device 108, a sensing system 111, a control system 112, at least one processor 114, and an outward facing image sensor 116, or camera.
  • the sensing system 111 includes various sensing devices and the control system 112 includes various control system devices including, for example, one or more processors 114 operably coupled to the components of the control system 112.
  • the control system 112 includes a communication module providing for communication and exchange of information between the wearable computing device 100 and other external devices.
  • the head mounted wearable device 100 includes a memory.
  • the memory stores executable instructions for execution by, for example, a processor of the control system 112.
  • the head mounted wearable device 100 includes a power storage device, or battery, that stores and distributes power to components of the head mounted wearable device 100.
  • the head mounted wearable device 100 includes a gaze tracking device 115 to detect and track eye gaze direction and movement. Data captured by the gaze tracking device 115 may be processed to detect and track gaze direction and movement as a user input.
  • the gaze tracking device 115 is provided in one of the two arm portions 130, simply for purposes of discussion and illustration.
  • the gaze tracking device 115 is provided in the same arm portion 130 as the display device 104, so that user eye gaze can be tracked not only with respect to objects in the physical environment, but also with respect to the content output for display by the display device 104.
  • a gaze tracking device 115 is provided in each of the two arm portions 130 to provide for gaze tracking of each of the two eyes of the user.
  • FIGs. 2A-2D illustrate operation of the example gaze tracking device 115.
  • FIGs. 2A and 2B are partial perspective views of the example gaze tracking device 115 provided in one of the two arm portions 130 of the head mounted wearable device 100, simply for ease of discussion and illustration. As noted above, in some examples, a gaze tracking device 115 is provided in each of the two arm portions 130.
  • FIGs. 2C and 2D are schematic illustrations, respectively corresponding to FIGs. 2A and 2B, of the operation of the gaze tracking device 115.
  • the gaze tracking device 115 includes an image sensor 117 (for example, a camera) and a light source 119.
  • the lens 127 includes a reflective portion.
  • the image sensor 117 may capture an image of the eye of the user based on a reflection of the eye of the user at the reflective portion of the lens 127.
  • the reflective portion of the lens 127 may be defined by a reflective coating applied to the lens 127.
  • the reflective coating may be made of a material that provides reflective properties, but does not obstruct the user’s view through the lens 127.
  • the reflective coating may be a near infrared coating material.
  • the capture of the reflected image of the eye of the user may be facilitated by illuminating the eye of the user.
  • the light source 119 may emit light toward the lens 127 of the head mounted wearable device 100.
  • the light emitted by the light source 119 may be reflected by the lens 127, for example, the reflective portion of the lens 127, toward the eye of the user to illuminate the eye of the user.
  • the image sensor 117 may capture an image of the illuminated eye of the user reflected at the lens 127, for example, at the reflective portion of the lens 127.
  • the light source 119 may emit light that is not visible to the user, so that the light emitted by the light source 119 is not a distraction, a source of discomfort, and the like while the head mounted wearable device 100 is worn.
  • the light source 119 may emit infrared light, so that the light is not visible to the user.
  • the frame 110 of the head mounted wearable device 100 may experience deflection, or deformation. This may occur due to, for example, a head size and/or shape of the user wearing the head mounted wearable device 100, movement or slippage of the head mounted wearable device 100, and other such factors. Deformation or deflection or slippage of the frame 110 that causes, for example, a relative shift in position and/or orientation between the image sensor 117 of the gaze tracking device 115 and the lens 127 may affect the accuracy of eye/gaze tracking performed by the gaze tracking device 115 based on the images captured by the image sensor 117 of the gaze tracking device 115.
  • deformation or deflection or slippage that causes a relative shift in position and/or orientation between one or both of the arm portion(s) 130 in which the display device(s) 104 is/are provided and the front frame portion 120 of the frame 110 may impact the user’s ability to view visual content output by the display device 104.
  • FIGs. 3A-3C provide a perspective view, a top view, and a side view of the frame 110 of the head mounted wearable device 100 in an at-rest state, or a baseline state, or a reference state. In the reference state, little to no forces are applied to the frame 110 that cause any type of deflection, or deformation of the frame 110.
  • the gaze tracking device 115 may be calibrated with the frame 110 in the reference state, so that gaze tracking done by the gaze tracking device 115 may be coordinated with content output by the display device 104, for the detection of user inputs and/or interactions with the content output by the display device 104 and/or with objects in the physical environment, and the like.
  • FIGs. 4A-4C provide a perspective view, a top view, and a side view, of the frame 110 of the head mounted wearable device 100, in a first example deformed state.
  • a force has been applied to the frame 110 causing one or both of the arm portions 130 to be deflected outward, for example outward with respect to the front frame portion 120 of the frame 110.
  • a first force has been applied to a first arm portion 130A, causing the first arm portionl30A to be deflected outward, in the direction of the arrow Al.
  • a second force has been applied to a second arm portion 130B, causing the second arm portion 130B to be deflected in the direction of the arrow A2.
  • FIGs. 4B and 4C the contour of the front frame portion 120 and the arm portions 130 in the reference state is shown in dashed lines, so that the change in contour is visible.
  • 5A-5C provide a perspective view, a top view, and a side view, of the frame 110 of the head mounted wearable device 100, in a second example deformed state in which a twisting force has been applied to the frame 110.
  • a first force has been applied to the first arm portion 130A, causing the first arm portion 130A to be deflected upward, in the direction of the arrow Bl
  • a second force has been applied to the second arm portion 130B, causing the second arm portion 130B to be deflected downward, in the direction of the arrow B2.
  • the position of the first arm portion 130A and the second arm portion 130B in the at rest state is shown in dashed lines, so that the deflection is visible.
  • the deflection of the first arm portion 130A and the second arm portion 130B in this manner may be due to, for example, a shifting of the head mounted wearable device 100 on the head of the user, wear at the hinge portions 140, and the like.
  • the deflection of the arm portions 130 has also caused twisting of the front frame portion 120 of the frame 110, as can be seen in FIG. 5C.
  • the first example deformed state and the second example deformed state shown in FIGs. 4A-5C provide just two examples of how deflection of the arm portions 130 of the frame 110 may cause deformation of the frame 110.
  • deflection or deformation of the frame 110 may result in a shifting of a relative position and/or orientation of the image sensor 117 and the lens 127 from which the image of the user’s eye is captured.
  • the principles to be described herein may be applied in response to other types of deformation experienced by the frame 110 that are not explicitly shown herein.
  • the shift (from the reference state, at which the gaze tracking device was calibrated) in position and/or orientation of the image sensor 117 relative to the lens 127, from which the image of the user’s eye is captured, may affect the accuracy of eye/gaze tracking performed by the gaze tracking device 115.
  • the corresponding shift in position and/or orientation of the gaze tracking device 115 relative to the lens 127, from the reference state to the deformed state may fall outside of the calibration thresholds, and adversely affect the accuracy of the eye/gaze tracking performed by the gaze tracking device gaze tracking device 115.
  • the corresponding shift in position and/or orientation of the display device 104 relative to the lens 127, from the reference state to the deformed state may fall outside of the calibration thresholds of the display device 104, and adversely affect the user’s visibility of visual content output by the display device 104.
  • the trend toward lighter weight, smaller form factor head mounted wearable devices may make the frame 110 more susceptible to deflection and/or deformation and/or slippage.
  • Systems and methods, in accordance with implementations described herein, provide for the detection of deformation and/or slippage of the frame of a head mounted wearable device, such as the example head mounted wearable device 100 described above.
  • systems and methods, in accordance with implementations described above provide for the detection of deformation and/or slippage of a frame of the head mounted wearable device, and the estimation of an amount, or degree, or magnitude of deformation, so that it can be determined whether or not the detected deformation and/or slippage may cause a gaze tracking device, such as the example gaze tracking device 115 described above or other gaze tracking device, to be outside of calibration thresholds that could adversely impact the accuracy of the output of the gaze tracking device.
  • the estimation of an amount, or degree, or magnitude of deformation provides for a determination of whether or not the detected deformation and/or slippage may cause a display device, such as the example display device 104 described above or other display device, to be outside of calibration thresholds that could adversely impact the visibility of visual content output by the display device.
  • the deformation and/or slippage of the frame causes a shift in a position and/or orientation of an image sensor of the gaze tracking device relative to a lens of the head mounted wearable device, from which an image of the user’s eye is captured by the image sensor to perform the eye tracking.
  • one or more fiducial markers are provided on the lens. Detection of the one or more fiducial markers provides for the determination of a position and/or an orientation of the lens with respect to the image sensor of the gaze tracking device in the deformed state of the frame.
  • an amount, or degree, or magnitude of a difference between a reference position (based on a reference position of the one or more fiducial markers) of the image sensor with respect to the lens and the determined position in the deformed state (based on the detection of the one or more fiducial markers) is used to determine whether or not the deformation and/or slippage will cause the gaze tracking device to operate outside of set calibration thresholds.
  • a correction factor is applied to one or more the gaze tracking algorithm(s) based on the detected change in position of the one or more fiducial markers. This correction factor may take into account, or compensate for, the changes caused by the deformation and/or slippage of the frame, to maintain accuracy of the eye/gaze tracking performed by the gaze tracking device.
  • At least one correction parameter (e.g., a correction factor) of the gaze tracking algorithm may be determined based on the position and/or orientation of the at least one fiducial marker.
  • the correction parameter is updated based on the detected position and/or orientation of the one or more fiducial markers or a change of the position and/or orientation of the one or more fiducial markers.
  • the correction parameter may be applied to input data to be processed by the gaze tracking algorithm such as light intensity values.
  • a display control algorithm may be adjusted based on the detected position and/or orientation of the one or more fiducial markers or a change of the position and/or orientation of the one or more fiducial markers.
  • a gaze tracking device 115 provided in and/or on one of the two arm portions 130 relative to a corresponding one of the two lenses 127 of the example head mounted wearable device 100, simply for purposes of discussion and illustration.
  • the principles to be described herein are similarly applicable to a head mounted wearable device including gaze tracking devices 115 provided in each of the two arm portions 130 (to provide for eye/gaze tracking of both eyes of the user), and/or a head mounted wearable device including display devices 104 provided in each of the two arm portions 130 (to provide for binocular display of visual content).
  • FIG. 6A is a schematic illustration of a plurality of fiducial markers 600 on one of the lenses 127 of the example head mounted wearable device 100 described above.
  • the example shown in FIG. 6A includes three fiducial markers 600, simply for purposes of discussion and illustration. The principles described herein can be applied to a head mounted wearable device including more, or fewer, fiducial markers, arranged similarly to or differently from the plurality of fiducial markers 600 shown in FIG. 6A.
  • the example shown in FIG. 6A includes a first fiducial marker 600A, a second fiducial marker 600B, and a third fiducial marker 600C arranged on the lens 127.
  • FIG. 6A includes a first fiducial marker 600A, a second fiducial marker 600B, and a third fiducial marker 600C arranged on the lens 127.
  • the fiducial markers 600 are be arranged outside of a viewing area 650 of the lens 127, so that the fiducial markers 600 do not inhibit or interfere with the user’s view of objects in the ambient environment, content output by the display device 104, and the like.
  • one or more of the fiducial markers 600 may be made of a retroreflective thin film material applied to the surface of the lens 127.
  • the one or more fiducial markers 600 made of the thin film retroreflective material have retroreflective properties that are different from the reflective properties of the reflective portion of the lens 127 described above, so that the one or more fiducial markers 600 are detectable and distinguishable.
  • one or more of the fiducial markers 600 may be formed by a removal of material, for example, from the reflective portion of the lens 127.
  • one or more of the fiducial markers 600 may be formed by masking as retroreflective material is applied to the lens 127 to form the reflective portion of the lens 127.
  • one or more of the fiducial markers 600 may be a laser etched marker formed in the surface of the lens 127. In some examples, one or more of the fiducial markers 600 may be an optical grating formed in the surface of the lens 127. The fiducial markers 600 may be formed so that they are not visible to the user when the head mounted wearable device 100 is worn by the user.
  • the fiducial markers 600 i.e., the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C, are spatially separated, or spaced apart, on the surface of the lens 127.
  • This spatial separation of the fiducial markers 600 on the surface of the lens 127 may make the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C more easily distinguishable from each other.
  • This spatial separation of the fiducial markers 600 on the surface of the lens 127, and the resulting distinction of the fiducial markers 600 may facilitate the detection of changes in position of the fiducial markers 600 in response to deformation of the frame 110 as described above.
  • the fiducial markers 600 are detected, for example, by the image sensor 117 of the gaze tracking device 115. Illumination of the surface of the lens 127 by, for example, the light source 119 of the gaze tracking device 115, may facilitate the detection of the fiducial markers 600 in images captured by the image sensor 117.
  • the light emitted by the light source 119 may be, for example, infrared light, so that the light is not visible to the user wearing the head mounted wearable device 100.
  • a reflection of the infrared light by the fiducial markers 600 may be detectable by the image sensor 117, to enable the detection of the fiducial markers 600.
  • the positioning of the fiducial markers 600 shown in FIG. 6A represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C with the frame 110 in the reference state, in which the frame 110 experiences little to no deformation.
  • the positioning of the fiducial markers 600 in FIG. 6C represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C in response to outward deflection of the arm portions 130, as described above with respect to FIGs. 4A-4C.
  • FIG. 6D represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C in response to further outward deflection of the arm portions 130, beyond the state shown in FIG. 6C.
  • the state shown in FIG. 6D may represent a further outward rotation or deflection of the arm portions 130, beyond a stop point of a rotation device of the hinge portions 140.
  • this further deflection of the arm portions 130 causes a change in contour (for example, a relative flattening) of the front frame portion 120 of the frame 110.
  • the rotation, and outward deflection of the arm portions 130 has caused a change in position of the first fiducial marker 600A.
  • FIG. 6C the rotation, and outward deflection of the arm portions 130 has caused a change in position of the first fiducial marker 600A.
  • the further outward deflection of the arm portions 130 has caused a further change in position of the first fiducial marker 600 A, and also a slight change in position of the third fiducial marker 600C.
  • the positioning of the fiducial markers 600 in FIG. 6E represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C in response to a twisting force applied to the frame 110, as described above with respect to FIGs. 5A-5C.
  • Changes in positions and/or orientations of the fiducial markers 600 are detected based on the analysis of images captured by the image sensor 117.
  • the detected changes may include, for example, a horizontal shift or change in position (i.e., along the X axis).
  • the detected changes may include, for example, a vertical shift of change in position (i.e., along the Y axis).
  • the detected changes may include, for example, movement closer to or further from the image sensor 117 (i.e., along the Z axis).
  • detection of a movement of the fiducial markers 600 closer to or further from the image sensor 117 is based on, for example, changes in a detected size and/or shape of the fiducial markers 600.
  • the detected changes in positions and/or orientations of the fiducial markers 600 may be used to provide for adjustment of one or more eye/gaze tracking algorithm(s) implemented by the gaze tracking device 115 to account for the shift in the relative positions of the lens 127 and the gaze tracking device 115, and preserve the accuracy of the eye/gaze tracking data output by the gaze tracking device 115. That is, the detected change in position, and orientation, of the first fiducial marker 600A and/or the second fiducial marker 600B and/or the third fiducial marker 600C in this example may be used to determine an adjustment factor representative of the corresponding change in position of the lens 127 relative to the image sensor 117.
  • the adjustment factor may be applied to one or more of the eye/gaze tracking algorithm(s) implemented by the gaze tracking device 115, to compensate for the change in position of the lens 127 relative to the image sensor 117.
  • This re-calibration of the gaze tracking device 115 may allow the gaze tracking device 115 to accurately track user eye gaze, even in the event of deformation of the frame 110.
  • the tracking of the positions of the fiducial markers 600 may be done substantially continuously while the gaze tracking device 115 is in operation, so that the re-calibration can be performed and updated in substantially real time.
  • the detected changes in positions of the fiducial markers 600 are used to provide for adjustment of one or more display control algorithm(s) implemented by the control system 112, to control the output of content by the display device 104.
  • display parameters associated with the output of content by the display device 104 may be adjusted to account for the shift in the relative positions of the lens 127 and the display device 104.
  • display parameters may be adjusted so that visual content output by the display device 104 remains visible to the user, for example, within the eye box of the user, in the deformed state of the frame 110.
  • a detected change in position and/or orientation, of the first fiducial marker 600A and/or a detected change in position and/or orientation of the second fiducial marker 600B and/or a detected change in position and/or orientation of the third fiducial marker 600C in this example may be used to determine a first adjustment factor representative of the corresponding change in position of the lens 127 relative to the image sensor 117.
  • the first adjustment factor is applied to the one or more display control algorithm(s) implemented by the control system 112, to compensate for the change in position of the lens 127.
  • a second adjustment factor may be determined, based on the first adjustment factor, that is representative of a change in position of the lens 127 relative to the display device 104.
  • the second adjustment factor is applied to the one or more display control algorithm(s) implemented by the control system 112, to compensate for the change in position of the lens 127.
  • This re-calibration of the display device 104 may allow the display device 104 to output visual content in a manner that is visible to the user, i.e., at the output coupler 105/within the eye box of the user, even in the event of deformation of the frame 110.
  • FIG. 7A illustrates an example in which different positioning and/or numbers and/or combinations of fiducial markers 700 are incorporated into an example lens 127.
  • the fiducial markers 700 are offset from each other. In some examples, the fiducial markers 700 are offset from each other in a vertical direction (in the example orientation shown in FIG. 7A). In some examples, the fiducial markers 700 are offset from each other in a horizontal direction (in the example orientation shown in FIG. 7A). In some examples, the fiducial markers 700 are offset from each other in both a vertical direction and a horizontal direction (in the example orientation shown in FIG. 7A). In some examples, some of the fiducial markers 700 are offset from each other in a vertical direction, and some of the fiducial markers are offset from each other in a horizontal direction (in the example orientation shown in FIG. 7A).
  • FIGs. 6A-7A illustrates just some example positioning/combinations of positions/numbers of fiducial markers that can be incorporated into the lens of the head mounted wearable device 100, simply for purposes of discussion and illustration. More, or fewer, fiducial markers can be incorporated into the lens 127, in similar or different combinations/ arranged similarly to or different from the examples shown in FIGs. 6A-7A. Numbers and/or sizing and/or positioning/placement of fiducial markers on the lens 127 may be determined based on numerous factors including, for example, a particular configuration of the lens (size, curvature, and the like), relative positioning/placement of the lens with respect to the display device, the eye/gaze tracking device, and other such factors.
  • some, or all, of the fiducial markers are laser etched into the example lens 127.
  • laser etching of the lens 127 includes laser ablation of at least one of, or a plurality of, anti -reflective coating layers 730.
  • the laser ablation removes the anti -reflective coating layer(s) 730 to form a laser ablated area, or etched area 740, in a primer or bonding layer 720 coupling the anti -reflective coating layer(s) 730 to a lens substrate 710.
  • the roughed, or etched area 740 formed by the laser ablation process defines the fiducial marker 700 that creates a localized scattering effect when light is directed at the lens 127.
  • the roughened or etched area 740 may be filled with a material that will still allow for reflection and localized scattering from the etched area 740.
  • the etched area 740 may be small enough so that the etched area 740 remains unfilled, and is not noticeable to the user. That is, in some examples, a dimension, i.e., a size, or a depth, or a contour, of the etched area 740 may be small enough, for example, proportionally small compared to a thickness of the lens substrate 710 and/or an overall size of the lens 127 and the like, that it remains unnoticeable to the wearer of the head mounted wearable device 100. [0042] FIG.
  • FIG. 8 is a flowchart of an example method 800, in accordance with implementations described herein.
  • the example method may be performed by computing systems implemented in the example head mounted wearable device 100 described above, or other such device having processing/computing capability and/or display capability and/or eye/gaze tracking capability.
  • a gaze tracking device such as the gaze tracking device
  • fiducial markers such as the example fiducial markers 600 described above, or other fiducial markers/combinations of fiducial markers provided on a lens of a head mounted wearable device
  • Detection of the fiducial markers may include, for example illumination of the surface(s) of the lens(es) by a light source of the gaze tracking device and detection of the fiducial markers by an image sensor of the gaze tracking device, such that the fiducial markers are detected through analysis of images capture by the image sensor.
  • detection of the fiducial markers includes detection of a reflection of light from the fiducial markers, detected by the image sensor. Detection of the fiducial markers by the image sensor may be performed substantially continuously while the gaze tracking device operates, so that a current position of the fiducial markers are compared to a previous position of the fiducial markers (block 830). A detected change in position and/or orientation of one or more of the fiducial markers (block 840) may trigger an adjustment to the eye/gaze tracking algorithm implemented by the gaze tracking device/re-calibration of the gaze tracking device (block 850).
  • a detected change in position and/or orientation of one or more of the fiducial markers that is greater than or equal to a set threshold triggers an ad adjustment to the eye/gaze tracking algorithm implemented by the gaze tracking device 115.
  • the set threshold is representative of a change in position of one or more of the fiducial markers along one of an X axis, a Y axis, or a Z axis that is great enough to trigger a re-calibration of the gaze tracking device.
  • the process may repeatedly performed until the operation of the gaze tracking device is terminated (block 860).
  • spatially relative terms such as “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for ease of description to describe one element or feature in relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 70 degrees or at other orientations) and the spatially relative descriptors used herein may be interpreted accordingly.
  • Example implementations of the concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized implementations (and intermediate structures) of example implementations. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example implementations of the described concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. Accordingly, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example implementations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)

Abstract

Systems and methods for detecting deformation of a frame of a head mounted wearable device, and for estimating an amount of deformation of the frame, are provided. One or more fiducial markers are provided on a lens of the head mounted wearable device. Positions of fiducial marker(s) may be detected by an image sensor of the head mounted wearable device. Changes in positions of the fiducial markers(s) may be correlated with a corresponding adjustment to be made in an eye/gaze tracking algorithm implemented by a gaze tracking device of the head mounted wearable device, to maintain accuracy of the eye/gaze tracking performed by the gaze tracking device.

Description

FIDUCIAL BASED TEMPORAL ARM DEFORMATION ESTIMATION
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application is a continuation of, and claims priority to, U.S. Provisional Application No. 63/375,599, filed on September 14, 2022, entitled “FIDUCIAL BASED TEMPORAL ARM DEFORMATION ESTIMATION,” the disclosure of which is incorporated by reference herein in its entirety.
TECHNICAL FIELD
[0002] This description relates in general to deformation in a wearable device, and in particular to the estimation of deformation in a head mounted wearable device.
BACKGROUND
[0003] Wearable computing devices may include, for example head mounted wearable devices, wrist worn wearable devices, hand worn wearable devices pendants, and the like. Head mounted wearable devices may include, for example, smart glasses, headsets, goggles, ear buds, and the like. Wrist/hand worn wearable devices may include, for example, smart watches, smart bracelets, smart rings, and the like. Wearable computing devices may include various types of electronic components that provide for functionality of the wearable computing device. In some situations, deformation of the housing or frame of the wearable computing device may affect the functionality of the electronic components and/or the wearable computing device.
SUMMARY
[0004] A frame of a head mounted wearable device may experience deformation due to, for example, size of the head of the user, shape of the head of the user, movement or slippage of the head mounted wearable device while worn by the user, and the like. In some situations, deformation of the frame may impact the accuracy of eye/gaze tracking capability of the head mounted wearable device. Systems and methods, in accordance with implementations described herein, provide for the detection of deformation, and the estimation of an amount of deformation, experienced by head mounted wearable device including eye tracking capability. The head mounted wearable device includes one or more fiducial markers, for example, on one or both lenses of the head mounted wearable device. Deformation of the frame is detected and estimated based on detection of the one or more fiducial markers. Eye/gaze tracking performed by a gaze tracking device of the head mounted wearable device are adjusted based on the estimated amount of deformation. For example, corrections and/or adjustments may be incorporated into eye/gaze tracking algorithms based on the detected deformation, and detected shift in position and/or orientation of the one or more fiducial markers. This may improve the accuracy of eye/gaze tracking in the head mounted wearable device, and may improve the accuracy of inputs associated with eye/gaze tracking.
[0005] The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] FIG. 1 A illustrates an example head mounted wearable device worn by a user.
[0007] FIG. IB is a front view, and FIG. 1C is a rear view of the example head mounted wearable device shown in FIG. 1 A.
[0008] FIGs. 2A-2D illustrate operation of an example gaze tracking device of the example head mounted wearable device shown in FIGs. 1 A-1C.
[0009] FIG. 3A is a perspective view, FIG. 3B is a top view, and FIG. 3C is a side view of a frame of the example head mounted wearable device shown in FIGs. 1A-1C, in a reference state.
[0010] FIG. 4A is a perspective view, FIG. 4B is a top view, and FIG. 4C is a side view of a frame of the example head mounted wearable device shown in FIGs. 1 A-1C, in a first deformed state.
[0011] FIG. 5A is a perspective view, FIG. 5B is a top view, and FIG. 5C is a side view of a frame of the example head mounted wearable device shown in FIGs. 1 A-1C, in a second deformed state
[0012] FIGs. 6A-6E illustrate example positioning of example fiducial markers on an example lens of the example head mounted wearable device shown in FIGs. 1A-1C.
[0013] FIG. 7A illustrates example positioning of example fiducial markers on an example lens of the example head mounted wearable device shown in FIGs. 1A-1C.
[0014] FIG. 7B is a cross-sectional view of an example fiducial marker shown in FIG. 7A.
[0015] FIG. 8 is a flowchart of an example method. DETAILED DESCRIPTION
[0016] This disclosure relates to systems and methods providing for the detection of deformation in a head mounted wearable device, and the estimation of an amount of deformation exhibited by the head mounted wearable device. In some examples, the deformation may be detected based on detection of change in position and/or orientation of one or more fiducial markers on the lens(es) of the head mounted wearable device, from a reference position/orientation at which little to no deformation is exhibited. For example, the position and/or orientation of the one or more fiducial markers may be determined in a world coordinate system or a coordinate system fixed to a portion of the head mounted wearable device (e.g., an image sensor). In some examples, the detected change in position/orientation of the one or more fiducial markers may represent a shift in a position of the lens(es) relative to an image sensor of a gaze tracking device of the head mounted wearable device. In some examples, the detected change in position/orientation of the one or more fiducial markers and/or associated amount of deformation may be taken into account in the eye/gaze tracking performed by the gaze tracking device. In some examples, corrections and/or adjustments may be incorporated into eye/gaze tracking algorithms based on the detected change in position/orientation of the one or more fiducial markers and associated deformation. This may improve the accuracy of user eye gaze tracking, and may improve the detection and processing of user inputs associated with user eye gaze. For example, an implemented eye/gaze tracking algorithm comprises at least one parameter (e.g., a correction coefficient or factor) that depends on the detected position and/or orientation of the one or more fiducial markers or on a change of the detected position and/or orientation of the one or more fiducial markers. The claimed adjusting of the gaze tracking algorithm thus is to include adjusting the parameter. For example, the parameter is applied to input data (e.g., values representing an intensity of detected light or data representing an image of the user’s eye) processed by the eye/gaze tracking algorithm. For example, light reflected or diffracted by the one or more fiducial markers is detected by the gaze tracking device and the parameter of the gaze tracking algorithm is set or changed depending on the detected light (e.g., depending on values representing an intensity of the detected reflected or diffracted light). A change of the parameter may be triggered dependent on whether or not a threshold is met as set out further below. Further, at least one image of the one or more fiducial markers may be captured by the gaze tracking device and the parameter of the eye/gaze tracking algorithm may be set based on the captured image, e.g., depending on a position and/or shape of the one or more fiducial markers in the captured image. In some examples, the detected change in position/orientation of the one or more fiducial markers may represent a shift in a position of the lens(es) relative to a display device of the head mounted wearable device. In some examples, the detected change in position/orientation of the one or more fiducial markers and/or associated amount of deformation is taken into account in the output of visual content by the display device. In some examples, corrections and/or adjustments are incorporated into display control algorithms (e.g., a display output algorithm) based on the detected change in position/orientation of the one or more fiducial markers and associated deformation, so that visual content is visible to the user even in the deformed state of the head mounted wearable device. For example, similarly to the gaze tracking algorithm, a display control algorithm comprises at least one parameter (e.g., a correction coefficient or factor) that depends on the detected position and/or orientation of the one or more fiducial markers or on a change of the detected position and/or orientation of the one or more fiducial markers. The adjusting of the display control algorithm is to include adjusting the parameter.
[0017] FIG. 1 A illustrates a user wearing an example head mounted wearable device 100 in the form of smart glasses, or augmented reality glasses, including display capability, eye/gaze tracking capability, and computing and/or processing capability. FIG. IB is a front view, and FIG. 1C is a rear view, of the example head mounted wearable device 100 shown in FIG. 1 A. The example head mounted wearable device 100 includes a frame 110. The frame 110 includes a front frame portion 120, and a pair of temple arm portions 130 rotatably coupled to the front frame portion 120 by respective hinge portions 140. The front frame portion 120 includes rim portions 123. The rim portions 123 surround respective optical portions in the form of lenses 127, such that the lenses 127 are received in the rim portions 123. A bridge portion 129 connects the rim portions 123 at a central portion of the front frame portion 120. The temple arm portions 130 are coupled, for example, pivotably or rotatably coupled, to the front frame portion 120 at peripheral portions of the respective rim portions 123. In particular, one of the temple arm portions 130 is coupled, by one of the hinge portions 140, to a first end portion of the front frame portion 120 corresponding to one of the rim portions 123, and the other of the temple arm portions 130 is coupled, by the other of the hinge portions 140, to a second end portion of the front frame portion 120 corresponding to the other of the rim portions 123. In some examples, the lenses 127 are corrective/prescription lenses received in the rim portions 123. In some examples, the lenses 127 are an optical material including glass and/or plastic portions received in the rim portions, that do not necessarily incorporate corrective/prescription parameters.
[0018] In some examples, the wearable device 100 includes a display device 104 that can output visual content, for example, at an output coupler 105, so that the visual content is visible to the user. In the example shown in FIGs. IB and 1C, the display device 104 is provided in one of the two arm portions 130, simply for purposes of discussion and illustration. In some examples, a display device 104 is provided in each of the two arm portions 130 to provide for binocular output of visual content. In some examples, the display device 104 is a see through near eye display. In some examples, the display device 104 is configured to project light from a display source onto a portion of teleprompter glass functioning as a beamsplitter seated at an angle (e.g., 30-45 degrees). In some examples, the beamsplitter provides for reflection and transmission values that allow the light from the display source to be partially reflected while the remaining light is transmitted through. Such an optic design may allow a user to see both physical items in the world, for example, through the lenses 127, next to content (for example, digital images, user interface elements, virtual content, and the like) output by the display device 104. In some implementations, waveguide optics may be used to depict content on the display device 104.
[0019] In some examples, the head mounted wearable device 100 includes at least one of an audio output device 106 (such as, for example, one or more speakers), an illumination device 108, a sensing system 111, a control system 112, at least one processor 114, and an outward facing image sensor 116, or camera. In some examples, the sensing system 111 includes various sensing devices and the control system 112 includes various control system devices including, for example, one or more processors 114 operably coupled to the components of the control system 112. In some examples, the control system 112 includes a communication module providing for communication and exchange of information between the wearable computing device 100 and other external devices. In some examples, the head mounted wearable device 100 includes a memory. In some examples, the memory stores executable instructions for execution by, for example, a processor of the control system 112. In some examples, the head mounted wearable device 100 includes a power storage device, or battery, that stores and distributes power to components of the head mounted wearable device 100.
[0020] In some examples, the head mounted wearable device 100 includes a gaze tracking device 115 to detect and track eye gaze direction and movement. Data captured by the gaze tracking device 115 may be processed to detect and track gaze direction and movement as a user input. In the example shown in FIGs. IB and 1C, the gaze tracking device 115 is provided in one of the two arm portions 130, simply for purposes of discussion and illustration. In the example arrangement shown in FIGs. IB and 1C, the gaze tracking device 115 is provided in the same arm portion 130 as the display device 104, so that user eye gaze can be tracked not only with respect to objects in the physical environment, but also with respect to the content output for display by the display device 104. In some examples, a gaze tracking device 115 is provided in each of the two arm portions 130 to provide for gaze tracking of each of the two eyes of the user.
[0021] FIGs. 2A-2D illustrate operation of the example gaze tracking device 115. FIGs. 2A and 2B are partial perspective views of the example gaze tracking device 115 provided in one of the two arm portions 130 of the head mounted wearable device 100, simply for ease of discussion and illustration. As noted above, in some examples, a gaze tracking device 115 is provided in each of the two arm portions 130. FIGs. 2C and 2D are schematic illustrations, respectively corresponding to FIGs. 2A and 2B, of the operation of the gaze tracking device 115.
[0022] In this example, the gaze tracking device 115 includes an image sensor 117 (for example, a camera) and a light source 119. In some examples, the lens 127 includes a reflective portion. The image sensor 117 may capture an image of the eye of the user based on a reflection of the eye of the user at the reflective portion of the lens 127. In some examples, the reflective portion of the lens 127 may be defined by a reflective coating applied to the lens 127. In some examples, the reflective coating may be made of a material that provides reflective properties, but does not obstruct the user’s view through the lens 127. For example, the reflective coating may be a near infrared coating material. In some examples, the capture of the reflected image of the eye of the user may be facilitated by illuminating the eye of the user. As shown in FIGs. 2A and 2C, the light source 119 may emit light toward the lens 127 of the head mounted wearable device 100. The light emitted by the light source 119 may be reflected by the lens 127, for example, the reflective portion of the lens 127, toward the eye of the user to illuminate the eye of the user. As shown in FIGs. 2B and 2D, the image sensor 117 may capture an image of the illuminated eye of the user reflected at the lens 127, for example, at the reflective portion of the lens 127. The light source 119 may emit light that is not visible to the user, so that the light emitted by the light source 119 is not a distraction, a source of discomfort, and the like while the head mounted wearable device 100 is worn. For example, the light source 119 may emit infrared light, so that the light is not visible to the user.
[0023] In some situations, the frame 110 of the head mounted wearable device 100 may experience deflection, or deformation. This may occur due to, for example, a head size and/or shape of the user wearing the head mounted wearable device 100, movement or slippage of the head mounted wearable device 100, and other such factors. Deformation or deflection or slippage of the frame 110 that causes, for example, a relative shift in position and/or orientation between the image sensor 117 of the gaze tracking device 115 and the lens 127 may affect the accuracy of eye/gaze tracking performed by the gaze tracking device 115 based on the images captured by the image sensor 117 of the gaze tracking device 115. Similarly, deformation or deflection or slippage that causes a relative shift in position and/or orientation between one or both of the arm portion(s) 130 in which the display device(s) 104 is/are provided and the front frame portion 120 of the frame 110 may impact the user’s ability to view visual content output by the display device 104.
[0024] FIGs. 3A-3C provide a perspective view, a top view, and a side view of the frame 110 of the head mounted wearable device 100 in an at-rest state, or a baseline state, or a reference state. In the reference state, little to no forces are applied to the frame 110 that cause any type of deflection, or deformation of the frame 110. In some examples, the gaze tracking device 115 may be calibrated with the frame 110 in the reference state, so that gaze tracking done by the gaze tracking device 115 may be coordinated with content output by the display device 104, for the detection of user inputs and/or interactions with the content output by the display device 104 and/or with objects in the physical environment, and the like. [0025] FIGs. 4A-4C provide a perspective view, a top view, and a side view, of the frame 110 of the head mounted wearable device 100, in a first example deformed state. In the state shown in FIGs. 4A-4C, a force has been applied to the frame 110 causing one or both of the arm portions 130 to be deflected outward, for example outward with respect to the front frame portion 120 of the frame 110. In particular, a first force has been applied to a first arm portion 130A, causing the first arm portionl30A to be deflected outward, in the direction of the arrow Al. A second force has been applied to a second arm portion 130B, causing the second arm portion 130B to be deflected in the direction of the arrow A2. In FIG. 4B, the position of the arm portions 130 in the reference state are shown in dashed lines, so that the deflection from the reference state is visible. This outward deflection of the arm portions 130 may be due to, for example, the head size of the user being relatively large compared to with width of the front frame portion 120 of the frame 110, shifting of the head mounted wearable device 100 on the head of the user, and the like. In this example, the outward deflection of the arm portions 130 has also caused a change in contour of the front frame portion 120 of the frame 110. In FIGs. 4B and 4C, the contour of the front frame portion 120 and the arm portions 130 in the reference state is shown in dashed lines, so that the change in contour is visible. [0026] FIGs. 5A-5C provide a perspective view, a top view, and a side view, of the frame 110 of the head mounted wearable device 100, in a second example deformed state in which a twisting force has been applied to the frame 110. In the state shown in FIGs. 5A-5C, a first force has been applied to the first arm portion 130A, causing the first arm portion 130A to be deflected upward, in the direction of the arrow Bl, and a second force has been applied to the second arm portion 130B, causing the second arm portion 130B to be deflected downward, in the direction of the arrow B2. In FIG. 5C, the position of the first arm portion 130A and the second arm portion 130B in the at rest state is shown in dashed lines, so that the deflection is visible. The deflection of the first arm portion 130A and the second arm portion 130B in this manner may be due to, for example, a shifting of the head mounted wearable device 100 on the head of the user, wear at the hinge portions 140, and the like. In this example, the deflection of the arm portions 130 has also caused twisting of the front frame portion 120 of the frame 110, as can be seen in FIG. 5C.
[0027] The first example deformed state and the second example deformed state shown in FIGs. 4A-5C provide just two examples of how deflection of the arm portions 130 of the frame 110 may cause deformation of the frame 110. As noted above, deflection or deformation of the frame 110 may result in a shifting of a relative position and/or orientation of the image sensor 117 and the lens 127 from which the image of the user’s eye is captured. The principles to be described herein may be applied in response to other types of deformation experienced by the frame 110 that are not explicitly shown herein. The shift (from the reference state, at which the gaze tracking device was calibrated) in position and/or orientation of the image sensor 117 relative to the lens 127, from which the image of the user’s eye is captured, may affect the accuracy of eye/gaze tracking performed by the gaze tracking device 115. For example, depending on a degree, or an amount, or a magnitude, of the deflection or deformation experienced by the frame 110, the corresponding shift in position and/or orientation of the gaze tracking device 115 relative to the lens 127, from the reference state to the deformed state, may fall outside of the calibration thresholds, and adversely affect the accuracy of the eye/gaze tracking performed by the gaze tracking device gaze tracking device 115. Similarly, depending on a degree, or an amount, or a magnitude, of the deflection or deformation experienced by the frame 110, the corresponding shift in position and/or orientation of the display device 104 relative to the lens 127, from the reference state to the deformed state, may fall outside of the calibration thresholds of the display device 104, and adversely affect the user’s visibility of visual content output by the display device 104. [0028] The trend toward lighter weight, smaller form factor head mounted wearable devices may make the frame 110 more susceptible to deflection and/or deformation and/or slippage. This, coupled with the increasing use of eye/gaze tracking as a user input mode and/or for user interaction with content and/or objects in the physical environment, puts more emphasis on the ability to detect and/or estimate deformation and/or slippage of the frame 110, and to provide for the re-calibration of the gaze tracking device 115 to correct for, or account for, the deformation and preserve accuracy of the eye/gaze tracking.
[0029] Systems and methods, in accordance with implementations described herein, provide for the detection of deformation and/or slippage of the frame of a head mounted wearable device, such as the example head mounted wearable device 100 described above. In particular, systems and methods, in accordance with implementations described above, provide for the detection of deformation and/or slippage of a frame of the head mounted wearable device, and the estimation of an amount, or degree, or magnitude of deformation, so that it can be determined whether or not the detected deformation and/or slippage may cause a gaze tracking device, such as the example gaze tracking device 115 described above or other gaze tracking device, to be outside of calibration thresholds that could adversely impact the accuracy of the output of the gaze tracking device. Similarly, the estimation of an amount, or degree, or magnitude of deformation provides for a determination of whether or not the detected deformation and/or slippage may cause a display device, such as the example display device 104 described above or other display device, to be outside of calibration thresholds that could adversely impact the visibility of visual content output by the display device.
[0030] In some examples, the deformation and/or slippage of the frame causes a shift in a position and/or orientation of an image sensor of the gaze tracking device relative to a lens of the head mounted wearable device, from which an image of the user’s eye is captured by the image sensor to perform the eye tracking. In some examples, one or more fiducial markers are provided on the lens. Detection of the one or more fiducial markers provides for the determination of a position and/or an orientation of the lens with respect to the image sensor of the gaze tracking device in the deformed state of the frame. In some examples, an amount, or degree, or magnitude of a difference between a reference position (based on a reference position of the one or more fiducial markers) of the image sensor with respect to the lens and the determined position in the deformed state (based on the detection of the one or more fiducial markers) is used to determine whether or not the deformation and/or slippage will cause the gaze tracking device to operate outside of set calibration thresholds. In some examples, a correction factor is applied to one or more the gaze tracking algorithm(s) based on the detected change in position of the one or more fiducial markers. This correction factor may take into account, or compensate for, the changes caused by the deformation and/or slippage of the frame, to maintain accuracy of the eye/gaze tracking performed by the gaze tracking device. For example, at least one correction parameter (e.g., a correction factor) of the gaze tracking algorithm may be determined based on the position and/or orientation of the at least one fiducial marker. For example, the correction parameter is updated based on the detected position and/or orientation of the one or more fiducial markers or a change of the position and/or orientation of the one or more fiducial markers. The correction parameter may be applied to input data to be processed by the gaze tracking algorithm such as light intensity values. Similarly to adjusting the gaze tracking algorithm a display control algorithm may be adjusted based on the detected position and/or orientation of the one or more fiducial markers or a change of the position and/or orientation of the one or more fiducial markers.
[0031] Hereinafter, systems and methods will be described with respect to a gaze tracking device 115 provided in and/or on one of the two arm portions 130 relative to a corresponding one of the two lenses 127 of the example head mounted wearable device 100, simply for purposes of discussion and illustration. The principles to be described herein are similarly applicable to a head mounted wearable device including gaze tracking devices 115 provided in each of the two arm portions 130 (to provide for eye/gaze tracking of both eyes of the user), and/or a head mounted wearable device including display devices 104 provided in each of the two arm portions 130 (to provide for binocular display of visual content).
[0032] FIG. 6A is a schematic illustration of a plurality of fiducial markers 600 on one of the lenses 127 of the example head mounted wearable device 100 described above. The example shown in FIG. 6A includes three fiducial markers 600, simply for purposes of discussion and illustration. The principles described herein can be applied to a head mounted wearable device including more, or fewer, fiducial markers, arranged similarly to or differently from the plurality of fiducial markers 600 shown in FIG. 6A. The example shown in FIG. 6A includes a first fiducial marker 600A, a second fiducial marker 600B, and a third fiducial marker 600C arranged on the lens 127. In the example shown in FIG. 6A, the fiducial markers 600 are be arranged outside of a viewing area 650 of the lens 127, so that the fiducial markers 600 do not inhibit or interfere with the user’s view of objects in the ambient environment, content output by the display device 104, and the like.
[0033] In some examples, one or more of the fiducial markers 600 may be made of a retroreflective thin film material applied to the surface of the lens 127. For example, the one or more fiducial markers 600 made of the thin film retroreflective material have retroreflective properties that are different from the reflective properties of the reflective portion of the lens 127 described above, so that the one or more fiducial markers 600 are detectable and distinguishable. In some examples, one or more of the fiducial markers 600 may be formed by a removal of material, for example, from the reflective portion of the lens 127. For example, one or more of the fiducial markers 600 may be formed by masking as retroreflective material is applied to the lens 127 to form the reflective portion of the lens 127. In some examples, one or more of the fiducial markers 600 may be a laser etched marker formed in the surface of the lens 127. In some examples, one or more of the fiducial markers 600 may be an optical grating formed in the surface of the lens 127. The fiducial markers 600 may be formed so that they are not visible to the user when the head mounted wearable device 100 is worn by the user.
[0034] In the example arrangement shown in FIG. 6A, the fiducial markers 600, i.e., the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C, are spatially separated, or spaced apart, on the surface of the lens 127. This spatial separation of the fiducial markers 600 on the surface of the lens 127 may make the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C more easily distinguishable from each other. This spatial separation of the fiducial markers 600 on the surface of the lens 127, and the resulting distinction of the fiducial markers 600, may facilitate the detection of changes in position of the fiducial markers 600 in response to deformation of the frame 110 as described above. As shown in FIG. 6B, the fiducial markers 600 are detected, for example, by the image sensor 117 of the gaze tracking device 115. Illumination of the surface of the lens 127 by, for example, the light source 119 of the gaze tracking device 115, may facilitate the detection of the fiducial markers 600 in images captured by the image sensor 117. In some examples, the light emitted by the light source 119 may be, for example, infrared light, so that the light is not visible to the user wearing the head mounted wearable device 100. In a reflection of the infrared light by the fiducial markers 600 may be detectable by the image sensor 117, to enable the detection of the fiducial markers 600.
[0035] The positioning of the fiducial markers 600 shown in FIG. 6A represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C with the frame 110 in the reference state, in which the frame 110 experiences little to no deformation. The positioning of the fiducial markers 600 in FIG. 6C represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C in response to outward deflection of the arm portions 130, as described above with respect to FIGs. 4A-4C. The positioning of the fiducial markers 600 in FIG. 6D represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C in response to further outward deflection of the arm portions 130, beyond the state shown in FIG. 6C. In particular, the state shown in FIG. 6D may represent a further outward rotation or deflection of the arm portions 130, beyond a stop point of a rotation device of the hinge portions 140. In this example, this further deflection of the arm portions 130 causes a change in contour (for example, a relative flattening) of the front frame portion 120 of the frame 110. As can be seen in FIG. 6C, the rotation, and outward deflection of the arm portions 130 has caused a change in position of the first fiducial marker 600A. In FIG. 6D, the further outward deflection of the arm portions 130 has caused a further change in position of the first fiducial marker 600 A, and also a slight change in position of the third fiducial marker 600C. The positioning of the fiducial markers 600 in FIG. 6E represents an example positioning of the first fiducial marker 600A, the second fiducial marker 600B and the third fiducial marker 600C in response to a twisting force applied to the frame 110, as described above with respect to FIGs. 5A-5C. [0036] Changes in positions and/or orientations of the fiducial markers 600 are detected based on the analysis of images captured by the image sensor 117. The detected changes may include, for example, a horizontal shift or change in position (i.e., along the X axis). The detected changes may include, for example, a vertical shift of change in position (i.e., along the Y axis). The detected changes may include, for example, movement closer to or further from the image sensor 117 (i.e., along the Z axis). In some examples, detection of a movement of the fiducial markers 600 closer to or further from the image sensor 117 is based on, for example, changes in a detected size and/or shape of the fiducial markers 600. The detected changes in positions and/or orientations of the fiducial markers 600 may be used to provide for adjustment of one or more eye/gaze tracking algorithm(s) implemented by the gaze tracking device 115 to account for the shift in the relative positions of the lens 127 and the gaze tracking device 115, and preserve the accuracy of the eye/gaze tracking data output by the gaze tracking device 115. That is, the detected change in position, and orientation, of the first fiducial marker 600A and/or the second fiducial marker 600B and/or the third fiducial marker 600C in this example may be used to determine an adjustment factor representative of the corresponding change in position of the lens 127 relative to the image sensor 117. The adjustment factor may be applied to one or more of the eye/gaze tracking algorithm(s) implemented by the gaze tracking device 115, to compensate for the change in position of the lens 127 relative to the image sensor 117. This re-calibration of the gaze tracking device 115 may allow the gaze tracking device 115 to accurately track user eye gaze, even in the event of deformation of the frame 110. The tracking of the positions of the fiducial markers 600 may be done substantially continuously while the gaze tracking device 115 is in operation, so that the re-calibration can be performed and updated in substantially real time.
[0037] In some examples, the detected changes in positions of the fiducial markers 600 are used to provide for adjustment of one or more display control algorithm(s) implemented by the control system 112, to control the output of content by the display device 104. For example, display parameters associated with the output of content by the display device 104 may be adjusted to account for the shift in the relative positions of the lens 127 and the display device 104. For example, display parameters may be adjusted so that visual content output by the display device 104 remains visible to the user, for example, within the eye box of the user, in the deformed state of the frame 110. That is, a detected change in position and/or orientation, of the first fiducial marker 600A and/or a detected change in position and/or orientation of the second fiducial marker 600B and/or a detected change in position and/or orientation of the third fiducial marker 600C in this example may be used to determine a first adjustment factor representative of the corresponding change in position of the lens 127 relative to the image sensor 117. In some examples, the first adjustment factor is applied to the one or more display control algorithm(s) implemented by the control system 112, to compensate for the change in position of the lens 127. In some examples, a second adjustment factor may be determined, based on the first adjustment factor, that is representative of a change in position of the lens 127 relative to the display device 104. In some examples, the second adjustment factor is applied to the one or more display control algorithm(s) implemented by the control system 112, to compensate for the change in position of the lens 127. This re-calibration of the display device 104 may allow the display device 104 to output visual content in a manner that is visible to the user, i.e., at the output coupler 105/within the eye box of the user, even in the event of deformation of the frame 110. [0038] The example described above with respect to FIGs. 6A-6E includes three fiducial markers 600, simply for purposes of discussion and illustration. FIG. 7A illustrates an example in which different positioning and/or numbers and/or combinations of fiducial markers 700 are incorporated into an example lens 127. As can be seen from the reference markings provided on the example lens 127 shown in FIG. 7 A, in some examples, the fiducial markers 700 are offset from each other. In some examples, the fiducial markers 700 are offset from each other in a vertical direction (in the example orientation shown in FIG. 7A). In some examples, the fiducial markers 700 are offset from each other in a horizontal direction (in the example orientation shown in FIG. 7A). In some examples, the fiducial markers 700 are offset from each other in both a vertical direction and a horizontal direction (in the example orientation shown in FIG. 7A). In some examples, some of the fiducial markers 700 are offset from each other in a vertical direction, and some of the fiducial markers are offset from each other in a horizontal direction (in the example orientation shown in FIG. 7A).
[0039] FIGs. 6A-7A illustrates just some example positioning/combinations of positions/numbers of fiducial markers that can be incorporated into the lens of the head mounted wearable device 100, simply for purposes of discussion and illustration. More, or fewer, fiducial markers can be incorporated into the lens 127, in similar or different combinations/ arranged similarly to or different from the examples shown in FIGs. 6A-7A. Numbers and/or sizing and/or positioning/placement of fiducial markers on the lens 127 may be determined based on numerous factors including, for example, a particular configuration of the lens (size, curvature, and the like), relative positioning/placement of the lens with respect to the display device, the eye/gaze tracking device, and other such factors.
[0040] In some examples, some, or all, of the fiducial markers, including the example fiducial markers 600 shown in FIGs. 6A-6E and/or the example fiducial markers 700 shown in FIG. 7A, are laser etched into the example lens 127. A schematic diagram of a crosssection of the example lens 127, in an area of one of the example fiducial markers 700, is shown in FIG. 7B.
[0041] In some examples, laser etching of the lens 127 includes laser ablation of at least one of, or a plurality of, anti -reflective coating layers 730. The laser ablation removes the anti -reflective coating layer(s) 730 to form a laser ablated area, or etched area 740, in a primer or bonding layer 720 coupling the anti -reflective coating layer(s) 730 to a lens substrate 710. The roughed, or etched area 740 formed by the laser ablation process defines the fiducial marker 700 that creates a localized scattering effect when light is directed at the lens 127. In some examples, the roughened or etched area 740 (from which the anti-reflective coating material has been removed) may be filled with a material that will still allow for reflection and localized scattering from the etched area 740. In some examples, the etched area 740 may be small enough so that the etched area 740 remains unfilled, and is not noticeable to the user. That is, in some examples, a dimension, i.e., a size, or a depth, or a contour, of the etched area 740 may be small enough, for example, proportionally small compared to a thickness of the lens substrate 710 and/or an overall size of the lens 127 and the like, that it remains unnoticeable to the wearer of the head mounted wearable device 100. [0042] FIG. 8 is a flowchart of an example method 800, in accordance with implementations described herein. The example method may be performed by computing systems implemented in the example head mounted wearable device 100 described above, or other such device having processing/computing capability and/or display capability and/or eye/gaze tracking capability.
[0043] During operation of a gaze tracking device (such as the gaze tracking device
115 described above, or other such gaze tracking device) of a head mounted wearable device (block 810), one or more fiducial markers (such as the example fiducial markers 600 described above, or other fiducial markers/combinations of fiducial markers provided on a lens of a head mounted wearable device) are detected on one or more lenses of the head mounted wearable device (block 820). Detection of the fiducial markers may include, for example illumination of the surface(s) of the lens(es) by a light source of the gaze tracking device and detection of the fiducial markers by an image sensor of the gaze tracking device, such that the fiducial markers are detected through analysis of images capture by the image sensor. In some examples, detection of the fiducial markers includes detection of a reflection of light from the fiducial markers, detected by the image sensor. Detection of the fiducial markers by the image sensor may be performed substantially continuously while the gaze tracking device operates, so that a current position of the fiducial markers are compared to a previous position of the fiducial markers (block 830). A detected change in position and/or orientation of one or more of the fiducial markers (block 840) may trigger an adjustment to the eye/gaze tracking algorithm implemented by the gaze tracking device/re-calibration of the gaze tracking device (block 850). In particular, a detected change in position and/or orientation of one or more of the fiducial markers that is greater than or equal to a set threshold, triggers an ad adjustment to the eye/gaze tracking algorithm implemented by the gaze tracking device 115. In some examples, the set threshold is representative of a change in position of one or more of the fiducial markers along one of an X axis, a Y axis, or a Z axis that is great enough to trigger a re-calibration of the gaze tracking device. The process may repeatedly performed until the operation of the gaze tracking device is terminated (block 860).
[0044] The terminology used herein is for the purpose of describing particular implementations only and is not intended to be limiting of the implementations. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used in this specification, specify the presence of the stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
[0045] It will be understood that when an element is referred to as being "coupled," "connected," or "responsive" to, or "on," another element, it can be directly coupled, connected, or responsive to, or on, the other element, or intervening elements may also be present. In contrast, when an element is referred to as being "directly coupled," "directly connected," or "directly responsive" to, or "directly on," another element, there are no intervening elements present. As used herein the term "and/or" includes any and all combinations of one or more of the associated listed items.
[0046] Spatially relative terms, such as "beneath," "below," "lower," "above," "upper," and the like, may be used herein for ease of description to describe one element or feature in relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the term "below" can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 70 degrees or at other orientations) and the spatially relative descriptors used herein may be interpreted accordingly.
[0047] Example implementations of the concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized implementations (and intermediate structures) of example implementations. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example implementations of the described concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. Accordingly, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example implementations.
[0048] It will be understood that although the terms "first," "second," etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. Thus, a "first" element could be termed a "second" element without departing from the teachings of the present implementations.
[0049] Unless otherwise defined, the terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which these concepts belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and/or the present specification and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0050] While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components, and/or features of the different implementations described.

Claims

WHAT IS CLAIMED IS:
1. A head mounted wearable device, comprising: a frame, including: a front frame portion; an arm portion rotatably coupled to a first end portion of the front frame portion; a lens received in a rim portion of the front frame portion; at least one fiducial marker provided on the lens; and a gaze tracking device provided in the arm portion, the at least one fiducial marker being detectable by an image sensor of the gaze tracking device, wherein the gaze tracking device is configured to detect a change in position and/or orientation of the at least one fiducial marker based on image data captured by the image sensor, and to adjust implementation of a gaze tracking algorithm by the gaze tracking device in response to the change in position of the at least one fiducial marker.
2. The head mounted wearable device of claim 1, wherein the gaze tracking device is configured to repeatedly capture image data, to compare a current position of the at least one fiducial marker to a previous position of the at least one fiducial marker, and to apply an updated adjustment to the gaze tracking algorithm based on the comparison.
3. The head mounted wearable device of claim 1 or 2, wherein a plurality of fiducial markers is arranged on a surface of the lens.
4. The head mounted wearable device of claim 3, wherein the gaze tracking device is configured to detect a change in a position of at least one of the plurality of fiducial markers, and to adjust the gaze tracking algorithm based on a relative change in position of the plurality of fiducial markers.
5. The head mounted wearable device of claim 3 or 4, wherein each of the plurality of fiducial markers is one of a retror ef ective thin film marker, an optical grating defined in the lens, or a laser etching defined in the lens.
6. The head mounted wearable device of any of claims 3 to 5, wherein the plurality of fiducial markers are spatially separated on the surface of the lens, outside of a previously defined viewing area of the lens.
7. The head mounted wearable device of any of the preceding claims, wherein the lens includes a reflective portion, the gaze tracking device includes a light source configured to illuminate the reflective portion of the lens, and the image sensor of the gaze tracking device is configured to capture images of an eye of a user of the head mounted wearable device reflected by the reflective portion of the lens.
8. The head mounted wearable device of any of the preceding claims, wherein the at least one fiducial marker is defined by a laser etched portion of the lens, from which an anti -reflective coating of the lens has been removed, and wherein the image sensor of the gaze tracking device is configured to detect the at least one fiducial marker based on a scattering of light reflected from the laser etched portion of the lens defining the at least one fiducial marker.
9. The head mounted wearable device of any of the preceding claims, further comprising a display device provided on the frame and configured to output display content, wherein the display device is configured to adjust implementation of a display output algorithm in response to the change in position of the at least one fiducial marker.
10. A non-transitory, computer-readable medium including a memory storing instructions that, when executed by at least one processor of a head mounted wearable device, cause the at least one processor to: detect, by a gaze tracking device coupled in a frame of the head mounted wearable device, a plurality of fiducial markers provided on a lens of the head mounted wearable device; detect, by the gaze tracking device, a change in a position and/or orientation of at least one of the plurality of fiducial markers; adjust implementation of a gaze tracking algorithm by the gaze tracking device in response to the change in the position and/or orientation of the at least one of the plurality of fiducial markers; and operate the gaze tracking device in accordance with the adjusted implementation of the gaze tracking algorithm.
11. The non-transitory, computer-readable medium of claim 10, wherein the gaze tracking device includes at least one image sensor, and wherein the instructions cause the at least one processor to: capture, by the at least one image sensor, image data including the plurality of fiducial markers on the lens; and detect the plurality of fiducial markers from the image data captured by the at least one image sensor.
12. The non-transitory, computer-readable medium of claim 11, wherein the instructions cause the at least one processor to: repeatedly capture the image data including the plurality of fiducial markers on the lens; compare a previous position and/or orientation of the plurality of fiducial markers in previously captured image data to a current position and/or orientation of the plurality of fiducial markers detected in currently captured image data; detect an updated change in the position and/or orientation of the at least one of the plurality of fiducial markers based on the comparison; adjust a previous gaze tracking algorithm to a current gaze tracking algorithm based on the updated change in the position and/or orientation of the at least one of the plurality of fiducial markers; and operate the gaze tracking device in accordance with the current gaze tracking algorithm.
13. The non-transitory, computer-readable medium of any of claims 10 to 12, wherein the plurality of fiducial markers are arranged on a surface of the lens, each of the plurality of fiducial markers being one of a retror ef ective thin film marker applied on the surface of the lens, an optical grating defined on the surface of the lens, or a laser etching defined on the surface of the lens.
14. The non-transitory, computer-readable medium of any of claims 10 to 13, wherein the plurality of fiducial markers are defined by a corresponding plurality of laser etched portions of the lens, from which an anti -reflective coating of the lens has been removed, and wherein the image sensor of the gaze tracking device is configured to detect the plurality of fiducial markers based on a scattering of light reflected from the plurality of laser etched portions of the lens defining the plurality of fiducial markers.
15. The non-transitory, computer-readable medium of any of claims 10 to 14, wherein the head mounted wearable device includes a display device coupled in the frame and configured to output display content, and wherein the instructions cause the at least one processor to: adjust a display output algorithm implemented by a display device of the head mounted wearable device in response to detecting the change in the position of the at least one of the plurality of fiducial markers; and operate the display device in accordance with the adjusted display output algorithm.
PCT/US2023/071487 2022-09-14 2023-08-02 Fiducial based temporal arm deformation estimation WO2024059383A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263375599P 2022-09-14 2022-09-14
US63/375,599 2022-09-14

Publications (1)

Publication Number Publication Date
WO2024059383A1 true WO2024059383A1 (en) 2024-03-21

Family

ID=87801498

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/071487 WO2024059383A1 (en) 2022-09-14 2023-08-02 Fiducial based temporal arm deformation estimation

Country Status (1)

Country Link
WO (1) WO2024059383A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210263307A1 (en) * 2020-02-21 2021-08-26 Fotonation Limited Multi-perspective eye acquisition
US11269406B1 (en) * 2019-10-24 2022-03-08 Facebook Technologies, Llc Systems and methods for calibrating eye tracking
US11520152B1 (en) * 2020-08-06 2022-12-06 Apple Inc. Head-mounted display systems with gaze tracker alignment monitoring

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11269406B1 (en) * 2019-10-24 2022-03-08 Facebook Technologies, Llc Systems and methods for calibrating eye tracking
US20210263307A1 (en) * 2020-02-21 2021-08-26 Fotonation Limited Multi-perspective eye acquisition
US11520152B1 (en) * 2020-08-06 2022-12-06 Apple Inc. Head-mounted display systems with gaze tracker alignment monitoring

Similar Documents

Publication Publication Date Title
JP7415077B2 (en) Display system and method for determining alignment between a display and a user's eyes
JP7431246B2 (en) Eye tracking using images with different exposure times
US11675432B2 (en) Systems and techniques for estimating eye pose
CN111771179A (en) Display system and method for determining registration between a display and an eye of a user
WO2016118178A1 (en) Compressible eyecup assemblies in a virtual reality headset
US11516457B2 (en) Switchable fringe pattern illuminator
JP2021532464A (en) Display systems and methods for determining vertical alignment between the left and right displays and the user's eyes.
JP7081599B2 (en) Information processing equipment, information processing methods, and programs
US9418617B1 (en) Methods and systems for receiving input controls
WO2024059383A1 (en) Fiducial based temporal arm deformation estimation
US11983315B2 (en) Augmented reality device and method for detecting gaze of user
US11782279B2 (en) High efficiency pancake lens
TW202317771A (en) Compact imaging optics using spatially located, free form optical components for distortion compensation and image clarity enhancement
EP4179273B1 (en) Vcsel arrays for generation of linear structured light features
JP7282437B2 (en) head mounted display
US11719942B2 (en) Offsetting image light aberration due to waveguide movement in display assemblies using information from piezoelectric movement sensors
US11892649B2 (en) Passive world-referenced eye tracking for smartglasses
US11415808B1 (en) Illumination device with encapsulated lens
WO2023014185A1 (en) Augmented reality device and method for detecting gaze of user
WO2022272156A1 (en) Offsetting image light aberration due to waveguide movement in display assemblies
US20230324712A1 (en) Proportional frame deflection to maintain glasses display alignment
WO2023028230A1 (en) Diffractive optical element (doe) on an imaging sensor to reduce and minimize flare
WO2024059648A1 (en) Online calibration of a head-worn device
CN117730271A (en) Augmented reality apparatus and method for detecting user gaze
WO2023034212A1 (en) Tunable transparent antennas implemented on lenses of augmented-reality devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23761373

Country of ref document: EP

Kind code of ref document: A1