CN117980795A - Eye reflection using IR light sources on transparent substrates - Google Patents

Eye reflection using IR light sources on transparent substrates Download PDF

Info

Publication number
CN117980795A
CN117980795A CN202280064175.1A CN202280064175A CN117980795A CN 117980795 A CN117980795 A CN 117980795A CN 202280064175 A CN202280064175 A CN 202280064175A CN 117980795 A CN117980795 A CN 117980795A
Authority
CN
China
Prior art keywords
light sources
eye
electronic device
light
transparent substrate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280064175.1A
Other languages
Chinese (zh)
Inventor
R·G·辉扎
B·佩托简斯基
黄琼
S·P·福斯特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority claimed from PCT/US2022/043956 external-priority patent/WO2023049065A1/en
Publication of CN117980795A publication Critical patent/CN117980795A/en
Pending legal-status Critical Current

Links

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

Various embodiments disclosed herein include electronic devices, systems, and methods that detect the reflection of light reflected from an eye generated by a plurality of light sources. An example electronic device may include a frame, an image sensor, a transparent substrate coupled to the frame, a waveguide coupled to the transparent substrate, and a processor coupled to the plurality of IR light sources. The transparent substrate may include a plurality of Infrared (IR) light sources that may be disposed within or on a surface of the transparent substrate in a spatial arrangement. The waveguide may be configured to display a projected image. The processor may be configured to receive sensor data from the image sensor. The sensor data may correspond to a plurality of reflections of light generated by the plurality of IR light sources and reflected from the eye.

Description

Eye reflection using IR light sources on transparent substrates
Technical Field
The present disclosure relates generally to electronic devices, and in particular, to systems, methods, and devices for determining eye characteristics of a user of an electronic device.
Background
Existing eye tracking techniques analyze glints reflected from the user's eyes and captured via an image sensor. Some head-mounted systems may include eye tracking techniques to analyze glints using light projected from a light source located at an edge of the device (e.g., a frame of a pair of eyeglasses). Eye tracking systems may lack accuracy, require more than one camera to capture a sufficient number of flashes, and require sub-optimal eye camera placement for capturing a sufficient number of flashes. Accordingly, it may be desirable to provide a means for effectively positioning a light source to produce glints for assessing eye characteristics (e.g., gaze direction, eye orientation, identifying the iris of an eye, etc.) of a head-mountable system.
Disclosure of Invention
Various implementations disclosed herein include devices, systems, and methods that evaluate eye characteristics (e.g., gaze direction, eye orientation, identifying an iris of an eye, etc.) of a user wearing a Head Mounted Device (HMD). Eye characteristic evaluation is based on a determination of the location of a flash of light generated using a light source such as an Infrared (IR) Light Emitting Diode (LED), micro IR LED, mini (IR LED, etc.). The light source is placed on a transparent substrate (e.g., a lens) that can be placed between the display (or view of the physical environment) and the human eye to illuminate the eye to produce a glint (e.g., reflection of the IR LED on the eye). The light sources (e.g., IR LEDs) are positioned on or within the transparent substrate, rather than (or in addition to) being positioned on the peripheral edge of the HMD.
In some implementations, the light source may be connected to the power source with a transparent conductive material and may be separately driven. The light sources, such as IR LEDs, may be small enough (e.g., less than 100 μm) that the user is unlikely to notice during use of the HMD (e.g., micro-IR LEDs, mini-IR LEDs, etc.) in view of their size and close proximity to the eye. In some implementations, different wavelengths generated by the IR LED may be used for different applications.
The advantage of including a light source in the lens/transparent substrate will allow a wider range of area options for placement of the light source and potentially reduce the thickness of the frame surrounding the lens than including the light source on the edge of the eyepiece/lens frame. In addition, due to the multi-stack architecture (e.g., lenses including bias (-), air gaps, waveguides, and bias (+) layers), a light source (e.g., an IR LED) may be embedded in the middle of the stack. Such a multi-stack structure would be imperceptible to the user when in use and would have better accuracy than on the peripheral edge of the HMD, as the light sources are closer to the optical axis of the user's eye. The positioning of each light source relative to the adjacent light source also becomes important. For example, the light sources should not be too close to each other, as such reflected light (e.g., glints) may be too close and reduce the accuracy of gaze estimation or evaluation of other eye characteristics. In addition, the light source should not be too far from the optical axis (e.g., around the edge of the frame) because the light source will not reflect properly for some ocular characteristics (e.g., the reflection may end up on the sclera rather than the pupil).
In general, one innovative aspect of the subject matter described in this specification can be embodied in an electronic device that includes: a frame; an image sensor; a transparent substrate coupled to the frame, the transparent substrate comprising a plurality of Infrared (IR) light sources, wherein the plurality of IR light sources are configured in a spatial arrangement within or on a surface of the transparent substrate; a waveguide coupled to the transparent substrate, wherein the waveguide is configured to display a projected image; and a processor coupled to the plurality of IR light sources. The processor is configured to receive sensor data from the image sensor, the sensor data corresponding to a plurality of reflections of light generated by the plurality of IR light sources and reflected from the eye.
These and other embodiments can each optionally include one or more of the following features.
In some aspects, the processor is coupled to the plurality of IR light sources via a transparent conductor.
In some aspects, the transparent substrate is configured to display content. In some aspects, the transparent substrate includes a deflection layer, and the plurality of IR light sources are configured in a spatial arrangement on a surface of the deflection layer. In some aspects, the transparent substrate includes a waveguide.
In some aspects, each light source is equidistant from adjacent light sources. In some aspects, each light source is spaced apart from each adjacent light source based on a minimum distance constraint. In some aspects, the plurality of light sources are embedded within the transparent substrate. In some aspects, the plurality of light sources are connected to a power source via a transparent conductor.
In some aspects, the plurality of light sources has a diameter of less than 200 microns. In some aspects, the plurality of light sources has a diameter of less than 100 microns. In some aspects, the plurality of light sources are individually addressable. In some aspects, the plurality of light sources are micro Light Emitting Diodes (LEDs) (e.g., also referred to herein as "micro LEDs"). In some aspects, the plurality of light sources are micro-Infrared (IR) LEDs. In some aspects, the plurality of light sources are miniature light emitting diodes ("mini LEDs").
In some aspects, the plurality of light sources is divided into subgroups, each subgroup comprising two or more light sources of the plurality of light sources. In some aspects, the subset of the plurality of light sources is dispersed throughout the transparent substrate.
In some aspects, the spatial arrangement includes a geometric shape. In some aspects, the geometry includes a parabola, ellipse, hyperbola, or cycloid. In some aspects, the geometry is based on an override curve or algebraic curve.
In some aspects, the plurality of IR light sources are imperceptible to a human eye having an average visual acuity when viewed from a distance of 1cm to 5 cm. In some aspects, the electronic device is a Head Mounted Device (HMD).
In general, one innovative aspect of the subject matter described in this specification can be embodied in methods that include, at an electronic device having a processor, the acts of: light is generated from a plurality of light sources configured in a spatial arrangement within or on a surface of a transparent substrate, wherein the plurality of light sources are configured to direct light toward an eye, and wherein a waveguide is coupled to the transparent substrate and configured to display a projected image. The method further comprises the acts of: receiving sensor data from an image sensor, the sensor data corresponding to a plurality of reflections of light reflected from the eye; and evaluating a characteristic of the eye based on the sensor data.
These and other embodiments can each optionally include one or more of the following features.
In some aspects, assessing characteristics from the eye includes determining an orientation of the eye based on identifying a pattern of the plurality of reflections of light reflected from the eye.
In some aspects, assessing characteristics from the eye includes determining a gaze direction of the eye based on the plurality of reflections of light reflected from the eye. In some aspects, evaluating the characteristic from the eye includes performing authentication. In some aspects, evaluating the characteristic from the eye is based on sensor data from a single sensor. In some aspects, performing authentication includes identifying an iris of the eye.
In some aspects, the transparent substrate is configured to display content. In some aspects, the transparent substrate includes a waveguide. In some aspects, the plurality of light sources are embedded within the transparent substrate. In some aspects, the plurality of light sources are located on a surface of the transparent substrate. In some aspects, the transparent substrate includes a deflection layer, and the plurality of IR light sources are configured in a spatial arrangement on a surface of the deflection layer.
In some aspects, the plurality of light sources are connected to a power source via a transparent conductive material. In some aspects, the plurality of light sources has a diameter of less than 200 microns. In some aspects, the plurality of light sources has a diameter of less than 100 microns. In some aspects, the plurality of light sources are individually addressable.
In some aspects, the plurality of light sources is divided into subgroups, each subgroup comprising two or more light sources of the plurality of light sources. In some aspects, the subset of the plurality of light sources is dispersed throughout the transparent substrate.
In some aspects, determining the eye characteristic includes determining a position of portions of the eye based on determining the positions of the plurality of glints. In some aspects, the light is Infrared (IR) light. In some aspects, the sensor includes an image sensor and receiving the reflected light includes receiving the reflected light in accordance with image data from the sensor. In some aspects, the plurality of IR light sources are imperceptible to a human eye having an average visual acuity when viewed from a distance of 1cm to 5 cm. In some aspects, the electronic device is a Head Mounted Device (HMD).
In general, one innovative aspect of the subject matter described in this specification can be embodied in an electronic device that includes a transparent substrate, a plurality of light sources on the transparent substrate, a sensor, a non-transitory computer-readable storage medium, and one or more processors coupled to the non-transitory computer-readable storage medium. The non-transitory computer-readable storage medium includes program instructions that, when executed on the one or more processors, cause the device to perform operations. The operation includes: generating light from a plurality of light sources on a transparent substrate, wherein the light from the plurality of light sources is configured to direct the light toward an eye; receiving sensor data from an image sensor, the sensor data corresponding to a plurality of reflections of light reflected from an eye of a user; and evaluating the eye characteristic based on the sensor data.
In accordance with some implementations, a non-transitory computer readable storage medium has stored therein instructions that are computer executable to perform or cause to be performed any of the methods described herein. According to some implementations, an apparatus includes one or more processors, a non-transitory memory, and one or more programs stored in the non-transitory memory and configured to be executed by the one or more processors, and the one or more programs include instructions for performing or causing performance of any of the methods described herein.
Drawings
Accordingly, the present disclosure may be understood by those of ordinary skill in the art, and the more detailed description may refer to aspects of some illustrative embodiments, some of which are illustrated in the accompanying drawings.
FIG. 1 illustrates a device that displays content and obtains physiological data from a user, according to some implementations.
Fig. 2A illustrates an example of a user wearing a Head Mounted Display (HMD) according to some implementations.
Fig. 2B shows an example view of a transparent substrate (lens) of the HMD of fig. 2A.
FIG. 3 illustrates an example eye tracking system according to some implementations.
Fig. 4 illustrates an example of a plurality of light sources within a transparent substrate according to some implementations.
Fig. 5A-5D illustrate different spatial arrangements of multiple light sources within a transparent substrate of an HMD according to some implementations.
Fig. 6 illustrates different spatial arrangements of clusters of light sources of the plurality of light sources of fig. 5A-5D, according to some implementations.
FIG. 7 is a flowchart representation of a method for evaluating an eye characteristic of a user based on reflected light from a plurality of light sources on a transparent substrate, according to some implementations.
Fig. 8 is a block diagram illustrating device components of an exemplary device according to some implementations.
The various features shown in the drawings may not be drawn to scale according to common practice. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some figures may not depict all of the components of a given system, method, or apparatus. Finally, like reference numerals may be used to refer to like features throughout the specification and drawings.
Detailed Description
Numerous details are described to provide a thorough understanding of the example implementations shown in the drawings. However, the drawings illustrate only some example aspects of the disclosure and therefore should not be considered limiting. It will be apparent to one of ordinary skill in the art that other effective aspects or variations do not include all of the specific details set forth herein. Moreover, well-known systems, methods, components, devices, and circuits have not been described in detail so as not to obscure the more pertinent aspects of the example implementations described herein.
Fig. 1 shows an environment 5 (e.g., a room) that includes a device 110 with a display 15. In some implementations, the device 110 displays the content 20 to the user 25. For example, the content 20 may be a button, a user interface icon, a text box, a graphic, a avatar of a user or another user, or the like. In some implementations, the content 20 may occupy the entire display area of the display 15.
Device 110 obtains image data, motion data, and/or physiological data (e.g., pupil data, facial feature data, etc.) from user 25 via one or more sensors (e.g., sensor 32). The device 110 may obtain the eye gaze characteristic data 40 via the sensor 32. In addition, the device 110 includes a light source 34 (e.g., a Light Emitting Diode (LED)) that may be used to illuminate the specularly reflective and diffuse portions of the eye 45 of the user 25.
While this example and other examples discussed herein show a single device 110 in the real-world environment 5, the techniques disclosed herein are applicable to multiple devices and other real-world environments. For example, the functions of device 110 may be performed by a plurality of devices with sensor 32 and light source 34 divided among each respective device, or in any combination.
In some implementations, as shown in fig. 1, the device 110 is a handheld electronic device (e.g., a smart phone or tablet computer). In some implementations, the device 110 is a laptop computer or a desktop computer. In some implementations, the device 110 has a touch pad, and in some implementations, the device 110 has a touch sensitive display (also referred to as a "touch screen" or "touch screen display"). In some implementations, the device 110 is a wearable device such as a Head Mounted Device (HMD).
In some implementations, the device 110 includes an eye tracking system for detecting eye position and eye movement via the eye gaze characteristic data 40. For example, the eye tracking system may include one or more Infrared (IR) LEDs (e.g., light source 34), a camera sensitive to the wavelength emitted by the LEDs (e.g., a Near IR (NIR) camera), and an illumination source (e.g., an NIR light source) that emits light (e.g., NIR light) toward the eyes of user 25. The LEDs or IR LEDs may comprise different size ranges. In some implementations, a "mini LED" may be utilized, which may be in the range of about 100 μm x 100 μm ± 20 μm. Additionally or alternatively, in some implementations, a "micro LED" may be utilized, which may be in the range of about 10 μm x 10 μm ± 2 μm. Further, the illumination source of the device 110 may emit NIR light to illuminate the eyes of the user 25, and the NIR camera may capture images of the eyes of the user 25. In some implementations, images captured by the eye tracking system may be analyzed to detect the position and movement of the eyes of the user 25, or to detect other information about the eyes such as color, shape, status (e.g., fully open eyes, squint eyes, etc.), pupil dilation, or pupil diameter. Further, gaze points estimated from eye-tracked images may enable gaze-based interactions with content shown on a near-eye display of device 110.
In some implementations, the device 110 has a Graphical User Interface (GUI), one or more processors, memory, and one or more modules, programs, or sets of instructions stored in the memory for performing a plurality of functions. In some implementations, the user 25 interacts with the GUI through finger contacts and gestures on the touch-sensitive surface. In some implementations, these functions include image editing, drawing, rendering, word processing, web page creation, disk editing, spreadsheet making, game playing, phone calls, video conferencing, email sending and receiving, instant messaging, fitness support, digital photography, digital video recording, web browsing, digital music playing, and/or digital video playing. Executable instructions for performing these functions may be included in a computer-readable storage medium or other computer program product configured for execution by one or more processors.
In some implementations, one or both eyes 45 of the user 25 (including one or both pupils 50 of the user 25) use physiological data in the form of pupillary responses (e.g., eye gaze characteristic data 40) detected from the glint analysis. The pupillary response of user 25 may cause a change in the size or diameter of pupil 50 via the optic nerve and the opthalmic cranial nerve. For example, the pupillary response may include a constrictive response (pupil constriction), i.e., pupil narrowing, or a dilated response (pupil dilation), i.e., pupil widening. In some implementations, the device 110 can detect a pattern of physiological data representing a time-varying pupil diameter.
User data (e.g., eye gaze characteristic data 40) may change over time, and device 110 may use the user data to generate and/or provide a representation of the user.
According to some implementations, an electronic device (e.g., device 110) described herein may generate and present an augmented reality (XR) environment to a user. In contrast to a physical environment in which people may sense and/or interact without the assistance of an electronic device, an augmented reality (XR) environment refers to a completely or partially simulated environment in which people sense and/or interact via an electronic device. For example, the XR environment may include Augmented Reality (AR) content, mixed Reality (MR) content, virtual Reality (VR) content, and the like. In the case of an XR system, a subset of the physical movements of a person, or a representation thereof, are tracked and in response one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner consistent with at least one physical law. As one example, the XR system may detect head movements and, in response, adjust the graphical content and sound field presented to the person in a manner similar to the manner in which such views and sounds change in the physical environment. As another example, the XR system may detect movement of an electronic device (e.g., mobile phone, tablet computer, laptop computer, etc.) presenting the XR environment, and in response, adjust the graphical content and sound field presented to the person in a manner similar to how such views and sounds would change in the physical environment. In some cases (e.g., for reachability reasons), the XR system may adjust characteristics of graphical content in the XR environment in response to representations of physical movements (e.g., voice commands).
There are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include head-mounted systems, projection-based systems, head-up displays (HUDs), vehicle windshields integrated with display capabilities, windows integrated with display capabilities, displays formed as lenses designed for placement on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablet computers, and desktop/laptop computers. The head-mounted system may have an integrated opaque display and one or more speakers. Alternatively, the head-mounted system may be configured to accept an external opaque display (e.g., a smart phone). The head-mounted system may incorporate one or more imaging sensors for capturing images or video of the physical environment and/or one or more microphones for capturing audio of the physical environment. The head-mounted system may have a transparent or translucent display instead of an opaque display. The transparent or translucent display may have a medium through which light representing an image is directed to the eyes of a person. The display may utilize digital light projection, OLED, LED, μled, liquid crystal on silicon, laser scanning light source, or any combination of these techniques. The medium may be an optical waveguide, a holographic medium, an optical combiner, an optical reflector, or any combination thereof. In some implementations, the transparent or translucent display may be configured to selectively become opaque. Projection-based systems may employ retinal projection techniques that project a graphical image onto a person's retina. The projection system may also be configured to project the virtual object into the physical environment, for example as a hologram or on a physical surface.
Fig. 2A illustrates an example operating environment 200 of a real-world environment 5 (e.g., a room) that includes a user wearing a device 210 (HMD). In this example, device 210 is an HMD that includes a transparent or translucent display that includes a medium through which light representing an image is directed to the eyes of user 25. Specifically, device 210 is an HMD, which may also be referred to herein as "AR glasses" or "XR glasses. Such XR glasses may include a transparent display to view the physical environment, and be provided with a display to view other content via retinal projection techniques that project graphical images within or onto the field of view of the human retina.
As shown, the device 210 includes a frame 212 that may be worn on the user's head, and may include additional extensions (e.g., arms) that are placed over the ears of the user 25 to hold the frame in place on the user's head. The device 210 comprises two displays for the left and right eye of the user 25. The frame 212 supports a first lens 215a and a second lens 215b. Each lens 215 includes a transparent substrate. Each lens 215 may be configured as a stack including a deviation (+/-) for a prescription lens, a waveguide and transparent conductor for housing or embedding multiple IR light sources, and the like.
The device 210 further comprises a detector 220a, 220b for each lens 215a, 215b, respectively. The detector 220 may be an image sensor, such as an IR camera, that detects light, such as a flash, reflected from the user's eyes.
In some implementations, the device 210 also includes a projector 240a, 240b for each lens 215a, 215b, respectively. Projector 240 may be used to display XR content to a user (e.g., virtual content presented to the user at a focal distance from device 210 based on a configuration of lenses). The waveguides stacked within lens 215 may be configured to bend and/or combine light directed toward the eyes of user 25 to provide the appearance of virtual content within real physical environment 5, as further shown herein with reference to fig. 4. In some implementations, the device 210 may include only one projector 240. For example, a user's pair of XR glasses only displays XR content on one side of device 210, so user 25 is less distracting and may have a larger view of physical environment 5.
In some implementations, the device 210 also includes a controller 250. For example, the controller 250 may include a processor and a power source that controls the emission of light from the light source. In some implementations, the controller 250 is a microcontroller that can control the processes described herein for assessing characteristics of the eye (e.g., gaze direction, eye orientation, identifying the iris of the eye) based on sensor data obtained from the detector 220. Alternatively, the controller 250 may be communicatively coupled (e.g., in wireless communication) with another device (such as a mobile phone, tablet computer, etc.), and the controller 250 may send the data collected from the detector 220 for analysis by the other device. In an exemplary implementation, device 210 (with controller 250) is a stand-alone unit that does not communicate with another device, which may project virtual content via projector 240 and evaluate characteristics of the eye via a light source for eye tracking purposes. In some implementations, the plurality of light sources are individually addressable. For example, a processor within the controller 250 may individually control each light source 230, 232, 234, 236, etc. The pattern of IR flashes may be formed based on the spatial arrangement of the light sources and the controller may control each light source to provide the desired pattern.
Fig. 2B shows an example view of a transparent substrate (e.g., lens 215). In particular, fig. 2B shows a transparent substrate for device 210 with components (some transparent/translucent components) for an eye tracking system and an XR display. In this example, the lens 215 includes a plurality of light sources (e.g., mini-LEDs, micro-IR LEDs, etc.) 230, 232, 234, 236, a detector 220, a controller 250. The controller 250 may control and provide power to the light sources 230, 232, 234, 236 via the transparent conductors. For example, as shown, the light source 234 is powered and controlled by the controller 250 via the transparent conductor 252 and the light source 236 is powered and controlled by the controller 250 via the transparent conductor 254. The transparent conductors 252, 254 are configured to have a sufficiently small size and/or to be made of one or more transparent materials (e.g., transparent Conductive Films (TCFs)) so as to be undetectable by the human eye, and thus will be considered transparent and/or translucent when viewing content through the lens 215. Transparent conductors 252, 254 include optically transparent and electrically conductive materials including, but not limited to, indium Tin Oxide (ITO), broad spectrum Transparent Conductive Oxides (TCO), conductive polymers, metal grids and random metal networks, carbon Nanotubes (CNTs), graphene, nanowire meshes, and/or ultrathin films. In some implementations, the transparent conductors 252, 254 may include a translucent conductor material, such as silver nano-traces, and the like. For example, a translucent material may refer to a material that is not necessarily transparent but thin enough to be imperceptible to the human eye.
In some implementations, the plurality of light sources 230, 232, 234, 236 are IR sources, such as IR LEDs. In some implementations, the plurality of light sources 230, 232, 234, 236 are micro-IR LEDs. In some implementations, the plurality of light sources 230, 232, 234, 236 are mini-LEDs, also referred to herein as mini-IR LEDs. For example, the size of the micro-IR LEDs or mini-IR LEDs may be small enough to be undetectable by the human eye and therefore will be considered transparent and/or translucent when viewing content (such as the direct-transmitted content of the physical environment 5 or XR content via the display 245) through the lens 215. For example, mini-IR LEDs may be 200 μm, 100 μm, 75 μm, and 50 μm, and micro-LEDs may be 25 μm, 10 μm, 5 μm, 1 μm, or another size that is undetectable by the human eye under ordinary use conditions.
The XR display system of device 210 through lens 215 includes projector 240 and display 245 that may be presented to a user, as shown at the location of display 245. However, when powered and controlled by the controller 250, the light projected from the projector 240 is not projected directly as illustrated. Instead, the light from projector 240 is actually bent via the waveguide such that XR content displayed at display 245 is presented to user 25 at a focal distance from device 210 based on the configuration of the waveguide.
In some implementations, device 210 may cause only one of lenses 215 to display XR content (e.g., left eye lens 215b would be a normal lens, without detector 220b, without projector 240, and thus without display 245). For example, the left eye view will only present the direct content of physical environment 5 (e.g., such as a pair of ordinary glasses), and the right eye view will also have the direct content of physical environment 5 and have the ability to present XR content only to right lens 215 a. For example, only right lens 215a will include light sources (e.g., micro-IR LEDs, mini-IR LEDs, etc.) 230, 232, 234, 236, projector 240, and display 245 to present XR content.
Fig. 3 illustrates an example environment 300 for an eye tracking system according to some implementations. In particular, the eye tracking system of the example environment 300 illustrates tracking the eye characteristics of the eye 45 via a light source 230 (e.g., micro IR LED, mini IR LED, etc.) from the example lens 215 of the device 210. The eye tracking system of example environment 300 illustrates a single light system to observe a glint of eye 45 being reflected into a camera (e.g., detector 220) having a lens. For example, as shown in fig. 3, a light source 230 (e.g., an IR LED, etc.) directs light toward the user's eye 45. The light waves are then reflected from the cornea of eye 45 and detected by detector 220. In one aspect, the light source 230 is used to illuminate both the specularly reflective and diffuse portions of the object (e.g., eye 45) and thus may provide at least a threshold illumination level. Providing at least such a threshold illumination level may result in a flash of light to be detected in an image captured by detector 220. For example, light rays 302, 304, 306 from light source 230 will produce specular glints 312, 314, 316, respectively.
FIG. 4 illustrates an example environment 400 of a plurality of light sources within a transparent substrate according to some implementations. In particular, environment 400 shows an eye 45 looking through lenses 410 in a stacked configuration. Lens 410 illustrates an example lens, such as lens 215 for use with a device within device 210 (e.g., HMD). The lens 410 as a transparent substrate is stacked from the user side (for example, the side facing the user's eye 45), and the layers from the user side to the world side are: bias (-) 422, air gap 412, waveguide 415, air gap 414, and bias (+) 424. Each deviation 422, 424 (also referred to herein as a "deviation layer") may be used for prescription glasses, for example, to change the prescription level based on the size and shape of each deviation. In some implementations, the prescription level is changed by modifying only one bias layer (e.g., bias (+) 424).
The lens 410 (also referred to herein as a "stack") may include different transparent or translucent layers (e.g., opaque but thin enough that they are not perceptible), may not include each layer as shown in fig. 4 (e.g., only one air gap 412 or 414 may be utilized), and/or the layers may be in different combinations. For example, the additional layers may be a colored layer for a lens (e.g., a colored for a different shade of eyewear), a cover glass layer, and a colored cover glass layer. The coloring layer may include an organic Electrochromic (EC) material, also referred to herein as an organic EC coloring layer.
The lens 410 includes a plurality of light sources 430, 432, 434, 436, 438, 440 (also referred to herein as light sources 430) embedded within the lens 410 between the waveguide 415 and the deviation (+) 424. In some implementations, each light source can be attached to the layer of bias (+) 424. Alternatively, each light source 430-440 may be attached to or embedded within waveguide 415. Alternatively, each light source 430-440 may be attached to any surface of a layer within lens 410. Each light source 430 may be connected to a controller (e.g., controller 250) via a transparent conductor (e.g., transparent conductors 252, 254) not shown within the example environment 400. Alternatively, in some embodiments, each light source 430 may be connected to a controller (e.g., controller 250) via a translucent conductor (e.g., silver nano-trace) not shown within the example environment 400. Translucency may include materials that are opaque but thin enough to be imperceptible to the human eye. Each light source 430-440 may be placed within the air gap 414, as shown, embedded within the waveguide 415, embedded within the air gap 412, or a combination of each such that each light source 430-440 is located within the lens 410 (e.g., transparent substrate) within the field of view of the eye 45. In summary, a plurality of light sources 430 may be placed at any interface in the stack/lens 410 (e.g., on one side of the colored layer, behind the colored CG, on the waveguide itself, or on the CG, as shown)
The light sources 430-440 may be dispersed within the lens 410 and throughout the lens in a particular spatial arrangement. Fig. 5A-5D illustrate examples of different spatial arrangements in which each light source 430-440 may be placed within a lens.
Fig. 5A-5D illustrate different spatial arrangements of multiple light sources within a transparent substrate of an HMD according to some implementations. Fig. 5A-5D each illustrate different configuration embodiments 500A-500D, respectively, each including an example lens 510 (e.g., lens 215 for device 210) including a detector 520, a projector 540, a display 545, and a plurality of light sources (e.g., micro IR LEDs, mini IR LEDs, etc.) positioned in different spatial arrangements and embedded within each lens (e.g., as described herein and shown with reference to the stacked configuration of fig. 4). Not shown in each of fig. 5A-5D are a controller (e.g., controller 250) and a transparent conductor (e.g., transparent conductors 252, 254) that are connected between the controllers and provide and control power to each light source. In an exemplary implementation, only one detector 520 is required to acquire light reflections from the plurality of light sources 530 from each of the spatial arrangements shown in fig. 5A-5D, as well as the other different spatial arrangements discussed herein. That is, because the location of the light source 530 is embedded within the lens 510, only one detector 520 is required. However, if the light source 530 is positioned around the edge of the lens, such as on the frame of the lens 510, at least a second detector may be required to obtain reflected light because the reflected distance from the light source on the frame to the eye is greater. For example, if a difficult location (e.g., tilt) of the camera cannot see a sufficient number of flashes such that an estimate of an eye characteristic (such as gaze) is inaccurate, a second camera would be required. For example, if the detector is observing the eye from 90 °, the detector will not be able to detect a glint on the far side of the cornea of the eye. In addition, if the user is looking in an extreme direction, it will also be difficult to detect a flash from one detector at the other side of the extreme gaze direction.
Fig. 5A shows a plurality of light sources 530 spatially arranged with an elliptical configuration, wherein a display 545 appears to be superimposed over a portion of the light sources 530. Because the light source 530 (e.g., micro-IR LED, mini-IR LED, etc.) appears transparent/translucent (e.g., undetectable LED) to the human eye due to the proximity of the lens 510 to the human eye when worn, the light source is not visible, but the display 545 is visible to the human eye. As discussed herein, as shown, display 545 represents a location where a user will view XR content when wearing a device including lenses 510, however, content is first projected to a different area within the waveguide, and the waveguide then bends and projects light to that location at display 545. Accordingly, the illustration of display 545 is for illustrative purposes.
Fig. 5B shows a plurality of light sources 530 spatially arranged with an elliptical configuration, wherein the display 545 appears to be superimposed over a portion of the light sources 530. Fig. 5C shows a plurality of light sources 530 in a spatial arrangement having a grid configuration, wherein the display 545 appears to be beside the plurality of light sources 530. In addition, the spatial arrangement of the grid configuration of fig. 5C shows that each light source is uniformly spaced apart such that each light source is equidistant from each adjacent light source. For example, the distance between light sources 532 and 534 is the same as the distance between light sources 534 and 536. Fig. 5D shows a plurality of light sources 530 having a non-uniform spatial arrangement, wherein no light source overlaps the display 545.
Fig. 6 illustrates different spatial arrangements of clusters of light sources of the plurality of light sources of fig. 5A-5D, according to some implementations. Fig. 6 shows an example configuration embodiment 500A of fig. 5A that includes a plurality of light sources 530 in a spatial arrangement having an elliptical configuration. Fig. 6 also shows a different cluster configuration that may be used for one or more of the plurality of light sources 530. For example, cluster configurations 610, 620, 630, 640 are exemplary cluster configurations that may represent light sources 534 as indicated by region 602. Cluster configuration 610 shows an example single LED cluster including one light source 612 (e.g., a single 75 μm x 50 μm LED). Cluster configuration 620 shows an example multi-LED cluster that includes a plurality of light sources 622 (e.g., 622a, 622c, etc.) in a grid/panel form (e.g., 81 12 μm LEDs in a 250 μm x 250 μm grid). Cluster configuration 630 shows an example multi-LED cluster that includes four light sources 632 (e.g., 632a-632 d) in a 2x 2 array (e.g., four 75 μm x 50 μm LEDs in a 250 μm x 250 μm array). Cluster configuration 640 shows an example multiple LED cluster that includes nine light sources 634 (e.g., 634a-634 i) in a 3x 3 array (e.g., nine 75 μm x 50 μm mLED in a 250 μm x 250 μm array). In some implementations, power may be adjusted for each light source having each of the different array forms (e.g., cluster configurations 610, 620, 630, 640, etc.). For example, the power to a center LED (e.g., light source 642e in the middle of a 3x 3 array form) may be higher than adjacent LEDs within the array to make it the brightest light source in order to improve the location of the centroid of the flash.
Fig. 7 is a flow chart illustrating an exemplary method 700. In some implementations, a device (e.g., device 110 of fig. 1 or device 210 of fig. 2) performs the techniques of method 700 to evaluate an eye characteristic of a user based on reflected light from a plurality of light sources on a transparent substrate. In some implementations, the techniques of method 700 are performed on a mobile device, desktop computer, laptop computer, HMD, or server device. In some implementations, the method 700 is performed on processing logic (including hardware, firmware, software, or a combination thereof). In some implementations, the method 700 is performed on a processor executing code stored in a non-transitory computer readable medium (e.g., memory). In some implementations, the method 700 is performed in a combination of one or more devices as described herein. For example, sensor data from multiple light sensors may be acquired at an HMD (e.g., device 210), but processing of the data (e.g., evaluating eye characteristics) may be performed at a separate device (e.g., a mobile device such as device 110).
At block 702, the method 700 generates light from a plurality of light sources configured in a spatial arrangement within or on a surface of a transparent substrate, wherein the plurality of light sources are configured to direct light toward an eye. In some implementations, the waveguide is coupled to the transparent substrate and configured to display the projected image. In some implementations, the transparent substrate is configured to display content. For example, a user is wearing an HMD (such as device 210) that includes multiple light sources (e.g., micro IR LEDs, mini IR LEDs, etc., such as light sources 230, 232, etc.).
The light source produces a glint (e.g., specular reflection) by producing light reflected from a portion of the eye. In some implementations, the flash may be a specularly reflective flash. In some implementations, if a light source is used to illuminate both the specular and diffuse portions of an object (e.g., eye 45 of user 25), the specular reflection "flash" must be in saturation in order to detect the diffuse region of the object. For example, as shown in fig. 3, a light source 230 (e.g., micro-IR LED, mini-IR LED, etc.) flashes at the eye 45, and a detector 220 (e.g., an image sensor, such as an IR camera) detects the flashes, such as reflected light rays from the eye 45 (e.g., reflected light rays 314 and 316 from rays 304 and 306, respectively).
In some implementations, the light is IR light. In some implementations, the light source is an LED. In some implementations, the light source is a very small IR LED, such as a micro IR LED or a mini IR LED. Alternatively, another type of light source may be used that sufficiently provides a glint when light from the light source is projected onto the eye (the Point Spread Function (PSF) of the glint may be detected by the eye tracking system at the detector), but they are small enough that they appear to be transparent (e.g., undetectable) to the human eye. In some implementations, the plurality of IR light sources are imperceptible to a human eye having an average visual acuity when viewed from a distance of 1cm to 5 cm. For example, the average vertex distance between the eye and the lenses of a pair of eyeglasses is approximately between 14mm and 24mm, so when the apparatus described herein (e.g., HMD) is worn by a user, the IR light sources within the transparent substrate will not be visible. In some implementations, the plurality of light sources has a diameter of less than 200 microns. In some implementations, the plurality of light sources are less than 100 microns (or even smaller, such as 75 μm, 50 μm, 25 μm, 5 μm, etc.) in diameter.
In some implementations, the transparent substrate includes a bias layer (e.g., bias (-), bias (+). For example, for a prescription lens, the bias layer may be modified based on the prescription. In some implementations, the plurality of IR light sources are configured in a spatial arrangement on a surface of the bias layer. For example, as shown in FIG. 4, the plurality of IR light sources (430-440) are attached to the layer of offset (+) 424 of the lens 410.
At block 704, the method 700 receives sensor data from an image sensor, the sensor data corresponding to a plurality of reflections of light reflected from an eye. For example, the sensor (e.g., detector 220) may be an IR image sensor/detector that receives a reflection (e.g., a flash) of light from the eye, such as reflected light rays 314 and 316 from light rays 304 and 306, respectively, as shown in fig. 3.
In some implementations, the method 700 determines the location of the flash based on the reflected light received at the sensor. For example, determining the location of the flash may include determining a centroid of the received light. In some implementations, multiple flashes of light may be generated and positioned by a sensor (e.g., detector 220). For example, the centroid may be determined based on the unsaturated perimeter (e.g., halo).
At block 706, method 700 determines an eye characteristic based on the sensor data. In some implementations, determining the eye characteristic may be based on the determined location of the glints. For example, for an eye tracking system, the eye characteristics may include gaze direction, eye orientation, identifying the iris of the eye, and so forth. For example, if the electronic device is an HMD, an eye tracking system for the HMD may track a user's gaze direction, eye orientation, iris recognition, and the like.
In some implementations, determining the orientation of the eye is based on identifying a pattern of glints/light reflections in the image. In one example, the gaze direction may be determined using sensor data to identify two points on the eye, e.g., a cornea center and an eyeball center. In another example, the gaze direction may be determined using sensor data (e.g., a pattern of flashes) to directly predict the gaze direction. For example, a machine learning model may be trained to directly predict gaze direction based on sensor data.
In some implementations, for iris recognition, the user may be uniquely identified based on a registration process or previous iris evaluations. For example, method 700 may include evaluating a characteristic from an eye by performing an authentication process. The authentication process may include identifying an iris of the eye. For example, the pattern of flash/light reflections in the image is matched to a unique pattern associated with the user. In some implementations, iris recognition techniques (e.g., matching patterns) may be used for anti-spoofing. For example, there may be multiple enrollment patterns that may be changed and used to authenticate the user's iris against a pre-enrolled biometric template and confirm that the user is the correct person, the actual person, and is authenticating in real time. Iris recognition may be used as the primary authentication mode or as part of a multi-factor or step-wise enhanced authentication. The matching pattern may be stored in a database located on the HMD (e.g., device 210), another device communicatively coupled to the HMD (e.g., a mobile device in electronic communication with the HMD), an external device or server (e.g., via a network connection), or a combination of these or other device.
In some implementations, determining the eye characteristic includes determining a position of a plurality of portions of the eye based on determining the positions of the plurality of glints. For example, as shown in fig. 3, a light source 230 (e.g., micro-IR LED, mini-IR LED, etc.) may illuminate the eye 45 in multiple regions, producing more than one flash of light, each of which may be detected at the detector 220.
In some implementations, evaluating the characteristic from the eye is based on sensor data from a single sensor. For example, based on the location of the plurality of light sources directly on a transparent substrate (although transparent) within the user's field of view, only one sensor or camera (e.g., detector 220) is required to capture the light reflection. If the light source is located around the frame 212 of the device 210, two or more sensors will be required to pick up the flash.
In some implementations, the sensor (e.g., detector 220) includes an image sensor, and receiving the reflected light (e.g., light rays 314 and 316) includes receiving the reflected light according to image data from the sensor. For example, detector 220 is an image sensor that acquires image data of light rays 314 and 316.
In some implementations, a device (e.g., device 210) that performs the techniques of method 700 includes a frame, an image sensor, a transparent substrate coupled to the frame, and the transparent substrate includes a plurality of IR light sources (e.g., micro IR LEDs, mini IR LEDs, etc.). In some implementations, the transparent substrate is configured to display content. The plurality of IR light sources may be configured in a spatial arrangement on a surface of the transparent substrate, and the processor is coupled to the plurality of IR light sources. In some implementations, the processor is configured to receive sensor data from the image sensor, the sensor data corresponding to a plurality of reflections of light generated by the plurality of IR light sources and reflected from the eye.
In some implementations, the transparent substrate includes a waveguide. In some implementations, the plurality of light sources are embedded within the transparent substrate. For example, as shown in fig. 4, light sources 430, 432, etc. are embedded within a transparent substrate (e.g., lens 410) and positioned in air gap 414 between waveguide 415 and offset (+) 424. In some implementations, the light sources 430, 432, etc., and transparent conductors connecting the light sources to the controller 250 (e.g., power source and processor) are embedded within or on top of the waveguide 415.
In some implementations, a processor (e.g., controller 250) is coupled to the plurality of IR light sources via transparent conductors (e.g., transparent conductors 252, 254 of fig. 2). For example, the controller 250 includes a processor and a power source that control the emission of light from the light sources 234, 236 via the transparent conductors 252, 254, respectively. In some implementations, the plurality of light sources are individually addressable. For example, a processor within the controller 250 may individually control each light source 230, 232, 234, 236, etc. For example, the pattern of IR flashes may be formed based on the spatial arrangement of the light sources, and the controller may control each light source to simulate the desired pattern. For example, for the spatial arrangement of fig. 5A, the controller 250 may control each light source 230, 232, 234, 236 to pulse each individual light source at a different frequency or at the same frequency but with a different offset (e.g., to form a chronological ring of flash light of IR flash light around an elliptical arrangement).
In some implementations, each light source is equidistant from adjacent light sources, as shown in fig. 5C. For example, the spatial arrangement of the plurality of IR light sources is a uniformly shifted grid 3 x 3, 4 x 4, etc. In some implementations, each light source is spaced apart from each adjacent light source based on a minimum distance constraint. For example, the micro-LEDs are constrained to be about 7mm.
In some implementations, the plurality of light sources is divided into subgroups and each subgroup includes two or more light sources of the plurality of light sources. In some implementations, the subset of the plurality of light sources is dispersed throughout the transparent substrate. For example, the light sources may be grouped in a number of three light sources per group, and each group may be deployed in any of the spatial arrangements discussed herein (e.g., equidistant grid, oval, box, etc.).
In some implementations, the spatial arrangement includes a geometric shape. In some implementations, the geometric shape includes shapes such as parabolic, elliptical, hyperbolic, cycloidal, and the like. In some implementations, the geometry is based on an override curve or algebraic curve. For example, as shown in fig. 5A-5D, several different spatial arrangements may be provided for the plurality of light sources. Each spatial arrangement may provide improved sensitivity and improved performance for eye tracking and evaluation of eye characteristics as described herein.
Fig. 8 is a block diagram of an example device 800. Device 800 illustrates an exemplary device configuration of devices 110 and 210. While certain specific features are shown, those of ordinary skill in the art will appreciate from the disclosure that various other features are not shown for brevity and so as not to obscure more pertinent aspects of the implementations disclosed herein. To this end, as a non-limiting example, in some implementations, the device 800 includes one or more processing units 802 (e.g., microprocessors, ASIC, FPGA, GPU, CPU, processing cores, and the like), one or more input/output (I/O) devices and sensors 806, one or more communication interfaces 808 (e.g., ,USB、FIREWIRE、THUNDERBOLT、IEEE 802.3x、IEEE 802.11x、IEEE 802.16x、GSM、CDMA、TDMA、GPS、IR、BLUETOOTH、ZIGBEE、SPI、I2C and/or similar types of interfaces), one or more programming (e.g., I/O) interfaces 810, one or more displays 812, one or more inwardly and/or outwardly facing image sensor systems 814, a memory 820, and one or more communication buses 804 to interconnect these components and various other components.
In some implementations, one or more communication buses 804 include circuitry that interconnects and controls communications between system components. In some implementations, the one or more I/O devices and sensors 806 include at least one of: an Inertial Measurement Unit (IMU), accelerometer, magnetometer, gyroscope, thermometer, one or more physiological sensors (e.g., blood pressure monitor, heart rate monitor, blood oxygen sensor, blood glucose sensor, etc.), one or more microphones, one or more speakers, a haptic engine, or one or more depth sensors (e.g., structured light, time of flight, etc.), and the like.
In some implementations, the one or more displays 812 are configured to present a view of the physical environment or the graphical environment to the user. In some implementations, the one or more displays 812 correspond to holographic, digital Light Processing (DLP), liquid Crystal Displays (LCD), liquid crystal on silicon (LCoS), organic light emitting field effect transistors (OLET), organic Light Emitting Diodes (OLED), surface conduction electron emitter displays (SED), field Emission Displays (FED), quantum dot light emitting diodes (QD-LED), microelectromechanical systems (MEMS), and/or similar display types. In some implementations, the one or more displays 812 correspond to diffractive, reflective, polarizing, holographic, etc. waveguide displays. In one example, device 800 includes a single display. In another example, device 800 includes a display (e.g., device 210) for each eye of the user.
In some implementations, the one or more image sensor systems 814 are configured to obtain image data corresponding to at least a portion of the physical environment 5. For example, the one or more image sensor systems 814 include one or more RGB cameras (e.g., with Complementary Metal Oxide Semiconductor (CMOS) image sensors or Charge Coupled Device (CCD) image sensors), monochrome cameras, IR cameras, depth cameras, event-based cameras, and the like. In various implementations, the one or more image sensor systems 814 also include an illumination source that emits light, such as a flash of light. In various implementations, the one or more image sensor systems 814 further include an on-camera Image Signal Processor (ISP) configured to perform a plurality of processing operations on the image data.
Memory 820 includes high-speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices. In some implementations, the memory 820 includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 820 optionally includes one or more storage devices remotely located from the one or more processing units 802. Memory 820 includes a non-transitory computer-readable storage medium.
In some implementations, memory 820 or a non-transitory computer-readable storage medium of memory 820 stores an optional operating system 830 and one or more instruction sets 840. Operating system 830 includes procedures for handling various basic system services and for performing hardware related tasks. In some implementations, the instruction set 840 includes executable software defined by binary information stored in the form of electrical charges. In some implementations, the instruction set 840 is software that is executable by the one or more processing units 802 to implement one or more of the techniques described herein.
The instruction set 840 includes a flash analysis instruction set 842, a physiological tracking instruction set 844, and an LED driver instruction set 846. The instruction set 840 may be embodied as a single software executable or as a plurality of software executable files.
In some implementations, the flash analysis instruction set 842 is executable by the processing unit 802 to determine a location of a flash based on reflected light received at a sensor. The flash analysis instruction set 842 (e.g., the registration instruction set 420 of fig. 4) may be configured to receive reflected light at a sensor (e.g., an IR image sensor/detector) after the reflected light passes through a multi-zone lens having a first zone (e.g., a halo-generating zone) and a second zone (e.g., a normal curvature), wherein the first zone and the second zone have different energy-spreading characteristics; (e.g., the first region has a different curvature, inclination, etc.). Additionally, the flash analysis instruction set 842 may be configured to determine a location of the flash based on the reflected light received at the sensor. For these purposes, in various implementations, the instructions include instructions and/or logic for the instructions as well as heuristics and metadata for the heuristics.
In some implementations, the physiological tracking (e.g., eye gaze characteristics) instruction set 644 is executable by the processing unit 802 to track the eye gaze characteristics or other physiological attributes of the user based on the determined flash location (e.g., according to the flash analysis instruction set 842) using one or more of the techniques discussed herein or another potentially suitable technique. For these purposes, in various implementations, the instructions include instructions and/or logic for the instructions as well as heuristics and metadata for the heuristics.
In some implementations, the LED driver instruction set 846 may be executable by the processing unit 802 to activate and control a light source (e.g., an IR LED), such as the light source 530 in fig. 5A-5D, using one or more of the techniques discussed herein or as otherwise may be appropriate. For these purposes, in various implementations, the instructions include instructions and/or logic for the instructions as well as heuristics and metadata for the heuristics.
While the instruction set 840 is shown as residing on a single device, it should be understood that in other implementations, any combination of elements may reside on separate computing devices. Moreover, FIG. 8 is a functional description of various features that are more fully utilized in a particular implementation, as opposed to a structural schematic of the implementations described herein. As will be appreciated by one of ordinary skill in the art, the individually displayed items may be combined and some items may be separated. The actual number of instruction sets, and how features are distributed among them, will vary depending upon the particular implementation, and may depend in part on the particular combination of hardware, software, and/or firmware selected for the particular implementation.
It should be understood that the implementations described above are cited by way of example, and that the present disclosure is not limited to what has been particularly shown and described hereinabove. Rather, the scope includes both combinations and subcombinations of the various features described hereinabove as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
As described above, one aspect of the present technology is to collect and use physiological data to improve the user's electronic device experience in interacting with electronic content. The present disclosure contemplates that in some cases, the collected data may include personal information data that uniquely identifies a particular person or that may be used to identify an interest, characteristic, or predisposition of a particular person. Such personal information data may include physiological data, demographic data, location-based data, telephone numbers, email addresses, home addresses, device characteristics of personal devices, or any other personal information.
The present disclosure recognizes that the use of such personal information data in the present technology may be used to benefit users. For example, personal information data may be used to improve the interaction and control capabilities of the electronic device. Thus, the use of such personal information data enables planned control of the electronic device. In addition, the present disclosure contemplates other uses for personal information data that are beneficial to the user.
The present disclosure also contemplates that entities responsible for the collection, analysis, disclosure, transmission, storage, or other use of such personal information and/or physiological data will adhere to established privacy policies and/or privacy practices. In particular, such entities should exercise and adhere to privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining the privacy and security of personal information data. For example, personal information from a user should be collected for legal and legitimate uses of an entity and not shared or sold outside of those legal uses. In addition, such collection should be done only after the user's informed consent. In addition, such entities should take any required steps to secure and protect access to such personal information data and to ensure that other people who are able to access the personal information data adhere to their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices.
In spite of the foregoing, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates that hardware elements or software elements may be provided to prevent or block access to such personal information data. For example, with respect to content delivery services customized for a user, the techniques of the present invention may be configured to allow the user to choose to "join" or "leave" to participate in the collection of personal information data during the registration service. In another example, the user may choose not to provide personal information data for the targeted content delivery service. In yet another example, the user may choose not to provide personal information, but allow anonymous information to be transferred for improved functionality of the device.
Thus, while the present disclosure broadly covers the use of personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that the various embodiments may be implemented without accessing such personal information data. That is, various embodiments of the present technology do not fail to function properly due to the lack of all or a portion of such personal information data. For example, the content may be selected and delivered to the user by inferring preferences or settings based on non-personal information data or absolute minimum personal information such as content requested by a device associated with the user, other non-personal information available to the content delivery service, or publicly available information.
In some embodiments, the data is stored using a public/private key system that only allows the owner of the data to decrypt the stored data. In some other implementations, the data may be stored anonymously (e.g., without identifying and/or personal information about the user, such as legal name, user name, time and location data, etc.). Thus, other users, hackers, or third parties cannot determine the identity of the user associated with the stored data. In some implementations, a user may access stored data from a user device other than the user device used to upload the stored data. In these cases, the user may need to provide login credentials to access their stored data.
Numerous specific details are set forth herein to provide a thorough understanding of the claimed subject matter. However, it will be understood by those skilled in the art that the claimed subject matter may be practiced without these specific details. In other instances, methods, devices, or systems known by those of ordinary skill have not been described in detail so as not to obscure the claimed subject matter.
Unless specifically stated otherwise, it is appreciated that throughout the description, discussions utilizing terms such as "processing," "computing," "calculating," "determining," or "identifying" or the like, refer to the action or processes of a computing device, such as one or more computers or similar electronic computing devices, that manipulate or transform data represented as physical, electronic, or magnetic quantities within the computing platform's memory, registers, or other information storage device, transmission device, or display device.
The one or more systems discussed herein are not limited to any particular hardware architecture or configuration. The computing device may include any suitable arrangement of components that provide results conditioned on one or more inputs. Suitable computing devices include a multi-purpose microprocessor-based computer system that accesses stored software that programs or configures the computing system from a general-purpose computing device to a special-purpose computing device that implements one or more implementations of the subject invention. The teachings contained herein may be implemented in software for programming or configuring a computing device using any suitable programming, scripting, or other type of language or combination of languages.
Implementations of the methods disclosed herein may be performed in the operation of such computing devices. The order of the blocks presented in the above examples may be varied, e.g., the blocks may be reordered, combined, or divided into sub-blocks. Some blocks or processes may be performed in parallel.
The use of "adapted" or "configured to" herein is meant to be an open and inclusive language that does not exclude devices adapted or configured to perform additional tasks or steps. In addition, the use of "based on" is intended to be open and inclusive in that a process, step, calculation, or other action "based on" one or more of the stated conditions or values may be based on additional conditions or beyond the stated values in practice. Headings, lists, and numbers included herein are for ease of explanation only and are not intended to be limiting.
It will also be understood that, although the terms "first," "second," etc. may be used herein to describe various objects, these objects should not be limited by these terms. These terms are only used to distinguish one object from another. For example, a first node may be referred to as a second node, and similarly, a second node may be referred to as a first node, which changes the meaning of the description, so long as all occurrences of "first node" are renamed consistently and all occurrences of "second node" are renamed consistently. The first node and the second node are both nodes, but they are not the same node.
The terminology used herein is for the purpose of describing particular implementations only and is not intended to be limiting of the claims. As used in the description of this specification and the appended claims, the singular forms "a," "an," and "the" are intended to cover the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or groups thereof.
As used herein, the term "if" may be interpreted to mean "when the prerequisite is true" or "in response to a determination" or "upon a determination" or "in response to detecting" that the prerequisite is true, depending on the context. Similarly, the phrase "if it is determined that the prerequisite is true" or "if it is true" or "when it is true" is interpreted to mean "when it is determined that the prerequisite is true" or "in response to a determination" or "upon determination" that the prerequisite is true or "when it is detected that the prerequisite is true" or "in response to detection that the prerequisite is true", depending on the context.
The foregoing description and summary of the invention should be understood to be in every respect illustrative and exemplary, but not limiting, and the scope of the invention disclosed herein is to be determined not by the detailed description of illustrative implementations, but by the full breadth permitted by the patent laws. It is to be understood that the specific implementations shown and described herein are merely illustrative of the principles of this invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention.

Claims (88)

1. An electronic device, the electronic device comprising:
A frame;
An image sensor;
A transparent substrate coupled to the frame, the transparent substrate comprising a plurality of Infrared (IR) light sources, wherein the plurality of IR light sources are configured in a spatial arrangement within or on a surface of the transparent substrate;
A waveguide coupled to the transparent substrate, wherein the waveguide is configured to display a projected image; and
A processor coupled to the plurality of IR light sources, the processor configured to:
Sensor data is received from the image sensor, the sensor data corresponding to a plurality of reflections of light generated by the plurality of IR light sources and reflected from the eye.
2. The electronic device of claim 1, wherein the processor is coupled to the plurality of IR light sources via a transparent conductor.
3. The electronic device of claim 1 or 2, wherein the transparent substrate is configured to display content.
4. The electronic device of any of claims 1-3, wherein the transparent substrate comprises a deflection layer, and the plurality of IR light sources are arranged in a spatial arrangement on a surface of the deflection layer.
5. The electronic device of claim 1 or 2, wherein the transparent substrate comprises the waveguide.
6. The electronic device of any of claims 1-5, wherein each light source is equidistant from adjacent light sources.
7. The electronic device of any of claims 1-6, wherein each light source is spaced apart from each adjacent light source based on a minimum distance constraint.
8. The electronic device of any of claims 1-7, wherein the plurality of light sources are embedded within the transparent substrate.
9. The electronic device of any one of claims 1-8, wherein the plurality of light sources are connected to a power source via a transparent conductor.
10. The electronic device of any of claims 1-9, wherein the plurality of light sources are less than 200 microns in diameter.
11. The electronic device of any of claims 1-9, wherein the plurality of light sources are less than 100 microns in diameter.
12. The electronic device of any of claims 1-11, wherein the plurality of light sources are individually addressable.
13. The electronic device of any of claims 1-12, wherein the plurality of light sources are micro Light Emitting Diodes (LEDs).
14. The electronic device of any of claims 1-12, wherein the plurality of light sources are micro-Infrared (IR) LEDs.
15. The electronic device of any of claims 1-12, wherein the plurality of light sources are miniature light emitting diodes (mini LEDs).
16. The electronic device of any of claims 1-15, wherein the plurality of light sources are divided into subgroups, each subgroup comprising two or more of the plurality of light sources.
17. The electronic device of claim 16, wherein the sub-groups of the plurality of light sources are distributed throughout the transparent substrate.
18. The electronic device of any of claims 1-17, wherein the spatial arrangement comprises a geometric shape.
19. The electronic device of claim 18, wherein the geometry comprises:
The shape of the parabola is a parabola,
An oval-shaped cross-section is formed,
Hyperbola, or
Cycloid.
20. The electronic device of claim 18, wherein the geometry is based on an override curve or an algebraic curve.
21. The electronic device of claim 18, wherein the plurality of IR light sources are imperceptible to a human eye having an average visual acuity when viewed from a distance of 1cm to 5 cm.
22. The electronic device of any of claims 1-21, wherein the electronic device is a Head Mounted Device (HMD).
23. A method, the method comprising:
At an electronic device having a processor:
Generating light from a plurality of light sources configured in a spatial arrangement within or on a surface of a transparent substrate, wherein the plurality of light sources are configured to direct light toward an eye, and wherein a waveguide is coupled to the transparent substrate and configured to display a projected image;
Receiving sensor data from an image sensor, the sensor data corresponding to a plurality of reflections of the light reflected from the eye; and
The characteristics of the eye are evaluated based on the sensor data.
24. The method of claim 23, wherein evaluating the characteristic from the eye comprises determining an orientation of the eye based on identifying a pattern of the plurality of reflections of the light reflected from the eye.
25. The method of claim 23, wherein evaluating the characteristic from the eye comprises determining a gaze direction of the eye based on the plurality of reflections of the light reflected from the eye.
26. The method of claim 23, wherein evaluating the characteristic from the eye comprises performing authentication.
27. The method of claim 26, wherein performing the authentication comprises identifying an iris of the eye.
28. The method of any one of claims 23 to 27, wherein evaluating the characteristic from the eye is based on sensor data from a single sensor.
29. The method of any of claims 23-28, wherein the transparent substrate is configured to display content.
30. The method of any one of claims 23 to 29, wherein the transparent substrate comprises a waveguide.
31. The method of any one of claims 23 to 30, wherein the plurality of light sources are embedded within the transparent substrate.
32. The method of any one of claims 23 to 31, wherein the plurality of light sources are located on a surface of the transparent substrate.
33. The method of any one of claims 23 to 32, wherein the transparent substrate comprises a deflection layer and the plurality of IR light sources are arranged in a spatial arrangement on a surface of the deflection layer.
34. The method of any one of claims 23 to 33, wherein the plurality of light sources are connected to a power source via a transparent conductive material.
35. The method of any one of claims 23 to 34, wherein the plurality of light sources are less than 200 microns in diameter.
36. The method of any one of claims 23 to 34, wherein the plurality of light sources are less than 100 microns in diameter.
37. The method of any one of claims 23 to 36, wherein the plurality of light sources are individually addressable.
38. The method of any one of claims 23 to 37, wherein the plurality of light sources are divided into subgroups, each subgroup comprising two or more light sources of the plurality of light sources.
39. The method of claim 38, wherein the sub-groups of the plurality of light sources are distributed throughout the transparent substrate.
40. The method of any one of claims 23 to 39, wherein the plurality of IR light sources are imperceptible to a human eye having an average visual acuity when viewed from a distance of 1cm to 5 cm.
41. The method of any of claims 23-38, wherein determining the eye characteristics comprises determining locations of portions of the eye based on determining locations of a plurality of flashes.
42. The method of any one of claims 23 to 41, wherein the light is Infrared (IR) light.
43. The method of any one of claims 23 to 42, wherein the sensor comprises an image sensor and receiving the reflected light comprises receiving the reflected light in accordance with image data from the sensor.
44. The method of any one of claims 23 to 43, wherein the electronic device is a Head Mounted Device (HMD).
45. An electronic device, the electronic device comprising:
A transparent substrate;
A waveguide coupled to the transparent substrate and configured to display a projected image;
A plurality of light sources arranged in a spatial arrangement within or on a surface of the transparent substrate;
An image sensor;
a non-transitory computer readable storage medium; and
One or more processors coupled to the non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium includes program instructions that, when executed on the one or more processors, cause the apparatus to perform operations comprising:
generating light from the plurality of light sources on the transparent substrate, wherein the light from the plurality of light sources is configured to direct light toward an eye;
Receiving sensor data from the image sensor, the sensor data corresponding to a plurality of reflections of the light reflected from the eye of a user; and
Eye characteristics are evaluated based on the sensor data.
46. The electronic device of claim 45, wherein evaluating the characteristic from the eye comprises determining an orientation of the eye based on identifying a pattern of the plurality of reflections of the light reflected from the eye.
47. The electronic device of claim 45, wherein evaluating the characteristic from the eye comprises determining a gaze direction of the eye based on the plurality of reflections of the light reflected from the eye.
48. The electronic device of claim 45, wherein evaluating the characteristic from the eye comprises performing authentication.
49. The electronic device of claim 48, wherein performing the authentication comprises identifying an iris of the eye.
50. The electronic device of any of claims 45-49, wherein evaluating the characteristic from the eye is based on sensor data from a single sensor.
51. The electronic device of any of claims 45-50, wherein the transparent substrate is configured to display content.
52. The electronic device of any of claims 45-51, wherein the transparent substrate comprises a waveguide.
53. The electronic device of any of claims 45-52, wherein the plurality of light sources are embedded within the transparent substrate.
54. The electronic device of any of claims 45-53, wherein the plurality of light sources are located on a surface of the transparent substrate.
55. The electronic device of any of claims 45-54, wherein the transparent substrate comprises a deflection layer and the plurality of IR light sources are arranged in a spatial arrangement on a surface of the deflection layer.
56. The electronic device of any of claims 45-55, wherein the plurality of light sources are connected to a power source via a transparent conductive material.
57. The electronic device of any of claims 45-56, wherein a diameter of the plurality of light sources is less than 200 microns.
58. The electronic device of any of claims 45-56, wherein a diameter of the plurality of light sources is less than 100 microns.
59. The electronic device of any of claims 45-58, wherein the plurality of light sources are individually addressable.
60. The electronic device of any of claims 45-59, wherein the plurality of light sources are divided into subgroups, each subgroup comprising two or more light sources of the plurality of light sources.
61. The electronic device of claim 60, wherein the sub-groups of the plurality of light sources are distributed throughout the transparent substrate.
62. The electronic device of any of claims 45-61, wherein the plurality of IR light sources are imperceptible to a human eye having an average visual acuity when viewed from a distance of 1cm to 5 cm.
63. The electronic device of any of claims 45-62, wherein determining the eye characteristic includes determining locations of portions of the eye based on determining locations of a plurality of flashes.
64. The electronic device of any of claims 45-63, wherein the light is Infrared (IR) light.
65. The electronic device of any of claims 45-64, wherein the sensor comprises an image sensor and receiving reflected light comprises receiving reflected light in accordance with image data from the sensor.
66. The electronic device of any of claims 45-65, wherein the electronic device is a Head Mounted Device (HMD).
67. A non-transitory computer readable storage medium storing program instructions executable by one or more processors on a device to perform operations comprising:
Generating light from a plurality of light sources configured in a spatial arrangement within or on a surface of a transparent substrate, wherein the plurality of light sources are configured to direct light toward an eye, and wherein a waveguide is coupled to the transparent substrate and configured to display a projected image;
Receiving sensor data from an image sensor, the sensor data corresponding to a plurality of reflections of the light reflected from the eye; and
The characteristics of the eye are evaluated based on the sensor data.
68. The non-transitory computer-readable storage medium of claim 67, wherein evaluating the characteristic from the eye comprises determining an orientation of the eye based on identifying a pattern of the plurality of reflections of the light reflected from the eye.
69. The non-transitory computer-readable storage medium of claim 67, wherein evaluating the characteristic from the eye comprises determining a gaze direction of the eye based on the plurality of reflections of the light reflected from the eye.
70. The non-transitory computer-readable storage medium of claim 67, wherein evaluating the characteristic from the eye comprises performing authentication.
71. The non-transitory computer-readable storage medium of claim 67, wherein performing the authentication includes identifying an iris of the eye.
72. The non-transitory computer-readable storage medium of any one of claims 67-71, wherein evaluating the characteristic from the eye is based on sensor data from a single sensor.
73. The non-transitory computer-readable storage medium of any one of claims 67-72, wherein the transparent substrate is configured to display content.
74. The non-transitory computer-readable storage medium of any one of claims 67-73, wherein the transparent substrate comprises a waveguide.
75. The non-transitory computer-readable storage medium of any one of claims 67-74, wherein the plurality of light sources are embedded within the transparent substrate.
76. The non-transitory computer-readable storage medium of any one of claims 67-75, wherein the plurality of light sources are located on a surface of the transparent substrate.
77. The non-transitory computer-readable storage medium of any one of claims 67-76, wherein the transparent substrate includes a bias layer, and the plurality of IR light sources are configured in a spatial arrangement on a surface of the bias layer.
78. The non-transitory computer-readable storage medium of any one of claims 67-77, wherein the plurality of light sources are connected to a power source via a transparent conductive material.
79. The non-transitory computer-readable storage medium of any one of claims 67-78, wherein a diameter of the plurality of light sources is less than 200 microns.
80. The non-transitory computer-readable storage medium of any one of claims 67-78, wherein a diameter of the plurality of light sources is less than 100 microns.
81. The non-transitory computer-readable storage medium of any one of claims 67-80, wherein the plurality of light sources are individually addressable.
82. The non-transitory computer-readable storage medium of any one of claims 67-81, wherein the plurality of light sources are divided into subgroups, each subgroup comprising two or more light sources of the plurality of light sources.
83. The non-transitory computer-readable storage medium of claim 82, wherein the sub-groups of the plurality of light sources are dispersed throughout the transparent substrate.
84. The non-transitory computer-readable storage medium of any one of claims 67-83, wherein the plurality of IR light sources are not perceptible to a human eye having an average visual acuity when viewed from a distance of 1cm to 5 cm.
85. The non-transitory computer-readable storage medium of any one of claims 67-84, wherein determining the eye characteristic comprises determining a position of portions of the eye based on determining a position of a plurality of flashes.
86. The non-transitory computer-readable storage medium of any one of claims 67-85, wherein the light is Infrared (IR) light.
87. The non-transitory computer-readable storage medium of any one of claims 67-86, wherein the sensor comprises an image sensor, and receiving reflected light comprises receiving reflected light according to image data from the sensor.
88. The non-transitory computer-readable storage medium of any one of claims 67-87, wherein the electronic device is a Head Mounted Device (HMD).
CN202280064175.1A 2021-09-24 2022-09-19 Eye reflection using IR light sources on transparent substrates Pending CN117980795A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163248201P 2021-09-24 2021-09-24
US63/248,198 2021-09-24
US63/248,201 2021-09-24
PCT/US2022/043956 WO2023049065A1 (en) 2021-09-24 2022-09-19 Eye reflections using ir light sources on a transparent substrate

Publications (1)

Publication Number Publication Date
CN117980795A true CN117980795A (en) 2024-05-03

Family

ID=90863673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280064175.1A Pending CN117980795A (en) 2021-09-24 2022-09-19 Eye reflection using IR light sources on transparent substrates

Country Status (1)

Country Link
CN (1) CN117980795A (en)

Similar Documents

Publication Publication Date Title
US11295551B2 (en) Accumulation and confidence assignment of iris codes
US20220269333A1 (en) User interfaces and device settings based on user identification
CN110968189A (en) Pupil modulation as a cognitive control signal
US20220262080A1 (en) Interfaces for presenting avatars in three-dimensional environments
CN117980795A (en) Eye reflection using IR light sources on transparent substrates
WO2023049065A1 (en) Eye reflections using ir light sources on a transparent substrate
WO2022010659A1 (en) Display calibration
US20230324587A1 (en) Glint analysis using multi-zone lens
US20230359273A1 (en) Retinal reflection tracking for gaze alignment
US20230309824A1 (en) Accommodation tracking based on retinal-imaging
US20230288985A1 (en) Adjusting image content to improve user experience
US20240005537A1 (en) User representation using depths relative to multiple surface points
US20240212291A1 (en) Attention control in multi-user environments
US20230351676A1 (en) Transitioning content in views of three-dimensional environments using alternative positional constraints
US20230273985A1 (en) Devices, methods, and graphical user interfaces for authorizing a secure operation
WO2024006107A1 (en) Gaze behavior detection
KR20240041257A (en) User interface response based on gaze-holding event assessment
CN117980867A (en) Interactive event based on physiological response to illumination
WO2023224878A1 (en) Eye tracking using camera lens-aligned retinal illumination
TW202318180A (en) Systems and methods for communicating model uncertainty to users
CN118119915A (en) System and method for communicating model uncertainty to a user
CN117742482A (en) User interface response based on gaze retention event evaluation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination