CN113677619A - Substrate modification by femtosecond laser in dry etching to achieve variable etch depth - Google Patents

Substrate modification by femtosecond laser in dry etching to achieve variable etch depth Download PDF

Info

Publication number
CN113677619A
CN113677619A CN202080028185.0A CN202080028185A CN113677619A CN 113677619 A CN113677619 A CN 113677619A CN 202080028185 A CN202080028185 A CN 202080028185A CN 113677619 A CN113677619 A CN 113677619A
Authority
CN
China
Prior art keywords
substrate
display
eye
etch mask
etching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080028185.0A
Other languages
Chinese (zh)
Inventor
谢罗尔·图尔克耶尔马兹
朱塞佩·卡拉菲奥雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Facebook Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Facebook Technologies LLC filed Critical Facebook Technologies LLC
Publication of CN113677619A publication Critical patent/CN113677619A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C03GLASS; MINERAL OR SLAG WOOL
    • C03CCHEMICAL COMPOSITION OF GLASSES, GLAZES OR VITREOUS ENAMELS; SURFACE TREATMENT OF GLASS; SURFACE TREATMENT OF FIBRES OR FILAMENTS MADE FROM GLASS, MINERALS OR SLAGS; JOINING GLASS TO GLASS OR OTHER MATERIALS
    • C03C23/00Other surface treatment of glass not in the form of fibres or filaments
    • C03C23/0005Other surface treatment of glass not in the form of fibres or filaments by irradiation
    • C03C23/0025Other surface treatment of glass not in the form of fibres or filaments by irradiation by a laser beam
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/06Shaping the laser beam, e.g. by masks or multi-focusing
    • B23K26/062Shaping the laser beam, e.g. by masks or multi-focusing by direct control of the laser beam
    • B23K26/0622Shaping the laser beam, e.g. by masks or multi-focusing by direct control of the laser beam by shaping pulses
    • B23K26/0624Shaping the laser beam, e.g. by masks or multi-focusing by direct control of the laser beam by shaping pulses using ultrashort pulses, i.e. pulses of 1ns or less
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/36Removing material
    • B23K26/362Laser etching
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0933Systems for active beam shaping by rapid movement of an element
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4272Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having plural diffractive elements positioned sequentially along the optical path
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1847Manufacturing methods
    • G02B5/1857Manufacturing methods using exposure or etching means, e.g. holography, photolithography, exposure to electron or ion beams
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1866Transmission gratings characterised by their structure, e.g. step profile, contours of substrate or grooves, pitch variations, materials
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/0001Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
    • G02B6/0011Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form

Abstract

Artificial reality systems, such as Head Mounted Displays (HMDs) or Head Up Display (HUDs) systems, typically include a near-eye display (e.g., a head-mounted device or a pair of eyeglasses) configured to present content to a user via an electronic or optical display, e.g., within about 10-20 mm in front of the user's eyes. Such a system may include a variable etch depth structure. The invention provides a method of fabricating a variable etch depth structure in a substrate, the method comprising: exposing the substrate surface to femtosecond laser pulses, wherein an exposure dose of the femtosecond laser pulses on the substrate surface varies across the substrate surface; depositing an etch mask layer on a substrate; forming a two-dimensional pattern in the photoresist layer on the etch mask layer; transferring the two-dimensional pattern into an etch mask layer to form an etch mask; and etching the substrate using the etch mask to form a variable etch depth structure in the substrate.

Description

Substrate modification by femtosecond laser in dry etching to achieve variable etch depth
Cross Reference to Related Applications
This application claims priority from U.S. application No. 62/834,289 filed on 2019, 4, 15, the contents of which are incorporated herein by reference in their entirety for all purposes.
Appendix
Appendix a is filed as part of the present application. The contents of appendix a are part of the present application. Appendix a is also incorporated herein by reference in its entirety for all purposes.
Background
Artificial reality systems, such as Head Mounted Displays (HMDs) or Head Up Display (HUDs) systems, typically include a near-eye display (e.g., a head set or a pair of glasses) configured to present content to a user via an electronic or optical display, for example, within about 10-20 mm in front of the user's eyes. As in Virtual Reality (VR), Augmented Reality (AR), or Mixed Reality (MR) applications, the near-eye display may display or combine an image of a real object with a virtual object. For example, in an AR system, a user may view both an image of a virtual object (e.g., a Computer Generated Image (CGI)) and the surrounding environment, for example, through transparent display glasses or lenses, commonly referred to as optical see-through.
One example optical see-through AR system may use a waveguide-based optical display, where light of a projected image may be coupled into a waveguide (e.g., a transparent substrate), propagate within the waveguide, and be coupled out of the waveguide at different locations. In some implementations, a diffractive optical element (e.g., a grating) may be used to couple light of the projected image into or out of the waveguide. Various techniques may be used to fabricate the grating or to fabricate a mold for imprinting the grating. However, these techniques are generally not capable of etching grating structures having a desired height or depth profile that is not uniform over the grating area.
Summary of The Invention
The present disclosure relates generally to techniques for fabricating variable etch depth structures. According to a first aspect of the present invention, there is provided a method of fabricating a variable etch depth structure in a substrate, the method comprising: exposing the substrate surface to femtosecond laser pulses, wherein an exposure dose of the femtosecond laser pulses on the substrate surface varies across the substrate surface; depositing an etch mask layer on a substrate; forming a two-dimensional pattern in the photoresist layer on the etch mask layer; transferring the two-dimensional pattern into an etch mask layer to form an etch mask; and etching the substrate using the etch mask to form a variable etch depth structure in the substrate.
The variable etch depth structure may be a variable etch depth grating structure for a waveguide-based near-eye display system.
The substrate may comprise a dielectric material and/or a semiconductor material.
The etch mask layer may include one or more of a metal, molybdenum silicide, a polymer, a metal oxide, and combinations thereof. The metal may include chromium, platinum, palladium, and/or titanium.
An etch mask layer may be deposited on the substrate prior to exposing the substrate surface to the femtosecond laser pulses.
The exposure to the femtosecond laser pulses may occur from the backside of the substrate. The backside may be the side or surface opposite to the side or surface on which the etch mask layer is deposited.
Two-dimensional patterns may be formed in the photolithographic layers using electron beam lithography, photolithography, and/or nanoimprint lithography.
The pattern in the photolithographic layer may be transferred into the etch mask layer using dry etching techniques, wet etching techniques, physical etching techniques, and/or chemical etching techniques.
The substrate may be etched using dry etching using the etch mask. The substrate may be etched using an etch mask using inductively coupled plasma etching, capacitively coupled plasma etching, reactive ion etching, ion beam etching, ion milling, and/or chemical reactive ion etching.
According to a second aspect of the present invention there is provided a variable etch depth structure manufactured using the method of the first aspect.
According to a third aspect of the invention, there is provided a system for performing the method of the first aspect.
This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used in isolation to determine the scope of the claimed subject matter. The subject matter should be understood with reference to appropriate portions of the entire specification of this disclosure, any or all of the drawings, and each claim. The foregoing and other features and examples will be described in more detail below in the following specification, claims, and drawings.
Brief Description of Drawings
Illustrative embodiments are described in detail below with reference to the following drawings.
Fig. 1 is a simplified block diagram of an example of an artificial reality system environment including a near-eye display, in accordance with some embodiments.
Fig. 2 is a perspective view of an example of a near-eye display in the form of a head-mounted display (HMD) device for implementing some examples disclosed herein.
Fig. 3 is a perspective view of an example of a near-eye display in the form of a pair of eyeglasses for implementing some examples disclosed herein.
Fig. 4 illustrates an example of an optical see-through augmented reality system using a waveguide display, in accordance with certain embodiments.
FIG. 5 illustrates an example of a tilted variable etch depth grating coupler in a waveguide display according to some embodiments.
Fig. 6 illustrates an example of a method of fabricating a variable etch depth structure according to some embodiments.
FIG. 7 is a simplified block diagram of an example electronic system of an example near-eye display, in accordance with certain embodiments.
The figures depict embodiments of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated may be employed without departing from the principles and advantages of the present disclosure.
In the drawings, similar components and/or features may have the same reference numerals. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
Detailed Description
The present disclosure relates generally to techniques for fabricating variable etch depth structures (e.g., variable etch depth grating structures for waveguide-based near-eye display systems). Various inventive embodiments and examples are described herein, including devices, systems, methods, and so on.
The grating may be used in a waveguide-based near-eye display system for coupling light into or out of a waveguide. Achieving spatially Variable Etch Depth (VED) gratings with precise control of pattern height is very useful in waveguide technology to adjust the diffraction efficiency and/or angle and/or spectral response of diffraction gratings. For example, in some waveguide-based near-eye display systems, to improve the performance of the grating, such as different diffraction characteristics (e.g., diffraction efficiency and/or diffraction angle) at different regions of the grating, the depth of the grating may need to vary across the entire area of the grating. In non-diffractive (e.g. refractive or reflective) devices, it is also very useful to achieve spatially varying three-dimensional structures. However, it is difficult to pattern nanoscale features and spatially control the geometry of nanoscale features with high precision.
While conventional lithographic techniques (e.g., photolithography, e-beam lithography, etc.) can produce gratings with highly customizable duty cycles and/or grating periods, these lithographic techniques typically fail to modulate the vertical dimension (i.e., etch depth) of the grating relative to the surface normal of the substrate over the entire area of the substrate. Techniques such as using a movable blade to spatially control etch time (and depth) during etching, using dual masks and gray-tone lithography (gray-tone lithography), etc., can be used to fabricate variable etch depth structures. However, these techniques may require long development cycles to develop and may have difficulty controlling the different variations in etch depth over a two-dimensional area to achieve a complete three-dimensional VED structure.
According to certain embodiments, a method of fabricating a VED structure may include modifying (e.g., quartz) an etch selectivity of a surface region of a substrate (e.g., quartz) using a femtosecond laser pulse prior to etching, and then etching the modified substrate using a dry etch process to achieve a spatially varying geometry with nanoscale or microscale features.
In the following description, for purposes of explanation, specific details are set forth in order to provide a thorough understanding of the disclosed examples. It may be evident, however, that the various examples may be practiced without these specific details. For example, devices, systems, structures, components, methods, and other elements may be shown in block diagram form as components in order not to obscure the examples in unnecessary detail. In other instances, well-known devices, processes, systems, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the examples. The drawings and description are not intended to be limiting. The terms and expressions which have been employed in the present disclosure are used as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding any equivalents of the features shown and described or portions thereof. The word "example" is used herein to mean "serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs.
Fig. 1 is a simplified block diagram of an example of an artificial reality system environment 100 including a near-eye display 120, according to some embodiments. The artificial reality system environment 100 shown in fig. 1 may include a near-eye display 120, an optional external imaging device 150, and an optional input/output interface 140, which may each be coupled to an optional console 110. Although fig. 1 illustrates the example artificial reality system environment 100 including one near-eye display 120, one external imaging device 150, and one input/output interface 140, any number of these components may be included in the artificial reality system environment 100 or any components may be omitted. For example, there may be multiple near-eye displays 120, with these near-eye displays 120 being monitored by one or more external imaging devices 150 in communication with the console 110. In some configurations, the artificial reality system environment 100 may not include an external imaging device 150, an optional input/output interface 140, and an optional console 110. In alternative configurations, different or additional components may be included in the artificial reality system environment 100.
The near-eye display 120 may be a head-mounted display that presents content to a user. Examples of content presented by the near-eye display 120 include one or more images, video, audio, or some combination thereof. In some embodiments, the audio may be presented via an external device (e.g., speakers and/or headphones) that receives audio information from the near-eye display 120, the console 110, or both, and presents audio data based on the audio information. The near-eye display 120 may include one or more rigid bodies, which may be rigidly or non-rigidly coupled to each other. A rigid coupling between the rigid bodies may cause the coupled rigid bodies to act as a single rigid entity. A non-rigid coupling between the rigid bodies may allow the rigid bodies to move relative to each other. In various embodiments, the near-eye display 120 may be implemented in any suitable form factor, including a pair of glasses. Some embodiments of the near-eye display 120 are further described below with reference to fig. 2-4. Additionally, in various embodiments, the functionality described herein may be used in a head mounted device that combines images of the environment external to the near-eye display 120 and artificial reality content (e.g., computer generated images). Accordingly, the near-eye display 120 may augment the image of the physical, real-world environment external to the near-eye display 120 with the generated content (e.g., images, video, sound, etc.) to present augmented reality to the user.
In various embodiments, the near-eye display 120 may include one or more of display electronics 122, display optics 124, and an eye tracking unit 130. In some embodiments, the near-eye display 120 may also include one or more positioners 126, one or more position sensors 128, and an Inertial Measurement Unit (IMU) 132. In various embodiments, the near-eye display 120 may omit any of these elements, or may include additional elements. Additionally, in some embodiments, the near-eye display 120 may include elements that combine the functionality of the various elements described in conjunction with fig. 1.
Display electronics 122 may display or facilitate the display of images to a user based on data received from, for example, console 110. In various embodiments, the display electronics 122 may include one or more display panels, such as a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED) display, an Inorganic Light Emitting Diode (ILED) display, a micro light emitting diode (mLED) display, an active matrix OLED display (AMOLED), a transparent OLED display (TOLED), or some other display. For example, in one embodiment of the near-eye display 120, the display electronics 122 may include a front TOLED panel, a rear display panel, and an optical component (e.g., an attenuator, polarizer, or diffractive or spectral film) between the front and rear display panels. The display electronics 122 may include pixels that emit light of a primary color (e.g., red, green, blue, white, or yellow). In some implementations, the display electronics 122 can display a three-dimensional (3D) image through a stereoscopic effect produced by a two-dimensional panel to create a subjective perception of image depth. For example, the display electronics 122 may include a left display and a right display positioned in front of the user's left and right eyes, respectively. The left and right displays may present copies of the image that are horizontally offset relative to each other to produce a stereoscopic effect (i.e., the perception of image depth by a user viewing the image).
In some embodiments, the display optics 124 may optically display image content (e.g., using optical waveguides and couplers), or magnify image light received from the display electronics 122, correct optical errors associated with the image light, and present the corrected image light to a user of the near-eye display 120. In various embodiments, display optics 124 may include one or more optical elements such as, for example, a substrate, an optical waveguide, an aperture (aperture), a fresnel lens, a convex lens, a concave lens, a filter, an input/output coupler, or any other suitable optical element that may affect image light emitted from display electronics 122. Display optics 124 may include a combination of different optical elements and mechanical couplings to maintain the relative spacing and orientation of the optical elements in the combination. One or more optical elements in display optics 124 may have an optical coating, such as an anti-reflective coating, a filter coating, or a combination of different optical coatings.
The magnification of the image light by display optics 124 may allow display electronics 122 to be physically smaller, lighter in weight, and consume less power than larger displays. Additionally, the magnification may increase the field of view of the display content. The magnification of image light by display optics 124 may be changed by adjusting optical elements, adding optical elements, or removing optical elements from display optics 124. In some embodiments, the display optics 124 may project the displayed image to one or more image planes, which may be further from the user's eye than the near-eye display 120.
Display optics 124 may also be designed to correct one or more types of optical errors, such as two-dimensional optical errors, three-dimensional optical errors, or a combination thereof. The two-dimensional error may include an optical aberration (optical aberration) occurring in two dimensions. Example types of two-dimensional errors may include barrel distortion, pincushion distortion, longitudinal chromatic aberration, and lateral chromatic aberration. The three-dimensional errors may include optical errors occurring in three dimensions. Example types of three-dimensional errors may include spherical aberration (spherical aberration), coma (comatic aberration), field curvature (field) and astigmatism (astigmatism).
The locators 126 may be objects that are located at particular positions on the near-eye display 120 relative to each other and relative to a reference point on the near-eye display 120. In some implementations, the console 110 can identify the locator 126 in images captured by the external imaging device 150 to determine the position, orientation, or both of the artificial reality headset. The locators 126 may be Light Emitting Diodes (LEDs), pyramidal prisms (corner prisms), reflective markers, a type of light source that contrasts with the environment in which the near-eye display 120 operates, or some combination thereof. In embodiments where the locator 126 is an active component (e.g., an LED or other type of light emitting device), the locator 126 may emit light in the visible band (e.g., about 380nm to 750nm), Infrared (IR) band (e.g., about 750nm to 1mm), ultraviolet band (e.g., about 10nm to about 380nm), light in another portion of the electromagnetic spectrum, or light in any combination of portions of the electromagnetic spectrum.
The external imaging device 150 may generate slow calibration data based on the calibration parameters received from the console 110. The slow calibration data may include one or more images showing the viewing position of the positioner 126, which may be detected by the external imaging device 150. The external imaging device 150 may include one or more cameras, one or more video cameras, any other device capable of capturing images including one or more locators 126, or some combination thereof. Additionally, the external imaging device 150 may include one or more filters (e.g., to improve signal-to-noise ratio). The external imaging device 150 may be configured to detect light emitted or reflected from the locators 126 in the field of view of the external imaging device 150. In embodiments where the locators 126 include passive elements (e.g., retro-reflectors), the external imaging device 150 may include a light source that illuminates some or all of the locators 126, and the locators 126 may retroreflect light to the light source in the external imaging device 150. The slow calibration data may be communicated from the external imaging device 150 to the console 110, and the external imaging device 150 may receive one or more calibration parameters from the console 110 for adjusting one or more imaging parameters (e.g., focal length, focus, frame rate, sensor temperature, shutter speed, aperture, etc.).
The position sensor 128 may generate one or more measurement signals in response to movement of the near-eye display 120. Examples of position sensors 128 may include accelerometers, gyroscopes, magnetometers, other motion detection or error correction sensors, or some combination thereof. For example, in some embodiments, the position sensors 128 may include multiple accelerometers that measure translational motion (e.g., forward/backward, up/down, or left/right) and multiple gyroscopes that measure rotational motion (e.g., pitch, yaw, or roll). In some embodiments, the various position sensors may be oriented orthogonal to one another.
The IMU132 may be an electronic device that generates fast calibration data based on measurement signals received from one or more position sensors 128. The position sensor 128 may be located external to the IMU132, internal to the IMU132, or some combination thereof. Based on the one or more measurement signals from the one or more position sensors 128, the IMU132 may generate fast calibration data indicative of an estimated position of the near-eye display 120 relative to an initial position of the near-eye display 120. For example, the IMU132 may integrate the measurement signals received from the accelerometers over time to estimate a velocity vector, and integrate the velocity vector over time to determine an estimated position of a reference point on the near-eye display 120. Alternatively, the IMU132 may provide sampled measurement signals to the console 110, and the console 110 may determine fast calibration data. While the reference point may be generally defined as a point in space, in various embodiments, the reference point may also be defined as a point within the near-eye display 120 (e.g., the center of the IMU 132).
The eye tracking unit 130 may comprise one or more eye tracking systems. Eye tracking may refer to determining the position of the eye relative to the near-eye display 120, including the orientation and position of the eye. The eye tracking system may include an imaging system that images one or more eyes, and may optionally include a light emitter that may generate light directed at the eye such that light reflected by the eye may be captured by the imaging system. For example, the eye tracking unit 130 may include an incoherent or coherent light source (e.g., a laser diode) that emits light in the visible or infrared spectrum, and a camera that captures light reflected by the user's eye. As another example, eye tracking unit 130 may capture reflected radio waves emitted by a miniature radar unit. The eye tracking unit 130 may use a low power light emitter that emits light at a frequency and intensity that does not harm the eyes or cause physical discomfort. The eye tracking unit 130 may be arranged to improve contrast in an eye image captured by the eye tracking unit 130 while reducing the total power consumed by the eye tracking unit 130 (e.g., reducing the power consumed by the light emitters and imaging system included in the eye tracking unit 130). For example, in some embodiments, the eye tracking unit 130 may consume less than 100 milliwatts of power.
The near-eye display 120 may use the orientation of the eyes to, for example, determine the user's interpupillary distance (IPD), determine gaze direction, introduce depth cues (e.g., blur images outside of the user's main line of sight), collect heuristic information (hemristics) about user interaction in the VR media (e.g., time spent on any particular subject, object, or frame depending on the stimulus experienced), some other functionality based in part on the orientation of at least one user's eye, or some combination of the above. Because the orientation of the user's eyes can be determined, the eye tracking unit 130 can determine where the user is looking. For example, determining the direction of the user's gaze may include determining a point of convergence (point of convergence) based on the determined orientation of the user's left and right eyes. The convergence point may be a point where two foveal axes (foveal axes) of the user's eyes intersect. The direction of the user's gaze may be the direction of a line passing through the midpoint between the convergence point and the pupil of the user's eye.
The input/output interface 140 may be a device that allows a user to send action requests to the console 110. The action request may be a request to perform a particular action. For example, the action request may be to start or end an application, or to perform a particular action within an application. The input/output interface 140 may include one or more input devices. Example input devices may include a keyboard, mouse, game controller, gloves, buttons, touch screen, or any other suitable device for receiving an action request and communicating the received action request to the console 110. The action request received by the input/output interface 140 may be communicated to the console 110, and the console 110 may perform an action corresponding to the requested action. In some embodiments, the input/output interface 140 may provide haptic feedback to the user in accordance with instructions received from the console 110. For example, the input/output interface 140 may provide haptic feedback when an action request is received, or when the console 110 has performed the requested action and transmitted instructions to the input/output interface 140.
The console 110 may provide content to the near-eye display 120 for presentation to the user based on information received from one or more of the external imaging device 150, the near-eye display 120, and the input/output interface 140. In the example shown in fig. 1, the console 110 may include an application storage 112, a head-mounted device tracking module 114, an artificial reality engine 116, and an eye tracking module 118. Some embodiments of console 110 may include different or additional modules than those described in conjunction with FIG. 1. The functions described further below may be distributed among the components of the console 110 in a manner different than that described herein.
In some embodiments, the console 110 may include a processor and a non-transitory computer readable storage medium storing instructions executable by the processor. A processor may include multiple processing units that execute instructions in parallel. The non-transitory computer-readable storage medium may be any memory, such as a hard disk drive, a removable memory, or a solid state drive (e.g., flash memory or Dynamic Random Access Memory (DRAM)). In various embodiments, the modules of the console 110 described in connection with fig. 1 may be encoded as instructions in a non-transitory computer-readable storage medium that, when executed by a processor, cause the processor to perform the functions described further below.
The application storage 112 may store one or more applications for execution by the console 110. The application may include a set of instructions that, when executed by the processor, generate content for presentation to the user. The content generated by the application may be responsive to input received from the user via movement of the user's eyes or input received from the input/output interface 140. Examples of applications may include: a gaming application, a conferencing application, a video playback application, or other suitable application.
The headset tracking module 114 may use slow calibration information from the external imaging device 150 to track movement of the near-eye display 120. For example, the head mounted device tracking module 114 may determine the location of the reference point of the near-eye display 120 using the observed locator from the slow calibration information and the model of the near-eye display 120. The head mounted device tracking module 114 may also use the position information from the fast calibration information to determine the position of the reference point of the near-eye display 120. Additionally, in some embodiments, the head mounted device tracking module 114 may use a portion of the fast calibration information, the slow calibration information, or some combination thereof to predict a future position of the near-eye display 120. The head mounted device tracking module 114 may provide the estimated or predicted future position of the near-eye display 120 to the artificial reality engine 116.
The head-mounted device tracking module 114 may calibrate the artificial reality system environment 100 using the one or more calibration parameters, and may adjust the one or more calibration parameters to reduce errors in determining the position of the near-eye display 120. For example, the headset tracking module 114 may adjust the focus of the external imaging device 150 to obtain a more accurate position of the localizer viewed on the near-eye display 120. In addition, the calibration performed by the headset tracking module 114 may also take into account information received from the IMU 132. Additionally, if tracking of the near-eye display 120 is lost (e.g., the external imaging device 150 loses at least a threshold number of line of sight of the localizer 126), the headset tracking module 114 may recalibrate some or all of the calibration parameters.
The artificial reality engine 116 may execute an application within the artificial reality system environment 100 and receive, from the headset tracking module 114, position information of the near-eye display 120, acceleration information of the near-eye display 120, speed information of the near-eye display 120, a predicted future position of the near-eye display 120, or some combination thereof. The artificial reality engine 116 may also receive estimated eye position and orientation information from the eye tracking module 118. Based on the received information, the artificial reality engine 116 may determine content to provide to the near-eye display 120 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the artificial reality engine 116 may generate content for the near-eye display 120 that reflects (mirror) the movement of the user's eyes in the virtual environment. Additionally, the artificial reality engine 116 may perform actions within the application executing on the console 110 in response to action requests received from the input/output interface 140 and provide feedback to the user indicating that the actions have been performed. The feedback may be visual or auditory feedback via the near-eye display 120, or tactile feedback via the input/output interface 140.
The eye tracking module 118 may receive eye tracking data from the eye tracking unit 130 and determine a position of the user's eye based on the eye tracking data. The position of the eye may include an orientation, a position, or both of the eye relative to the near-eye display 120 or any element thereof. Because the axis of rotation of the eye changes according to the position of the eye in the orbit, determining the position of the eye in the orbit may allow the eye tracking module 118 to more accurately determine the orientation of the eye.
In some embodiments, the eye tracking module 118 may store a mapping between the images captured by the eye tracking unit 130 and the eye positions to determine a reference eye position from the images captured by the eye tracking unit 130. Alternatively or additionally, the eye tracking module 118 may determine an updated eye position relative to the reference eye position by comparing the image from which the reference eye position is determined and the image from which the updated eye position is determined. The eye tracking module 118 may use measurements from different imaging devices or other sensors to determine the eye position. For example, the eye tracking module 118 may determine a reference eye position using measurements from the slow eye tracking system and then determine an updated position from the fast eye tracking system relative to the reference eye position until a next reference eye position is determined based on measurements from the slow eye tracking system.
The eye tracking module 118 may also determine eye calibration parameters to improve the accuracy and precision of eye tracking. The eye calibration parameters may include parameters that change each time the user wears or adjusts the near-eye display 120. Example eye calibration parameters may include an estimated distance between components of the eye tracking unit 130 and one or more portions of the eye (e.g., the center of the eye, the pupil, the corneal boundary, or a point on the surface of the eye). Other example eye calibration parameters may be specific to a particular user, and may include an estimated average eye radius, an average corneal radius, an average scleral radius, a feature map on the surface of the eye, and an estimated eye surface profile. In embodiments where light from outside the near-eye display 120 may reach the eye (as in some augmented reality applications), the calibration parameters may include correction factors for intensity and color balance due to variations in light from outside the near-eye display 120. The eye tracking module 118 may use the eye calibration parameters to determine whether the measurements captured by the eye tracking unit 130 allow the eye tracking module 118 to determine an accurate eye position (also referred to herein as "valid measurements"). Invalid measurements from which the eye tracking module 118 may not be able to determine an accurate eye position may be caused by the user blinking, adjusting or removing the head-mounted device, and/or may be caused by the near-eye display 120 experiencing a change in illumination greater than a threshold due to external light. In some embodiments, at least some of the functions of the eye tracking module 118 may be performed by the eye tracking unit 130.
Fig. 2 is a perspective view of an example of a near-eye display in the form of a head-mounted display (HMD) device 200 for implementing some examples disclosed herein. The HMD device 200 may be part of, for example, a Virtual Reality (VR) system, an Augmented Reality (AR) system, a Mixed Reality (MR) system, or some combination thereof. The HMD device 200 may include a main body 220 and a headband 230. Fig. 2 shows a top side 223, a front side 225 and a right side 227 of the main body 220 in a perspective view. The headband 230 may have an adjustable or extendable length. There may be sufficient space between the main body 220 and the headband 230 of the HMD device 200 to allow the user to mount the HMD device 200 on the user's head. In various embodiments, the HMD device 200 may include additional, fewer, or different components. For example, in some embodiments, the HMD device 200 may include temples (eyeglass temples) and temple tips (temples tips) (e.g., as shown in fig. 2) instead of the headband 230.
The HMD device 200 may present media to the user that includes virtual and/or enhanced views of a physical, real-world environment with computer-generated elements. Examples of media presented by the HMD device 200 may include images (e.g., two-dimensional (2D) or three-dimensional (3D) images), video (e.g., 2D or 3D video), audio, or some combination thereof. The images and video may be presented to each eye of the user by one or more display components (not shown in fig. 2) housed in the body 220 of the HMD device 200. In various embodiments, the one or more display components may include a single electronic display panel or multiple electronic display panels (e.g., one display panel for each eye of the user). Examples of electronic display panels may include, for example, a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED) display, an Inorganic Light Emitting Diode (ILED) display, a micro light emitting diode (mLED) display, an Active Matrix Organic Light Emitting Diode (AMOLED) display, a Transparent Organic Light Emitting Diode (TOLED) display, some other display, or some combination thereof. The HMD device 200 may include two viewport regions.
In some implementations, the HMD device 200 may include various sensors (not shown), such as depth sensors, motion sensors, position sensors, and eye tracking sensors. Some of these sensors may use structured light patterns for sensing. In some implementations, the HMD device 200 may include an input/output interface for communicating with a console. In some implementations, the HMD device 200 may include a virtual reality engine (not shown) that may execute applications within the HMD device 200 and receive depth information, position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the HMD device 200 from various sensors. In some implementations, information received by the virtual reality engine can be used to generate signals (e.g., display instructions) to one or more display components. In some implementations, the HMD device 200 may include locators (not shown, e.g., locators 126) that are located at fixed positions on the body 220 relative to each other and relative to a reference point. Each locator may emit light that is detectable by an external imaging device.
Fig. 3 is a perspective view of an example of a near-eye display 300 in the form of a pair of eyeglasses for implementing some examples disclosed herein. The near-eye display 300 may be a specific embodiment of the near-eye display 120 of fig. 1 and may be configured to function as a virtual reality display, an augmented reality display, and/or a mixed reality display. Near-eye display 300 may include a frame 305 and a display 310. The display 310 may be configured to present content to a user. In some embodiments, display 310 may include display electronics and/or display optics. For example, as described above with reference to the near-eye display 120 of fig. 1, the display 310 may include an LCD display panel, an LED display panel, or an optical display panel (e.g., a waveguide display assembly).
The near-eye display 300 may also include various sensors 350a, 350b, 350c, 350d, and 350e on the frame 305 or within the frame 305. In some embodiments, the sensors 350a-350e may include one or more depth sensors, motion sensors, position sensors, inertial sensors, or ambient light sensors. In some embodiments, sensors 350a-350e may include one or more image sensors configured to generate image data representing different fields of view in different directions. In some embodiments, the sensors 350a-350e may be used as input devices to control or affect the display content of the near-eye display 300 and/or to provide an interactive VR/AR/MR experience to a user of the near-eye display 300. In some embodiments, sensors 350a-350e may also be used for stereo imaging.
In some embodiments, the near-eye display 300 may further include one or more illuminators 330 to project light into the physical environment. The projected light may be associated with different frequency bands (e.g., visible light, infrared light, ultraviolet light, etc.) and may serve various purposes. For example, the illuminator 330 may project light in a dark environment (or in an environment with low intensity infrared light, ultraviolet light, etc.) to help the sensors 350a-350e capture images of different objects in the dark environment. In some embodiments, the illuminator 330 may be used to project a particular pattern of light onto objects in the environment. In some embodiments, the illuminator 330 may be used as a locator, such as the locator 126 described above with reference to FIG. 1.
In some embodiments, the near-eye display 300 may also include a high-resolution camera 340. Camera 340 may capture an image of the physical environment in the field of view. The captured image may be processed, for example, by a virtual reality engine (e.g., artificial reality engine 116 of fig. 1) to add virtual objects to the captured image or to modify physical objects in the captured image, and the processed image may be displayed to the user by display 310 for AR or MR applications.
Fig. 4 illustrates an example of an optical see-through augmented reality system 400 using a waveguide display, in accordance with certain embodiments. Augmented reality system 400 may include a projector 410 and a combiner 415. Projector 410 may include a light source or image source 412 and projector optics 414. In some embodiments, the image source 412 may include a plurality of pixels displaying virtual objects, such as an LCD display panel or an LED display panel. In some embodiments, image source 412 may include a light source that generates coherent or partially coherent light. For example, the image source 412 may include a laser diode, a vertical-cavity surface-emitting laser, and/or a light-emitting diode. In some embodiments, image source 412 may include a plurality of light sources, each light source emitting monochromatic image light corresponding to a primary color (e.g., red, green, or blue). In some embodiments, image source 412 may include an optical pattern generator, such as a spatial light modulator. Projector optics 414 may include one or more optical components that may condition light from image source 412, such as expanding, collimating, scanning, or projecting light from image source 412 to combiner 415. The one or more optical components may include, for example, one or more lenses, liquid lenses, mirrors, apertures, and/or gratings. In some embodiments, projector optics 414 may include a liquid lens (e.g., a liquid crystal lens) with multiple electrodes, allowing scanning of light from image source 412.
Combiner 415 can include an input coupler 430 for coupling light from projector 410 into substrate 420 of combiner 415. The input coupler 430 may include a volume holographic grating, a Diffractive Optical Element (DOE) (e.g., a surface relief grating), or a refractive coupler (e.g., an optical wedge (wedge) or a prism). For visible light, the input coupler 430 may have a coupling efficiency of greater than 30%, 50%, 75%, 90%, or more. As used herein, visible light may refer to light having a wavelength between about 380nm to about 750 nm. Light coupled into substrate 420 may propagate within substrate 420 by, for example, Total Internal Reflection (TIR). The substrate 420 may be in the form of a lens of a pair of eyeglasses. Substrate 420 may have a flat or curved surface and may include one or more types of dielectric materials, such as glass, quartz, plastic, polymer, Polymethylmethacrylate (PMMA), crystal, or ceramic. The thickness of the substrate 420 may range, for example, from less than about 1mm to about 10mm or more. The substrate 420 may be transparent to visible light. A material may be "transparent" to a light beam if the light beam is capable of passing through the material with a high transmission rate (e.g., greater than 50%, 40%, 75%, 80%, 90%, 95% or higher) where a small portion of the light beam (e.g., less than 50%, 40%, 25%, 20%, 10%, 5% or less) may be scattered, reflected or absorbed by the material. The transmittance (i.e., the degree of transmission) may be represented by a photopic weighted or unweighted average transmittance in a wavelength range, or by the lowest transmittance in a wavelength range (e.g., visible wavelength range).
Substrate 420 may include or may be coupled to a plurality of output couplers 440, output couplers 440 configured to extract at least a portion of light from substrate 420 that is guided by substrate 420 and propagates within substrate 420 and direct extracted light 460 to window 495 where an eye 490 of a user of augmented reality system 400 may be located when augmented reality system 400 is in use. Like the input coupler 430, the output coupler 440 may include a grating coupler (e.g., a volume holographic grating or a surface relief grating), other DOEs, prisms, and the like. The output coupler 440 may have different coupling (e.g., diffraction) efficiencies at different locations. Substrate 420 may also allow light 450 from the environment in front of combiner 415 to pass through with little or no loss. The output coupler 440 may also allow the light 450 to pass through with little loss. For example, in some implementations, the output coupler 440 may have a low diffraction efficiency for the light 450, such that the light 450 may be refracted or otherwise pass through the output coupler 440 with little loss, and thus may have a higher intensity than the extracted light 460. In some implementations, the output coupler 440 can have a high diffraction efficiency for the light 450 and can diffract the light 450 into certain desired directions (i.e., diffraction angles) with little loss. As a result, the user may view a combined image of the environment in front of combiner 415 and the virtual object projected by projector 410.
In many applications, to diffract light in a desired direction towards the user's eye, to obtain a desired diffraction efficiency for certain diffraction orders, and to increase the field of view and reduce rainbow artifacts of the waveguide display, the grating coupler (e.g., input coupler 430 or output coupler 440) may comprise a blazed grating or an inclined grating, such as an inclined surface relief grating, wherein the grating ridges (ridges) and grooves (groves) may be inclined with respect to the surface normal of the grating coupler or waveguide. Furthermore, in some embodiments, it may be desirable for the grating to have a non-uniform height or depth profile across the grating area, and/or a grating period or duty cycle that varies across the grating, in order to improve the performance of the grating, for example to achieve different diffraction characteristics (e.g., diffraction efficiency and/or diffraction angle) at different areas of the grating.
FIG. 5 illustrates an example of a tilted grating 520 used in an example waveguide display 500 according to some embodiments. The waveguide display 500 may include a tilted grating 520 on a waveguide 510, such as the substrate 420. The tilted grating 520 may act as a grating coupler for coupling light into the waveguide 510 or out of the waveguide 510. In some embodiments, the tilted grating 520 may include a structure having a period p, which may be constant or may vary over the entire area of the tilted grating 520. The tilted grating 520 may include a plurality of ridges 522 and a plurality of grooves 524 between the ridges 522. Each period of the tilted grating 520 may include a ridge 522 and a groove 524, and the groove 524 may be an air gap or a region filled with a material having a refractive index different from that of the ridge 522. The ratio between the width of the ridge 522 and the grating period p may be referred to as the duty cycle. The tilted grating 520 may have a duty cycle of, for example, from about 30% to about 70%, or from about 10% to about 90% or more. In some embodiments, the duty cycle may vary from cycle to cycle or from region to region. In some embodiments, the period p of the tilted grating may vary from one region to another over the tilted grating 520, or may vary from one period to another over the tilted grating 520 (i.e., chirped).
Ridge 522 may be formed of a material such as silicon-containing (e.g., SiO)2、Si3N4、SiC、SiOxNyAmorphous silicon), organic materials (e.g., spin-on carbon (SOC) or Amorphous Carbon Layer (ACL) or diamond-like carbon (DLC)) or inorganic metal oxide layers (e.g., TiOx、AlOx、TaOx、HfOxEtc.). Each ridge 522 may include a leading edge 530 having a skew angle α and a trailing edge 540 having a skew angle β. In some embodiments, the leading edge 530 and trailing edge 540 of each ridge 522 may be parallel to each other. In some embodiments, the tilt angle β 0 may be different from the tilt angle β 1. In some embodiments, the tilt angle α may be approximately equal to the tilt angle β. For example, the difference between the tilt angle α and the tilt angle β may be less than 20%, 10%, 5%, 1%, or less. In some embodiments, the tilt angle α and the tilt angle β may range, for example, from about 30 ° or less to about 70 ° or more. In some embodiments, the tilt angle α and/or the tilt angle β may also vary from ridge to ridge in the tilted grating 520.
Each groove 524 may have a depth d in the z-direction, which may be constant or may vary over the entire area of the slanted grating 520. In some embodiments, the depth of the grooves 524 may vary across the area of the tilted grating 520 according to a pattern or depth profile 550. In some embodiments, the depth of the groove 524 may include multiple depth levels, such as 8 depth levels, 16 depth levels, 32 depth levels, or more. In some embodiments, the depth of the trench 524 may vary from 0 to about 100nm, 200nm, 300nm, or more. In some embodiments, the grooves 524 between the ridges 522 may be over-clad (over-coat) or filled with a material having an index of refraction that is higher or lower than the index of refraction of the material of the ridges 522. For example, in some embodiments, a high index of refraction material, such as hafnium oxide (Hafnia), titanium oxide (Titania), tantalum oxide, tungsten oxide, zirconium oxide, gallium sulfide, gallium nitride, gallium phosphide, silicon, or a high index of refraction polymer, may be used to fill trench 524. In some embodiments, a low index material, such as silicon oxide, aluminum oxide, porous silicon dioxide, or fluorinated low index monomer (or polymer), may be used to fill the trenches 524. As a result, the difference between the refractive index of the ridge and the refractive index of the groove may be greater than 0.1, 0.2, 0.3, 0.5, 1.0, or higher.
Thus, the tilted grating 520 may have a three-dimensional structure whose physical dimensions may vary in the x, y, and/or z directions. For example, the grating period or duty cycle of the tilted grating 520 may vary in the x-y plane and also in the z direction if the tilt angle α is different from the tilt angle β. The depth of the slot 524 in the z-direction may vary in the x-and/or y-directions. In some embodiments, the tilt angles α and/or β relative to the z-direction may also vary along the x and/or y directions in the tilted grating 520.
It can be challenging to manufacture the tilted grating shown and described above with reference to figure 5. For example, many grating etch processes may only uniformly etch the substrate to produce gratings with uniform thickness or depth. In some etching processes, the etch rate and depth of the grating may depend on the duty cycle of the grating to be etched. Thus, even though a non-uniform depth may be obtained using such an etching process, the etch depth may depend on other physical dimensions of the grating (e.g., duty cycle or period), and thus the grating may not have a desired three-dimensional profile.
While conventional lithographic techniques (e.g., photolithography, electron beam lithography, etc.) can produce gratings with highly customizable duty cycles and/or grating periods, these lithographic techniques are generally unable to modulate the vertical dimension (i.e., etch depth) of the grating over the entire area of the substrate relative to the surface normal of the substrate. Techniques such as using a movable blade to spatially control etch time (and depth) during etching, using dual masks and gray scale lithography can be used to fabricate variable etch depth structures. However, these techniques may require long development cycles to develop and may have difficulty controlling the different variations in etch depth over a two-dimensional area to achieve a complete three-dimensional VED structure.
According to certain embodiments, a method of fabricating a VED structure may include modifying an etch selectivity of a surface region of a substrate (e.g., quartz) using femtosecond laser pulses prior to etching, and then etching the modified substrate using a dry etch process to achieve spatially varying geometries with nanoscale or microscale features. The etch selectivity of modifying the substrate surface region using femtosecond laser pulses can be performed before or after the photolithography process.
Femtosecond laser pulses are used to directly and locally modify the etch selectivity of the substrate and define the VED function or profile in the substrate, which allows maskless gray-scale etching. Nanoscale or microscale lithography may be used to define two-dimensional diffraction, refraction, or reflection patterns of optics used, for example, in near-eye display devices. Due to the different femtosecond laser modifications of different two-dimensional regions of the substrate, a two-dimensional pattern can be transferred into the substrate at different depth levels in the different regions depending on the femtosecond laser pulse energy to which each respective region of the substrate has been exposed and the etch selectivity of the respective region.
Figure 6 is a flow chart 600 illustrating an example of a method of fabricating a variable etch depth grating according to some embodiments. The operations described in flowchart 600 are for illustration purposes only and are not intended to be limiting. In various embodiments, flow diagram 600 may be modified to add additional operations or to omit some operations. The operations described in flowchart 600 may be performed by, for example, one or more semiconductor manufacturing systems including patterning systems, deposition systems, etching systems, or any combination thereof.
At block 610, a surface of a substrate may be exposed to a femtosecond laser pulse. As mentioned above, the substrate may comprise one or more types of dielectric or semiconductor materials, such as glass, quartz, plastic, polymer, Polymethylmethacrylate (PMMA), crystal, Si3N4、SiC、Al2O3Or a ceramic. The exposure time may be different at different areas of the substrate surface, so that the exposure dose may be different at different areas of the substrate surface, which may modify the etch selectivity of the substrate differently at different areas.
At block 620, an etch mask layer (e.g., a hard mask layer) may be deposited on the substrate. The etch mask layer may comprise, for example, Cr, Pt, Pd, Ti, MoSi, a polymer, another metallic material, a metal oxide, or any combination thereof. In some embodiments, an etch mask layer may be deposited on the substrate prior to exposing the substrate surface to the femtosecond laser pulses, wherein the exposing to the femtosecond laser pulses may occur from the backside of the substrate.
At block 630, a two-dimensional pattern may be formed in the photoresist layer on top of the etch mask layer. The pattern may comprise a nanoscale or microscale grating. The photolithographic layer may include, for example, photoresist. The pattern may be formed in the photoresist layer using, for example, Electron Beam Lithography (EBL), photolithography, nanoimprint lithography, and the like.
At block 640, the pattern in the photolithographic layer may be transferred into an etch mask layer, for example, using various dry etching, wet etching, physical etching, or chemical etching techniques to form an etch mask.
At block 650, the substrate may be etched by dry etching, such as Inductively Coupled Plasma (ICP) etching, Capacitively Coupled Plasma (CCP) etching, Reactive Ion Etching (RIE), Ion Beam Etching (IBE), ion milling, chemical RIE, or the like, using the etch mask. Areas of the substrate exposed to higher femtosecond laser pulse doses can be etched faster than areas of the substrate exposed to lower femtosecond laser pulse doses. Thus, a variable etch depth profile may be formed in the substrate using dry etching.
Embodiments of the invention may be used to implement components of or in conjunction with an artificial reality system. Artificial reality is a form of reality that is adjusted in some way before being presented to a user, which may include, for example, Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), mixed reality, or some combination and/or derivative thereof. The artificial reality content may include fully generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of them may be presented in a single channel or multiple channels (e.g., stereoscopic video that produces a three-dimensional effect to a viewer). Further, in some embodiments, the artificial reality may also be associated with an application, product, accessory, service, or some combination thereof for creating content, for example, in the artificial reality, and/or otherwise using in the artificial reality (e.g., performing an activity in the artificial reality). An artificial reality system that provides artificial reality content may be implemented on a variety of platforms, including a Head Mounted Display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
Fig. 7 is a simplified block diagram of an example electronic system 700 for implementing an example near-eye display (e.g., HMD device) of some examples disclosed herein. The electronic system 700 may be used as an electronic system for the HMD device described above or other near-eye displays. In this example, electronic system 700 may include one or more processors 710 and memory 720. The processor 710 may be configured to execute instructions for performing operations at various components and may be, for example, a general purpose processor or a microprocessor suitable for implementation within a portable electronic device. Processor 710 may be communicatively coupled with various components within electronic system 700. To achieve this communicative coupling, processor 710 may communicate with the other illustrated components over a bus 740. Bus 740 may be any subsystem adapted to transfer data within electronic system 700. Bus 740 may include multiple computer buses and additional circuits for transferring data.
A memory 720 may be coupled to the processor 710. In some embodiments, memory 720 may provide both short-term and long-term storage, and may be divided into several units. Memory 720 may be volatile (e.g., Static Random Access Memory (SRAM) and/or Dynamic Random Access Memory (DRAM)) and/or nonvolatile (e.g., Read Only Memory (ROM), flash memory, etc.). Further, memory 720 may include a removable memory device, such as a Secure Digital (SD) card. Memory 720 may provide storage of computer readable instructions, data structures, program modules and other data for electronic system 700. In some embodiments, memory 720 may be distributed among different hardware modules. A set of instructions and/or code may be stored in memory 720. The instructions may take the form of executable code, which may be executable by the electronic system 700 and/or may take the form of source code and/or installable code, which may be in the form of executable code when compiled and/or installed on the electronic system 700 (e.g., using any of a variety of commonly available compilers, installation programs, compression/decompression utilities, etc.).
In some embodiments, memory 720 may store a plurality of application modules 722-724, and application modules 722-724 may include any number of applications. Examples of applications may include: a gaming application, a conferencing application, a video playback application, or other suitable application. These applications may include depth sensing functions or eye tracking functions. The application modules 722 and 724 may include specific instructions to be executed by the processor 710. In some embodiments, certain applications or portions of the application modules 722 and 724 may be executed by other hardware modules 780. In certain embodiments, memory 720 may additionally include secure memory, which may include additional security controls to prevent copying or other unauthorized access to secure information.
In some embodiments, the memory 720 may include an operating system 725 loaded therein. The operating system 725 may be operable to initiate execution of instructions provided by the application modules 722 and 724 and/or to manage the other hardware modules 780 and interfaces with the wireless communication subsystem 730, which may include one or more wireless transceivers. The operating system 725 may be adapted to perform other operations on the components of the electronic system 700, including thread management (threading), resource management, data storage control, and other similar functions.
The wireless communication subsystem 730 may include, for example, an infrared communication device, a wireless communication device, and/or a chipset (e.g., such as
Figure BDA0003299038010000221
Devices, IEEE 802.11 devices, Wi-Fi devices, WiMax devices, cellular communications facilities, etc.) and/or the like. Electronic system 700 may include one or more antennas 734 for wireless communication, either as part of wireless communication subsystem 730 or as a separate component coupled to any portion of the system. Depending on the desired functionality, the wireless communication subsystem 730 may include a separate transceiver to communicate with base transceiver stations and other wireless devices and access points, which may include communicating with different data networks and/or network types, such as Wireless Wide Area Networks (WWANs), Wireless Local Area Networks (WLANs), or Wireless Personal Area Networks (WPANs). The WWAN may be, for example, a WiMax (IEEE 802.16) network. The WLAN may be, for example, an IEEE 802.11x network. The WPAN may be, for example, a bluetooth network, IEEE 802.15x, or some other type of network. The techniques described herein may also be used for any combination of WWAN, WLAN and/or WPAN. Wireless communication subsystem 730 may allow data to be exchanged with a network, other computer systems, and/or any other devices described herein. The wireless communication subsystem 730 may include a transceiver for transmitting or receiving data (e.g., an identifier of an HMD device, location data, a geographic map, heat) using an antenna 734 and a wireless link 732Pictures, photos, or videos). The wireless communication subsystem 730, processor 710, and memory 720 may together comprise at least a portion of one or more of the means for performing some of the functions disclosed herein.
The electronic system 700 may include one or more sensors 790. Sensor 790 may include, for example, an image sensor, an accelerometer, a pressure sensor, a temperature sensor, a proximity sensor, a magnetometer, a gyroscope, an inertial sensor (e.g., a module combining an accelerometer and a gyroscope), an ambient light sensor, or any other similar module operable to provide a sensing output and/or receive a sensing input, such as a depth sensor or a position sensor. For example, in some embodiments, the sensors 790 may include one or more Inertial Measurement Units (IMUs) and/or one or more position sensors. Based on the measurement signals received from the one or more position sensors, the IMU may generate calibration data indicative of an estimated position of the HMD device relative to an initial position of the HMD device. The position sensor may generate one or more measurement signals in response to motion of the HMD device. Examples of position sensors may include, but are not limited to, one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor to detect motion, one type of sensor for IMU error correction, or some combination thereof. The position sensor may be located external to the IMU, internal to the IMU, or some combination thereof. At least some of the sensors may sense using a structured light pattern.
Electronic system 700 may include a display module 760. Display module 760 may be a near-eye display and may present information from electronic system 700, such as images, videos, and various instructions, to a user graphically. Such information may be derived from one or more application modules 722 and 724, a virtual reality engine 726, one or more other hardware modules 780, a combination thereof, or any other suitable means for parsing graphical content for a user (e.g., via the operating system 725). The display module 760 may use Liquid Crystal Display (LCD) technology, Light Emitting Diode (LED) technology (including, for example, OLED, ILED, mLED, AMOLED, TOLED, etc.), light emitting polymer display (LPD) technology, or some other display technology.
Electronic system 700 may include a user input/output module 770. User input/output module 770 may allow a user to send action requests to electronic system 700. The action request may be a request to perform a particular action. For example, the action request may be to start or end an application, or to perform a particular action within an application. User input/output module 770 may include one or more input devices. Example input devices may include a touch screen, touch pad, microphone, button, dial, switch, keyboard, mouse, game controller, or any other suitable device for receiving an action request and communicating the received action request to the electronic system 700. In some embodiments, the user input/output module 770 may provide haptic feedback to the user according to instructions received from the electronic system 700. For example, haptic feedback may be provided when an action request is received or has been performed.
The electronic system 700 may include a camera 750, and the camera 750 may be used to take pictures or videos of the user, for example, to track the user's eye position. The camera 750 may also be used to take pictures or video of an environment, for example, for VR, AR, or MR applications. The camera 750 may include, for example, a Complementary Metal Oxide Semiconductor (CMOS) image sensor having millions or tens of millions of pixels. In some implementations, the camera 750 may include two or more cameras, which may be used to capture 3D images.
In some embodiments, electronic system 700 may include a plurality of other hardware modules 780. Each other hardware module 780 may be a physical module within electronic system 700. While each of the other hardware modules 780 may be permanently configured as a structure, some of the other hardware modules 780 may be temporarily configured to perform a particular function or temporarily activated. Examples of other hardware modules 780 may include, for example, audio output and/or input modules (e.g., microphone or speaker), Near Field Communication (NFC) modules, rechargeable batteries, battery management systems, wired/wireless battery charging systems, and so forth. In some embodiments, one or more functions of the other hardware modules 780 may be implemented in software.
In some embodiments, the memory 720 of the electronic system 700 may also store a virtual reality engine 726. The virtual reality engine 726 may execute applications within the electronic system 700 and receive location information, acceleration information, velocity information, predicted future locations, or some combination thereof, of the HMD device from various sensors. In some embodiments, information received by virtual reality engine 726 may be used to generate signals (e.g., display instructions) to display module 760. For example, if the received information indicates that the user has looked to the left, the virtual reality engine 726 may generate content for the HMD device that reflects the user's movements in the virtual environment. Further, virtual reality engine 726 may perform actions within the application and provide feedback to the user in response to action requests received from user input/output module 770. The feedback provided may be visual feedback, auditory feedback, or tactile feedback. In some implementations, processor 710 may include one or more GPUs that may execute virtual reality engine 726.
In various embodiments, the hardware and modules described above may be implemented on a single device or on multiple devices that may communicate with each other using wired or wireless connections. For example, in some implementations, some components or modules, such as the GPU, the virtual reality engine 726, and applications (e.g., tracking applications), may be implemented on a console separate from the head-mounted display device. In some implementations, one console may be connected to or support more than one HMD.
In alternative configurations, different and/or additional components may be included in electronic system 700. Similarly, the functionality of one or more components may be distributed among the components in a manner different from that described above. For example, in some embodiments, electronic system 700 may be modified to include other system environments, such as an AR system environment and/or an MR environment.
The methods, systems, and devices discussed above are examples. Various embodiments may omit, substitute, or add various procedures or components as appropriate. For example, in alternative configurations, the described methods may be performed in an order different than that described, and/or stages may be added, omitted, and/or combined. Furthermore, features described with reference to certain embodiments may be combined in various other embodiments. Different aspects and elements of the embodiments may be combined in a similar manner. Furthermore, technology is constantly evolving and thus many elements are examples and do not limit the scope of the disclosure to those specific examples.
In the description, specific details are given to provide a thorough understanding of the embodiments. However, embodiments may be practiced without these specific details. For example, well-known circuits, processes, systems, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the embodiments. This description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the invention. Rather, the foregoing description of the embodiments will provide those skilled in the art with a enabling description for implementing various embodiments. Various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the disclosure.
Furthermore, some embodiments and examples are described as a process which is depicted as a flowchart or a block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. The process may have additional steps not included in the figure. Furthermore, embodiments of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the associated tasks may be stored in a computer-readable medium such as a storage medium. The processor may perform the associated tasks.
It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized or dedicated hardware might also be used, and/or particular elements might be implemented in hardware, software, including portable software (e.g., applets, etc.), or both. In addition, connections to other computing devices, such as network input/output devices, may be employed.
Referring to the figures, components that may include memory may include a non-transitory machine-readable medium. The terms "machine-readable medium" and "computer-readable medium" may refer to any storage medium that participates in providing data that causes a machine to operation in a specific fashion. In the embodiments and examples provided above, various machine-readable media may be involved in providing instructions/code to a processing unit and/or other device for execution. Additionally or alternatively, a machine-readable medium may be used to store and/or carry such instructions/code. In many implementations, the computer-readable medium is a physical and/or tangible storage medium. Such a medium may take many forms, including but not limited to, non-volatile media, and transmission media. Common forms of computer-readable media include, for example, magnetic and/or optical media (e.g., Compact Discs (CDs) or Digital Versatile Discs (DVDs)), punch cards, paper tape, any other physical medium with patterns of holes (holes), RAM, Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), flash-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code. A computer program product may include code and/or machine executable instructions, which may represent any combination of procedures, functions, subroutines, programs, routines, applications (App), subroutines, modules, software packages, classes or instructions, data structures, or program statements.
Those of skill in the art would understand that the information and signals used to convey the messages described herein may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
The terms "and" or "as used herein may include a variety of meanings that are also intended to depend, at least in part, on the context in which the terms are used. Typically, if an "or" is used to associate a list (e.g., A, B or C), it is intended to mean A, B and C (used herein in an inclusive sense), and A, B or C (used herein in an exclusive sense). Furthermore, the term "one or more" as used herein may be used to describe any feature, structure, or characteristic in the singular or may be used to describe some combination of features, structures, or characteristics. It should be noted, however, that this is merely an illustrative example and that claimed subject matter is not limited to this example. Furthermore, if at least one of the terms (at least one of) is used to associate a list (e.g., A, B or C), the term may be interpreted to mean any combination of A, B and/or C, e.g., a, AB, AC, BC, AA, ABC, AAB, AABBCCC, etc.
Furthermore, while certain embodiments and examples have been described using a particular combination of hardware and software, it should be recognized that other combinations of hardware and software are possible. Certain embodiments and examples may be implemented solely in hardware, solely in software, or using a combination thereof. In one example, the software may be implemented in a computer program product comprising computer program code or instructions executable by one or more processors for performing any or all of the steps, operations, or processes described in the present disclosure, wherein the computer program may be stored on a non-transitory computer readable medium. The various processes described herein may be implemented on the same processor, or on different processors in any combination.
Where a device, system, component, or module is described as being configured to perform certain operations or functions, such configuration may be accomplished by: for example, by designing an electronic circuit to perform the operations, by programming a programmable electronic circuit (such as a microprocessor) to perform the operations (such as by executing computer instructions or code), or by a processor or core programmed to execute code or instructions stored on a non-transitory storage medium, or by any combination thereof. Processes may communicate using various techniques, including but not limited to conventional techniques for inter-process communication, and different pairs of processes may use different techniques, or the same pair of processes may use different techniques at different times.
The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense. It will, however, be evident that additions, deletions, as well as other modifications and changes may be made thereto without departing from the broader spirit and scope set forth in the claims. Thus, while particular embodiments and examples have been described, these embodiments are not intended to be limiting. Various modifications and equivalents are within the scope of the appended claims.
Appendix A
Motive machine
In WG technology, achieving a highly controlled spatially Variable Etch Depth (VED) grating with a pattern height is very important to adjust the diffraction efficiency and angular/spectral response of the diffraction grating. In non-diffractive solutions (refraction/reflection), it is also very important to achieve spatially varying 3D structures. Unfortunately, there is no simple way to pattern nanoscale features and spatially control their geometry on a millimeter scale. Currently explored solutions include the use of movable blades in the etching process to spatially control the etching time (and depth), the use of dual masks and grayscale lithography, etc. However, these techniques require long development times and are difficult to control when a 2D VED is required.
Invention of the invention
The invention includes the use of femtosecond laser pulses to modify the substrate surface, in conjunction with conventional photolithography steps and dry etching, to achieve any spatially varying geometry with nanoscale or microscale patterns.
Femtosecond laser pulses are used to locally modify the etch selectivity of the substrate and to "program" the VED function directly in the substrate. This allows maskless gray scale etching. Nano-scale or micro-scale lithography is used to define the diffraction/refraction/reflection patterns used by the optics. As the substrate is modified by the femtosecond laser, the pattern will be etched deeper or shallower depending on the local energy to which the substrate is exposed.
Procedure step
1) Exposing a substrate surface to femtosecond laser pulses
2) A hard mask is deposited. A hard mask (Cr, Pt, Pd, Ti, MoSi, polymer or any combination of metals, metal oxides) may already be present on the substrate. In this case, exposure to the femtosecond laser may occur from the backside.
3) The nanograms and/or the microgrids are patterned on the photoresist layer on top of the hard mask (EBL, photolithography, NIL, etc.).
4) The pattern is transferred into the hard mask.
5) The quartz is etched through the hard mask. The areas of the substrate that are subjected to the highest dose are expected to etch faster than the unmodified areas. Thus, the VED profile can be programmed on the substrate and implemented using dry etching (ICP, CCP, RIE, IBE, ion milling, chemical RIE, etc.).

Claims (13)

1. A method of fabricating a variable etch depth structure in a substrate, the method comprising:
exposing a surface of the substrate to femtosecond laser pulses, wherein an exposure dose of the femtosecond laser pulses on the surface of the substrate varies across the surface of the substrate;
depositing an etch mask layer on the substrate;
forming a two-dimensional pattern in a photolithographic layer on the etch mask layer;
transferring the two-dimensional pattern into the etch mask layer to form an etch mask; and
etching the substrate using the etch mask to form the variable etch depth structure in the substrate.
2. The method of claim 1, wherein the variable etch depth structure is a variable etch depth grating structure for a waveguide-based near-eye display system.
3. The method of claim 1, wherein the substrate comprises a dielectric material and/or a semiconductor material.
4. The method of claim 1, wherein the etch mask layer comprises one or more of a metal, molybdenum silicide, a polymer, a metal oxide, and combinations thereof.
5. The method of claim 4, wherein the metal comprises chromium, platinum, palladium, and/or titanium.
6. The method of claim 1, wherein the etch mask layer is deposited on the substrate prior to exposing the surface of the substrate to femtosecond laser pulses.
7. The method of claim 6, wherein exposing to femtosecond laser pulses occurs at a backside of the substrate.
8. The method of claim 1, wherein the two-dimensional pattern is formed in the photolithographic layer using electron beam lithography, photolithography, and/or nanoimprint lithography.
9. The method of claim 1, wherein the pattern in the photolithographic layer is transferred into the etch mask layer using a dry etching technique, a wet etching technique, a physical etching technique, and/or a chemical etching technique.
10. The method of claim 1, wherein the substrate is etched using dry etching using the etch mask.
11. The method of claim 10, wherein the substrate is etched using the etch mask using inductively coupled plasma etching, capacitively coupled plasma etching, reactive ion etching, ion beam etching, ion milling, and/or chemically reactive ion etching.
12. A variable etch depth structure fabricated using the method of claim 1.
13. A system for performing the method of claim 1.
CN202080028185.0A 2019-04-15 2020-04-14 Substrate modification by femtosecond laser in dry etching to achieve variable etch depth Pending CN113677619A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201962834289P 2019-04-15 2019-04-15
US62/834,289 2019-04-15
PCT/US2020/028147 WO2020214608A1 (en) 2019-04-15 2020-04-14 Substrate modification by femto-second laser to achieve variable etch depth in dry etching

Publications (1)

Publication Number Publication Date
CN113677619A true CN113677619A (en) 2021-11-19

Family

ID=70482905

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080028185.0A Pending CN113677619A (en) 2019-04-15 2020-04-14 Substrate modification by femtosecond laser in dry etching to achieve variable etch depth

Country Status (3)

Country Link
EP (1) EP3956258A1 (en)
CN (1) CN113677619A (en)
WO (1) WO2020214608A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114273790B (en) * 2022-02-15 2023-07-28 山东大学 Femtosecond laser processing device and method for etching gallium nitride in liquid phase
WO2023249702A1 (en) * 2022-06-23 2023-12-28 Google Llc Waveguide grating depth and filling factor dual modulation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455950A (en) * 2000-08-12 2003-11-12 应用材料有限公司 Integrated shallow trench isolation process
US20040137372A1 (en) * 2003-01-15 2004-07-15 Livingston Frank E. Photosensitive glass variable laser exposure patterning method
JP2004351494A (en) * 2003-05-30 2004-12-16 Seiko Epson Corp Drilling method for material transparent to laser
CN1710705A (en) * 2005-07-05 2005-12-21 华中科技大学 Silicon wet-etching technology
CN102741010A (en) * 2010-02-05 2012-10-17 株式会社藤仓 Surface microstructure formation method and substrate having surface microstructure

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4791745B2 (en) * 2005-03-28 2011-10-12 パナソニック電工株式会社 Method of processing light incident / exit part of optical medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455950A (en) * 2000-08-12 2003-11-12 应用材料有限公司 Integrated shallow trench isolation process
US20040137372A1 (en) * 2003-01-15 2004-07-15 Livingston Frank E. Photosensitive glass variable laser exposure patterning method
JP2004351494A (en) * 2003-05-30 2004-12-16 Seiko Epson Corp Drilling method for material transparent to laser
CN1710705A (en) * 2005-07-05 2005-12-21 华中科技大学 Silicon wet-etching technology
CN102741010A (en) * 2010-02-05 2012-10-17 株式会社藤仓 Surface microstructure formation method and substrate having surface microstructure

Also Published As

Publication number Publication date
EP3956258A1 (en) 2022-02-23
WO2020214608A1 (en) 2020-10-22

Similar Documents

Publication Publication Date Title
US11662584B2 (en) Gradient refractive index grating for display leakage reduction
US20220082739A1 (en) Techniques for manufacturing variable etch depth gratings using gray-tone lithography
CN113302542B (en) Angle-selective grating coupler for waveguide display
US11550083B2 (en) Techniques for manufacturing slanted structures
US11667059B2 (en) Techniques for reducing surface adhesion during demolding in nanoimprint lithography
US20220206232A1 (en) Layered waveguide fabrication by additive manufacturing
US10976483B2 (en) Variable-etch-depth gratings
CN113348386B (en) Increasing the duty cycle range of a waveguide combiner
CN113302431A (en) Volume Bragg grating for near-eye waveguide displays
US11709422B2 (en) Gray-tone lithography for precise control of grating etch depth
CN113677619A (en) Substrate modification by femtosecond laser in dry etching to achieve variable etch depth
TW202343080A (en) Suppression of first-order diffraction in a two-dimensional grating of an output coupler for a head-mounted display
WO2022146904A1 (en) Layered waveguide fabrication by additive manufacturing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: California, USA

Applicant after: Yuan Platform Technology Co.,Ltd.

Address before: California, USA

Applicant before: Facebook Technologies, LLC

CB02 Change of applicant information