WO2023014185A1 - 사용자의 시선을 검출하는 증강 현실 장치 및 방법 - Google Patents
사용자의 시선을 검출하는 증강 현실 장치 및 방법 Download PDFInfo
- Publication number
- WO2023014185A1 WO2023014185A1 PCT/KR2022/011693 KR2022011693W WO2023014185A1 WO 2023014185 A1 WO2023014185 A1 WO 2023014185A1 KR 2022011693 W KR2022011693 W KR 2022011693W WO 2023014185 A1 WO2023014185 A1 WO 2023014185A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- lens
- user
- augmented reality
- vision correcting
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 167
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000004438 eyesight Effects 0.000 claims abstract description 186
- 210000001508 eye Anatomy 0.000 claims abstract description 170
- 238000012937 correction Methods 0.000 claims abstract description 57
- 230000008878 coupling Effects 0.000 claims description 8
- 238000010168 coupling process Methods 0.000 claims description 8
- 238000005859 coupling reaction Methods 0.000 claims description 8
- 210000001747 pupil Anatomy 0.000 description 25
- 238000001514 detection method Methods 0.000 description 21
- 230000003287 optical effect Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 20
- 239000011521 glass Substances 0.000 description 18
- 230000008569 process Effects 0.000 description 9
- 239000000463 material Substances 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 238000013507 mapping Methods 0.000 description 7
- 239000012780 transparent material Substances 0.000 description 7
- 210000005252 bulbus oculi Anatomy 0.000 description 6
- 229920003023 plastic Polymers 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 4
- 210000003128 head Anatomy 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 3
- 239000006059 cover glass Substances 0.000 description 2
- 239000012530 fluid Substances 0.000 description 2
- 239000012528 membrane Substances 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 206010020675 Hypermetropia Diseases 0.000 description 1
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 1
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 1
- 201000009310 astigmatism Diseases 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229920002457 flexible plastic Polymers 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 230000004305 hyperopia Effects 0.000 description 1
- 201000006318 hyperopia Diseases 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004379 myopia Effects 0.000 description 1
- 208000001491 myopia Diseases 0.000 description 1
- 229910052698 phosphorus Inorganic materials 0.000 description 1
- 239000011574 phosphorus Substances 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
Definitions
- the present disclosure relates to an augmented reality apparatus and method for detecting a user's gaze. More particularly, the present disclosure relates to a gaze tracking sensor and method for detecting a gaze of a user in an augmented reality device including a vision correcting lens.
- Augmented reality (AR) technology is a technology that synthesizes virtual objects or information in a real environment to make the virtual objects or information look like objects existing in a real physical environment.
- Modern computing and display technologies have enabled the development of systems for augmented reality experiences, in which digitally recreated images, or parts thereof, can be perceived as if they were real, or perceived as real. It can be presented to the user in any way.
- augmented reality devices have a head mounted display (head mounted display, HMD) form, such an augmented reality device is inconvenient to use while wearing glasses for vision correction.
- the visual acuity before correction of a person wearing spectacles for vision correction may appear complex due to myopia, hyperopia, astigmatism, or a combination thereof.
- augmented reality devices including vision correction lenses are being developed.
- Gaze direction information is used for various operations, such as building a user interface, optimizing rendering of an image provided to the user (eg, foveated rendering), or determining the distance to an object the user is looking at.
- can be used for Gaze direction information may be generated by a user's eyeball position tracking sensor (hereinafter referred to as an eye-tracking sensor (ET sensor)).
- E sensor eye-tracking sensor
- a gaze tracking method capable of increasing the accuracy of detecting a user's eye gaze in an augmented reality device including a vision correcting lens.
- One embodiment of the present disclosure may provide a gaze tracking method and apparatus capable of increasing accuracy of gaze detection in an augmented reality device including a vision correcting lens.
- An augmented reality device disclosed as a technical means for achieving the above-described technical problem includes a light guide plate, a vision correcting lens disposed overlapping with the light guide plate in a visual direction of a user wearing the augmented reality device, and an augmented reality device to a user of the augmented reality device. It may include a support for fixing to a face, a gaze tracking sensor including a light emitter and a light receiver installed on a part of the support, a light reflector for reflecting light for gaze tracking, and at least one processor.
- the processor obtains lens characteristic information of the vision correcting lens, controls the light emitting unit to emit light for gaze tracking toward the light reflecting unit, and obtains an eye image of the user based on the light received through the light receiving unit;
- the user's eye image may be compensated based on the lens characteristic information, and the user's gaze information may be obtained from the compensated eye image.
- Light emitted toward the light reflector may be reflected by the light reflector toward the user's eyes, and light received by the light receiver may include light toward the user's eyes and reflected light by the user's eyes.
- the light reflection part may include a pattern.
- the processor controls the light emitting unit to emit light for obtaining lens characteristic information of the vision correcting lens toward the light reflecting unit, identifies a distorted pattern based on the light received through the light receiving unit, and based on the distorted pattern Lens characteristic information of the vision correcting lens may be acquired.
- a method for detecting a user's gaze by an augmented reality device including a vision correcting lens disclosed as a technical means for achieving the above-described technical problem is overlapped with a light guide plate for displaying an image output from the augmented reality device in a direction of the user's gaze.
- obtaining lens characteristic information of the vision correcting lens which is disposed; Emitting light for gaze tracking toward the light reflector through the light emitter installed on a part of the support of the augmented reality device, and the emitted light is reflected by the light reflector and directed toward the eyes of a user wearing the augmented reality device.
- phosphorus step; receiving light reflected by the user's eyes through a light receiver installed in the support; obtaining an eye image of the user based on the light received through the light receiving unit; compensating an eye image of a user based on lens characteristic information of a vision correcting lens; and obtaining gaze information of the user based on the compensated eye image.
- a computer-readable recording medium disclosed as a technical means for achieving the above-described technical problem may store a program for executing at least one of the embodiments of the disclosed method in a computer.
- FIG. 1 is a schematic diagram of a method of tracking a user's gaze in an augmented reality device including a vision correcting lens according to an embodiment of the present disclosure.
- FIG. 2 is a flowchart of a method of detecting a gaze of a user by an augmented reality device according to an embodiment of the present disclosure.
- FIG. 3 is a block diagram of an augmented reality device according to an embodiment of the present disclosure.
- FIG. 4 is a diagram for explaining an operation of detecting a gaze of a user by an augmented reality device according to an embodiment of the present disclosure.
- FIG. 5 is a diagram for explaining an operation of obtaining lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure.
- FIG. 6 is a diagram for explaining an operation of compensating an eye image based on lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure.
- FIG. 7 is a diagram for explaining an operation of obtaining gaze information of a user from a compensated eye image based on lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure.
- FIG. 8 is a diagram for explaining an operation of obtaining lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure.
- FIG. 9 is a diagram for explaining an operation of obtaining lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure.
- FIG. 10 is a diagram for explaining control parameters of a vision correcting lens according to an embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an augmented reality device according to an embodiment of the present disclosure.
- ⁇ unit and “ ⁇ module” described in this specification refer to a unit that processes at least one function or operation, which may be implemented as hardware or software or a combination of hardware and software. .
- a processor configured (or configured) to perform A, B, and C can be used by a dedicated processor (eg, embedded processor) to perform those operations, or by executing one or more software programs stored in memory; It may mean a general-purpose processor (eg, CPU or application processor) capable of performing corresponding operations.
- a dedicated processor eg, embedded processor
- a general-purpose processor eg, CPU or application processor
- an 'Augmented Reality (AR) system' refers to a system that shows a virtual image together in a physical environment space of the real world or a real object and a virtual image together.
- an 'Augmented Reality Device (AR device)' is a device capable of expressing 'Augmented Reality', and is a glasses-shaped augmented reality glasses device worn on a face by a user.
- AR device a glasses-shaped augmented reality glasses device worn on a face by a user.
- HMD Head Mounted Display
- VRH virtual reality headset
- a 'real scene' is a scene in the real world that a user sees through an augmented reality device, and may include a real world object.
- a 'virtual image' is an image generated through an optical engine, and may include both static and dynamic images.
- the virtual image may be observed together with a real scene, and may be an image representing information about a real object in the real scene, information about an operation of an augmented reality device, or a control menu.
- the augmented reality device guides an optical engine for generating a virtual image composed of light generated from a light source and the virtual image generated by the optical engine to the user's eyes and is transparent so that a scene of the real world can be viewed together.
- a waveguide formed of a material is provided. Since the augmented reality device should be able to observe the scene of the real world as well, an optical element to change the path of the basically linear light is required to guide the light generated by the optical engine to the user's eyes through the light guide plate. need.
- the light path may be changed using reflection by a mirror or the like, or the light path may be changed through diffraction by a diffraction element such as a diffractive optical element (DOE) or a holographic optical element (HOE), but is not limited thereto.
- DOE diffractive optical element
- HOE holographic optical element
- FIG. 1 is a schematic diagram of a method of tracking a user's gaze in an augmented reality device (AR device) including a prescription lens according to an embodiment of the present disclosure.
- AR device augmented reality device
- An augmented reality device is a device capable of expressing augmented reality, and may display an image including a virtual object and a physical object existing in reality.
- the augmented reality device is a glasses-type display device and may include a glasses-type body configured to be worn by a user.
- the glasses-type body may include a support for fixing the augmented reality device to a user's face.
- the support may include a temple 131 and a bridge 132 .
- the temple 131 may be used to secure the augmented reality device to the user's head at the side of the spectacle body.
- the bridge 132 may be used to seat the augmented reality device on the user's nose, and may include, for example, a nose bridge and a glasses nose, but is not limited thereto.
- the eyeglass-shaped body may further include a frame.
- a light guide plate 110, a vision correcting lens 120, and a light reflector 150 may be disposed in the frame.
- the frame may be formed to surround outer circumferential surfaces of the light guide plate 110 and the vision correcting lens 120 .
- the light guide plate 110 may be configured to receive projected light from an input area and output at least a part of the input light from an output area, and the vision correction lens 120 is disposed between the light guide plate 110 and the user's eyes, When a user recognizes a scene displayed on the light guide plate 110 or a real scene through the light guide plate 110 , the user's eyesight may be corrected.
- the light guide plate 110 may include a light guide plate for the left eye and a light guide plate for the right eye
- the vision correction lens 120 may include a vision correction lens for the left eye and a vision correction lens for the right eye.
- the vision correction lens 120 for the left eye, the light reflector 150 for the left eye, and the light guide plate 110 for the left eye may be disposed at positions corresponding to the left eye of the user, and the vision correction lens for the right eye and the light reflector for the right eye may be disposed at positions corresponding to the user's left eye.
- the light guide plate for the right eye may be disposed at a position corresponding to the user's right eye.
- the light reflector 150 for the left eye may be disposed between the vision correcting lens 120 for the left eye and the light guide plate 110 for the left eye, and the light guide plate 110 for the left eye or the vision correcting lens 120 for the left eye It may be coated on one side of, but is not limited thereto.
- the light reflector for the right eye may be disposed between the vision correcting lens for the right eye and the light guide plate for the right eye, and may be coated on one surface of the light guide plate for the right eye or the vision correcting lens for the right eye, but is not limited thereto.
- the projector's optical engine projecting display light containing images may include a left-eye optical engine and a right-eye optical engine.
- the left eye optical engine and the right eye optical engine may be located on either side of the augmented reality device.
- one optical engine may be included in a central portion around the bridge 132 of the augmented reality device. Light emitted from the optical engine may be displayed through the light guide plate 110 .
- the gaze tracking sensor may include a light emitting unit 141 and a light receiving unit 143 .
- the light emitting unit 141 and the light receiving unit 143 may be disposed on an inner surface of the support unit, which is a position between the support unit and the user's eyes, in the support unit of the augmented reality device.
- the light emitting unit 141 and the light receiving unit 143 may be disposed to face the light reflecting unit 150 in the support of the augmented reality device.
- the light emitting unit 141 and the light receiving unit 143 may emit and receive IR light without being hindered by the user's hair, among the side surfaces of the temple 131 of the augmented reality device, the light guide plate. It may be disposed at a position spaced about 2 mm to 25 mm from (110).
- the augmented reality device may include a vision correcting lens 120 for correcting the eyesight of a user wearing the augmented reality device.
- the vision correcting lens 120 may have a predetermined fixed refractive characteristic or may have a variable refractive characteristic that can be changed as needed.
- FIG. 2 is a flowchart of a method of detecting a gaze of a user by an augmented reality device according to an embodiment of the present disclosure.
- the augmented reality device may acquire lens characteristic information of a vision correcting lens.
- the augmented reality device may emit light for gaze tracking towards the light reflector through the light emitter.
- the augmented reality device may receive light reflected by the user's eyes through the light receiving unit.
- the augmented reality device may obtain an eye image of the user based on the received light.
- the augmented reality device may compensate the user's eye image based on the lens characteristic information of the vision correcting lens.
- the augmented reality device may acquire information on the gaze of the user based on the compensated eye image.
- FIG. 3 is a block diagram of an augmented reality device 300 according to an embodiment of the present disclosure.
- the augmented reality device 300 includes a display unit 310, a vision correcting lens 320, a support unit 330, an eye tracking sensor 340, a light reflecting unit ( 350), a processor 360, and a storage unit 370.
- the display unit 310 displays and outputs information processed by the augmented reality device 300 .
- the display unit 310 may display information related to a user interface for capturing the surroundings of the augmented reality device 300 and a service provided based on a photographed image of the surroundings of the augmented reality device 300. there is.
- the display unit 310 may provide augmented reality (AR) images.
- the display unit 310 may include a light guide plate (or waveguide) 311 and an optical engine 312 .
- the light guide plate 311 may be made of a transparent material in which a portion of the rear surface is visible.
- the light guide plate 311 may be formed of a single- or multi-layer flat plate made of a transparent material through which light is reflected and propagated therein.
- the light guide plate 311 may face the output surface of the optical engine 312 to receive light of a virtual image projected from the optical engine 312 .
- the transparent material means a material through which light can pass, and the transparency may not be 100% and may have a predetermined color.
- the light guide plate 311 is made of a transparent material, the user can view a virtual object of a virtual image through the display unit 310 as well as an external real scene. 311 may be referred to as a see through display.
- the display unit 310 may provide an augmented reality image by outputting a virtual object of a virtual image through the light guide plate 311 .
- the display unit 310 may include a left display unit and a right display unit.
- the vision correction lens 320 may correct the eyesight of a user wearing the augmented reality device 300 .
- the vision correcting lens 320 may be disposed between the light guide plate 311 and the user's eyes, and may correct the eyesight of a user who recognizes a real scene and a virtual image through the light guide plate 311 .
- the vision correcting lens 320 may have a predetermined fixed refractive characteristic or may have a variable refractive characteristic that can be changed as needed.
- the lens characteristics of the vision correcting lens 320 mean that, when the light reflecting unit 350 is disposed between the vision correcting lens 320 and the light guide plate 311, the eye image is formed by the light emitting unit ( 341) - Vision correction lens 320 - Light reflector 350 - Vision correction lens 320 - User's eye - Vision correction lens 320 - Light reflector 350 - Vision correction lens 320 - Light It represents the characteristic of the lens that causes eye image distortion that occurs in the process of being acquired through the receiver 343 path.
- the lens characteristics of the vision correcting lens 320 may indicate the degree to which the vision correcting lens 320 converges or spreads light.
- Lens characteristics of the vision correcting lens 320 may be determined based on a refractive index and a curvature of a refractive surface of the lens.
- the lens characteristics of the vision correcting lens 320 increase as the refractive index or curvature of the refracting surface increases.
- the refractive index may represent the degree of refraction when light and waves pass through interfaces (refracting surfaces) of different media.
- the refractive index may be determined based on the material of the vision correcting lens 320 .
- the lens characteristic information of the vision correcting lens 320 may include various numerical values expressed to represent lens characteristics in addition to the refractive index or the curvature of a refracting surface.
- the light reflector 350 may reflect light emitted from the light emitter 341 to be described later.
- the light reflector 350 and the light guide plate 311 may be disposed at positions facing the user's eyes, and the light reflector 350 and the light guide plate 311 may be attached to each other.
- the light reflection part 350 may be coated on at least a portion of the light guide plate 311 .
- the light reflector 350 is attached to a cover glass installed to protect other components included in the glasses-type augmented reality device, for example, the vision correction lens 320 or the light guide plate 311, in addition to the light guide plate. or may be coated.
- the light reflector 350 may be formed of a material capable of reflecting light such as IR light emitted from the light emitter 341 .
- the light reflection part 350 may be, for example, silver, gold, copper, or a material including one or more of these metals, but is not limited thereto. Accordingly, the light emitted from the light emitting unit 341 may be reflected by the light reflecting unit 350 and directed toward the user's eyes, and the light reflected back from the user's eyes is reflected by the light reflecting unit 350. and may be directed toward the light receiving unit 343.
- the light reflection part 350 may be coated on the light guide plate 311 to have a predetermined pattern.
- the pattern formed on the light reflector 350 may include, for example, a dot pattern, a line pattern, a grid pattern, a 2D marker, and the like, but is not limited thereto.
- the pattern formed on the light reflection part 350 may be formed on a part of the light guide plate 311 where the user's gaze is directed at a low frequency, for example.
- the pattern formed on the light reflector 350 may be formed on a portion of the light guide plate 311 that does not interfere with photographing or scanning the user's eyes.
- the predetermined pattern may refer to a pattern formed by a portion where light emitted from the light emitter 341 is reflected and a portion where the light emitted from the light emitter 341 is not reflected in the light reflector 350 . Since the light emitted from the light reflecting unit 350 toward the non-reflecting portion is not reflected by the light reflecting portion 350, the light receiving unit 343 transmits the light emitted towards the non-reflecting portion. will not receive Accordingly, a pattern formed by the light reflecting portion and the non-reflective portion of the light reflecting unit 350 may be detected from the light received by the light receiving unit 343 .
- a predetermined pattern when the light emitted from the light emitting unit 341 is IR light, a predetermined pattern may be formed of a material for reflecting IR light, and the material for reflecting IR light is invisible to the user's eyes. May contain invisible materials. Since most of the real world light or real scene observed by the user through the augmented reality device 300 is composed of visible light, the user is not disturbed by the light reflector 350 having a predetermined pattern, and the real world Light or real scenes can be observed.
- the gaze tracking sensor 340 may include a light emitting unit 341 and a light receiving unit 343 .
- the gaze tracking sensor 340 includes a light emitter 341 that emits light for detecting the user's gaze and a light receiver 343 that receives light reflected from the user's eyes, and the augmented reality device 300 It is possible to detect data related to the gaze of the user wearing the .
- the light emitter 341 of the gaze tracking sensor 340 may emit light toward the light reflector 350 so that the light reflected by the light reflector 350 may be directed toward the user's eyes.
- the light emitting unit 341 may emit light toward the light reflecting unit 350 , the emitted light may be reflected by the light reflecting unit 350 , and the reflected light may be directed toward the user's eyes.
- the light emitting unit 341 may be disposed at a position capable of emitting light toward the light reflecting unit 350 in the augmented reality device 300 .
- the light emitting unit 341 may be located on the support unit 330 supporting the augmented reality device 300 on the user's face, such as the temple 331 and the bridge 332 .
- light reflected from the user's eyes may be reflected by the light reflector 350 and received by the light receiver 343 of the gaze tracking sensor 340 .
- Light toward the user's eyes is reflected from the user's eyes, the light reflected from the user's eyes may be reflected by the light reflector 350, and the light receiver 343 is reflected by the light reflector 350.
- the light receiving unit 343 may be disposed at a position capable of receiving light reflected from the light reflecting unit 350 in the augmented reality device 300 .
- the light receiving unit 343 may be located on the support unit 330 supporting the augmented reality device 300 on the user's face, such as the temple 331 and the bridge 332 .
- the bridge 332 may include a nose bridge and a spectacle nose.
- the bridge of the nose and the nose of the glasses may be integrally configured, but are not limited thereto.
- the light emitting unit 341 may be an IR LED emitting IR light
- the light receiving unit 343 may be an IR camera capturing IR light.
- the IR camera may capture the user's eyes using the IR light reflected by the light reflector 350 .
- the light emitting unit 341 is an IR LED and the light receiving unit 343 is an IR camera
- the light emitting unit 341 transmits at least one IR light selected from planar light and point light to the light reflecting unit. 350
- the light receiving unit 343 identifies IR light emitted from the light emitting unit 341 as being sequentially reflected by the light reflecting unit 350, the user's eye, and the light reflecting unit 350. reflected light can be received.
- the surface light may be light emitted in the form of a surface, and the surface light emitted from the light emitting unit 341 may be directed to at least a part of the entire area of the light reflecting unit 350 . At least a portion of the entire area of the light reflector 350 may be set so that surface light reflected from at least a portion of the entire area of the light reflector 350 covers the user's eyes.
- the light emitting unit 341 and the light receiving unit 343 may be disposed on the temple 331 of the augmented reality device 300 .
- the light emitting unit 341 and the light receiving unit 343 may be disposed on the inner side of the temple 331 of the augmented reality device 300, which is between the temple 331 and the user's eyes.
- the light emitting unit 341 and the light receiving unit 343 may be disposed at a position spaced apart from the light guide plate 311 by about 2 mm to 25 mm from the side of the temple 331 of the augmented reality device 300.
- the light emitting unit 341 and the light receiving unit 343 may be disposed to face the light reflecting unit 350 at the temple 331 of the augmented reality device 300 .
- the light emitting unit 341 and the light receiving unit 343 may be disposed on the bridge 332 of the augmented reality device 300 .
- the light emitting unit 341 and the light receiving unit 343 may be disposed on the inner side of the bridge 332 of the augmented reality device 300 between the bridge 332 and the user's eyes.
- the light emitting unit 341 and the light receiving unit 343 may be disposed at a distance of about 2 mm to 25 mm from the light guide plate 311 on the side of the bridge 332 of the augmented reality device 300.
- the light emitting unit 341 and the light receiving unit 343 may be disposed to face the light reflecting unit 350 in the bridge 332 of the augmented reality device 300 .
- the gaze tracking sensor 340 may provide data related to the gaze of the user's eyes to the processor 360, and the processor 360 may obtain gaze information of the user based on the data related to the gaze of the user's eyes.
- the data related to the gaze of the user's eyes is data acquired by the gaze tracking sensor 340, and includes data on the wavelength of light emitted from the light emitting unit 341, the emitting area, the characteristics of the light, the emitting direction, and the like, and the light receiving unit ( 343) may include data representing characteristics of the reflected light received.
- the user's gaze information is information related to the user's gaze, and may be generated by analyzing data related to the user's gaze, for example, the location of the user's pupil, the location of the central point of the pupil, the user's iris Information about the location of the user's eyes, the center of the user's eyes, the location of the user's eye twinkle feature point, the user's gaze point, and the user's gaze direction may be included, but is not limited thereto.
- the direction of the user's gaze may be, for example, a direction of the gaze from the center of the user's eyes to the gaze point at which the user gazes.
- the user's gaze direction may be represented by a vector value from the center of the user's left eye to the gaze point and a vector value from the center of the user's right eye to the gaze point, but is not limited thereto.
- the gaze tracking sensor 340 may detect data related to the gaze of the user wearing the augmented reality apparatus 300 at predetermined time intervals.
- the storage unit 370 may store programs to be executed by the processor 360 to be described later, and may store data input to or output from the augmented reality device 300 .
- the storage unit 370 may include at least one of an internal memory (not shown) or an external memory (not shown).
- the built-in memory includes, for example, volatile memory (eg, DRAM (Dynamic RAM), SRAM (Static RAM), SDRAM (Synchronous Dynamic RAM), etc.), non-volatile memory (eg, OTPROM (One Time Programmable ROM) ), PROM (Programmable ROM), EPROM (Erasable and Programmable ROM), EEPROM (Electrically Erasable and Programmable ROM), Mask ROM, Flash ROM, etc.), hard disk drive (HDD), or solid state drive (SSD).
- volatile memory eg, DRAM (Dynamic RAM), SRAM (Static RAM), SDRAM (Synchronous Dynamic RAM), etc.
- non-volatile memory eg, OTPROM (One Time Programmable ROM)
- PROM Programmable ROM
- EPROM Erasable and Programmable ROM
- the processor 360 may load and process a command or data received from at least one of a non-volatile memory or other components into a volatile memory. Also, the processor 360 may store data received or generated from other components in a non-volatile memory.
- the external memory may include, for example, at least one of CF (Compact Flash), SD (Secure Digital), Micro-SD (Micro Secure Digital), Mini-SD (Mini Secure Digital), xD (extreme Digital), or Memory Stick. can include
- Programs stored in the storage unit 370 may be classified into a plurality of modules according to their functions.
- a lens characteristic acquisition module 371, an eye image acquisition module 372, a distortion compensation module 373, A feature point detection module 374 and a gaze detection module 375 may be included.
- a memory (not shown) may be included in the gaze tracking sensor 340, and in this case, the eye image acquisition module 372 is firmware (firmware) in the memory (not shown) included in the gaze tracking sensor 340. ) may be stored.
- the processor 360 controls overall operations of the augmented reality device 300 .
- the processor 360 executes programs stored in the storage unit 370 to operate the display unit 310, the vision correction lens 320, the gaze tracking sensor 340, and the storage unit 370. overall control.
- the processor 360 stores the lens characteristic acquisition module 371, the eye image acquisition module 372, the distortion compensation module 373, the feature point detection module 374, and the gaze detection module 375 stored in the storage unit 370. By executing it, it is possible to determine the user's gaze point and gaze direction.
- the augmented reality device 300 may include a plurality of processors 360, a lens characteristic acquisition module 371, an eye image acquisition module 372, a distortion compensation module 373, and feature point detection.
- Module 374 and gaze detection module 375 may be executed by a plurality of processors 360 .
- some of the lens characteristic acquisition module 371, the eye image acquisition module 372, the distortion compensation module 373, the feature point detection module 374, and the gaze detection module 375 may be configured by a first processor (not shown). ), and the rest of the lens characteristic acquisition module 371, the eye image acquisition module 372, the distortion compensation module 373, the feature point detection module 374, and the gaze detection module 375 are executed by the second processor (not shown), but is not limited thereto.
- the gaze tracking sensor 340 may include another processor (not shown) and a memory (not shown), and the other processor (not shown) may include an eye image acquisition module 372 stored in a memory (not shown). and the processor 360 may execute the lens characteristic acquisition module 371, the distortion compensation module 373, the feature point detection module 374, and the gaze detection module 375 stored in the storage unit 370.
- the processor 360 may acquire lens characteristic information of the vision correcting lens 320 by executing the lens characteristic obtaining module 371 stored in the storage unit 370 .
- the light reflection part 350 may include a pattern.
- the processor 360 executes the lens characteristic acquisition module 371 stored in the storage unit 370 to emit light for obtaining lens characteristic information toward the light reflector 350 through the light emitter 341, A pattern may be identified based on the light received through the light receiver 343, and lens characteristic information of the vision correcting lens 320 may be obtained based on the identified pattern.
- the pattern may be formed to correspond to a partial area on the light guide plate 311 .
- a predetermined pattern may be formed on a portion of the light guide plate 311 to reflect the light emitted from the light emitting unit 341 and direct the reflected light toward the light receiving unit 343 .
- the pattern may be attached to or coated on a cover glass installed to protect the vision correcting lens 320 or the light guide plate 311 for correcting the user's eyesight.
- an indicator used to acquire lens characteristic information of the vision correcting lens 320 is included, and the processor 360 executes the lens characteristic acquisition module 371 stored in the storage unit 370. By doing so, it is possible to obtain lens characteristic information of the vision correcting lens 320 by identifying the indicator.
- the indicator may include a barcode, a label such as a QR code, a test, and the like.
- the vision correcting lens 320 includes a coupling unit for coupling to the support unit 330 of the augmented reality device 300, and the processor 360 includes a lens characteristic acquisition module 371 stored in the storage unit 370. ), an electromagnetic signal may be obtained through the coupler, and lens characteristic information of the vision correcting lens 320 may be obtained based on the obtained electromagnetic signal.
- the processor 360 applies an electromagnetic signal to the vision correcting lens 320 through a coupling unit and obtains a corresponding electromagnetic return signal from the vision correcting lens 320, thereby providing a Lens characteristic information may be obtained.
- the coupling unit may be disposed at a portion where the vision correcting lens 320 contacts the support unit 330 .
- the coupler may be disposed along an outer surface of the vision correcting lens 320 , where the vision correcting lens 320 contacts the frame of the support 330 .
- vision correction lens 320 may include a variable focus lens.
- a variable focus lens is a lens having a variable focus, for example, a liquid crystal lens (LC lens), a liquid membrane lens, an electrowetting lens, or an Alvarez lens. ) may be included.
- the variable focus lens may be implemented in a form in which a flexible plastic membrane surrounds a transparent fluid.
- the refractive index (diopter) of the variable focus lens may be changed by moving the fluid in the variable focus lens according to the electric signal applied to the variable focus lens.
- the processor 360 may identify lens characteristic information of the vision correcting lens 320 from refractive parameters set in the variable focus lens by executing the lens characteristic acquisition module 371 stored in the storage unit 370. .
- the processor 360 may obtain an eye image from the reflected light received through the light receiving unit 343 by executing the eye image acquisition module 372 stored in the storage unit 370 .
- the reflected light received through the light receiver 343 may indicate light that is identified by light reflected from the user's eyes and reflected again by the light reflector 350 .
- the light emitting unit 341 is an IR LED and the corresponding light receiving unit 343 is an IR camera
- the obtained eye image may be an IR image.
- the processor 360 may compensate the user's eye image based on the lens characteristic information of the vision correction lens 320 by executing the distortion compensation module 373 stored in the storage unit 370 .
- the processor 360 may detect features related to the line of sight of the user's eyes by executing the feature point detection module 374 stored in the storage unit 370 .
- the processor 360 may detect the position of the pupil feature point and the central point of the user's eyes by executing the feature point detection module 374 .
- the pupil feature point may include, for example, a glint feature point of the eye.
- the eye twinkle feature point may be a part having a brightness greater than or equal to a predetermined value in the detected region of the eye.
- the position of the feature point of the pupil and the position of the central point may be identified by, for example, a coordinate value indicating a position in the coordinate system of the light receiving unit 343 .
- the coordinate system of the light receiving unit 343 may be the coordinate system of the IR camera, and the coordinate values in the coordinate system of the light receiving unit 343 may be 2D coordinate values.
- the processor 360 may detect features related to the gaze of the eyes by analyzing the light received by the light receiving unit 343 . For example, when the light receiving unit 343 is an IR camera, the processor 360 may identify the location of the feature point and the center point of the pupil in an image captured by the IR camera. When the positions of the feature points are detected, the position of the feature point or the center point of the pupil may have a corrected value by reflecting the lens characteristic information of the vision correcting lens 320 . The position of the feature point and the center point of the pupil corrected by reflecting the lens characteristic information of the vision correcting lens 320 will be described in more detail with reference to FIG. 8 to be described later.
- the processor 360 may obtain a coordinate value indicating a location of a feature point of the pupil and a coordinate value indicating a location of a center point of the pupil by analyzing the light received by the light receiving unit 343 .
- the processor 360 may obtain coordinate values of a feature point and a coordinate value of a center point of the pupil in the coordinate system of the camera.
- the coordinate system of the camera may be used to indicate the location of the feature point and the central point of the pupil, and for example, coordinate values on the camera coordinate system corresponding to pixels of an image photographed by the camera may be preset.
- coordinate values corresponding to eye feature points may be identified based on properties (eg, brightness) of light received through the camera.
- the processor 360 may acquire information about the user's gaze by executing the gaze detection module 375 stored in the storage unit 370 .
- the processor 360 may calculate the location of the center of the eye of the user by executing the gaze detection module 375 .
- the center of the user's eye may indicate the center of the region corresponding to the iris in the user's eyeball, that is, the position of the pupil.
- the position of the point of gaze at which the user gazes and the direction of the user's gaze may be obtained from the position of the pupil.
- the location on the light guide plate 311 corresponding to the central point of the user's pupil may be identified as the location of the gaze point at which the user gazes, and the direction from the user's pupil toward the gaze point may be identified as the user's gaze direction.
- the processor 360 may calculate the position of the user's pupil through lens characteristic information of the lens and feature points included in the user's eye image.
- the processor 360 may calculate the position of the user's gaze point by executing the gaze detection module 375 .
- the processor 360 may previously generate a mapping function for calculating the position of the gaze point from the characteristics of the user's eyes in order to calculate the position of the user's gaze point.
- the mapping function is a function for calculating the position of the user's gazing point in consideration of the characteristics of the user's eyes and the lens characteristic information of the vision correcting lens 320, and may be generated through a previously performed calibration process.
- the position of the gaze point may have a 3D coordinate value in a real space coordinate system, but is not limited thereto.
- the position of the gaze point is the position on the light guide plate 311 corresponding to the center point of the user's pupil. It may have a coordinate value in the light guide plate coordinate system, but is not limited thereto.
- the processor 360 may correct features related to the gaze of the user's eyes obtained from the feature point detection module 374 based on the lens characteristic information of the vision correcting lens 320 by executing the gaze detection module 375. there is.
- the processor 360 may calculate the location of the user's gaze point by applying features related to the gaze of the user's eyes, corrected based on the lens characteristics, to a mapping function.
- the direction of the user's gaze may be determined based on the position of the central point of the eye calculated by the gaze detection module 375 and the gaze point of the user.
- the processor 360 may perform a calibration operation in advance before performing an operation of detecting the user's gaze.
- the mapping function may be calibrated based on lens characteristic information of the vision correcting lens 320 .
- the processor 360 may calibrate the mapping function to obtain a gaze point of the user based on lens characteristics and eye feature points. For example, in the calibration operation, the degree of distortion of the eye image according to the lens characteristic information of the vision correcting lens 320 may be calculated to calculate compensation values for the positions of the feature points and central points of the eyes.
- the processor 360 may calibrate the mapping function so that a target gaze direction value may be output from the mapping function inputted with the position of feature points of the user's eye and the lens characteristic information of the vision correcting lens 320 .
- an augmented reality device including a vision correction lens may include a user calibration process and an eye-tracking process.
- lens characteristic information of the vision correction lens may be identified, and a degree of distortion of the user's eye image acquired by the augmented reality device including the vision correction lens may be estimated.
- the gaze of the user wearing the augmented reality device may be tracked by reflecting the degree of distortion estimated in the user calibration process.
- FIG. 5 is a diagram for explaining an operation of obtaining lens characteristic information of a vision correcting lens 620 according to an embodiment of the present disclosure.
- the augmented reality device may include a display unit 610, a vision correction lens 620, and a light reflector 650 disposed between the display unit 610 and the vision correction lens 620. there is.
- the light reflection part 650 may include a pattern RP.
- the pattern RP may be formed to correspond to a partial area on the display unit 610 .
- a predetermined pattern RP may be formed on a portion of the display unit 610 to direct light emitted from the light emitting unit to be reflected toward the light receiving unit.
- the gaze tracking sensor 640 may emit light for acquiring lens characteristics of the vision correction lens 620 toward the light reflector 650 through the light emitter.
- Light emitted from the eye tracking sensor 640 through the light emitter may be directed to the light reflector 650 through the vision correcting lens 620 .
- the light may be reflected by the light reflector 650, pass through the vision correcting lens 620, and reach the light receiver of the gaze tracking sensor 640.
- the augmented reality device identifies a distorted pattern DP formed by distorting the pattern RP included in the light reflection unit 650 based on the light received through the light receiver, and determines the distorted pattern DP.
- Lens characteristic information of the vision correcting lens 620 may be obtained based on .
- information related to the actual pattern RP included in the light reflector 650 may be pre-stored in the augmented reality device.
- the augmented reality device may identify a distorted pattern DP from light received through the light receiving unit of the gaze tracking sensor 640 .
- the gaze tracking sensor 640 captures the actual pattern RP through the vision correcting lens 620, so that light reflected from the actual pattern RP passes through the vision correcting lens 620. It can be refracted and formed.
- the augmented reality device compares the actual pattern RP included in the light reflector 650 and the distorted pattern DP identified from the light received through the light receiver, thereby providing lens characteristic information of the vision correction lens 620. can be obtained. For example, the augmented reality device compares the positions of the vertices on the real pattern RP with the positions of the vertices on the distorted pattern DP, and compares the edges of the real pattern RP with the distorted pattern.
- Lens characteristic information of the vision correcting lens 620 may be obtained by comparing the length, position, direction, degree of curvature, and the like of edges on the DP.
- lens characteristic information of the vision correcting lens 620 may include difference value information for each vertex or characteristic change information for each edge in the pattern.
- the difference value information for a specific vertex may indicate information related to a difference value between a position of a specific vertex in the actual pattern RP and a position of a corresponding vertex in the distorted pattern DP. For example, when considering a vector from the position of a specific vertex in the actual pattern RP to the position of the corresponding vertex in the distorted pattern DP, information about the size and direction of the vector is included. can do.
- the characteristic change information for a specific edge may indicate information related to a difference between a characteristic of a specific edge in the real pattern RP and a characteristic of a corresponding edge in the distorted pattern DP.
- information related to the difference between the length of a specific edge within the actual pattern RP and the length of a corresponding edge within the distorted pattern DP, or the degree of curvature of a specific edge within the actual pattern RP and the distorted may include information related to the difference in the degree of curvature of the corresponding edge within the pattern DP.
- the degree of deformation of each point in the image captured through the vision correcting lens 620 when compared with the actual image may be obtained from the difference value information for each vertex. Also, from the characteristic change information for each edge, the degree of deformation of each line in the image captured through the vision correcting lens 620 when compared with the actual image may be obtained. Therefore, according to the lens characteristic information of the vision correction lens 620, it is possible to know how much the image captured through the vision correction lens 620 is deformed from the actual image, and the degree of distortion of the acquired (distorted) image is compensated. to produce an actual image.
- the operation of acquiring the lens characteristic information of the vision correcting lens 620 by the augmented reality device is not limited to the above operation, and may be obtained through various operations.
- the acquired lens characteristic information of the vision correcting lens 620 may be used in an operation of compensating for distortion of an eye image and an operation of compensating for a gaze tracking result accordingly.
- FIG. 6 is a diagram for explaining an operation of compensating an eye image based on lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure.
- the light emitter - the vision correction lens - the light reflector - the vision correction lens - the user's eye - the vision correction lens - light Reflector - vision correcting lens - light receiver The user's eye image obtained from the light moving through the path may include distortion due to the vision correcting lens.
- the obtained user's gaze information may have a difference from the direction of the eye gaze of the actual user. Therefore, in order to increase the accuracy of eye tracking, it is necessary to compensate for the degree of distortion of the user's eye image.
- a distortion compensation function H(x,y) may be used to compensate for the degree of distortion of the image.
- Lens characteristic information of a vision correcting lens may be used to generate the distortion compensation function H(x,y).
- the grid pattern image i is located at a position where the user's eyes are to be placed when the user wears the augmented reality device.
- the light emitter - the vision correction lens - the light reflector - the vision correction lens - the grid pattern image - the vision correction lens - the light reflector - the vision correction lens - the light receiver distorted from the light moving through the path d the grid pattern image can be obtained.
- the degree of distortion of the user's eye image may be calculated using lens characteristic information of the vision correcting lens.
- i(x,y) is represents the coordinate value of the vertex (x,y) from the center point (or preset origin) in the actual grid pattern image i
- d(x,y) is the center point (or preset origin) in the distorted grid pattern image
- d represents the coordinate value of the vertex (x,y) from Before compensating for the degree of distortion of the image, for at least one vertex (x,y), the coordinate value i(x,y) on the real grid pattern image and the coordinate value d(x,y) on the distorted grid pattern image y) can be seen to be different.
- the coordinate value D(x,y) on the compensated grid pattern image D may be obtained by multiplying d(x,y) by the distortion compensation function H(x,y). That is, Equation 1 below may be satisfied.
- Lens characteristic information of the vision correcting lens may be obtained through the method described above with reference to FIG. 5 or the like.
- the lens characteristic information may include information about which points in an image captured through a vision correcting lens have moved in which direction and by how much distance when compared to an actual image.
- the lens characteristic information may include information on how much the length or degree of curvature of each line in the image captured through the vision correcting lens is deformed when compared with the actual image.
- the lens characteristic information of the vision correction lens it is possible to know how much the image captured through the vision correction lens is deformed from the actual image, and the distortion compensation function H for compensating the degree of distortion of the obtained (distorted) image. (x,y) can be obtained.
- the distortion of the eye image is compensated for using the obtained distortion compensation function H(x,y), and the user's gaze is detected from the compensated eye image, so that the gaze tracking result is accurate. This may increase
- FIG. 7 is a diagram for explaining an operation of obtaining user gaze information from a compensated eye image based on lens characteristic information of a vision correcting lens 820 according to an embodiment of the present disclosure.
- the augmented reality device may include a display unit 810, a vision correction lens 820, and a light reflector 850 disposed between the display unit 810 and the vision correction lens 820. there is.
- the gaze tracking sensor 840 may emit light for eye gaze tracking of a user wearing an augmented reality device toward the light reflector 850 through the light emitter.
- Light emitted from the eye tracking sensor 840 through the light emitter may be directed to the light reflector 850 through the vision correcting lens 820 . Thereafter, the light may be sequentially reflected from the light reflector 850, the user's eyes, and again from the light reflector 850, and reach the light receiver of the gaze tracking sensor 840.
- the augmented reality device may acquire an eye image of the user based on the light received through the light receiving unit.
- the user's eye image obtained from the light moving through the receiver path may include distortion due to the vision correcting lens 820 . Therefore, in order to increase the accuracy of eye tracking, it is necessary to compensate for the degree of distortion of the user's eye image.
- the augmented reality device may acquire an eyeball image d(x,y) through a light receiving unit.
- the obtained eye image d(x,y) includes distortion due to the vision correcting lens 820 .
- the eyeball feature point is extracted directly from the eyeball image d(x,y) and the user's gaze direction, that is, the user's pupil position information is obtained, the obtained user's pupil position corresponds to the gaze direction of the actual user due to image distortion. It may not be.
- the augmented reality device compensates for the eyeball image d(x,y) using the distortion compensation function H(x,y) obtained by the method described above with reference to FIG. 6,
- the user's pupil position information and the user's gaze information are detected from the compensated eye image. That is, according to an embodiment of the present disclosure, since the user's gaze is detected from the compensated eye image using the distortion compensation function H(x,y) obtained using the lens characteristic information of the vision correction lens 820, the gaze The accuracy of tracking results can be increased.
- FIG. 8 is a diagram for explaining an operation of obtaining lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure
- FIG. 9 is a diagram for obtaining lens characteristic information of a vision correcting lens according to an embodiment of the present disclosure. It is a drawing for explaining the operation.
- an indicator 922 used to acquire lens characteristic information of the vision correcting lens 920 may be displayed on the vision correcting lens 920 .
- the processor of the augmented reality device may obtain lens characteristic information of the vision correcting lens 920 by identifying the indicator 922 .
- the processor emits light toward the indicator 922 by controlling the light emitter included in the gaze tracking sensor 940 and based on the light received through the light receiver included in the gaze tracking sensor 940.
- the indicator 922 can be identified.
- the vision correcting lens 1020 may include a coupler 1022 coupled to the support of the augmented reality device.
- the processor of the augmented reality device applies an electromagnetic signal to the vision correction lens 1020 through the coupler 1022 and obtains a corresponding electromagnetic return signal from the vision correction lens 1020, thereby correcting vision.
- Lens characteristic information of the lens 1020 may be acquired.
- the augmented reality device includes a display unit 1110, a light reflection unit 1150, and a vision correction lens, and the vision correction lens has a variable focus.
- a lens 1123 may be included.
- the augmented reality device of FIG. 10 may correspond to the augmented reality device of FIG. 3 .
- the variable focus lens 1123 may be controlled using a refractive index according to a position and a position of an aperture center as control parameters.
- the processor may set a control parameter for controlling the variable focus lens 1123 and obtain lens characteristic information of the vision correcting lens based on the set control parameter.
- FIG. 11 is a diagram illustrating an augmented reality device 1200 according to an embodiment of the present disclosure.
- an augmented reality device 1200 for detecting a user's gaze is shown.
- the augmented reality device 1200 may include glasses-shaped AR glasses worn on the face of a user, a head mounted display (HMD) worn on the head, a virtual reality headset (VRH), or It may include an AR helmet and the like.
- HMD head mounted display
- VRH virtual reality headset
- the augmented reality device 1200 by placing the display in front of the user's eyes, since the screen moves according to the user's movement, it is possible to provide a real scene and a realistic virtual image.
- a user may wear an augmented reality device 1200, capable of displaying visual augmented reality content.
- the augmented reality device 1200 may include an audio module capable of providing audio augmented reality content to a user.
- the augmented reality device 1200 may include one or more cameras capable of capturing images and video of the surroundings.
- the augmented reality device 1200 may include an eye tracking system to determine a user's vergence distance.
- the augmented reality device 1200 may include a lightweight head-worn display (HMD) (eg, goggles, glasses, visor, etc.).
- the augmented reality device 1200 is laser projection glasses (e.g., capable of projecting a low-powered laser onto the user's retina to project and display images or depth content to the user). eyeglasses), non-HMD devices.
- the augmented reality apparatus 1200 may provide an AR service in which at least one virtual object is output so as to overlap an area determined as a user's field of view (FOV).
- FOV field of view
- the area determined as the user's field of view is an area determined to be perceptible by a user wearing the augmented reality device 1200 through the augmented reality device 1200, and the entire display of the augmented reality device 1200 is displayed.
- it may be a region including at least a part of it.
- the augmented reality device 1200 may include a plurality of transparent members (eg, displays 1220 and 1230) corresponding to both eyes of the user.
- the augmented reality device 1200 may include a display module 1214, a camera, an audio output unit, and support units 1221 and 1222.
- the camera may capture an image corresponding to the user's field of view or measure a distance to an object.
- the camera may be used for head tracking and spatial awareness. Also, the camera may recognize the user's movement.
- the camera may further include an 'eye tracking (ET) camera 1212' in addition to the camera 1213 used for detecting an image corresponding to the user's field of view, that is, motion of an object, or spatial recognition. there is.
- the ET camera 1212 may be used to detect and track the pupil of the user.
- the ET camera 1212 may be used for adjusting the center of a virtual image projected on the augmented reality device 1200 to be positioned according to the direction in which the eyes of the user wearing the augmented reality device 1200 gaze.
- the ET camera 1212 may correspond to the light receiver 343 of FIG. 3 described above.
- a global shutter (GS) camera may be used in the ET camera 1212 to detect pupils and track fast pupil movements without delay.
- the ET camera 1212 may separately include a left-eye camera 1212-1 and a right-eye camera 1212-2.
- the display module 1214 may include a first display 1220 and a second display 1230 .
- the display module 1214 may correspond to the display unit 310 of FIG. 3 described above.
- the virtual object output through the display module 1214 may include information related to an application program executed on the augmented reality device 1200 or information related to an external object located in a real space corresponding to a region determined by the user's field of view. there is.
- the augmented reality apparatus 1200 may check an external object included in at least a part corresponding to a region determined as the user's field of view among image information related to the real space obtained through the camera 1213 .
- the augmented reality device 1200 may output a virtual object related to an external object checked at least in part through an area determined to be the user's field of view among display areas of the augmented reality device 1200 .
- External objects may include objects existing in a real space.
- the displays 1220 and 1230 may include a condensing lens, a corrective lens, or a waveguide in a transparent member.
- the light guide plate may correspond to the above-described light guide plate 311 of FIG. 3 .
- the transparent member may be formed of a glass plate, plastic plate, or polymer, and may be made completely transparent or translucent.
- the transparent member may include a first transparent member facing the right eye of the user wearing the augmented reality device 1200 (eg, the second display 1230) and a second transparent member facing the left eye of the user. (eg, the first display 1220). If the display is transparent, it may be disposed at a position facing the user's eyes to display a screen.
- the light guide plate may transmit light generated from a light source of the display to the user's eyes.
- the light guide plate may be at least partially positioned on a portion of a transparent member (eg, the display 1220 or 1230).
- light emitted from the display may be incident to one end of the light guide plate, and the incident light may be transmitted to the user's eyes through total internal reflection within the light guide plate.
- the light guide plate may be made of a transparent material such as glass, plastic, or polymer, and may include a nanopattern formed on an inner or outer surface, for example, a polygonal or curved grating structure.
- the incident light may be propagated or reflected inside the light guide plate by the nanopattern and provided to the user's eyes.
- the light guide plate may include at least one of at least one diffractive element (eg, a diffractive optical element (DOE) or a holographic optical element (HOE)) or a reflective element (eg, a mirror).
- the light guide plate may guide the display light emitted from the light source unit to the user's eyes by using at least one diffractive element or reflective element.
- the displays 1220 and 1230 may include a display panel or lens (eg, glass).
- the display panel may include a transparent material such as glass or plastic.
- the display may be composed of a transparent element, and the user may perceive the real space behind the display by passing through the display.
- the display may display the virtual object on at least a partial region of the transparent element so that the user sees that the virtual object is added to at least a portion of the real space.
- the supports 1221 and 1222 may include printed circuit boards (PCBs) 1231-1 and 1231-2 for transmitting electrical signals to each component of the augmented reality device 1200, audio Speakers 1232-1 and 1232-2 for outputting signals or batteries 1233-1 and 1233-2 for supplying power may be included.
- the support portions 1221 and 1222 may correspond to the aforementioned support portion 330 of FIG. 3 .
- the support parts 1221 and 1222 may be disposed on the temple parts of the glasses.
- the support units 1221 and 1222 may include hinge units 1240 - 1 and 1240 - 2 coupled to the main body of the augmented reality device 1200 .
- the speakers 1232-1 and 1232-2 include a first speaker 1232-1 for transmitting audio signals to the user's left ear and a second speaker 1232-2 for transmitting audio signals to the user's right ear. can do.
- the augmented reality device 1200 may include a microphone 1241 for receiving a user's voice and ambient sounds.
- the augmented reality device 1200 includes at least one camera (eg, the ET camera 1212, the outward facing camera 1213, or the recognition cameras 1211-1 and 1211-2) to increase accuracy. It may include one illumination LED 1242 .
- the light emitting device 1242 may be used as an auxiliary means for increasing accuracy when photographing a user's pupil with the ET camera 1212, and the light emitting device 1242 may use an IR LED of an infrared wavelength rather than a visible light wavelength.
- the light emitting device 1242 may correspond to the light emitting portion 341 of FIG. 3 described above.
- the light emitting device 1242 may be used as an auxiliary means when it is not easy to detect a subject due to a dark environment when a user's gesture is photographed by the recognition cameras 1211-1 and 1211-2.
- the display module 1214 includes a first light guide plate corresponding to the left eye (eg, the first display 1220) and a second light guide plate corresponding to the right eye (eg, the second display 1230). ), and may provide visual information to a user through a first light guide plate (eg, the first display 1220) and a second light guide plate (eg, the second display 1230).
- the display module 1214 may include a display panel and a lens (eg, a glass lens, a plastic lens, or an LC lens).
- the display panel may include a transparent material such as glass or plastic.
- the display module 1214 may be formed of a transparent element, and a user may pass through the display module 1214 and perceive a real space in front of the user as well as a rear surface of the display module 1214 .
- the display module 1214 may display the virtual object on at least a partial region of the transparent device so that the user sees that the virtual object is added to at least a portion of the real space.
- the augmented reality device 1200 determines an external object included in at least a part corresponding to an area determined by the user's field of view (FoV) among the image information related to the real space acquired through the outward facing camera 1213.
- the augmented reality device 1200 may output (or display) a virtual object related to an external object checked at least in part through an area determined by a user's viewing angle among display areas of the augmented reality device 1200 .
- External objects may include objects existing in a real space.
- a display area where the augmented reality device 1200 displays a virtual object may include a portion of the display module 1214 (eg, at least a portion of the display panel).
- the display area may correspond to at least a portion of the first light guide plate (eg, the first display 1220) and the second light guide plate (eg, the second display 1230). .
- the augmented reality device 1200 may measure a distance to a physical object located in the front direction of the augmented reality device 1200 by using the outward facing camera 1213 .
- the outward facing camera 1213 may include a high resolution (HR) camera such as a photo video (PV) camera.
- the augmented reality device 1200 is not limited to the above-described configuration, and may include various components in various locations and in various numbers.
- Various embodiments of the present disclosure may be implemented or supported by one or more computer programs, and the computer programs may be formed from computer readable program codes and recorded in a computer readable medium.
- application and “program” mean one or more computer programs, software components, sets of instructions, procedures, functions, objects suitable for implementation in computer readable program code. ), class, instance, related data, or part thereof.
- Computer readable program code may include various types of computer code, including source code, object code, and executable code.
- Computer readable medium means read only memory (ROM), random access memory (RAM), hard disk drive (HDD), compact disc (CD), digital video disc (DVD), or various types of memory; It may include various types of media that can be accessed by a computer.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- a 'non-transitory storage medium' is a tangible device, and may exclude wired, wireless, optical, or other communication links that transmit transitory electrical or other signals. Meanwhile, this 'non-temporary storage medium' does not distinguish between a case in which data is semi-permanently stored in the storage medium and a case in which data is temporarily stored in the storage medium.
- the 'non-temporary storage medium' may include a buffer in which data is temporarily stored.
- Computer readable media can be any available media that can be accessed by a computer and can include both volatile and nonvolatile media, removable and non-removable media. Computer readable media include media on which data can be permanently stored and media on which data can be stored and later overwritten, such as rewritable optical discs or removable memory devices.
- the method according to various embodiments disclosed in this document may be provided by being included in a computer program product.
- Computer program products may be traded between sellers and buyers as commodities.
- a computer program product is distributed in the form of a device-readable storage medium (eg, compact disc read only memory (CD-ROM)), or through an application store or on two user devices (eg, smart It can be distributed (eg, downloaded or uploaded) online, directly between phones).
- a part of a computer program product eg, a downloadable app
- a storage medium readable by a device such as a manufacturer's server, an application store server, or a relay server's memory. It can be at least temporarily stored in or temporarily created.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
Description
Claims (15)
- 증강 현실 장치에 있어서,도광판;상기 증강 현실 장치를 착용한 사용자의 시선 방향에서 상기 도광판과 중첩하는 시력 보정 렌즈;상기 증강 현실 장치를 상기 증강 현실 장치의 사용자의 얼굴에 고정하기 위한 지지부;상기 지지부의 일부분에 배치되는 광 방출부 및 광 수신부를 포함하는 시선 추적 센서;시선 추적을 위한 광을 반사시키기 위한 광 반사부; 및적어도 하나의 프로세서를 포함하고,상기 적어도 하나의 프로세서는,상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하고, 상기 광 방출부를 제어함으로써 상기 광 반사부를 향해 광을 방출하고, 상기 광 수신부를 통해 수신된 광에 기초하여 상기 사용자의 눈 이미지를 획득하고, 상기 렌즈 특성 정보에 기초하여 상기 사용자의 눈 이미지를 보상하고, 상기 보상된 눈 이미지로부터 상기 사용자의 시선 정보를 획득하며,상기 광 반사부를 향해 방출된 광은 상기 광 반사부에 의해 반사되어 상기 사용자의 눈을 향하고, 상기 광 수신부에 의해 수신된 광은 상기 사용자의 눈을 향한 광이 상기 사용자의 눈에 의해 반사되어 수신된 광을 포함하고,상기 광 반사부는 패턴을 포함하고,상기 적어도 하나의 프로세서는, 상기 광 방출부를 제어함으로써 상기 광 반사부를 향해 상기 시력 보정 렌즈의 렌즈 특성 정보 획득을 위한 광을 방출하고, 상기 광 수신부를 통해 수신된 광에 기초하여 왜곡된 패턴을 식별하고, 상기 왜곡된 패턴에 기초하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 것을 특징으로 하는, 증강 현실 장치.
- 제1항에 있어서,상기 프로세서는, 상기 광 반사부에 포함된 패턴 및 상기 광 반사부에 반사되어 상기 광 수신부를 통해 수신된 광으로부터 식별되는 왜곡된 패턴을 비교함으로써, 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 것을 특징으로 하는, 증강 현실 장치.
- 제1항에 있어서,상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하기 위해 이용되는 지시자(indicator)가 상기 시력 보정 렌즈 상에 표시되고,상기 프로세서는, 상기 지시자를 식별하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 것을 특징으로 하는, 증강 현실 장치.
- 제3항에 있어서,상기 프로세서는, 상기 광 방출부를 제어함으로써 상기 지시자를 향해 광을 방출하고, 상기 광 수신부를 통해 수신된 광에 기초하여 상기 지시자를 식별하는 것을 특징으로 하는, 증강 현실 장치.
- 제1항에 있어서,상기 시력 보정 렌즈는 상기 증강 현실 장치의 지지부에 결합하기 위한 결합부를 포함하고,상기 프로세서는, 상기 결합부를 통해 상기 시력 보정 렌즈로 전자기적 신호를 인가하고, 상기 시력 보정 렌즈로부터 상기 인가된 전자기적 신호에 대한 응답에 대응되는 전자기적 리턴 신호를 획득함으로써, 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 것을 특징으로 하는, 증강 현실 장치.
- 제1항에 있어서,상기 시력 보정 렌즈는 가변 초점 렌즈를 포함하고,상기 프로세서는, 상기 가변 초점 렌즈를 제어하기 위한 제어 파라미터를 설정하고, 상기 설정된 제어 파라미터에 기초하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 것을 특징으로 하는, 증강 현실 장치.
- 제1항에 있어서,상기 적어도 하나의 프로세서는, 상기 보상된 눈 이미지로부터 적어도 하나의 기 설정된 특징점들의 위치 및 중심점의 위치를 획득하고, 상기 특징점들의 위치 및 중심점의 위치에 기초하여 상기 사용자의 시선 정보를 획득하는 것을 특징으로 하는, 증강 현실 장치.
- 시력 보정 렌즈를 포함하는 증강 현실 장치가 사용자의 시선을 검출하는 방법에 있어서,상기 사용자의 시선 방향에서 상기 증강 현실 장치로부터 출력되는 영상을 디스플레이하기 위한 도광판과 중첩하는, 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계;상기 증강 현실 장치의 지지부의 일부분에 배치되는 광 방출부를 통해 광 반사부를 향해 시선 추적을 위한 광을 방출하고, 상기 방출된 광은 상기 광 반사부에 의해 반사되어 상기 증강 현실 장치를 착용한 상기 사용자의 눈을 향하는 것인, 단계;상기 사용자의 눈에 의해 반사된 광을 상기 지지부에 배치되는 광 수신부를 통해 수신하는 단계;상기 광 수신부를 통해 수신된 광에 기초하여, 상기 사용자의 눈 이미지를 획득하는 단계;상기 시력 보정 렌즈의 렌즈 특성 정보에 기초하여 상기 사용자의 눈 이미지를 보상하는 단계; 및상기 보상된 눈 이미지에 기초하여 상기 사용자의 시선 정보를 획득하는 단계를 포함하는, 방법.
- 제8항에 있어서,상기 광 반사부는 패턴을 포함하고,상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계는,상기 광 방출부를 통해 상기 광 반사부를 향해 렌즈 특성 정보 획득을 위한 광을 방출하는 단계;상기 광 수신부를 통해 수신된 광에 기초하여 왜곡된 패턴을 식별하는 단계; 및상기 왜곡된 패턴에 기초하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계를 포함하는, 방법.
- 제9항에 있어서,상기 왜곡된 패턴에 기초하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계는,상기 광 반사부에 포함된 패턴 및 상기 광 반사부에 반사되어 상기 광 수신부를 통해 수신된 광으로부터 식별되는 왜곡된 패턴을 비교함으로써, 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계를 포함하는, 방법.
- 제8항에 있어서,상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하기 위해 이용되는 지시자(indicator)가 상기 시력 보정 렌즈 상에 표시되고,상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계는,상기 지시자를 식별하는 단계; 및상기 식별된 지시자에 기초하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계를 포함하는, 방법.
- 제8항에 있어서,상기 시력 보정 렌즈는 상기 증강 현실 장치의 지지부에 결합하기 위한 결합부를 포함하고상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계는,상기 결합부를 통해 상기 시력 보정 렌즈로 전자기적 신호를 인가하는 단계;상기 시력 보정 렌즈로부터 상기 인가된 전자기적 신호에 대한 응답에 대응되는 전자기적 리턴 신호를 획득하는 단계; 및상기 획득된 전자기적 리턴 신호에 기초하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계를 포함하는, 방법.
- 제8항에 있어서,상기 시력 보정 렌즈는 가변 초점 렌즈를 포함하고,상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계는,상기 가변 초점 렌즈를 제어하기 위한 제어 파라미터를 설정하는 단계; 및상기 설정된 제어 파라미터에 기초하여 상기 시력 보정 렌즈의 렌즈 특성 정보를 획득하는 단계를 더 포함하는, 방법.
- 제8항에 있어서,상기 보상된 눈 이미지에 기초하여 상기 사용자의 시선 정보를 획득하는 단계는,상기 보상된 눈 이미지로부터 적어도 하나의 기 설정된 특징점들의 위치 및 중심점의 위치를 획득하는 단계; 및상기 특징점들의 위치 및 중심점의 위치에 기초하여 상기 사용자의 시선 정보를 획득하는 단계를 포함하는, 방법.
- 제8항 내지 제14항 중 어느 한 항의 방법을 컴퓨터에서 수행하기 위한 프로그램이 기록된 컴퓨터로 읽을 수 있는 기록 매체.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280052797.2A CN117730271A (zh) | 2021-08-05 | 2022-08-05 | 用于检测用户注视的增强现实装置和方法 |
EP22853560.5A EP4321923A1 (en) | 2021-08-05 | 2022-08-05 | Augmented reality device and method for detecting gaze of user |
US17/903,616 US11983315B2 (en) | 2021-08-05 | 2022-09-06 | Augmented reality device and method for detecting gaze of user |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20210103476 | 2021-08-05 | ||
KR10-2021-0103476 | 2021-08-05 | ||
KR1020210152571A KR20230021551A (ko) | 2021-08-05 | 2021-11-08 | 사용자의 시선을 검출하는 증강 현실 장치 및 방법 |
KR10-2021-0152571 | 2021-11-08 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/903,616 Continuation US11983315B2 (en) | 2021-08-05 | 2022-09-06 | Augmented reality device and method for detecting gaze of user |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023014185A1 true WO2023014185A1 (ko) | 2023-02-09 |
Family
ID=85154683
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/011693 WO2023014185A1 (ko) | 2021-08-05 | 2022-08-05 | 사용자의 시선을 검출하는 증강 현실 장치 및 방법 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023014185A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20170041862A (ko) * | 2014-08-11 | 2017-04-17 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 유저 안경 특성을 결정하는 눈 추적용 디바이스를 갖는 헤드업 디스플레이 |
JP2019516204A (ja) * | 2016-04-06 | 2019-06-13 | 北京七▲シン▼易▲維▼信息技▲術▼有限公司Beijing 7Invensun Technology Co.,Ltd. | ビデオメガネの眼球追跡モジュール |
KR20190069563A (ko) * | 2016-10-26 | 2019-06-19 | 밸브 코포레이션 | 광학적 렌즈 왜곡을 교정하기 위한 동공 위치의 사용 |
KR20200136449A (ko) * | 2018-04-23 | 2020-12-07 | 칼 자이스 비전 인터내셔널 게엠베하 | 사용자에 의한 개별 착용 상황에 대해 광학 렌즈를 측정하기 위한 방법 및 장치 |
KR20210023921A (ko) * | 2021-02-18 | 2021-03-04 | 주식회사 레티널 | 시력 보정 기능을 구비하는 증강 현실용 광학 장치 |
-
2022
- 2022-08-05 WO PCT/KR2022/011693 patent/WO2023014185A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20170041862A (ko) * | 2014-08-11 | 2017-04-17 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 유저 안경 특성을 결정하는 눈 추적용 디바이스를 갖는 헤드업 디스플레이 |
JP2019516204A (ja) * | 2016-04-06 | 2019-06-13 | 北京七▲シン▼易▲維▼信息技▲術▼有限公司Beijing 7Invensun Technology Co.,Ltd. | ビデオメガネの眼球追跡モジュール |
KR20190069563A (ko) * | 2016-10-26 | 2019-06-19 | 밸브 코포레이션 | 광학적 렌즈 왜곡을 교정하기 위한 동공 위치의 사용 |
KR20200136449A (ko) * | 2018-04-23 | 2020-12-07 | 칼 자이스 비전 인터내셔널 게엠베하 | 사용자에 의한 개별 착용 상황에 대해 광학 렌즈를 측정하기 위한 방법 및 장치 |
KR20210023921A (ko) * | 2021-02-18 | 2021-03-04 | 주식회사 레티널 | 시력 보정 기능을 구비하는 증강 현실용 광학 장치 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10031579B2 (en) | Automatic calibration for reflective lens | |
US9711072B1 (en) | Display apparatus and method of displaying using focus and context displays | |
US9711114B1 (en) | Display apparatus and method of displaying using projectors | |
EP3228072B1 (en) | Virtual focus feedback | |
JP6089705B2 (ja) | 表示装置、および、表示装置の制御方法 | |
US10048750B2 (en) | Content projection system and content projection method | |
EP2499962B1 (en) | Optical measuring device and method for capturing at least one parameter of at least one eye wherein an illumination characteristic is adjustable | |
US10002293B2 (en) | Image collection with increased accuracy | |
WO2018100242A1 (en) | Display apparatus and method of displaying using optical combiners and context and focus image renderers | |
WO2022010152A1 (ko) | 사용자의 시력을 교정하고 캘리브레이션을 수행하는 디바이스 및 방법 | |
US11983315B2 (en) | Augmented reality device and method for detecting gaze of user | |
KR20220046494A (ko) | 시선 방향을 결정하는 방법 및 시선 추적 센서 | |
WO2023014185A1 (ko) | 사용자의 시선을 검출하는 증강 현실 장치 및 방법 | |
US11934571B2 (en) | Methods and systems for a head-mounted device for updating an eye tracking model | |
WO2023282524A1 (ko) | 시력 측정 및 시력 교정을 제공하는 증강 현실 장치 및 방법 | |
KR20230021551A (ko) | 사용자의 시선을 검출하는 증강 현실 장치 및 방법 | |
US10859832B1 (en) | Mitigating light exposure to elements of a focus adjusting head mounted display | |
CN117730271A (zh) | 用于检测用户注视的增强现实装置和方法 | |
US20190347833A1 (en) | Head-mounted electronic device and method of utilizing the same | |
WO2022124638A1 (ko) | 사용자 렌즈가 삽입되는 렌즈 클립 및 렌즈 클립에 사용자 렌즈의 삽입 여부를 검출하는 머리 장착형 전자 장치 | |
WO2024111804A1 (ko) | 증강 현실에서 이미지 획득 장치 및 방법 | |
WO2022158795A1 (ko) | 사용자의 시선을 검출하는 증강 현실 장치 및 방법 | |
KR20220106011A (ko) | 사용자의 시선을 검출하는 증강 현실 장치 및 방법 | |
CN117452591A (zh) | 具有集成式可调谐透镜的镜筒 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22853560 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022853560 Country of ref document: EP Ref document number: 22853560.5 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022853560 Country of ref document: EP Effective date: 20231108 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280052797.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |