GB2611579A - Methods and systems for interpupillary distance measurement - Google Patents

Methods and systems for interpupillary distance measurement Download PDF

Info

Publication number
GB2611579A
GB2611579A GB2114508.1A GB202114508A GB2611579A GB 2611579 A GB2611579 A GB 2611579A GB 202114508 A GB202114508 A GB 202114508A GB 2611579 A GB2611579 A GB 2611579A
Authority
GB
United Kingdom
Prior art keywords
subject
image
distance
fixation
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2114508.1A
Other versions
GB202114508D0 (en
Inventor
Wilson Iain
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuel 3d Tech Ltd
Original Assignee
Fuel 3d Tech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuel 3d Tech Ltd filed Critical Fuel 3d Tech Ltd
Priority to GB2114508.1A priority Critical patent/GB2611579A/en
Publication of GB202114508D0 publication Critical patent/GB202114508D0/en
Priority to PCT/EP2022/077681 priority patent/WO2023061822A1/en
Publication of GB2611579A publication Critical patent/GB2611579A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/111Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring interpupillary distance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0091Fixation targets for viewing direction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/15Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
    • A61B3/152Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing for aligning

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A pupil distance on subject’s face, such as the interpupillary distance between the subject’s two pupils, is measured by displaying a fixation image 100 which is modified by an overlay 31,34 interposed between the fixation image 100 and the subject by directing lights from the fixation image to different ones of the subject’s eyes so that the disparity between the respective images formed on the retina of each eye causes the subject to perceive the fixation image as being further away from the subject’s eyes than the fixation image actually is - to ensure that the pupil distance is consistently measured when the eyes are looking at the far distance. The overlay 3 may be a parallax barrier 31 with an array of gaps 32 or pin holes on an opaque sheet 33 with interleaved portions 101 and 102 displaying first and second eye images visible to only the right eye 103 and left eye 105 respectively. The overlay may alternatively be a lenticular array 34.

Description

Methods and Systems for I nterpupillary Distance measurement The present invention relates to methods and systems for measuring at least one distance on the face of a human subject, in order to form a pair of spectacles for the subject based on the measured distance. At least one of the points on the face of the subject is the pupil of one of the subject's eyes, so this distance is referred to here as a "pupil distance".
Background of the invention
Conventionally, the fabrication of personalized spectacles involves the measurement of multiple pupil distances on the face of a subject who will wear the spectacles. Here the term "face" is used to include both the portion of the face which is composed of skin and the portion which is the subject's eyes. The measured pupil distances are used to select parameter(s) of the lenses which are included in the spectacles, such as parameters defining the shape of the lenses, e.g. the positions of their optical axes.
One such pupil distance is the interpupillary distance (IPD), which is the distance measured in millimetres between the centres of the pupils of the subject's two eyes. This measurement differs from person to person and, for a given person, depends on whether the person is looking (fixating) at a near object, in which case the I PD is known as "near IPD", or at a faraway object, in which case the IPD is known as "far IPD". More generally, a pupil distance measured when the subject is looking at a near object (e.g. less than about 80cm) is referred to here as a "near pupil distance", and one measured when the subject is looking at a distant object (e.g. at least about 4 metres away) is referred to as a "far pupil distance".
Accurate and reliable measurement of IPDs is an integral part of spectacle fitting for corrective lenses. The wearers' pupil centres must align with the optical centre of the correcting lenses. Typically, an accuracy of ±1mm is required by opticians and lens manufacturers.
Opticians conventionally measure both the near and far IPD. The near IPD may be measured when the subject is looking at an object which is about 40 cm from the subject's face (a typical distance between the subject's face a surface which the subject is reading). The far I PD is measured when the subject is looking at an object which is at a distance from the subject's face which is greater than or equal to 4 meters. The far IPD measurement is of the most interest. The optician will be careful to direct the subject to fixate at the desired distance and monitor their compliance. Nevertheless, errors of measurement are common, e.g. when the subject does not move his eyes as expected.
Furthermore, opticians themselves sometimes make errors, so an automatic process is desirable.
Currently near and far IPD can be measured in an automatic process (i.e. without supervision by a human operator, such as a trained eye care professional, except optionally for initiating the process) by a measurement device includes a screen a few 10s of centimetres (cm) from the subject's face. When measuring near IPD, a fixation target is presented on the device's screen, and the subject is asked to look at it. This is easy for a subject to understand, resulting in reliable measurements. For the far IPD measurement, however, the subject is instructed to fixate on a target in the distance, not on the screen. Thus, the fixation target is highly dependent on the environment and outside of the control of the measurement device. Confusion about what to focus on and potential distractions (moving people etc.) can cause unstable fixations and inaccurate IPD measurements. A significant challenge for this measurement device is to provide clear and concise instructions that will ensure reliable measurements. This results in significant variability in far pupil distance measurements, resulting in incorrectly manufactured lenses discarded at the optician's expense.
The problem is particularly acute in the case of personalized varifocal lenses which are to be used for multiple tasks including reading. This is because they are typically designed based on the measured far IPD, which creates a problem when the subject uses them to concentrate on a near object. Only a respective narrow strip of the lens is associated with any given focal distance, so, unlike for reading glasses for which the viewer has some flexibility with regard to which part of the lens he or she uses for reading, the near IPD which is used to design this strip needs to be known with high accuracy (e.g. about 1mm accuracy). It is particularly expensive to have to discard varifocal lenses, which are much more expensive than conventional lenses.
The IPD is not the only pupil distance used by an optician. Another such distance commonly used to select a dimension of the lenses of a pair of spectacles is the monocular pupil distance (MPD), defined as the distance from a point on the subject's skin (typically, the centre of the bridge of the subject's nose) to the centre of one of the subject's pupils. This distance varies widely as the subject looks from side to side (more than the IPD does), and the monocular pupil distance should be measured when the subject is looking directly forward, i.e. a straight line joining the subject's pupils is parallel to a side-to-side direction of the subject's face (e.g. the direction between corresponding corners of the subject's respective eyes). Measuring the far MPD is a particular challenge.
Summary of the invention
In general terms, the present invention proposes that a pupil distance is measured by displaying a fixation image which is modified by an overlay interposed between the fixation image and the subject (that is, the overlay "covers the fixation image" from the point of view of the subject), and which directs lights from the fixation image to different ones of the subject's eyes. The light directed by the overlay to each eye is referred to here as the "eye image" for the corresponding eye. The disparity between the respective images formed on the retina of each eye by the respective eye images causes the brain of the subject to perceive the fixation image as being at a certain distance from the eyes, and the system is configured to create the impression the fixation image which is further away from the subject's eyes than the fixation image actually is.
A mechanism is provided for determining the pupil distance when the subject looks at the fixation image through the overlay. The mechanism may be a camera for capturing an image of the subject's face. The camera may be a depth camera. Thus, that the pupil distance can be measured when the subject is looking at an image which, to him or her, appears to be in the distance. In other words, a far pupil distance can be measured, even though the subject is looking at a fixation image displayed much closer to him. More generally, this enables a far pupil distance measurement to be performed without the subject having to look at an actually distant object and potentially being distracted. This allows an accurate measurement of the pupil distance to be made, for example in an automatic process (i.e. involving actions by the subject, but not supervised by a second person, e.g. a trained eyecare professional).
The fixation image may be considered as multiple interleaved image portions. The overlay directs light from a first subset of the interleaved image portions (collectively forming a first of the eye images) to a first of the subject's eyes, and light from a second subset of the interleaved image portions (collectively forming a second of the eye images) to a second of the subject's eyes. Each of the eye images may include a respective "target", i.e. a highlighted area distinguished by colour and/or intensity from a surrounding area of the eye image (the "background").
The system may be configured such that, if the subject is looking directly at the fixation image, the subject is only able to obtain a clear view in each of his or her eyes of the corresponding eye image if the subject's eyes are at a certain distance from the fixation image. This image may depend on the IPD. Thus, if the fixation image is displayed on a surface, the subject can ensure that both the subject's eyes see the correct corresponding eye image by positioning his eyes at that certain distance from the surface. If the surface is on a body which the subject holds on his hands, for example, the subject will naturally hold the body at that certain distance from the subject's face. Typically the "working window" of distances at which the subjecet can see the fixation image properly will include distances which are natural for subjects having normal arm length and IPD. If not, a warning message may be sent to the subject, as discussed below.
In order to measure the far IPD (or any other pupil distance which is to be measured when a subject is looking into the distance), the fixation image can be one such that the subject perceives it to be at a first distance (e.g. over 4m) which may be referred to as a far distance. Thus, far IPD measurements (and other pupil distance measurements) can be performed reliably even by a non-specialist subject, e.g. using a body held in the subject's hands. To measure far IPD, the fixation image is rendered to utilise the overlay to make it appear as at a far distance (e.g. over 4m). This removes distractions or ambiguity from the subject, enabling a more reliable IPD measurement.
A first way for the overlay to modify the perceived distance of the fixation image from the eyes of the subject utilises autostereoscopy. Autostereoscopy means that the fixation image and overlay are such that discrepancies between the views the subject sees with his respective eyes mean that subject can only bring these images into register by moving his eyes to a configuration in which the distance between the subject's pupils is the distance between the pupils when the subject is looking at an object which is further from the eyes than the surface where the fixation image is displayed.
A second, more sophisticated way of presenting the fixation image is as a light field display. Instead of or in addition to the autostereoscopy, the eye image for at least one of the eye can be configured such that it can only be brought into focus by modifying the focal distance of the lens of the eye to be a distance which is different from the distance between the eye and the fixation target. Fixation images of this kind are generated computationally.
Optionally, the focal distance of the eye which brings the eye image into focus may be substantially equal to the perceived distance of the fixation image from the eyes due to autostereoscopy. In this case, the subject will experience less discomfort because the brain of the subject will obtain a consistent (though incorrect) impression of the distance from the eyes to the fixation image both by autostereoscopy and from the focal distance of each eye.
Alternatively the focal distance of the eye which brings the eye image into focus may be selected based on the range of focal distances of which the corresponding eye is capable. Note that the eyes of some subjects are unable to form a focal distance which is equal to the distance from the eyes to the fixation target as determined by autostereoscopy, or indeed the true distance from the eyes to the fixation target. For these subjects, each eye image may be selected such that it is in focus when the corresponding eye has a focal distance which is within the focal distance range which that eye is capable of.
For example, if the eyes of a very short sighted subject are only able to focus on objects within a certain range (e.g. on objects up to 6cm away from the subject's eyes), the light field may be chosen such that the eye image can be brought into focus by the subject's eyes adopting a focal distance which is within the range. This selection may be desirable even when the apparent distance of the fixation image from the eyes as measured by autostereoscopy and/or the actual distance of the fixation image from the eyes, are outside this range.
Both autostereoscopy and light fields can be implemented if the overlay is either a parallax barrier (that is, a one or two-dimensional array of transparent areas (e.g. gaps) in a less transparent (e.g. opaque) layer), or a lenticular array (that is an array, e.g. a regular array, of refractive lenses, each of which refracts light passing through it).
In both approaches, the overlay is a preferably a passive component. That is, it is not a component which is capable of a performing a time-varying direction of the light it receives based either on an internal processor (typically the overlay does not comprise electronic components) or a control signal received by the overlay. All control of perceived distance and accommodation is controlled by modifying the fixation image, while passively redirecting light passing from the fixation image to the subject's eyes by passing through the overlay.
Alternatively, the overlay could be provided as a component controllable by an electronic control signal, for example by providing the parallax barrier as a liquid crystal display which has variable transparency at each of an array of pixels under the control of a control signal. This enables a context adaptive parallax barrier and the ability to "turn off" the 3D effect, e.g. by making the liquid crystal display transparent in the region which covers the fixation image.
The accuracy of the pupil distance measurements can be further improved by the system determining whether the (translational and/or angular) positioning of the subject's face meets one or more criteria for correct positioning of the face, and if not issuing instructions to the subject which enable the subject to more correctly position his or her face.
In principle, the body on which the fixation image is displayed could be one having the fixation image printed on it, and which is positioned with the overlap covering at least part of it. Providing the fixation image as a printed image has an advantage during manufacturing, since printing may be done with high accuracy and the placement of the parallax barrier or lenticular array can be very accurate relative to the fixation image.
If, in addition to or instead of the far pupil distance(s) such as the far I PD, it is desired to measure one or more near pupil distances, the body on which the fixation image is printed could be swapped for one having a second fixation image printed on it which, when viewed through the overlay, gives the subject the impression that the fixation image is at a second distance, less than the first distance, such as a distance in the range 40-80cm. Alternatively, the second fixation image could be printed on the reverse face of the body from the fixation image, and to measure the near pupil distances the body could be turned round, so that the reverse face of the body is directed towards the overlay.
Alternatively, the fixation image for measuring the far pupil distance could be displayed on an electronic screen which is at least partially covered by the overlay. If and when it is desired to measure one or more near pupil distances, the electronic screen could be controlled to display the second fixation image.
Conveniently, the electronic screen may be integrated in a single unit with the camera. For example, the electronic screen and camera may be two components provided and supported within a single housing of a mobile device, such as a mobile telephone or a
tablet computer.
A first specific expression of the invention is a method of obtaining a measurement of a pupil distance, the pupil being measured between two points on the face a human subject, at least one of the points being at a pupil of an eye of the subject, the method comprising: displaying a fixation image; when the subject directs his or her gaze to the fixation image, directing light from the fixation image to a first eye of the subject, and to a second eye of the subject, the directing of the light causing the distance perceived by the subject of the fixation image from the face of the subject to be different from (e.g. greater than) the actual distance of the fixation image from the face of the subject; capturing at least one image of the subject, locating the two points based on the at least one captured image of the subject; and measuring the pupil distance based on the locations of the two points.
Another expression of the invention is a system for measuring a pupil distance, the system comprising a computer system, the computer system comprising an electronic screen, an overlay positioned to cover at least a portion of the screen, a camera, a processor for controlling the electronic screen and the camera, and a data storage device storing a program comprising instructions which, when the program is executed by the processor cause the processor to: display a fixation image on the electronic screen comprising multiple image portions, the overlay being configured to direct light from a first subset of the image portions to a first eye of the subject, and light from a second subset of the image portions to a second eye of the subject, the two subsets of image portions being interleaved in the fixation image; capture using the camera at least one image of the subject, locate the two points based on the at least one captured image of the subject; and measure the pupil distance based on the locations of the two points.
A further expression of the invention is a computer program product (e.g. a program stored on a tangible recording medium, or a program downloadable over a communications network) comprising program instructions which, when executed by the processor of the system causes the processor to carry out the steps of the method.
Brief description of the fipures
Embodiments of the invention will now be described for the sake of example only with reference to the following figures, in which: Fig. 1 shows schematically the use of a system which is a first embodiment of the invention; Fig. 2 shows pupil distances which can be measured using the system fo Fig. 1; Fig. 3 shows schematically a first realization of the system of Fig. 1, in which the overlay is a parallax barrier; Fig. 4 shows schematically a second realization of the system of Fig. 1, in which the overlay is a lens array; Fig. 5, which is composed of Fig. 5(a) which shows a fixation image for measuring near pupil distances, and Figs. 5(b)-(d) which show a fixation image for measuring far pupil distances and corresponding eye images.
Fig. 6 shows the steps of a method according to the invention; Fig. 7 shows the construction of a measuring device in the system of Fig. 1; and Fig. 8 shows schematically the use of a system which is a second embodiment of the invention.
Elements having the same significance in different ones of the figures are denoted by the same reference numeral
Detailed description of the embodiments
Referring firstly to Figs. 1 and 2, the use of a system which is an embodiment is illustrated. The system is to measure, on the face 1 of a subject, distances between various points. The points include the subject's pupils 11, 13. An interpupillary distance (IPD) is defined as the distance between the points 11, 13. The monocular pupil distance (MPD) for either of the pupils (e.g. pupil 11) is defined as the distance between the pupil 11 and a point on the subject's skin, such as the bridge 12 of the subject's nose.
Note that the IPD and the MPD vary according to the direction in which the subject is looking (particularly the MPD). For that reason, the IPD and MPD are defined at a time when the subject is looking directly forward, e.g. a line between the pupils 11, 13 is parallel to a line between two corresponding points on either side of the subject's face (e.g. the corners 14, 15 of the subject's eyes).
In this embodiment the pupil distances are measured by a measuring system comprising measuring device 2 (which may be a mobile computer such as a tablet computer or a mobile telephone) and an overlay 3. The measuring device 2 comprises an integrated electronic screen 21 and an integrated camera 22 for capturing image(s) of the subject's face 1. The camera 22 may be a depth camera which produces three-dimensional image(s) of the subject's face 1.
A processor of measuring device is operative to identify the locations of points such as the points 11, 12 and 13 in the captured images (e.g. in a three-dimensional space with axes defined based on the position of the measuring device 2). Many methods are known for this. Some employ the techniques described in the present applicant's PCT patent application WO/2021/152144. The processor may further form a model of the three-dimensional head of the subject including both the eyes and the skin of the face, e.g. using the method of the present applicant's PCT application WO/2017/077279 but using depth camera images rather than the photometry described there. The processor may be configured to determine the translational and rotational position of the face relative to the measuring device 2, e.g. by identifying landmarks on the three-dimensional head, such as the points 12, 14, 15, and comparing positions of those landmarks. For example if point 14 is closer to the measuring device 2 than the point 15, the subject's head may be turned to the left as viewed from the measuring device 2; if the point 12 is closer to the measuring device 2 than the points 14, 15, the subject's face 1 may be inclined forward).
Note that the points 11, 12 (the points at the pupils of the subject) may be defined as the centre of the pupil (i.e. the centre of a circular outline of the pupil). In another possibility, the points 11, 12 may be defined as being as points on the pupil which are on the border of the pupil with the iris. The IPD obtained by measuring corresponding points on a pupil is called an anatomical IPD. Alternatively, the points 11, 12 may be defined as the positions of the corneal reflection ("corneal reflex"), which lies on the optical axis of the subject's eyes. The IPD ("physiological IPD") from the principal corneal reflexes is considered most useful for forming spectacles which are most accurate.
Note that optionally, the processor which performs this method may not be located within the housing of the measuring device 2. It may be a distant processor with which the measuring device 2 is configured to communicate.
The overlay 3 may be a generally laminar body which covers at least part of the electronic screen 21 from the point of view of the subject (i.e. the overlay 3 is interposed between the subject's face 1 and the electronic screen 21). A plane of the overlay 3 may be parallel to a plane which is the front surface of the electronic screen 21. Thus, the subject views at least some of the electronic screen 21 through the overlay 3. The measuring device 2 is connected to the overlay 3 in a fixed positional relationship. The overlay 3 may be spaced from the front surface of the electronic screen 21.
The measuring device 2 may further include an integrated sound generation device 23, typically within the same housing as the electronic screen and camera 22, for issuing instructions to the subject (e.g. telling the subject to hold the measuring device 2 with a different positional relationship with the face of the subject). In particular, if the subject holds his or her face 1 at a certain distance from the measuring device 2, the mobile computer measures the distance and it may either ask the subject to move his or her head until the distance is determined to be with a preferred distance range. The depth camera of many mobile devices (e.g. iPhones and iPads) cannot produce accurate depth images of any object which is less than 20cm from the depth camera, so if these mobile devices are employed as the measuring device 2 this may set a minimum limit for the distance from the subject's face 1 to the measuring device 2. Alternatively or additionally, the measuring device 2 may adjust the image displayed on the electronic screen 21 based on the measured distance.
Fig. 3 shows schematically the operation of the embodiment of Fig. 1 in the case that the overlay 3 is a parallax barrier 31. The parallax barrier 31 includes an array (e.g. a regular array) of gaps 32 formed in a generally laminar opaque sheet 33. The parallax barrier 31 may for example use vertical strips of opaque material mounted on a transparent laminar layer. Alternatively, the parallax layer may comprise an array (e.g. a regular array) of pinholes formed in an opaque layer.
A fixation image 100 is displayed on the electronic screen 21 including portions 101 which are interleaved with portions 102. Both the portions 101, 102 are referred to as "interleaved portions". The interleaved portions 101 together form a first "eye image", which, due to the parallax barrier 31, is visible only to the right eye 103 ("R") of the subject. The interleaved portions 102 together form a second "eye image", which, due to the parallax barrier 31, is visible only to the left eye 105 ("L") of the subject.
Fig. 4 shows schematically the operation of the embodiment of Fig. 1 in the case that the overlay 3 is a lenticular array 34. The lenticular array 34 includes a regular array of lenses 35 which are refractive elements which refract light passing through them. The fixation image 100 is as in Fig. 3. Again, the interleaved portions 101 together constitute a first "eye image", which, due to the refraction caused by the lenticular array 34 is visible only to the right eye 103 of the subject. The interleaved portions 102 together constitute a second "eye image", which, due to the lenticular array 34 is visible only to the left eye 105 of the subject. It has been found experimentally that the lenticular array of Fig. 4 works better than the parallax barriers of Fig. 3, but it is also more expensive.
A description of stereoscopic and autostereoscopic displays is provided by H. Urey, K. V. Chellappan, E. Erden and P. Surman, "State of the Art in Stereoscopic and Autostereoscopic Displays," in Proceedings of the IEEE, vol. 99, pp. 540-555, April 2011, the disclosure of which is incorporated here by reference. In particular, the lenticular array may be the slanted lenticular array discussed there.
When it is desired to measure a near pupil distance, the fixation image may be as in Fig. 5(a). That is, the electronic screen 21 displays a fixation image including a target 37 (illustrated as two circles). The pixels of the target 37 are displayed with a colour which is different from the rest of the fixation image (the background). The pixels which are coloured differently from the background to generate the target 37 include pixels which are within both the subsets of interleaved portions 101, 102, so each of the two eye images (that is, the images seen by the respective eyes) include the target 37. Thus, the subject's eyes will move so that they are directed towards the target image 37. That is, the eyes adopt their normal configuration when focusing on an object which is at a distance from the eyes which is equal to the distance of the electronic screen 37. This distance is the distance at which the near pupil distances are to be measured. The camera 22 can capture an image of the face resembling Fig. 2, and known techniques can be used to identify points 11, 12, 13 in the image, so that the I PD and/or monocular pupil distance can be measured.
Conversely, when it is desired to measure a far pupil distance, the fixation image is as shown in Fig. 5. It is composed of two spaced target areas 38, 39 (each shown as two concentric circles). Again, each target area 38, 39 has a different colour from the rest of the fixation image (the background). The pixels which are used to display the target area 38 (i.e. the pixels which have a different colour from the background of the fixation image) are ones which are part of the subset of interleaved portions 102 of the fixation image 100, so that they are only visible to the subject's left eye 105. Conversely, the pixels which are used to display the target area 39 are ones which are part of the subset of interleaved portions 101 of the fixation image 100, so that they are only visible to the subject's right eye 103.
Thus, the eye image for the subject's left eye is as shown in Fig. 5(c), only including the target 38. The eye image for the subject's right eye is as shown in Fig. 5(d), only including the target 39.
The subject's brain interprets the two targets 38, 39 as being different images of a single virtual target (to encourage this, the two targets 38, 39 may be chosen to be substantially identical in size, shape and/or colour). Because of this the subject's brain alters the configuration of the subject's eyes, into a configuration in which the subject is looking at an object which is further from the subject than the distance separating the screen 21 from the subject. Thus, the targets 38, 39 in the fixation image of Fig. 5(b) are far enough apart (e.g. such as 60mm apart, approximately equal to the spacing of the subject's eyes), the subject's brain and eyes are 'fooled' into behaving like they are viewing something in the distance. This process is called autostereoscopy.
Thus, the image(s) captured by the camera 22 show the face of the subject in the configuration the subject uses for viewing distant images. The processor of the measuring device 2 identifies the points 11, 12, 13, and thus measures the IPD and/or the monocular pupil distance. Note that this is achieved in a way in which what the subject sees is carefully controlled, so that distractions are few and an accurate measurement of the pupil distances can be obtained.
One possible drawback of this method is that certain subjects experience discomfort due to a vergence-accommodation conflict. This conflict arises from the fact that although the positional relationship of the eyes when viewing the far fixation image is that of distance viewing (as explained above), the accommodation (the focus of the eye's lens) is as for a near object because the actual distance of the targets 38, 39 from the subject's eyes is only the actual distance of the electronic screen 21 from the eyes (e.g. about 20cm, and generally less than 50cm). Vergence accommodation conflict may thus arise due to the eyes being focused on a plane (the front surface of the electronic screen 21) closer to the eyes than the apparent distance of the "virtual" target.
One way of avoiding the discomfort is for the eye images to be such that they generate a "light field display". The term "light field" describes the amount of light flowing in every direction through every point in space. Using a light field display, one can recreate the optical properties of any arbitrary scene. In this context, one can display an image on a physically close screen which has the optical properties of an object much further away. In other words, the fixation image 100 is configured such that, for at least one of the subject's eyes, the corresponding eye image the barrier 3 produces is a light field display in which the focal distance which the eye has to adopt for the corresponding eye image to be in focus is different from the actual distance from the eye to the fixation image.
Specific methods for generating a light field display which can be used in the embodiment are described in "Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computational Light Field Displays", Fu-Chung Huang et al., ACM Transactions on Graphics, Volume 33, 2014, and in "Near-Eye Light Field Displays", D Lanman et al., ACM Transactions on Graphics, Volume 32, 2013.
Thus, one possible advantage of using a light field display is that the accommodation conflict can be avoided: the fixation target can be optically corrected such that each target 38, 39 is in focus when viewed by the corresponding eye with a focal distance appropriate to view an object at a distance of 4 meters rather than 40 cm.
Alternatively or additionally, another possible advantage of using a light field array is that the control of accommodation can be used to correct any fault with the subject's eyes which could inhibit the use of the embodiment. For example, a myopic subject may not be able to focus on an object 4 meters away, and some cannot even focus on an object at a distance of 40cm or even 20cm. Conversely, hyperopia or presbyopia impacts on the subject's ability to focus on an object 40cm from the subject. A light field display can correct the refractive errors to enable these subjects to focus on the fixation target.
The discussion above assumes that each fixation image is displayed continuously as static images while the corresponding image(s) of the subject are captured. However, in a variation, one (or each) fixation image may be a moving (animated) image. This is advantageous for some subjects whose eyesight is so poor (e.g. due to age-related macular degeneration) that they find a static image hard to see. The target may not move translationally across the electronic screen 21, but may for example change shape or colour over time so that it is highlighted to a greater extent.
Figure 6 shows the steps of a method 600 according to the invention to measure a pupil distance using the first embodiment of Fig. 1. These steps are applicable also to the usage of the second embodiment described below with reference to Fig. 8.
In a first step 601, preferably performed at a time when the subject is holding the measuring device 2, a fixation image is displayed on the electronic screen 21 of the measuring device 2.
In optional step 602, an instruction may be given to the subject (e.g. using the sound generation device 23) to look properly at the target. The measuring device 2 is able to tell the subject to e.g. "stare at the dot" and align their face with the tablet. Note that this is a simple instruction which any user can follow. Note that the MPD depends very strongly on the direction of gaze -more than the IPD does -so it is important for the user to be facing forward when the MPD is measured.
In step 603, while the subject's gaze is directed to the fixation image, light is directed by the overlay 3 from the fixation image to a first eye of the subject, and from the fixation image to a second eye of the subject, in the form of respective eye images. If the fixation image is one which is to be used to measure a far pupil distance, the distance perceived by the subject of the fixation image from the face 1 of the subject is greater than the distance of the electronic screen 21 from the face 1 of the subject. This may be due to the autostereoscopic effect described above. Furthermore, as described above, the fixation image may be selected to be such that the focal distance from the subject's eyes which brings targets in the fixation image into focus is different from the distance of the electronic screen 21 from the face 1 of the subject. For example, it may be greater than the distance of the electronic screen 21 from the face 1 of the subject, to reduce discomfort, or the focal distance for each eye image may be greater or less than the distance of the electronic screen 21 from the face 1 of the subject and within a range of focal distances which the subject corresponding eye is able to produce.
In optional step 604, a check is carried out as to whether the subject is looking in right place. Note that the eyes of a subject naturally move around when not looking at a specific object ("eye flutter"), but will be still when focusing on an object. Gaze recognition -though not accurate enough to be used for measurement of the spacing of the pupils when looking in the far distance -is used to check whether eyes are still to determine whether patient is focusing on target, by seeing whether there is movement above a threshold within a certain time window. The method may only proceed to step 604 when this criterion is met (i.e. the measured rate of eye movement is below a predefined level).
Alternatively or additionally, the check of step 604 may include determining whether the subject is looking directly at the screen 21 of the measuring device 2, by using the three-dimensional model of the subject's face to determine the direction of the subject's gaze. Alternatively or additionally, the distance of the subject's face 1 from the electronic screen 21 of the measuring device 2 may be measured. These rotational and/or translational characteristics of the position of the subject's face 1 relative to the measuring device 2 are compared to thresholds. If one or more criteria defined based on the thresholds are not met, a message may be transmitted to the subject to change the position of his/her face relative to the measuring device 2 and/or the method may not proceed to step 605.
In step 605, at least one image (e.g. a distance image) is captured by the camera 23.
In step 606, a processor of the measuring device 2 used the captured image(s) to locate the two points associated with the pupil distance to be measured (e.g. in the case of the IPD, the processor locates the two pupils). This location process may be performed in three-dimensions to identify a three-dimensional position associated with each point in a reference frame associated with the measuring device 2.
In step 607, the pupil distance is determined by the processor based on the locations of the two points.
The method may be performed multiple times, each time using a different fixation image in which the perceived distance of the fixation image from the subject's eyes is correspondingly different. Thus, the repeated performances of the method 600 may be used to obtain successively both near pupil distance measurement(s) and far pupil distance measurement(s).
Note that in the case of measuring a near pupil distance it may not be necessary to use autostereoscopy because the electronic screen 21 may be at an actual distance from the subject's eyes which is appropriate for a near pupil distance measurement.
Nevertheless, it may be desirable to use a light field for which, to bring the targets into focus, the subject's eyes should have a focal distance different from the actual distance of the fixation image from the subject's eyes. This may be useful for example to ensure that the focal distance the subject requires to bring the target into focus is within a range of focal distances which the subject's eyes can achieve. Thus, in this case too, the distance perceived by the subject of the fixation image from the face of the subject is different from the actual distance of the fixation image from the face of the subject.
Fig. 7 is a block diagram showing a technical architecture of the measuring device 2.
The technical architecture includes a processor 322 (which may be referred to as a central processor unit or CPU) that is in communication with memory devices including secondary storage 324 (such as disk drives or memory cards), read only memory (ROM) 326, random access memory (RAM) 328. The processor 322 may be implemented as one or more CPU chips. The technical architecture further comprises input/output (I/O) devices 330, and network connectivity devices 332.
The I/O devices comprise a user interface (UI) 330a, a camera 330b and a geolocation module 330c. The Ul 330a may comprise a touch screen, keyboard, keypad or other known input device. The camera 330b allows a user to capture images and save the captured images in electronic form. The geolocation module 330c is operable to determine the geolocation of the communication device using signals from, for example global positioning system (GPS) satellites.
The secondary storage 324 is typically comprised of a memory card or other storage device and is used for non-volatile storage of data and as an over-flow data storage device if RAM 328 is not large enough to hold all working data. Secondary storage 324 may be used to store programs which are loaded into RAM 328 when such programs are selected for execution.
In this embodiment, the secondary storage 324 has an order generation component 324a, comprising non-transitory instructions operative by the processor 322 to perform various operations of the method of the present disclosure. The ROM 326 is used to store instructions and perhaps data which are read during program execution. The secondary storage 324, the RAM 328, and/or the ROM 326 may be referred to in some contexts as computer readable storage media and/or non-transitory computer readable media.
The network connectivity devices 332 may take the form of modems, modem banks, Ethernet cards, universal serial bus (USB) interface cards, serial interfaces, token ring cards, fiber distributed data interface (FDDI) cards, wireless local area network (WLAN) cards, radio transceiver cards that promote radio communications using protocols such as code division multiple access (CDMA), global system for mobile communications (GSM), long-term evolution (LTE), worldwide interoperability for microwave access (WiMAX), near field communications (NFC), radio frequency identity (RFID), and/or other air interface protocol radio transceiver cards, and other well-known network devices.
These network connectivity devices 332 may enable the processor 322 to communicate with the Internet or one or more intranets. With such a network connection, it is contemplated that the processor 322 might receive information from the network, or might output information to the network in the course of performing the above-described method operations. Such information, which is often represented as a sequence of instructions to be executed using processor 322, may be received from and outputted to the network, for example, in the form of a computer data signal embodied in a carrier wave.
The processor 322 executes instructions, codes, computer programs, scripts which it accesses from hard disk, floppy disk, optical disk (these various disk based systems may all be considered secondary storage 324), flash drive, ROM 326, RAM 328, or the network connectivity devices 332. While only one processor 322 is shown, multiple processors may be present. Thus, while instructions may be discussed as executed by a processor, the instructions may be executed simultaneously, serially, or otherwise executed by one or multiple processors.
Turning to Fig. 8 a second embodiment of the method is illustrated. This embodiment is identical to that of Fig. 1 except that the fixation image is printed on a surface of a body 4 (which may be formed of paper). The surface of the body 4 is directed towards the subject, with the overlay 3 interposed between them. In this second embodiment also, the overlay element 3 may take either of the two possibilities illustrated in Fig. 2: a parallax barrier or a lenticular array. The overlay 3 is in a fixed positional relationship with respect to the body 4, so that fixation image is split into the two eye images as illustrated above with reference to Figs. 2 and 3. In place of the camera 21 of the first embodiment, the second embodiment employs a camera 5 which may be separate from the body 4 and the overlay 7 (i.e. the camera 5 is an independently movable component which is not part of a unit formed by the body 4 and the overlay 3). In order to measure far pupil distances the fixation image printed on the body 4 may for example be the fixation image of Fig. 5(b). If it is also desired to measure near pupil distances, the body 4 may be changed for one presenting a second different fixation image which is suitable for measuring near pupil distances, such as the fixation image of Fig. 5(a), or the body 4 may be turned round with respect to the overlay, so that the surface of the body 4 which the subject views through the overlay 3 is a surface on which the second fixation image is printed.
The second embodiment of Fig. 8 has advantages over the first embodiment shown in Fig. 1, such as that a construction including the overlay 3 and the body 4 may be formed in a single, high-precision manufacturing process. Conversely, the first embodiment has the advantage that providing a single measuring device 2 which can both display the fixation image and perform depth imaging means that the relative position of the target and depth camera is known (i.e. not a parameter which has to be estimated, as in the second embodiment). As mentioned above, the electronic screen 21 of the first embodiment also makes it possible to use a dynamic (moving) target image which is useful for subjects with certain sorts of eye defect.

Claims (20)

  1. Claims 1. A method of obtaining a measurement of a pupil distance, the pupil being measured between two points on the face a human subject, at least one of the points being at a pupil of an eye of the subject, the method comprising: displaying a fixation image; when the subject directs his or her gaze to the fixation image, directing light from the fixation image to a first eye of the subject, and to a second eye of the subject, the directing of the light causing the distance perceived by the subject of the fixation image from the face of the subject to be different from the actual distance of the fixation image from the face of the subject; capturing at least one image of the subject, locating the two points based on the at least one captured image of the subject; and measuring the pupil distance based on the locations of the two points.
  2. 2. A method according to claim 1 in which the fixation image includes multiple image portions, the light directed from the fixation image to the first eye of the subject being a first subset of the image portions, the light directed from the fixation image to the second eye of the subject being a second subset of the image portions, the two subsets of image portions being interleaved in the fixation image.
  3. 3. A method according to claim 1 or claim 2, wherein the pupil distance is an interpupillary distance.
  4. 4. A method according to claim 1 or claim 2, wherein the pupil distance is a monocular pupil distance, the monocular pupil distance being a distance between the pupil of the eye of the subject and a bridge of a nose of the subject.
  5. 5. A method according to any preceding claim wherein the directing of the light is performed by an overlap element in the form of a parallax barrier or a lenticular lens, the overlay element being located between the fixation image and the subject.
  6. 6. A method according to any of claims 1 to 4 wherein the directing of the lightcomprises generating a light-field display.
  7. 7. A method according to claim 6, wherein an aberration correction is applied to the light-field display in response to a determination that at least one eye of the subject exhibits a deficiency in its ability to focus.
  8. 8. A method according to any preceding claim, wherein the fixation image is displayed on an electronic screen.
  9. 9. A method according to claim 8, wherein the measuring of the pupil distance is performed by a measuring device comprising the electronic screen as an integrated display.
  10. 10. A method according to claim 9 wherein the measuring device is a smartphone ora tablet computer.
  11. 11. A method according to claim 9 or claim 10, wherein the capturing the at least one image comprises the measuring device capturing a depth image including the face of the subject, and the locating of the two points comprises determining a positional relationship of the face of the subject and the measuring device.
  12. 12. A method according to claim 11, comprising determining whether the positional relationship of the face of the subject and the measuring device meets at least one criterion, and if the determination is negative to deliver a prompt to the subject to adjust the positional relationship between the face of the subject and the measuring device.
  13. 13. A method according to claim 11 or 12, comprising altering the fixation image based on the positional relationship of the face of the subject and the measuring device.
  14. 14. A method according to any of claims 8 to 13, wherein the fixation image is a moving image.
  15. 15. A method according to any preceding claim, further comprising: determining whether the subject's gaze is directed towards the fixation image; and in response to determining that the subject's gaze is directed towards the fixation image, measuring the pupil distance of the subject.
  16. 16. A method according to claim 15, wherein the determining step comprises determining whether an amount of movement of the at least one eye of the subject is below a threshold value within a set time frame.
  17. 17. A method according to any preceding claim, wherein the method further comprises: displaying a second fixation image at a first position, wherein the second fixation image is an image configured to appear to the subject to be at the first position; and capturing at least one second image of the subject, locating two second points based on the at least one captured second image of the subject; and measuring a near distance pupil distance based on the locations of the two second points
  18. 18. A method according to any of claims 1 to 7, wherein the fixation image comprises a printed image.
  19. 19. A method of manufacturing spectacles comprising measuring a pupil distance by a method according to any preceding claim, based on the pupil distance selecting at least one parameter of at least one lens, and forming the spectacles including the at least one lens having the selected at least one parameter.
  20. 20. A system for measuring a pupil distance, the system comprising a computer system, the computer system comprising an electronic screen, an overlay positioned to cover at least a portion of the screen, a camera, a processor for controlling the electronic screen and the camera, and a data storage device storing a program comprising instructions which, when the program is executed by the processor cause the processor to: display a fixation image on the electronic screen comprising multiple image portions, the overlay being configured to direct light from a first subset of the image portions to a first eye of the subject, and light from a second subset of the image portions to a second eye of the subject, the two subsets of image portions being interleaved in the fixation image; capture using the camera at least one image of the subject, locate the two points based on the at least one captured image of the subject; measure the pupil distance based on the locations of the two points. and
GB2114508.1A 2021-10-11 2021-10-11 Methods and systems for interpupillary distance measurement Withdrawn GB2611579A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2114508.1A GB2611579A (en) 2021-10-11 2021-10-11 Methods and systems for interpupillary distance measurement
PCT/EP2022/077681 WO2023061822A1 (en) 2021-10-11 2022-10-05 Methods and systems for interpupillary distance measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2114508.1A GB2611579A (en) 2021-10-11 2021-10-11 Methods and systems for interpupillary distance measurement

Publications (2)

Publication Number Publication Date
GB202114508D0 GB202114508D0 (en) 2021-11-24
GB2611579A true GB2611579A (en) 2023-04-12

Family

ID=78595097

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2114508.1A Withdrawn GB2611579A (en) 2021-10-11 2021-10-11 Methods and systems for interpupillary distance measurement

Country Status (2)

Country Link
GB (1) GB2611579A (en)
WO (1) WO2023061822A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170200285A1 (en) * 2016-01-08 2017-07-13 Samsung Electronics Co., Ltd. Method and apparatus for determining interpupillary distance (ipd)
US20180152698A1 (en) * 2016-11-29 2018-05-31 Samsung Electronics Co., Ltd. Method and apparatus for determining interpupillary distance (ipd)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105093546A (en) * 2015-08-20 2015-11-25 京东方科技集团股份有限公司 3d display device and control method thereof
GB2544460A (en) 2015-11-03 2017-05-24 Fuel 3D Tech Ltd Systems and methods for generating and using three-dimensional images
JP2018205819A (en) * 2017-05-30 2018-12-27 富士通株式会社 Gazing position detection computer program, gazing position detection device, and gazing position detection method
US10222634B2 (en) * 2017-07-07 2019-03-05 Optinno B.V. Optical measurement aid device
CN109429060B (en) * 2017-07-07 2020-07-28 京东方科技集团股份有限公司 Pupil distance measuring method, wearable eye equipment and storage medium
US20210169322A1 (en) * 2017-09-05 2021-06-10 Neurolens, Inc. System for measuring binocular alignment with adjustable displays and eye trackers
WO2019171334A1 (en) * 2018-03-09 2019-09-12 Evolution Optiks Limited Vision correction system and method, light field display and light field shaping layer and alignment therefor
GB2591994B (en) 2020-01-31 2024-05-22 Fuel 3D Tech Limited A method for generating a 3D model

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170200285A1 (en) * 2016-01-08 2017-07-13 Samsung Electronics Co., Ltd. Method and apparatus for determining interpupillary distance (ipd)
US20180152698A1 (en) * 2016-11-29 2018-05-31 Samsung Electronics Co., Ltd. Method and apparatus for determining interpupillary distance (ipd)

Also Published As

Publication number Publication date
GB202114508D0 (en) 2021-11-24
WO2023061822A1 (en) 2023-04-20

Similar Documents

Publication Publication Date Title
CN109964167B (en) Method for determining an eye parameter of a user of a display device
US10292581B2 (en) Display device for demonstrating optical properties of eyeglasses
US10048750B2 (en) Content projection system and content projection method
US9323075B2 (en) System for the measurement of the interpupillary distance using a device equipped with a screen and a camera
JP6212115B2 (en) Apparatus and method for measuring objective eye refraction and at least one geometrical form parameter of a person
EP3371781B1 (en) Systems and methods for generating and using three-dimensional images
IL298199B2 (en) Methods and systems for diagnosing and treating health ailments
EP2886041A1 (en) Method for calibrating a head-mounted eye tracking device
US10775647B2 (en) Systems and methods for obtaining eyewear information
JP2014515625A (en) Apparatus and method for examination, diagnosis or diagnostic assistance and treatment of functional problems in vision
US9110312B2 (en) Measurement method and equipment for the customization and mounting of corrective ophthalmic lenses
WO2018078411A1 (en) Method of determining an eye parameter of a user of a display device
JP5311601B1 (en) How to make a binocular loupe
CN111699432B (en) Method for determining the power of an eye using an immersive system and electronic device therefor
TWI832976B (en) Method and apparatus for measuring vision function
JP7165994B2 (en) Methods and devices for collecting eye measurements
JP2018047095A (en) Optometer
ES2942865T3 (en) Determination of values for myopia control of one eye of a user
JP2017191546A (en) Medical use head-mounted display, program of medical use head-mounted display, and control method of medical use head-mounted display
CN113138664A (en) Eyeball tracking system and method based on light field perception
GB2611579A (en) Methods and systems for interpupillary distance measurement
CN110414302A (en) Non-contact pupil distance measuring method and system
CN203882018U (en) 3D glasses and 3D display system
JP2018047096A (en) Optometric apparatus
KR102458553B1 (en) Method and device for virtual reality-based eye health measurement

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)