WO2013125707A1 - 眼球回旋測定装置、眼球回旋測定方法、及び、眼球回旋測定プログラム - Google Patents
眼球回旋測定装置、眼球回旋測定方法、及び、眼球回旋測定プログラム Download PDFInfo
- Publication number
- WO2013125707A1 WO2013125707A1 PCT/JP2013/054609 JP2013054609W WO2013125707A1 WO 2013125707 A1 WO2013125707 A1 WO 2013125707A1 JP 2013054609 W JP2013054609 W JP 2013054609W WO 2013125707 A1 WO2013125707 A1 WO 2013125707A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- blood vessel
- eyeball
- eyeball rotation
- image
- iris
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
Definitions
- the present invention relates to an eyeball rotation measuring device, an eyeball rotation measuring method, and an eyeball rotation measuring program.
- Eyeball rotation The eyeball of a living organism not only can be rotated in the vertical direction and the horizontal direction, but can also rotate with the axis of the line of sight as the rotation axis. This rotation is referred to as “eyeball rotation”. Eyeball rotation is caused, for example, by the action of returning to the original state when the head is tilted. In addition, since eyeball rotation also occurs when the person is in a bad mood such as a motion sickness state or an image sickness state, the eyeball rotation measuring technique can be a useful technique in the field of medical treatment or nursing care.
- a method for recognizing a shade pattern of an iris image (hereinafter referred to as an iris shade pattern) in an eyeball image and measuring the eyeball rotation based on the recognition result has been disclosed (for example, refer nonpatent literature 1).
- this document uses the fact that the deformation of the iris shade pattern is proportional to the pupil diameter, and corrects the shade pattern of the deformed iris, thereby suppressing the deterioration of the measurement accuracy of the eyeball rotation due to the pupil diameter change. The technology is described.
- an object of the present invention is to provide an eyeball rotation measuring device, an eyeball rotation measuring method, and an eyeball rotation measurement capable of measuring the eyeball rotation with higher accuracy. Is to provide a program.
- an eyeball rotation measuring device recognizes the position of a blood vessel in a white eye region in an eyeball image and acquires information related to the position of the blood vessel, and actual measurement of eyeball rotation.
- First angle calculation for calculating an eyeball rotation angle based on first information on the position of a predetermined blood vessel acquired by the blood vessel position recognition unit at times and second information on the position of the predetermined blood vessel in the reference state A section.
- the “white eye region” means a region including the sclera and the conjunctiva.
- the “reference state” refers to, for example, a state in which the state of the subject is a healthy state without stress and the rotation angle of the eyeball is regarded as zero degrees.
- the eyeball rotation measuring method of the present invention recognizes the position of a predetermined blood vessel in the white eye region in the eyeball image acquired at the time of actual measurement of the eyeball rotation, and acquires first information regarding the position of the predetermined blood vessel. And calculating the eyeball rotation angle based on the first information and the second information regarding the position of the predetermined blood vessel in the reference state.
- the eyeball rotation measurement program of the present invention is an eyeball rotation measurement program for causing the information processing apparatus to execute and execute each process of the eyeball rotation measurement method of the present invention.
- the eyeball rotation can be measured with higher accuracy.
- FIG. 1 is a hardware configuration diagram of an eyeball rotation measuring apparatus 1 according to the first embodiment of the present invention.
- FIG. 2 is a functional configuration diagram of the eyeball rotation measuring device 1 according to the first embodiment of the present invention.
- FIG. 3 is a diagram illustrating a pupil region, an iris region, a white eye region, a conjunctival blood vessel, and end points of the conjunctival blood vessel in an eyeball image.
- FIG. 4 is a diagram for explaining a method of generating vector data in a virtual ellipse.
- FIG. 5A and 5B show the distribution of each element of the difference vector data f ref in the reference state, the distribution of each element of the difference vector data f d at the actual measurement of the eyeball rotation, and the cross-correlation function R (k). It is a figure which shows a relationship.
- FIG. 6 is an example of the original image of the search area acquired in the eyeball rotation measurement method according to the first embodiment.
- FIG. 7 is an example of an image after the binarization process is performed on the original image in the search area.
- FIG. 8 is an example of an image after the noise removal processing is performed on the blood vessel portion in the search region subjected to the binarization processing by the Hilditch thinning algorithm.
- FIG. 6 is an example of the original image of the search area acquired in the eyeball rotation measurement method according to the first embodiment.
- FIG. 7 is an example of an image after the binarization process is performed on the original image in the search area.
- FIG. 8 is an example of an image after the noise removal processing is performed on
- FIG. 9 is a diagram showing the relationship between the reference end point O ref of the conjunctival blood vessel and the target end point O tgt of the conjunctival blood vessel V selected during actual measurement.
- Figure 10 is a rotational angle coordinates theta Tgt0 eligible end point O Tgt0 of conjunctival blood vessels in the reference state, the rotation angular coordinate theta tgt eligible end point O tgt selected conjunctival vessels in actual measurement, the eyeball rotation angle theta It is a figure which shows a relationship.
- FIG. 11 is a flowchart illustrating a processing procedure of the eyeball rotation measurement method according to the first embodiment.
- FIG. 11 is a flowchart illustrating a processing procedure of the eyeball rotation measurement method according to the first embodiment.
- FIG. 12 is a diagram for explaining a process for determining a boundary between the iris region and the white eye region in the eyeball rotation measurement method according to the second embodiment of the present invention.
- FIG. 13 is a diagram for explaining a process for determining a boundary between an iris region and a white-eye region in the eyeball rotation measurement method according to the second embodiment of the present invention.
- FIG. 14 is a diagram illustrating a detection region when detecting end points of a plurality of conjunctival blood vessels in the second embodiment.
- FIGS. 15A to 15C are diagrams for explaining extraction processing of end points of conjunctival blood vessels in the second embodiment.
- FIG. 16 is a diagram illustrating the relationship between the end points of a plurality of conjunctival blood vessels detected in the reference state and the template image area of each conjunctival blood vessel.
- FIG. 17 is a diagram illustrating a relationship between an area (matching area) to be subjected to template matching processing set during actual measurement of eyeball rotation and a reference end point when the area is set.
- FIG. 18 is a diagram for explaining a method for setting an area (matching area) on which template matching processing is performed.
- FIG. 19 is a flowchart illustrating a processing procedure of an eyeball rotation measurement method according to the second embodiment.
- 20A and 20B are diagrams illustrating the results of an evaluation experiment performed using the eyeball rotation measurement method according to the second embodiment.
- FIG. 21 is a diagram for explaining a method for setting a region (matching area) to be subjected to template matching processing in the first modification.
- the iris shading pattern (iris crest) has the following characteristics, for example.
- the contrast of the shade pattern of the iris is low.
- the position of the iris pattern moves as the pupil diminishes.
- the central portion of the iris region having a relatively high contrast disappears. Since the contrast is low at the outer edge of the iris region that does not disappear even if the pupil is dilated, sufficient resolution (accuracy) of the eyeball rotation angle cannot often be obtained.
- the present invention provides an eyeball rotation measurement technique that can accurately measure the eyeball rotation even when the pupil diameter of the eyeball changes greatly.
- FIG. 1 is a hardware configuration diagram of an eyeball rotation measuring apparatus 1 according to the first embodiment.
- the eyeball rotation measuring device 1 includes, for example, a CPU (Central Processing Unit) 10, a drive device 12, an auxiliary storage device 16, a memory device 18, an interface device 20, an input device 22, a display device 24, and an image. And an input interface 26. These components are connected to each other via a bus, a serial line, or the like.
- a CPU Central Processing Unit
- the CPU 10 includes, for example, an arithmetic processing unit having a program counter, an instruction decoder, various arithmetic units, an LSU (Load Store Unit), a general-purpose register, and the like.
- an arithmetic processing unit having a program counter, an instruction decoder, various arithmetic units, an LSU (Load Store Unit), a general-purpose register, and the like.
- the drive device 12 is a device that reads a program, data, and the like from the storage medium 14 mounted therein.
- the storage medium 14 is a portable storage medium such as a CD (Compact Disc), a DVD (Digital Versatile Disc), or a USB (Universal Serial Bus) memory.
- the auxiliary storage device 16 is configured by, for example, an HDD (Hard Disk Drive), a flash memory, or the like.
- the program when the storage medium 14 on which the program is recorded is loaded into the drive device 12, the program is installed from the storage medium 14 to the auxiliary storage device 16 via the drive device 12.
- the program installation method is not limited to this example.
- the interface device 20 may download a program from another computer via a network and install the downloaded program in the auxiliary storage device 16.
- the network includes the Internet, a LAN (Local Area Network), a wireless network, and the like.
- the program may be stored (implemented) in advance in the auxiliary storage device 16 or a ROM (Read Only Memory) (not shown) when the eyeball rotation measuring device 1 is shipped.
- the memory device 18 includes, for example, a storage device such as a RAM (Random Access Memory) or an EEPROM (Electrically-Erasable-and-Programmable-Read-Only Memory).
- the interface device 20 is connected to the various networks described above, and performs input / output operations such as predetermined data and programs to various external devices via the network.
- the input device 22 includes various input operation devices such as a keyboard, a mouse, a button, a touch pad, a touch panel, and a microphone.
- the display device 24 is configured by a display device such as an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube).
- the eyeball rotation measuring device 1 may include various output devices such as a printer and a speaker.
- the image input interface 26 is connected to the camera 30. Then, the image input interface 26 outputs the image data input from the camera 30 to the memory device 18 and the auxiliary storage device 16.
- the camera 30 is an imaging device such as a CCD (Charge Coupled Device) camera or a CMOS (Complementary Metal Oxide Semiconductor) camera, and outputs captured image data to the image input interface 26.
- the image may be a still image or a moving image.
- the eyeball of the subject is photographed by the camera 30.
- light emitted from each of an infrared LED (Light Emitting Diode) 32 and a blue LED 34 is irradiated to the eyeball.
- the infrared LED 32 irradiates the eyeball with infrared rays in order to emphasize the pupil and to accurately image the shade pattern of the iris.
- the blue LED 34 irradiates the white eye region (a region including the sclera and the conjunctiva) of the eyeball with blue light in order to increase the contrast of the conjunctival blood vessel (hereinafter simply referred to as a blood vessel) in the eyeball.
- a blood vessel conjunctival blood vessel
- FIG. 2 is a functional configuration diagram of the eyeball rotation measuring device 1 of the present embodiment.
- the eye rotation measurement apparatus 1 includes a reference information acquisition unit 40, an iris pattern recognition processing unit 42 (second angle calculation unit), and a blood vessel position recognition unit 44 as functional blocks that function when the CPU 10 executes a program. And an angle calculation unit 46 (first angle calculation unit).
- Each of these functional blocks inputs / outputs various information (data) to / from the memory device 18 and the auxiliary storage device 16, for example.
- the memory device 18 and the auxiliary storage device 16 store various information output from each functional block.
- each functional block may be a block realized by a program clearly separated from each other, or a block realized by a program called by another program, such as a subroutine or a function. May be.
- Some or all of these functional blocks may be configured by hardware such as LSI (Large Scale Integrated Circuit), IC (Integrated Circuit), FPGA (Field Programmable Gate Array), and the like.
- a predetermined blood vessel one blood vessel located near the boundary between the iris region and the white eye region in the eyeball image in the reference state is selected, and the selection is performed.
- the position information of the end point on the pupil side of the blood vessel (target end point O tgt0 described later) is acquired.
- a conventional measurement method using an iris shade pattern is applied to the eyeball image at the time of actual measurement, and the eyeball rotation angle (the first eyeball rotation described later) is applied with a certain degree of accuracy.
- An angle ⁇ 1) is obtained.
- the eyeball rotation angle calculated by the conventional measurement method using the density pattern of the iris from among the end points of the plurality of blood vessels near the boundary between the iris region and the white eye region, which is detected in the eyeball image at the time of actual measurement A blood vessel corresponding to a predetermined blood vessel selected in the reference state is specified using (first eyeball rotation angle ⁇ 1 described later).
- position information of the identified blood vessel end point (target end point O tgt described later) is acquired. Then, using the positional information of the end points of the blood vessels acquired in the reference state and the actual measurement, the eyeball rotation angle is determined with higher accuracy.
- the reference information acquisition unit 40 is performed before the actual measurement of the eyeball rotation and acquires various reference information necessary for the measurement of the eyeball rotation. To do.
- the acquisition process of various reference information performed by the reference information acquisition unit 40 is different from the process at the time of actual measurement of eyeball rotation (for example, when actually measuring the health condition of the subject).
- an eyeball is photographed using the above-described two types of illumination (infrared LED 32 and blue LED 34).
- infrared LED 32 and blue LED 34 the above-described two types of illumination
- infrared rays are irradiated to the pupil region and the iris region
- blue light is irradiated to the periphery (white eye region) of the blood vessel.
- the reference information acquisition unit 40 analyzes the eyeball image of the subject acquired in the reference state, and calculates the difference vector data f ref of the shade pattern of the iris in the reference state, the end point of the predetermined blood vessel (target end point O tgt0 ) Position information, pupil ellipse parameters (for example, information about the center point of the pupil contour, the long axis value, the short axis value, the inclination of the ellipse (rotation angle of the long axis), etc.).
- pupil ellipse parameters for example, information about the center point of the pupil contour, the long axis value, the short axis value, the inclination of the ellipse (rotation angle of the long axis), etc.
- the reference information acquisition unit 40 outputs the acquired various information to, for example, the memory device 18 or the auxiliary storage device 16.
- various reference information is acquired by the reference information acquisition unit 40 as described above, and the various reference information is stored in the memory device 18 or the auxiliary storage device 16.
- the position information of the end point (target end point O tgt0 ) of the predetermined blood vessel output from the reference information acquisition unit 40 is the X-axis direction and the Y-axis direction as the horizontal direction and the vertical direction of the eyeball image in the reference state, respectively.
- the coordinates of the XY orthogonal coordinate system see FIG. 3 described later).
- the iris pattern recognition processing unit 42 is recognized in the eyeball image of the same subject as the subject at the time of acquiring reference information, which is taken by the camera 30 at the time of actual measurement of eyeball rotation. Based on the shade pattern of the iris, the eyeball rotation angle ⁇ 1 (hereinafter referred to as the first eyeball rotation angle ⁇ 1) is calculated with a certain degree of accuracy.
- the processing content of the iris pattern recognition processing unit 42 will be described in more detail.
- FIG. 3 is a diagram illustrating an example of an eyeball image 100 captured by the camera 30 and a pupil region 101, an iris region 102, a white eye region 103, a blood vessel 104, and an end point 104 a of the blood vessel 104 in the eyeball image 100.
- the pupil region 101 and its contour (pupil contour) in the image are easily extracted by binarizing the eyeball image 100. be able to.
- the iris pattern recognition processing unit 42 performs, for example, ellipse fitting processing on the extracted pupil contour, and elliptic parameters (for example, coordinates of the pupil center, length of the pupil center) obtained by the processing are performed. Axis value, minor axis value, ellipse inclination, etc.). Then, the iris pattern recognition processing unit 42 outputs the acquired elliptical parameters of the pupil contour to the memory device 18 and the auxiliary storage device 16. As shown in FIG.
- the inclination of the ellipse included in the ellipse parameter is the long-axis Y in the ellipse indicating the pupil contour. This is the tilt angle (rotation angle counterclockwise with respect to the Y axis) with respect to the axis (zero degree).
- the procedure for the ellipse fitting process is as follows. First, the iris pattern recognition processing unit 42 prepares an ellipse having predetermined ellipse parameters, for example. Next, the iris pattern recognition processing unit 42 superimposes the ellipse on the pupil outline while changing the rotation angle (long axis inclination), flatness, and size of the ellipse. Then, the iris pattern recognition processing unit 42 searches for and extracts an ellipse whose divergence between the ellipse and the pupil contour is less than a predetermined range. Note that various methods can be used as the ellipse extraction method at this time, and for example, a method such as a least square method can be used.
- the elliptical parameter of the pupil contour acquired when acquiring the reference information is also acquired in the same manner as the above method.
- the above processing of the iris pattern recognition processing unit 42 may be performed by the reference information acquisition unit 40 to acquire the elliptical parameters of the pupil contour, or the above processing is performed by the iris pattern recognition processing unit 42.
- the reference information acquisition unit 40 may acquire the calculation result from the iris pattern recognition processing unit 42.
- the iris pattern recognition processing unit 42 recognizes a boundary line between the iris region 102 and the white eye region 103 based on the binarized eyeball image 100, and the boundary line An ellipse fitting process is performed on the. Then, the iris pattern recognition processing unit 42 outputs the ellipse parameters of the boundary line (outer edge of the iris region 102) between the iris region 102 and the white eye region 103 obtained by the ellipse fitting processing to the memory device 18 and the auxiliary storage device 16. To do.
- the iris pattern recognition processing unit 42 sets a virtual ellipse at a position outside the pupil contour and inside the outer edge of the iris region 102.
- the iris pattern recognition processing unit 42 samples, for example, the pixel values of pixels existing on the virtual ellipse at predetermined angles ⁇ s along the center of the virtual ellipse.
- Vector data f is generated.
- a virtual ellipse E is set with a diameter slightly larger than the diameter (short axis or long axis) of the pupil when the subject is dark-adapted and the pupil is most open.
- the diameter of the virtual ellipse E may be set to about 7.5 mm.
- the ellipse parameters other than the diameter of the virtual ellipse E are the same as the corresponding ellipse parameters of the pupil contour.
- the reason why the virtual ellipse E is set as described above is as follows. Since the contrast of the shade pattern of the iris increases in a region close to the pupil, it is desirable to extract a pixel value (generate vector data f) by setting a virtual ellipse E as close to the pupil as possible. However, it is necessary to avoid the virtual ellipse E from overlapping the pupil when the pupil is most open. Therefore, as described above, it is preferable to set the diameter (short axis or long axis) of the virtual ellipse E to a diameter slightly larger than the diameter of the pupil when the subject's pupil is most open. The present invention is not limited to this. For example, a virtual ellipse E may be set at a substantially intermediate position between the contour position of the pupil and the outer edge position of the iris region 102.
- the iris pattern recognition processing unit 42 includes the pixel value of each pixel (each element Zi of the vector data f) included in the vector data f of the virtual ellipse E and the adjacent pixel element that is separated from the pixel by the angle ⁇ s. It obtains the difference between the pixel value of the pixel (adjacent elements of the vector data f), to generate a difference vector data f d.
- the difference vector data f ref of the iris shading pattern acquired when the reference information is acquired is also calculated in the same manner as the above method.
- the calculation process of the iris pattern recognition processing unit 42 may be performed by the reference information acquisition unit 40 to acquire the difference vector data f ref of the shade pattern of the iris in the reference state.
- the process may be performed by the iris pattern recognition processing unit 42, and the calculation result may be acquired by the reference information acquisition unit 40 from the iris pattern recognition processing unit 42.
- the iris pattern recognition processing unit 42 calculates a cross-correlation function R (k) between the difference vector data f ref of the shade pattern of the iris in the reference state and the difference vector data f d calculated at the actual measurement of the eyeball rotation.
- the cross-correlation function R (k) is represented by the following formula (1).
- the parameter “m” is the index of each element Zi of the vector data f
- the parameter “k” is the relative index of each element of the difference vector data f d to the difference vector data f ref . It is the amount of shift.
- the sum “ ⁇ ” in the equation (1) is the sum for the parameter “m”.
- the cross-correlation function R (k) includes the difference vector data f ref obtained by shifting the index by k for each element of the difference vector data f ref and each element of the difference vector data f ref. This is a parameter obtained by multiplying each element of d and summing the multiplication results. Therefore, the value of the cross-correlation function R (k) is such that the distribution of each element of the difference vector data f d obtained by shifting the index of each element by k is the largest in the distribution of each element of the difference vector data f ref in the reference state. It gets the maximum when it gets close.
- the iris pattern recognition processing unit 42 specifies the value (k p ) of the parameter k that maximizes the cross-correlation function R (k), and the angle (k p ) corresponding to the specified value (k p ) of the parameter k.
- k p ⁇ ⁇ s) is calculated as the first eyeball rotation angle ⁇ 1 (reference eyeball rotation angle).
- FIG. 5A and 5B show the distribution of each element of the difference vector data f ref in the reference state, the distribution of each element of the difference vector data f d at the actual measurement of the eyeball rotation, and the cross-correlation function R (k). It is a figure which shows a relationship.
- the length (vertical axis) of the bar graph shown in each figure is the pixel value Zi of each element of the vector data, and the horizontal axis of the bar graph is the index m of each element.
- FIG. 5B is a diagram illustrating a relationship between the difference vector data when the relative shift amount of the index of each element of the difference vector data f d with respect to the difference vector data f ref is “k p ”.
- shifting the index of each element of the vector data by k means that the elements are arranged in a predetermined direction along the arrangement direction (rightward in the example shown in FIG. 5B), as shown in FIG. 5B. , K is shifted by the index.
- the blood vessel position recognizing unit 44 recognizes the position of the blood vessel 104 in the white eye region 103 and outputs information related to the position of the recognized blood vessel 104 to, for example, the memory device 18 or the auxiliary storage device 16 To do.
- the white eye region 103 is a region outside the iris region 102 recognized by the iris pattern recognition processing unit 42 in the eyeball image 100.
- the blood vessel 104 recognized (specified) by the blood vessel position recognition unit 44 is a blood vessel corresponding to a predetermined blood vessel selected by the reference information acquisition unit 40 at the time of reference information acquisition (hereinafter also referred to as a corresponding blood vessel). .
- the blood vessel position recognizing unit 44 refers to the reference information stored in the memory device 18 or the auxiliary storage device 16 and corresponds to the correspondence existing near the outer edge of the iris region 102 in the eyeball image 100 at the time of actual measurement.
- the position information of the end point of the blood vessel (hereinafter referred to as “target end point”) is acquired.
- the blood vessel position recognition unit 44 outputs the acquired position information (coordinate information) of the target end point of the corresponding blood vessel to, for example, the memory device 18 or the auxiliary storage device 16.
- the processing content performed by the blood vessel position recognition unit 44 will be described in more detail.
- the blood vessel position recognition unit 44 reads position information (coordinate information) of the target end point O tgt0 of a predetermined blood vessel in the reference state acquired in advance by the reference information acquisition unit 40 from the memory device 18 or the auxiliary storage device 16. Next, the blood vessel position recognizing unit 44 searches a region for specifying the position of the corresponding blood vessel in the eyeball image 100 acquired at the actual measurement based on the position information of the target end point O tgt0 of the predetermined blood vessel in the read reference state. Set.
- the blood vessel position recognizing unit 44 sets a target end point O tgt0 of a predetermined blood vessel in the reference state as the first eyeball rotation angle ⁇ 1 and a rotated point as the reference end point O ref (assumed end point position), and the reference end point A search area having a predetermined shape centering on O ref (in this embodiment, a square shape as shown in FIGS. 6 to 8 described later) is set in the eyeball image 100.
- the blood vessel position recognition unit 44 performs, for example, a smoothing process on the image of the search area to remove noise.
- the blood vessel position recognizing unit 44 performs binarization processing on the search region image subjected to the smoothing processing according to the pixel value, and recognizes a set portion of pixels having a lower pixel value as a blood vessel. To do. At this time, a plurality of blood vessels are recognized in the search area (see FIG. 7 described later).
- the blood vessel position recognizing unit 44 applies noise reduction processing to the set portion of pixels recognized as blood vessels by applying the Hilditch thinning algorithm. Next, the blood vessel position recognition unit 44 measures the length of the blood vessel by performing a depth-first search on the thin line portion. Then, the blood vessel position recognizing unit 44 extracts only blood vessels having a length equal to or greater than a predetermined value from the plurality of recognized blood vessels.
- a 3 ⁇ 3 window that refers to the pixel of interest in the search area and the surrounding 8 pixels is used as the basic processing unit of the noise removal process. Then, the blood vessel position recognizing unit 44 performs thinning processing by performing noise removal using each pixel as a target pixel while raster scanning the entire image data of the search region.
- the blood vessel position recognizing unit 44 determines whether or not the pixel of interest matches a deletion condition by thinning defined in advance in a 3 ⁇ 3 window.
- the blood vessel position recognition unit 44 deletes the target pixel, that is, replaces the graphic pixel (the blood vessel region pixel) with the background pixel (the white eye region pixel).
- the blood vessel position recognition unit 44 sets the next pixel in the raster scan order as the pixel of interest, and performs pixel-of-interest determination processing and graphic pixel replacement in a new 3 ⁇ 3 window. Process.
- the blood vessel position recognition unit 44 repeats the above-described series of processes for all the pixels in the search region. Next, the blood vessel position recognizing unit 44 repeats the series of processes described above while raster scanning the search area until there are no pixels to be deleted in one cycle of raster scanning. The blood vessel position recognizing unit 44 ends the noise removal processing by the thinning method of Hilditch when there are no pixels to be deleted in one cycle of raster scanning.
- FIGS. 6 to 8 show the original image of the corresponding blood vessel search region, the image of the search region after the binarization process, and the noise removal after applying the Hilditch thinning algorithm, respectively. It is an example of the image of a search area.
- the blood vessel position recognition unit 44 specifies the position of the target end point of each extracted blood vessel based on the image data of the search region from which the noise has been removed. Next, the blood vessel position recognition unit 44 selects the target end point closest to the reference end point O ref from the target end points of the plurality of specified blood vessels, and selects the target end point in the eyeball image 100 at the time of actual measurement. It is set as the target end point O tgt of the blood vessel.
- FIG. 9 is a diagram showing the relationship between the reference end point O ref that is the center of the search region and the target end point O tgt of the corresponding blood vessel V recognized (specified) by the blood vessel position recognition unit 44.
- the blood vessel position recognition unit 44 outputs the position information of the target end point O tgt of the identified corresponding blood vessel V (first information regarding the position of the predetermined blood vessel) to the memory device 18 and the auxiliary storage device 16, for example.
- the position information of the target end point O tgt of the corresponding blood vessel V output from the blood vessel position recognition unit 44 is the X axis direction and the Y axis direction in the horizontal direction and the vertical direction of the eyeball image 100 (search region) at the time of actual measurement, respectively.
- search region the time of actual measurement
- the target end point O tgt0 of a predetermined blood vessel selected in the reference state is obtained as follows. First, a predetermined search area is set near the outer edge of the iris area. Next, the set of predetermined search region image data is subjected to a series of the above processing from the binarization processing performed by the blood vessel position recognition unit 44 to the noise removal processing by the Hilditch thinning method, and the search region The end points of a plurality of blood vessels are extracted. Then, a target end point O tgt0 of a predetermined blood vessel in the reference state is selected from the extracted end points of the plurality of blood vessels. These processes may be performed by the reference information acquisition unit 40, or these processes are performed by the blood vessel position recognition unit 44, and the calculation result is acquired by the reference information acquisition unit 40 from the blood vessel position recognition unit 44. Also good.
- the setting position and size of the search region when selecting a predetermined blood vessel when acquiring the reference information can be arbitrarily set.
- a criterion for selecting the target end point O tgt0 of a predetermined blood vessel from the end points of the plurality of blood vessels extracted in the search region is also arbitrary.
- a blood vessel whose end point is in contact with the outer edge of the iris region of the eyeball as the predetermined blood vessel.
- a blood vessel in which the coordinates of the end point of the blood vessel in the Y-axis direction are closer (more preferably, the same) to the Y-axis direction coordinate of the pupil center as the predetermined blood vessel.
- the tracking accuracy of the blood vessels can be improved, and the eyeball rotation angle can be measured more reliably and accurately.
- the search region in the reference state is also near the outer edge of the iris region, and the coordinate of the center point of the search region in the Y-axis direction is closer to the coordinate of the pupil center in the Y-axis direction (more preferably). Are preferably set to be the same).
- the angle calculation unit 46 detects the position information of the target end point O tgt0 of the predetermined blood vessel selected in the reference state, and the corresponding blood vessel (predetermined) at the time of actual measurement of eyeball rotation.
- the eyeball rotation angle ⁇ is calculated based on the position information of the target end point O tgt of the blood vessel corresponding to the blood vessel). Specifically, the eyeball rotation angle ⁇ is obtained as follows.
- an elliptic coordinate system (elliptic polar coordinate system) along the pupil contour (pupil ellipse) is set.
- the position coordinates of the target end point of the blood vessel are defined as a short axis coordinate w (short axis value) and a long axis coordinate h (long axis value) passing through the pupil center P (ellipse center) of the pupil contour, and an ellipse.
- an elliptical coordinate system (w, h, ⁇ p) represented by a counterclockwise rotation angle coordinate ⁇ p with the major axis being zero degrees (reference) is set.
- the coordinates (second information regarding the position of the predetermined blood vessel) in the XY orthogonal coordinate system of the target end point O tgt0 of the predetermined blood vessel acquired at the time of acquiring the reference information are set as (Ix tgt0 , Iy tgt0 ),
- the coordinates of the pupil center P in the reference state in the XY orthogonal coordinate system are (x tgt0 0, y tgt0 0).
- the short axis value of the pupil contour in the reference state (the coordinate in the short axis direction of the pupil contour in the elliptical coordinate system) is defined as atgt0
- the long axis value of the pupil contour (the coordinate in the long axis direction of the pupil contour in the elliptical coordinate system) ) Is b tgt0
- the inclination of the major axis direction of the elliptic coordinate system with respect to the Y axis direction of the XY orthogonal coordinate system is ⁇ tgt0 .
- the rotation angle coordinate ⁇ p of the target end point O tgt0 of the predetermined blood vessel in the elliptic coordinate system is ⁇ tgt0
- the short axis coordinate w is w tgt0
- the long axis coordinate h is h tgt0 .
- parameters other than the rotation angle coordinate ⁇ tgt0 , the short axis coordinate w tgt0, and the long axis coordinate h tgt0 in the elliptic coordinate system of the target end point O tgt0 are used when the reference information is acquired. It is acquired in advance by the acquisition unit 40 and stored, for example, in the memory device 18 or the auxiliary storage device 16.
- the relational expression of these parameters in the reference state is expressed by the following expressions (2), (3) and (4).
- the following equations (2), (3) and (4) are an equation system for converting the coordinates (w, h, ⁇ p) of the elliptic coordinate system into the coordinates (x, y) of the XY orthogonal coordinate system. It is. Therefore, the angle calculation unit 46 calculates the rotation angle coordinate ⁇ of the target end point O tgt0 of the predetermined blood vessel selected in the reference state by calculating back the equation system of the following equations (2), (3), and (4). tgt0 can be calculated.
- the coordinates (first information regarding the position of a predetermined blood vessel) in the XY orthogonal coordinate system of the target end point O tgt of the corresponding blood vessel specified at the actual measurement of the eyeball rotation are set as (Ix tgt , Iy tgt ).
- the coordinates in the XY orthogonal coordinate system of the pupil center P at the time of measurement are defined as (x tgt 0, y tgt 0).
- the minor axis value of the pupil contour is set to a tgt
- the major axis value of the pupil profile is set to b tgt
- the inclination of the major axis direction of the elliptic coordinate system with respect to the Y axis direction of the XY orthogonal coordinate system is Let ⁇ tgt .
- the rotation angle coordinate ⁇ p of the target end point O tgt of the corresponding blood vessel in the elliptic coordinate system is ⁇ tgt
- the short axis coordinate w is w tgt
- the long axis coordinate h is h tgt .
- parameters other than the rotation angle coordinate ⁇ tgt , the short axis coordinate w tgt, and the long axis coordinate h tgt in the elliptic coordinate system of the target end point O tgt are the iris pattern recognition processing unit 42. And is calculated in advance by the processing of the blood vessel position recognition unit 44 and stored in, for example, the memory device 18 or the auxiliary storage device 16.
- the angle calculation unit 46 calculates the rotation angle coordinates of the target end point O tgt of the corresponding blood vessel specified at the actual measurement of the eyeball rotation by calculating back the equation system of the following formulas (5), (6), and (7). ⁇ tgt can be calculated.
- the angle calculation unit 46 subtracts the rotation angle coordinate ⁇ tgt0 of the target end point O tgt0 of the predetermined blood vessel in the reference state from the rotation angle coordinate ⁇ tgt of the target end point O tgt of the corresponding blood vessel calculated as described above.
- the eyeball rotation angle ⁇ is calculated. That is, the angle calculation unit 46 calculates the eyeball rotation angle ⁇ based on the following formula (8).
- FIG. 10 shows the relationship between the ellipse coordinate system and the XY orthogonal coordinate system set when calculating the eyeball rotation angle ⁇ , and the rotation angle coordinate ⁇ of the target end point O tgt0 of the predetermined blood vessel selected in the reference state.
- tgt0 is a diagram showing the relationship between the rotational angle coordinates theta tgt and the eyeball rotation angle theta of the target end point O tgt of the identified corresponding vessel in actual measurement.
- the size and flatness of a virtual ellipse passing through the target end point O tgt0 acquired in the reference state (when acquiring reference information) are acquired during actual measurement of eyeball rotation.
- the virtual ellipse passing through the target end point O tgt is shown to have the same size and flatness.
- FIG. 11 is a flowchart showing a processing procedure of an eyeball rotation measuring method executed by the eyeball rotation measuring apparatus 1 of the present embodiment.
- the eyeball rotation measuring device 1 determines whether or not the current measurement mode is the reference information acquisition mode (S100). This determination process is executed as follows, for example. First, a control flag indicating whether or not the current measurement mode is the reference information acquisition mode is prepared in advance. Consider a case where the control flag is turned on when some operation is performed on the eyeball rotation measuring device 1 by an operator or the like and the reference information acquisition mode is set as the measurement mode. In this case, the eyeball rotation measuring device 1 determines whether or not the current measurement mode is the reference information acquisition mode by determining the on / off state of this control flag in S100. Note that the timing of performing the determination process in S100 may be, for example, the time when the reference information acquisition mode is set by an operator or the like, or may be the time when the first image is input.
- the eyeball rotation measuring device 1 analyzes the eyeball image of the subject acquired in the reference state, and the difference vector data f ref of the shade pattern of the iris in the reference state and the position information of the target end point O tgt0 of the predetermined blood vessel. Then, reference information such as the elliptical parameters of the pupil contour in the reference state (the pupil center position, the short axis value of the ellipse, the long axis value, the inclination of the long axis (rotation angle), etc.) is acquired (S102).
- the eyeball rotation measuring device 1 outputs the acquired reference information to the memory device 18 or the auxiliary storage device 16, and then ends the eyeball rotation measuring process.
- the eyeball rotation measuring device 1 acquires the various reference information in the reference state according to the processing operation of the reference information acquisition unit 40 described above.
- the eyeball rotation measuring device 1 analyzes the eyeball image at the time of actual measurement acquired from the same subject as when the reference information is acquired, and the first eyeball rotation is based on the shade pattern of the iris recognized in the eyeball image.
- the angle ⁇ 1 is calculated (S104). In the process of S104, the eyeball rotation measuring device 1 calculates the first eyeball rotation angle ⁇ 1 in accordance with the processing operation of the iris pattern recognition processing unit 42 described above.
- the eyeball rotation measuring device 1 recognizes the positions of a plurality of blood vessels in the white eye region in the eyeball image, and identifies the corresponding blood vessels (the blood vessels corresponding to the predetermined blood vessels selected at the time of acquiring reference information) from the plurality of blood vessels. To do. Then, the eyeball rotation measuring device 1 acquires position information (information on the position of the blood vessel) of the target end point O tgt (the blood vessel end on the pupil side) of the corresponding blood vessel (S106). In the process of S106, the eyeball rotation measuring device 1 acquires the position information of the target end point O tgt according to the processing operation of the blood vessel position recognition unit 44 described above. In S106, eyeball rotation measuring device 1 also acquires the elliptical parameter of the pupil outline at the time of actual measurement.
- the eyeball rotation measuring device 1 uses the ellipse parameters of the positional information and pupil contours of the target end point O tgt corresponding vessels during actual measurements obtained in S106, the target end point O tgt corresponding vessels in elliptical coordinate system
- the rotation angle coordinate ⁇ tgt is calculated, and further, the rotation angle of the target end point O tgt0 of the predetermined blood vessel in the elliptic coordinate system using the reference information acquired in S102 for the same subject as the actual measurement subject
- the coordinate ⁇ tgt0 is calculated (S108).
- the eyeball rotation measuring device 1 calculates the eyeball rotation angle ⁇ by subtracting the rotation angle coordinate ⁇ tgt0 from the rotation angle coordinate ⁇ tgt calculated in S108 (S110). In the processes of S108 and S110, the eyeball rotation measuring device 1 calculates the eyeball rotation angle ⁇ according to the processing operation of the angle calculation unit 46 described above.
- the eyeball rotation is measured as described above.
- the above-described eyeball rotation measurement process may be realized by mounting a corresponding eyeball rotation measurement program in the eyeball rotation measurement apparatus 1 and executing the eyeball rotation measurement program by the CPU 10.
- the eyeball rotation angle ⁇ is calculated based on the position information of the blood vessel end (target end point of the conjunctival blood vessel) having a large contrast with respect to the surroundings and a small influence due to pupil contraction. Eye rotation can be measured with higher accuracy. Specifically, according to this embodiment, the eyeball rotation angle ⁇ can be measured with a resolution of about several degrees, which is higher than that of the conventional method (method using an iris shading pattern). Eye rotation can be measured with high accuracy.
- the eyeball rotation angle ⁇ is calculated based on the position information of the blood vessel end (target end point) located in the vicinity of the outer edge of the iris region.
- the blood vessel end exists at a position closest to the pupil in the white eye region, the influence on the detection of the blood vessel end (target end point) due to the movement of the eyelid or the like is minimized. Therefore, the eyeball rotation can be measured with higher accuracy by using the position information of the blood vessel end (target end point) located in the vicinity of the outer edge of the iris region as in the present embodiment.
- the eyeball rotation angle ⁇ when the eyeball rotation angle ⁇ is measured, first, the eyeball rotation angle (first eyeball rotation angle) is obtained with a certain degree of accuracy by the eyeball rotation measurement method using the shade pattern of the iris. ⁇ 1) is calculated, and the target end point O tgt of the blood vessel (corresponding blood vessel) to be detected is specified using this eyeball rotation angle.
- the target end point O tgt of the corresponding blood vessel can be identified earlier during actual measurement, so that the eyeball rotation angle ⁇ can be calculated at a higher speed.
- Second Embodiment> In general, for example, substances such as eyelashes, dust, and cosmetic fine powder are likely to enter the eye, and when these substances are present in the vicinity of the outer edge of the iris region, these substances are used in the present invention for eye rotation. It may become a noise source that degrades the measurement accuracy. Therefore, in the present invention, in order to measure the eyeball rotation more correctly, at the time of measurement, from a plurality of conjunctival blood vessel end points (hereinafter also referred to as blood vessel ends for short) existing near the outer edge (contour) of the iris region. It is preferable to automatically select (specify) a blood vessel end that is not shielded by a substance (noise) such as eyelashes.
- an example of an eyeball rotation measuring apparatus and an eyeball rotation measuring method capable of such processing will be described.
- the hardware configuration and functional configuration of the eyeball rotation measuring apparatus 1 according to the present embodiment are the same as those in the first embodiment (see FIGS. 1 and 2). The description of is omitted.
- the eyeball rotation measurement method of the present embodiment first, the outer edge of the iris region is recognized, and a plurality of blood vessel ends existing in the vicinity of the outer edge are automatically detected. Next, pattern matching processing is performed on a predetermined image region (a matching area described later) including each detected blood vessel end using corresponding reference information (a template video described later). Then, based on the matching result (similarity described later), the blood vessel end (target end point of the blood vessel) with the smallest noise is automatically specified from the plurality of blood vessel ends. By using such a method, the measurement accuracy of the eyeball rotation angle ⁇ can be further improved.
- the above-described automatic blood vessel end selection method and the conventional eyeball rotation measurement method using the shade pattern of the iris described in the first embodiment Use together. Thereby, it is possible to measure the eyeball rotation at high speed and at low cost, and it is possible to measure the eyeball rotation angle with high accuracy and high resolution even when the pupil diameter changes.
- a moving image is used as the eyeball image.
- the infrared LED 32 is temporarily turned off during moving image shooting. This is to increase the contrast between the iris region 102 and the white-eye region 103 in the process of determining the contour of the iris region 102 described later.
- the reference information acquisition unit 40 analyzes the eyeball image of the subject acquired in the reference state, recognizes the boundary between the iris region 102 and the white eye region 103 (the outer edge of the iris region), and determines the coordinates of the boundary.
- decide. 12 and 13 are diagrams for explaining a method of determining the coordinates of the boundary between the iris region 102 and the white eye region 103 using an eyeball image.
- the reference information acquisition unit 40 first acquires an elliptic coordinate system along the pupil contour from the eyeball image. This process will be described more specifically. As described in the first embodiment, since the pupil region 101 has a lower pixel value than other regions, the reference information acquisition unit 40 determines the original image of the eyeball. Then, binarization processing is performed on the image to extract a region having a low pixel value as the pupil region 101. Next, the reference information acquisition unit 40 performs an ellipse approximation (ellipse fitting process) on the extracted contour of the pupil region 101 by the least square method.
- ellipse approximation ellipse fitting process
- the reference information acquisition unit 40 acquires the elliptical parameters of the pupil contour (pupil ellipse) (for example, information on the center point of the pupil contour, the long axis value, the short axis value, the rotation angle of the long axis, etc.). Then, the reference information acquisition unit 40 sets an elliptic coordinate system in which the major axis of the ellipse is used as a reference (zero degree) of the rotation direction coordinate from the ellipse parameters obtained by ellipse approximation.
- the major axis of the ellipse is used as a reference (zero degree) of the rotation direction coordinate from the ellipse parameters obtained by ellipse approximation.
- the coordinates in the ellipse coordinate system are based on the coordinates (x 0 , y 0 ) of the center 101a of the pupil contour in the XY orthogonal coordinate system of the camera image (eyeball image), and the coordinates h (long (Axis value) and the coordinate w (short axis value) in the minor axis direction, and the rotation angle coordinate ⁇ p in the counterclockwise direction with respect to the major axis.
- the inclination of the major axis of the elliptic coordinate system with respect to the Y axis (the vertical axis of the eyeball image) of the XY orthogonal coordinate system of the camera video is ⁇ .
- Equation (9) is an equation system for converting the coordinates (w, h, ⁇ p) of the elliptic coordinate system into the coordinates (x, y) of the XY orthogonal coordinate system.
- a is the minor axis value of the pupil contour (coordinate in the minor axis direction of the pupil contour in the elliptical coordinate system)
- b is the major axis value of the pupil contour (elliptical coordinate system). In the major axis direction of the pupil contour).
- the reference information acquisition unit 40 calculates the difference vector data of the shade pattern of the iris in the reference state, as in the first embodiment.
- the reference information acquisition unit 40 determines the contour of the iris region 102.
- the reference information acquisition unit 40 acquires an eyeball image when the infrared LED 32 is turned off.
- the reference information acquisition unit 40 calculates the coordinates of the center 101a of the pupil contour using the eyeball image one frame before when the infrared LED 32 is turned off.
- at least two eyeball images ie, an eyeball image before turning off the infrared LED 32 and an eyeball image after turning off the light, are required.
- an eyeball image for calculating the coordinates of the center 101a of the pupil contour an eyeball image two frames or more before the infrared LED 32 is turned off may be used.
- the eyeball image before turning off the infrared LED 32 used to calculate the coordinates of the center 101a of the pupil contour and the infrared LED 32 are used to more reliably track the blood vessel edge with respect to eye movement including eye movement and eye rotation. It is preferable that the interval with the eyeball image when is turned off is shorter.
- the reference information acquisition unit 40 in the eyeball image when the infrared LED 32 is extinguished as indicated by the dashed arrow in FIG. 12, the center 101a of the pupil region 101 calculated in advance from the eyeball image before the infrared LED 32 is extinguished.
- the pixel value of the pixel is sampled along the direction from the first to the white eye region 103 (X-axis direction in the example of FIG. 12).
- FIG. 13 shows a sampling result of pixel values.
- the horizontal axis of the characteristics shown in FIG. 13 is the distance from the pupil center 101a of the sample point (pixel) on the sampling line (broken line arrow) in FIG. 12, and the vertical axis is the pixel value.
- the reference information acquisition unit 40 uses an intermediate value between the minimum and maximum pixel values obtained from the characteristics shown in FIG. 13 as a threshold for determining the boundary position between the iris region 102 and the white eye region 103. Then, the reference information acquisition unit 40 sequentially refers to the pixel value of the sample point (pixel) from the pupil center 101a along the direction of the broken line arrow in FIG. 12, and the pixel value becomes larger than the threshold value.
- the coordinates of the obtained pixels are acquired as boundary coordinates between the iris region 102 and the white eye region 103 on the sampling line (hereinafter referred to as white eye-iris boundary coordinates).
- the reference information acquisition unit 40 uses the white-eye-iris boundary coordinates obtained as described above and the above equation (9) to determine an elliptical locus (an outer edge of the iris region) for determining the detection range of the blood vessel end. ).
- the reference information acquisition unit 40 performs white eye-iris boundary coordinates (x, y), pupil contour center coordinates (x 0 , y 0 ), and minor axis of the pupil contour in an XY orthogonal coordinate system.
- white eye-iris boundary coordinates x, y
- pupil contour center coordinates x 0 , y 0
- minor axis of the pupil contour in an XY orthogonal coordinate system.
- FIG. 14 shows an example of a blood vessel end detection region set based on an elliptical locus E orbit passing through the white eye-iris boundary coordinates.
- a region (white band-shaped region in FIG. 14) from the elliptical trajectory E orbit to a position 10 pixels outside is defined as a detection region 50 at the end of the blood vessel.
- the detection area 50 may be set over the entire circumference of an ellipse indicating the boundary between the iris area 102 and the white-eye area 103, or a predetermined angular range along the circulation direction of the ellipse. It may be set in a region (90 degrees, 180 degrees, etc.). That is, in the present embodiment, the detection area 50 may be provided in a part on an ellipse indicating the boundary between the iris area 102 and the white eye area 103.
- FIG. 15A to 15C are diagrams for explaining the flow of image processing for extracting a blood vessel end in the detection region 50.
- FIG. 15A is an original image of a part of the detection area 50
- FIG. 15B is an image obtained by binarizing the original image
- FIG. 15C is an image obtained by further binarizing the image. It is an image after performing thinning noise removal processing.
- the reference information acquisition unit 40 performs a smoothing process on the original image with a range of 19 ⁇ 19 pixels as a basic processing unit.
- the reference information acquisition unit 40 calculates the difference between the original image shown in FIG. 15A and the image that has been subjected to the smoothing process.
- the pixel value of the blood vessel is lower than the pixel value of the white eye region that is the background. Therefore, in the image subjected to the difference processing, the difference value of the blood vessel region is high, the difference value of the white eye region as a background is low, and the difference value is almost constant in the white eye region.
- the reference information acquisition unit 40 performs a smoothing process on the image obtained by the above difference process using a 3 ⁇ 3 pixel range as a basic processing unit, and removes sesamecio noise. And the reference
- standard information acquisition part 40 performs a binarization process using the threshold value with respect to the image after a smoothing process.
- a binary image as shown in FIG. 15B is obtained, and the blood vessel region (black portion in FIG. 15B) can be recognized by this binary image.
- the reference information acquisition unit 40 uses the Hilditch thinning algorithm for the blood vessel region in the image that has been binarized as described above, as in the first embodiment. Perform noise removal processing. Thereby, as shown in FIG. 15C, an image in which a plurality of fine lines (blood vessel portions) are drawn is obtained.
- the reference information acquisition unit 40 obtains a position closest to the pupil center 101a in each thin line obtained in the image shown in FIG. 15C, and sets this position as the position of the blood vessel end. Then, the reference information acquisition unit 40 extracts, for each blood vessel end, an image of a 20 ⁇ 20 pixel region centered on the blood vessel end from the eyeball image subjected to noise removal processing by the Hilditch thinning algorithm, An image of the extracted area is acquired as a template video (template image).
- FIG. 16 is a diagram showing a plurality of blood vessel ends 51 detected in the detection region 50 and a plurality of template video regions 52 corresponding to them. Note that the circle in FIG. 16 is the blood vessel end 51, and the region surrounded by the white square frame is the template image region 52.
- the reference information acquisition unit 40 obtains the ellipse coordinates (w, h, ⁇ p) of the plurality of blood vessel ends 51 in the detection region 50 thus obtained, and the plurality of blood vessel ends 51 respectively corresponding to the plurality of blood vessel ends 51.
- the template image with the difference vector data of the shade pattern of the iris in the reference state, the elliptical parameter of the pupil contour (for example, information on the position of the pupil center 101a, the long axis value, the short axis value, the rotation angle of the long axis, etc.)
- the reference information is output to the memory device 18 or the auxiliary storage device 16 (see FIG. 1).
- various reference information is acquired by the reference information acquisition unit 40 as described above, and the various reference information is stored in the memory device 18 or the auxiliary storage device 16.
- the elliptical coordinates (w, h, ⁇ p) of the blood vessel end 51 are calculated using the above equation (9).
- the processing content of the iris pattern recognition processing unit 42 in this embodiment is the same as that in the first embodiment. Specifically, the iris pattern recognition processing unit 42, similar to the first embodiment, difference vector data of the shade pattern of the iris recognized in the eyeball image acquired during the actual measurement of the eyeball rotation, and Based on the difference vector data of the shade pattern of the iris in the reference state, the first eyeball rotation angle ⁇ 1 (in this embodiment, expressed as “ ⁇ _iris”) is calculated.
- the blood vessel end obtained by using the first eyeball rotation angle ⁇ _iris (reference eyeball rotation angle) from the plurality of blood vessel ends recognized in the eyeball image acquired at the actual measurement of the eyeball rotation.
- ⁇ _iris reference eyeball rotation angle
- a method has been described in which the end point closest to the reference end point (O ref ) is specified (selected) as the blood vessel end for measuring eyeball rotation (the target end point O tgt of the corresponding blood vessel).
- a blood vessel end for eyeball rotation measurement is specified (selected) in the eyeball image at the time of actual measurement using a method different from that in the first embodiment.
- the outline of the method for specifying (selecting) a blood vessel end for eyeball rotation measurement in the present embodiment is as follows. First, in an eyeball image at the time of actual measurement, a matching process between an image of a predetermined area (matching area) set based on position information of each blood vessel end acquired at the reference state and a corresponding template image acquired at the reference state (Hereinafter referred to as template matching processing). Next, the blood vessel end having the maximum similarity obtained by the template matching process is selected as the blood vessel end for eye rotation measurement, and the eye rotation angle ⁇ is measured based on the selected blood vessel end.
- a template matching technique is used as a technique for selecting a blood vessel (blood vessel end) to be measured for the eyeball rotation angle ⁇ .
- the template matching process described above is performed in the blood vessel position recognition unit 44. Since the processing of the blood vessel position recognition unit 44 other than the template matching processing is the same as the processing described in the first embodiment, the description of the processing will be omitted here and the template matching processing will be described.
- the matching area is narrowed down using the first eyeball rotation angle ⁇ _iris calculated by the conventional eyeball rotation measurement method based on the shade pattern of the iris image (based on the above formula (1)). Do.
- FIG. 17 is a diagram illustrating an example of the matching area 62 to which the template matching process is applied and the reference end point 61 for determining the matching area 62 in the present embodiment.
- a white square point in FIG. 17 is the reference end point 61, and a region surrounded by a square white frame is the matching area 62.
- the blood vessel position recognizing unit 44 obtains the reference end point 61 of the matching area 62 using the first eyeball rotation angle ⁇ _iris calculated by the conventional eyeball rotation measurement method based on the shade pattern of the iris image.
- FIG. 18 is a diagram showing an outline of a method for calculating the reference end point 61 of the matching area 62.
- FIG. 18 only the matching area 62 for one blood vessel end 63 and its reference end point 61 are shown to simplify the description.
- the rotation angle coordinate (initial position) in the elliptic coordinate system of each blood vessel end 63 in the reference state is obtained.
- ⁇ _temp the blood vessel position recognizing unit 44 sets the matching area 62 at a position ( ⁇ _input) rotated further by the first eyeball rotation angle ⁇ _iris in the counterclockwise direction with respect to the major axis of the ellipse from the initial position ⁇ _temp.
- the shape of an ellipse passing through the position of each blood vessel end 63 is similar to the elliptical shape of the pupil contour.
- the coordinates (x, y) of the reference end point 61 in the XY orthogonal coordinate system of the camera image (eyeball image) are the rotation angle coordinates ⁇ _input of the reference end point 61 at the time of actual measurement of the eyeball rotation (the rotation component at the time of eye movement).
- the center coordinates (x0_input, y0_input) of the pupil ellipse during actual measurement, the inclination ⁇ _input of the pupil ellipse during actual measurement, and the coordinate h_input (major axis value) and minor axis of the ellipse passing through the reference end point 61 This can be obtained by substituting the direction coordinates w_input (short axis value) into the above equation (9).
- the elliptical parameters of these pupils at the time of actual measurement of eye rotation are acquired in advance before the first eye rotation angle ⁇ _iris is calculated by the iris pattern recognition processing unit 42, as in the first embodiment.
- the blood vessel position recognition unit 44 sets an area in a predetermined range centered on the reference end point 61 as the matching area 62. Thereby, as shown in FIG. 17, a plurality of reference end points 61 and a plurality of matching areas 62 respectively corresponding thereto are set. In the example shown in FIGS. 17 and 18, a range of 40 ⁇ 40 pixels centering on the reference end point 61 is set as the matching area 62. In this example, the size of the template image of each blood vessel end acquired when acquiring the reference information is 20 ⁇ 20 pixels.
- the blood vessel position recognizing unit 44 performs a matching process between the image of each matching area 62 (referred to as an input video) and the template image of the corresponding blood vessel end.
- the blood vessel position recognition unit 44 performs the pre-processing of the template matching process such that the maximum value of the pixel value is 255 and the minimum value is 0 for each of the template video and the input video. Processing is performed to increase the contrast of the blood vessel end.
- the blood vessel position recognizing unit 44 performs template matching using an image that has been subjected to contrast enhancement processing. Specifically, the blood vessel position recognition unit 44 calculates the similarity between the input image and the template video while moving (sliding) the corresponding template video within each matching area 62 pixel by pixel.
- the center position (position of the blood vessel end) of the template image to be shifted in each matching area 62 is referred to as a pixel position.
- an XY orthogonal coordinate system having a different origin from the XY orthogonal coordinate system of the eyeball image is separately provided.
- the pixel position (the position of the blood vessel end) of the template image in the XY orthogonal coordinate system set in each matching area 62 is expressed by coordinates (x t , y t ).
- each matching area 62 is 40 ⁇ 40 pixels and the size of each template video is 20 ⁇ 20 pixels
- the variation range of y t is less than 0 or more 20.
- ZNCC Zero-mean Normalized Cross-Correlation
- R_zncc the similarity R_zncc (x t , y t ) at each pixel position is obtained by the following equations (11) to (13).
- T (i, j) is the pixel value of the pixel at the coordinate (i, j) in the template image
- I (i, j) is the coordinate (i, j).
- j) is the pixel value of the input video
- M and N are the vertical width (number of pixels) and horizontal width (number of pixels) of the template video, respectively.
- the blood vessel position recognizing unit 44 obtains a similarity R_zncc (x t , y t ) at each pixel position while sliding each template image within the corresponding matching area 62 (changing the pixel position).
- the blood vessel position recognition unit 44 performs this process for all the matching areas 62 that have been set. That is, in this embodiment, template matching processing is performed on all blood vessel ends detected in the detection region 50 in the eyeball image in the reference state, and the similarity R_zncc is calculated.
- the cross-correlation function between the input image and the template video (numerator of the above formula (11)) is divided by the variance of the pixel values in the input video and the variance of the pixel values in the template video,
- the similarity is normalized.
- the similarity R_zncc is low when a blood vessel end image (input image and / or template image) including noise such as eyelashes or illumination is used.
- the similarity R_zncc is relatively higher than the similarity R_zncc when using a blood vessel end image including noise.
- the blood vessel position recognition unit 44 determines the end of the blood vessel (the center of the template image) with the maximum similarity R_zncc based on the result of the template matching process performed for all the matching areas 62. It selects as the blood vessel end for the eyeball rotation measurement, and specifies the pixel position (x t , y t ) of the blood vessel end for the eyeball rotation measurement. As a result, the position of the blood vessel end that is least affected by the noise described above can be specified.
- the blood vessel position recognizing unit 44 uses the coordinates (x t , y t ) (the coordinates in the XY orthogonal coordinate right separately set in the matching area 62) of the pixel position of the blood vessel end for eyeball rotation measurement as the camera image.
- the (eyeball image) is converted into coordinates (x, y) in the XY orthogonal coordinate system.
- the position information of the blood vessel end for measuring the eyeball rotation in the XY orthogonal coordinate system of the camera image (eyeball image) is acquired.
- the blood vessel position recognizing unit 44 substitutes the position coordinates (x, y) of the blood vessel end for measuring the rotation angle into the above equation (9), and reversely calculates the equation system of the above equation (9) to obtain the rotation angle.
- a rotation angle coordinate ⁇ _out (first information related to a predetermined blood vessel position) in the elliptic coordinate system of the blood vessel end for measurement is calculated.
- the blood vessel position recognizing unit 44 automatically selects a blood vessel end having the least influence of noise at the time of actual measurement from the plurality of blood vessel ends detected in the reference state as described above. Get information.
- the rotation angle coordinate ⁇ p of the blood vessel end (blood vessel end for eyeball rotation measurement) having the maximum similarity R_zncc selected by the blood vessel position recognition unit 44 is the initial position ⁇ _temp.
- FIG. 19 is a flowchart showing a processing procedure of an eyeball rotation measuring method executed by the eyeball rotation measuring apparatus 1 of the present embodiment.
- the same reference numerals are given to the same processes (steps) as those in the flowchart of the first embodiment shown in FIG.
- the eyeball rotation measuring device 1 determines whether or not the current measurement mode is the reference information acquisition mode in the same manner as in the first embodiment (S100).
- the eyeball rotation measuring device 1 analyzes the eyeball image of the subject acquired in a state where there is no pupil change or eyeball rotation (reference state), and acquires various reference information (S202).
- the difference vector data of the iris shading pattern in the reference state is used as reference information in the reference state when the reference information is acquired, as in the first embodiment. Get as one.
- the eyeball rotation measuring device 1 first performs binarization processing on the eyeball image in the reference state, and approximates an ellipse (ellipse fitting processing) to the pupil region in the eyeball image subjected to the binarization processing. )I do. Thereby, the elliptical parameters of the pupil contour in the reference state (for example, information on the center point of the pupil contour, the long axis value, the short axis value, the rotation angle of the long axis, etc.) are acquired.
- the eyeball rotation measuring device 1 sets an elliptic coordinate system (basic elliptic coordinate system) along the pupil contour using various elliptic parameters of the pupil contour calculated by ellipse approximation (S202a).
- the eyeball rotation measuring device 1 calculates the white eye-iris boundary coordinates based on the method described with reference to FIGS. 12 and 13, for example.
- the eyeball rotation measuring device 1 sets the detection region 50 of the blood vessel end in the vicinity of the outer edge of the iris region based on the ellipse (outer edge of the iris region) passing through the white eye-iris boundary coordinates.
- the eyeball rotation measuring apparatus 1 automatically detects a plurality of blood vessel ends 51 in the set detection region 50 based on the method described with reference to FIGS. 15A to 15C, for example (S202b). In the process of S202b, position information of each blood vessel end 51 is acquired.
- the eyeball rotation measuring apparatus 1 acquires an image of a region 52 of a predetermined size centered on each blood vessel end 51 detected in S202b as a template video (S202c).
- the eyeball rotation measuring apparatus 1 acquires the position information of each blood vessel end 51, the template image corresponding to each blood vessel end 51, the difference vector data of the iris density pattern, the elliptic parameter of the pupil contour (for example, acquired by the above-described various processes)
- the information such as the center point of the pupil contour, the long axis value, the short axis value, the rotation angle of the long axis) is stored as reference information in, for example, the memory device 18 or the auxiliary storage device 16 (S202d).
- the eyeball rotation measuring device 1 analyzes the eyeball image at the time of actual measurement acquired from the same subject as that at the time of acquiring the reference information in S202, and converts it into the shade pattern of the iris recognized in the eyeball image at the time of actual measurement. Based on this, the first eyeball rotation angle ⁇ _iris is calculated (S204). In the process of S204, the eyeball rotation measuring device 1 calculates the first eyeball rotation angle ⁇ _iris in accordance with the processing operation of the iris pattern recognition processing unit 42 described above.
- the eyeball rotation measuring device 1 will be described with reference to FIGS. 17 and 18 using the first eyeball rotation angle ⁇ _iris calculated in S204 and the position information (initial position ⁇ _temp) of each blood vessel end 51 acquired as reference information.
- the reference end point 61 for setting the matching area 62 of each blood vessel end 51 is calculated (S208).
- the eyeball rotation measuring device 1 calculates the rotation angle coordinate ⁇ _input in the elliptic coordinate system of the reference end point 61 of each matching area 62 according to the above equation (10).
- the eyeball rotation measuring device 1 sets a matching area 62 within a predetermined range (40 ⁇ 40 pixels in the examples shown in FIGS. 17 and 18) centered on each reference end point 61 (S208).
- the eyeball rotation measuring device 1 performs a template matching process between the image in each matching area 62 and the corresponding template video (S210).
- the eyeball rotation measuring apparatus 1 calculates the similarity R_zncc using the above equations (11) to (13) in all the matching areas 62 (all the extracted blood vessel ends 51).
- the eyeball rotation measuring device 1 selects (identifies) the blood vessel end 51 having the maximum similarity as the blood vessel end for the eyeball rotation measurement from the plurality of similarities R_zncc calculated in S210 (S212).
- the position coordinates (x, y) in the XY orthogonal coordinate system of the blood vessel end 51 for measuring the eyeball rotation are obtained by this processing.
- the eyeball rotation measuring device 1 converts the position coordinate (x, y) in the XY orthogonal coordinate system of the blood vessel end 51 for measuring the eyeball rotation selected in S212 into the rotation angle coordinate ⁇ _out in the elliptical coordinate system. (S214).
- the coordinate conversion processing of the blood vessel end 51 is performed using the above equation (9).
- the rotation angle coordinate ⁇ _out of the blood vessel end 51 calculated in S214 is the angular position of the blood vessel end 51 after the eyeball rotation that most matches the blood vessel end of the initial image.
- the eyeball rotation measuring device 1 calculates the rotation angle coordinates (initial position ⁇ _temp) in the reference state of the blood vessel end 51 for measuring the eyeball rotation selected in S212 and the eyeball rotation measurement calculated in S214 during the actual measurement of the eyeball rotation.
- the final rotation angle ⁇ of the eyeball is calculated using the rotation angle coordinate ⁇ _out of the blood vessel end 51 for use (S216).
- the eyeball rotation is measured as described above.
- the above-described eyeball rotation measurement process may be realized by mounting a corresponding eyeball rotation measurement program in the eyeball rotation measurement apparatus 1 and executing the eyeball rotation measurement program by the CPU 10.
- the eyeball is based on the position of the blood vessel end (the end point of the conjunctival blood vessel) that has a large contrast to the surroundings and is less affected by pupil contraction.
- the rotation angle ⁇ is calculated.
- the eyeball rotation angle ⁇ is calculated based on the position of the blood vessel end existing near the outer edge of the iris region that is not easily affected by the movement of the eyelids. Therefore, also in the present embodiment, the eyeball rotation can be measured with higher accuracy as in the first embodiment.
- the range of the matching area is narrowed down using the first eyeball rotation angle ⁇ _iris calculated by the eyeball rotation measurement method using the shade pattern of the iris. Therefore, in this embodiment as well, as in the first embodiment, it is possible to perform eye rotation measurement at high speed and low cost, and to measure the eye rotation angle with high accuracy and high resolution even when the pupil diameter changes. be able to.
- a blood vessel end having the least influence of noise from a plurality of blood vessel ends can be automatically detected as a blood vessel end for eyeball rotation measurement. Therefore, in this embodiment, the eyeball rotation can be measured with higher accuracy and reliability.
- the subject in order to generate a pupil change, the subject was kept in a dark room and darkened for about 5 to 6 minutes to enlarge the pupil. Then, the capturing of the moving image was started from the time of pupil enlargement (a state in which the subject was dark-adapted), and white pupil illumination was irradiated toward the pupil of the subject in the middle of the moving image to reduce the pupil. Moreover, in this evaluation experiment, in order to generate the eyeball rotation, simultaneously with the moving image acquisition process, the subject's head was tilted sideways to induce the eyeball rotation by the vestibulo-oculomotor reflex.
- an average error and a standard between the true value of the eyeball rotation angle ⁇ by eye measurement and the measurement value of the eyeball rotation angle ⁇ calculated by the eyeball rotation measurement system shown in FIG. Deviation was determined.
- the image resolution of the eyeball image used in this evaluation experiment was 740 ⁇ 320 pixels, and the number of blood vessel ends 51 detected in the detection region 50 near the outer edge of the iris region was ten. Further, the resolution of the eyeball rotation angle ⁇ is about 0.25 degrees, although it depends on the length from the pupil center position to the conjunctival blood vessel end. 20A and 20B show the results of the evaluation experiment.
- FIG. 20A shows an average error (rotation in FIG. 20A) of the eyeball rotation angle ⁇ measured at each pupil size by changing the pupil size in three stages of small, medium, and large according to the length of the major axis of the pupil. (Angle error) and standard deviation values.
- FIG. 20B shows the average error of the eyeball rotation angle ⁇ obtained by the eyeball rotation measurement method of the present embodiment, and the average error of the eyeball rotation angle ⁇ obtained by the conventional eyeball rotation measurement method based on the shade pattern of the iris. It is the figure compared with. Note that the vertical axis of the bar graph shown in FIG. 20B is the average error (rotation angle error in FIG. 20B) of the eyeball rotation angle ⁇ , and the horizontal axis is the pupil size.
- the eyeball rotation measurement method of the present embodiment as is clear from FIG. 20B, even if the pupil diameter (pupil size) changes, the change in the rotation angle error is small, and the value of the rotation angle error is 0. 24 degrees or less. That is, in this embodiment, it is understood that the eyeball rotation can be measured with high accuracy without being affected by the change in the pupil diameter.
- the eyeball rotation measurement technique of the present embodiment can measure the eyeball rotation angle ⁇ with a resolution of about several degrees, which is a conventional method (method using an iris shading pattern). It can be seen that the eyeball rotation can be measured with higher accuracy.
- the eyeball rotation measuring device, the eyeball rotation measuring method, and the eyeball rotation measuring program according to the present invention are not limited to the examples described in the various embodiments. Various other modifications are also included in the present invention without departing from the gist of the present invention described in the claims. For example, the following various modifications and application examples are also included in the present invention.
- the first eyeball rotation angle ⁇ _iris is calculated using the conventional eyeball rotation measurement method based on the shade pattern of the iris image, and the eyeball rotation measurement is performed using the first eyeball rotation angle ⁇ _iris.
- the matching area including the blood vessel end may be set directly from the position information of the blood vessel end detected at the time of obtaining the reference information without calculating the first eyeball rotation angle ⁇ _iris. In this case, it is necessary to set a matching area having a size wider than the matching area 62 set in the second embodiment. An example of such a matching area setting method is shown in FIG.
- FIG. 21 is a diagram showing an outline of a method for setting the matching area 72 in the first modification.
- FIG. 21 shows only one blood vessel end 71 and a matching area 72 set for the same for the sake of simplicity.
- each blood vessel end 71 (target end point) on an ellipse passing through each blood vessel end 71 detected in the vicinity of the outer edge of the iris region when the reference information is acquired.
- a region extending with a predetermined width (2 ⁇ h) is set as a matching area 72.
- the angular width ⁇ that defines the range in the rotational direction on the ellipse of the matching area 72 and the width ⁇ h that defines the range in the radial direction of the ellipse are, for example, the blood vessel end 71 assumed during eye movement including rotation. Can be set as appropriate in consideration of the maximum movement amount.
- the angular width ⁇ can be about 20 degrees and the width ⁇ h can be about 20 pixels.
- the setting range of the matching area 72 in this example is not limited to the example shown in FIG.
- an area extending with a predetermined width (2 ⁇ h) over the entire circumference of the ellipse passing through each blood vessel end 71 detected in the vicinity of the outer edge of the iris area when acquiring the reference information may be set as the matching area.
- the eyeball rotation angle ⁇ is set based on the position of the blood vessel end (conjunctival blood vessel end point) that has a large contrast to the surroundings and that is less affected by the pupil contraction. calculate. Therefore, also in the eyeball rotation measurement technique of this example, the eyeball rotation can be measured with higher accuracy as in the case of the second embodiment.
- the matching area including the blood vessel end is set directly from the position information of the blood vessel end detected at the time of acquiring the reference information, the matching area is also widened as described above. Therefore, the second embodiment is superior from the viewpoint of high-speed eyeball rotation measurement.
- the corresponding blood vessel (the blood vessel corresponding to the predetermined blood vessel selected in the eyeball image in the reference state) is identified from the plurality of blood vessels recognized in the eyeball image acquired during the actual measurement of the eyeball rotation.
- the blood vessel having the target end point closest to the reference end point (O ref ) of the blood vessel end obtained using the first eyeball rotation angle ⁇ 1 is selected as the corresponding blood vessel, but the present invention is not limited to this.
- the template matching processing described in the second embodiment may be applied to the first embodiment, and the corresponding blood vessel may be specified in the eyeball image acquired during actual measurement of eyeball rotation.
- a template image having a predetermined size centered on the target end point O tgt0 of the predetermined blood vessel (blood vessel for measuring eyeball rotation) selected in the eyeball image in the reference state is acquired.
- a matching area of a predetermined size is set with the reference end point (O ref ) of each blood vessel end obtained using the first eyeball rotation angle ⁇ 1 as the center point.
- matching processing between the matching area and the template image is performed, and the corresponding blood vessel is specified from the plurality of blood vessels recognized in the eyeball image acquired at the time of actual measurement.
- a corresponding blood vessel is identified from a plurality of blood vessels recognized in an eyeball image acquired during actual measurement of eyeball rotation. Also good.
- a template image having a predetermined size centered on the target end point O tgt0 of the predetermined blood vessel (blood vessel for measuring eyeball rotation) selected in the eyeball image in the reference state is acquired.
- a matching area of a predetermined size is set with the target end point O tgt0 of the predetermined blood vessel selected in the reference state as the center point. Note that the matching area at this time is provided along an ellipse passing through the target end point O tgt0 of a predetermined blood vessel, as described in FIG.
- the matching process between the matching area and the template video is performed, and the corresponding blood vessel is specified in the eyeball image acquired at the time of actual measurement.
- the region for detecting the end of the blood vessel (end point on the pupil side of the conjunctival blood vessel) is set outside the iris region (including the outer edge of the iris region) has been described. Is not limited to this.
- the detection region of the blood vessel end can be set to any region as long as it is in the vicinity of the outer edge of the iris region.
- the detection area may be set so that a part of the detection area includes an area on the inner side (pupil side) of the outer edge of the iris area.
- the eyeball rotation angle ⁇ may be calculated by comparing the extension direction of a predetermined blood vessel in the eyeball image in the reference state with the extension direction of the blood vessel corresponding to the predetermined blood vessel in the eyeball image at the actual measurement. .
- SYMBOLS 1 Eye rotation measuring apparatus, 10 ... CPU, 16 ... Auxiliary storage device, 18 ... Memory device, 26 ... Image input interface, 30 ... Camera, 32 ... Infrared LED, 34 ... Blue LED, 40 ... Reference
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Eye Examination Apparatus (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
まず、本発明の第1の実施形態に係る眼球回旋測定装置、眼球回旋測定方法、及び、眼球回旋測定プログラムについて説明する。
図1は、第1の実施形態に係る眼球回旋測定装置1のハードウェア構成図である。眼球回旋測定装置1は、例えば、CPU(Central Processing Unit)10と、ドライブ装置12と、補助記憶装置16と、メモリ装置18と、インターフェース装置20と、入力装置22と、ディスプレイ装置24と、画像入力インターフェース26とを備える。これらの構成要素は、バスやシリアル回線等を介して互いに接続される。
図2は、本実施形態の眼球回旋測定装置1の機能構成図である。眼球回旋測定装置1は、CPU10がプログラムを実行することにより機能する機能ブロックとして、基準情報取得部40と、虹彩パターン認識処理部42(第2の角度算出部)と、血管位置認識部44と、角度算出部46(第1の角度算出部)と、を備える。また、これらの各機能ブロックは、例えばメモリ装置18や補助記憶装置16に対して各種情報(データ)を入出力する。そして、メモリ装置18や補助記憶装置16は、各機能ブロックから出力された各種情報を格納する。
(1)眼球回旋測定手法の概要
次に、図2に示す各機能ブロックで行われる各種処理を説明しながら、本実施形態における眼球回旋測定手法の内容を具体的に説明するが、最初に、本実施形態における眼球回旋測定手法の概要を説明する。
本実施形態の眼球回旋測定手法において、基準情報取得部40は、眼球回旋の実計測時より前に行われ、眼球回旋の測定に必要な各種基準情報を取得する。なお、基準情報取得部40により行われる各種基準情報の取得処理は、眼球回旋の実計測時(例えば実際に被験体の健康状態を計測するとき)の処理と異なる。
虹彩パターン認識処理部42は、眼球回旋の実計測時にカメラ30により撮影された、基準情報取得時の被験体と同一の被験体の眼球画像において認識される虹彩の濃淡パターンに基づき、ある程度の精度で眼球回旋角度θ1(以下、第1眼球回旋角度θ1という)を算出する。以下、この虹彩パターン認識処理部42の処理内容をより詳細に説明する。
血管位置認識部44は、白目領域103における血管104の位置を認識し、該認識した血管104の位置に関する情報を、例えばメモリ装置18や補助記憶装置16に出力する。白目領域103は、図3に示すように、眼球画像100において、虹彩パターン認識処理部42により認識された虹彩領域102よりも外側の領域である。なお、この際、血管位置認識部44で認識(特定)する血管104は、基準情報取得時に基準情報取得部40により選択された所定の血管に対応する血管(以下、対応血管ともいう)である。
角度算出部46は、基準状態において選択された所定の血管の対象端点Otgt0の位置情報と、眼球回旋の実計測時に特定(認識)された対応血管(所定の血管に対応する血管)の対象端点Otgtの位置情報とに基づいて、眼球回旋角度θを算出する。具体的には、眼球回旋角度θは、次のように求められる。
次に、本実施形態の眼球回旋測定装置1により眼球回旋測定を行う際の具体的な処理手順を、図11を参照しながら説明する。図11は、本実施形態の眼球回旋測定装置1によって実行される眼球回旋測定手法の処理手順を示すフローチャートである。
一般に、目の中には、例えば、まつ毛、塵埃、化粧品の微粉等の物質などが入りやすく、これらの物質が虹彩領域の外縁近傍に存在すると、これらの物質は、本発明において、眼球回旋の測定精度を劣化させるノイズ源になる可能性がある。それゆえ、本発明において、眼球回旋運動をより正しく測定するためには、測定時に、虹彩領域の外縁(輪郭)付近に存在する複数の結膜血管の端点(以下、略して血管端ともいう)から、例えばまつ毛などの物質(ノイズ)で遮蔽されていない血管端を自動的に選択(特定)することが好ましい。第2の実施形態では、このような処理が可能な眼球回旋測定装置、及び、眼球回旋測定手法の一例を説明する。
(1)眼球回旋測定手法の概要
図2に示す各機能ブロックで行われる各種処理を説明しながら、本実施形態における眼球回旋測定手法を具体的に説明するが、その前に、本実施形態における眼球回旋測定手法の概要を説明する。
基準情報取得時には、まず、上記第1の実施形態と同様に、2種類の照明(赤外線LED32及び青色LED34)を用いて眼球を撮影する。この際、瞳孔及び虹彩の領域に赤外線を照射し、血管の周辺(白目領域)に青色光を照射する。
本実施形態における虹彩パターン認識処理部42の処理内容は、上記第1の実施形態と同様である。具体的には、虹彩パターン認識処理部42は、上記第1の実施形態と同様にして、眼球回旋の実計測時に取得された眼球画像において認識される虹彩の濃淡パターンの差分ベクトルデータ、及び、基準状態における虹彩の濃淡パターンの差分ベクトルデータに基づき、第1眼球回旋角度θ1(本実施形態では「θ_iris」と表記する)を算出する。
上記第1の実施形態では、眼球回旋の実計測時に取得された眼球画像において認識された複数の血管端から、第1眼球回旋角度θ_iris(基準眼球回旋角度)を用いて求められた血管端の基準端点(Oref)に最も近い端点を、眼球回旋測定用の血管端(対応血管の対象端点Otgt)として特定(選択)する手法を説明した。しかしながら、本実施形態では、実計測時の眼球画像において、上記第1の実施形態とは異なる手法を用いて、眼球回旋測定用の血管端を特定(選択)する。
本実施形態では、血管位置認識部44により選択された類似度R_znccが最大となる血管端(眼球回旋測定用の血管端)の回転角度座標θpが、初期位置θ_tempから実計測時の回転角度座標θ_outに変化した際の回転角度座標の変化量を眼球回旋角度θとする。それゆえ、角度算出部46は、回旋角度計測用の血管端における実計測時の回転角度座標θ_out(所定の血管の位置に関する第1の情報)から初期位置θ_temp(所定の血管の位置に関する第2の情報)を減算して、眼球回旋角度θ(=θ_out-θ_temp)を算出する。
次に、本実施形態の眼球回旋測定装置1により眼球回旋測定を行う際の具体的な処理手順を、図19を参照しながら説明する。図19は、本実施形態の眼球回旋測定装置1によって実行される眼球回旋測定手法の処理手順を示すフローチャートである。なお、図19に示す本実施形態のフローチャートにおいて、図11に示す上記第1の実施形態のフローチャートと同様の処理(ステップ)には、同じ符号を付して示す。
ここで、上述した本実施形態の眼球回旋測定技術により得られる各種効果を、実際に行った評価実験の結果に基づいてより具体的に説明する。この評価実験では、3名の被験者において、瞳孔径の変化と眼球回旋運動とが同時に発生している状態で眼球回旋角度θを算出し、その平均誤差及び標準偏差を計測した。
本発明に係る眼球回旋測定装置、眼球回旋測定手法、及び、眼球回旋測定プログラムは、上記各種実施形態で説明した例に限定されない。特許請求の範囲に記載した本発明の要旨を逸脱しない限りその他種々の変形例も本発明に含まれる。例えば、次のような各種変形例及び応用例も本発明に含まれる。
上記第2の実施形態では、虹彩画像の濃淡パターンに基づく従来の眼球回旋測定手法を用いて第1眼球回旋角度θ_irisを算出し、そして、第1眼球回旋角度θ_irisを用いて、眼球回旋測定用の血管端を特定するためのマッチングエリアの範囲を絞り込む例(図18参照)を説明したが、本発明はこれに限定されない。例えば、第1眼球回旋角度θ_irisを算出せずに、基準情報取得時に検出された血管端の位置情報から、直接、該血管端を含むマッチングエリアを設定してもよい。この場合、上記第2の実施形態で設定したマッチングエリア62より広いサイズのマッチングエリアを設定する必要がある。そのようなマッチングエリアの設定手法の一例を図21に示す。
上記第1の実施形態では、眼球回旋の実計測時に取得された眼球画像において認識された複数の血管から対応血管(基準状態の眼球画像において選択された所定の血管に対応する血管)を特定する際、第1眼球回旋角度θ1を用いて求められた血管端の基準端点(Oref)に最も近い対象端点を有する血管を対応血管として選択したが、本発明はこれに限定されない。上記第1の実施形態に、上記第2の実施形態で説明したテンプレートマッチング処理を適用して、眼球回旋の実計測時に取得された眼球画像において対応血管を特定してもよい。
上記各種実施形態及び各種変形例では、血管端(結膜血管の瞳孔側の端点)を検出する領域を、虹彩領域の外側(虹彩領域の外縁を含む)に設定する例を説明したが、本発明はこれに限定されない。血管端の検出領域は、虹彩領域の外縁近傍であれば、任意の領域に設定することができる。例えば、検出領域の一部が虹彩領域の外縁より内側(瞳孔側)の領域を含むように検出領域を設定してもよい。
上述した本発明の技術は、眼球回旋の測定技術だけでなく、例えば視線検出等の眼球運動全般の測定技術にも適用可能であり、同様の効果が得られる。
Claims (8)
- 眼球画像において、白目領域における血管の位置を認識して、該血管の位置に関する情報を取得する血管位置認識部と、
眼球回旋の実計測時に前記血管位置認識部により取得された所定の血管の位置に関する第1の情報と、基準状態における前記所定の血管の位置に関する第2の情報とに基づいて、眼球回旋角度を算出する第1の角度算出部と、
を備える眼球回旋測定装置。 - 前記血管位置認識部は、前記血管の位置に関する情報として、前記血管の端点の位置情報を取得する
請求項1に記載の眼球回旋測定装置。 - 前記血管位置認識部は、前記眼球画像における虹彩領域の外縁を認識し、前記血管の位置に関する情報として、前記血管の前記虹彩領域の外縁側の端点の位置情報を取得する
請求項1又は2に記載の眼球回旋測定装置。 - 前記眼球画像中の虹彩の濃淡パターンに基づき、基準眼球回旋角度を算出する第2の角度算出部を備え、
前記血管位置認識部は、前記第2の角度算出部により算出された基準眼球回旋角度と、基準状態における前記所定の血管の位置に関する前記第2の情報とに基づいて前記所定の血管の想定端点位置を決定し、眼球回旋の実計測時における眼球画像から認識される一つ以上の血管の端点のうち、前記想定端点位置に最も近い位置に存在する血管の端点の位置情報を、前記所定の血管の位置に関する前記第1の情報として取得する
請求項1~3のいずれか一項に記載の眼球回旋測定装置。 - 前記血管位置認識部は、前記虹彩領域の外縁に沿って設けられた、前記白目領域を含む検出領域内に存在する一つ以上の血管の端点を検出し、検出した各血管の端点に対して当該端点を含むマッチングエリアを設定し、前記マッチングエリアの画像と、それに対応する血管の前記基準状態における所定サイズのテンプレート画像との類似度を計算し、該類似度が最大となる端点の位置情報を、前記所定の血管の位置に関する前記第1の情報として取得する
請求項3に記載の眼球回旋測定装置。 - 前記血管位置認識部は、検出した各血管の端点に対して当該端点を中心とする所定の領域をマッチングエリアとして設定する
請求項5に記載の眼球回旋測定装置。 - 眼球回旋の実計測時に取得された眼球画像において、白目領域における所定の血管の位置を認識して、該所定の血管の位置に関する第1の情報を取得することと、
前記第1の情報と、基準状態における前記所定の血管の位置に関する第2の情報とに基づいて、眼球回旋角度を算出することと、
を含む眼球回旋測定方法。 - 眼球回旋の実計測時に取得された眼球画像において、白目領域における所定の血管の位置を認識して、該所定の血管の位置に関する第1の情報を取得する処理と、
前記第1の情報と、基準状態における前記所定の血管の位置に関する第2の情報とに基づいて、眼球回旋角度を算出する処理と、を情報処理装置に実装して実行させる眼球回旋測定プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014500963A JP6048844B2 (ja) | 2012-02-24 | 2013-02-22 | 眼球回旋測定装置、眼球回旋測定方法、及び、眼球回旋測定プログラム |
US14/380,912 US9433345B2 (en) | 2012-02-24 | 2013-02-22 | Cycloduction measurement device, cycloduction measurement method, and cycloduction measurement program |
EP13751309.9A EP2818099B1 (en) | 2012-02-24 | 2013-02-22 | Cycloduction measurement device, cycloduction measurement method, and cycloduction measurement program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-039403 | 2012-02-24 | ||
JP2012039403 | 2012-02-24 | ||
JP2012224622 | 2012-10-09 | ||
JP2012-224622 | 2012-10-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013125707A1 true WO2013125707A1 (ja) | 2013-08-29 |
Family
ID=49005883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/054609 WO2013125707A1 (ja) | 2012-02-24 | 2013-02-22 | 眼球回旋測定装置、眼球回旋測定方法、及び、眼球回旋測定プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US9433345B2 (ja) |
EP (1) | EP2818099B1 (ja) |
JP (1) | JP6048844B2 (ja) |
WO (1) | WO2013125707A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016024616A (ja) * | 2014-07-18 | 2016-02-08 | 国立大学法人静岡大学 | 眼球計測システム、視線検出システム、眼球計測方法、眼球計測プログラム、視線検出方法、および視線検出プログラム |
WO2016167091A1 (ja) * | 2015-04-13 | 2016-10-20 | 株式会社クリュートメディカルシステムズ | 視覚検査装置、視覚検査装置の視標補正方法、および表示装置 |
JP2016198301A (ja) * | 2015-04-10 | 2016-12-01 | 株式会社豊田中央研究所 | 虹彩パタン比較装置及びプログラム |
WO2016195066A1 (ja) * | 2015-06-05 | 2016-12-08 | 聖 星野 | 眼球の運動を検出する方法、そのプログラム、そのプログラムの記憶媒体、及び、眼球の運動を検出する装置 |
EP3111421A1 (en) * | 2014-02-25 | 2017-01-04 | EyeVerify Inc. | Eye gaze tracking |
WO2019240157A1 (ja) * | 2018-06-12 | 2019-12-19 | 国立大学法人筑波大学 | 眼球運動測定装置、眼球運動測定方法及び眼球運動測定プログラム |
CN113175904A (zh) * | 2021-04-13 | 2021-07-27 | 西安交通大学 | 一种基于旋量模型的键槽特征公差建模方法及*** |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160029945A1 (en) * | 2013-03-15 | 2016-02-04 | Massachusetts Eye And Ear Infirmary | Vestibular testing |
KR102322029B1 (ko) * | 2015-03-27 | 2021-11-04 | 삼성전자주식회사 | 생체 정보 획득 방법 및 이를 위한 장치 |
EP3626161A1 (en) * | 2018-09-24 | 2020-03-25 | Christie Medical Holdings, Inc. | Ir/nir imaging with discrete scale comparator objects |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02164335A (ja) * | 1988-12-16 | 1990-06-25 | Konan Camera Kenkyusho:Kk | 眼球運動解析装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3718156B2 (ja) * | 2001-10-24 | 2005-11-16 | 独立行政法人科学技術振興機構 | 眼球動特性の測定装置 |
US20050137586A1 (en) * | 2003-12-23 | 2005-06-23 | Gray Gary P. | Hybrid eye tracking system and associated methods |
JP5092120B2 (ja) | 2006-10-17 | 2012-12-05 | 国立大学法人東京工業大学 | 眼球運動計測装置 |
DE102010032193A1 (de) | 2010-07-24 | 2012-01-26 | Chronos Vision Gmbh | Verfahren und Vorrichtung zur Bestimmung der Augentorsion |
-
2013
- 2013-02-22 EP EP13751309.9A patent/EP2818099B1/en active Active
- 2013-02-22 JP JP2014500963A patent/JP6048844B2/ja active Active
- 2013-02-22 US US14/380,912 patent/US9433345B2/en active Active
- 2013-02-22 WO PCT/JP2013/054609 patent/WO2013125707A1/ja active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02164335A (ja) * | 1988-12-16 | 1990-06-25 | Konan Camera Kenkyusho:Kk | 眼球運動解析装置 |
Non-Patent Citations (1)
Title |
---|
TSUTOMU HASHIMOTO; YOSHIO MAKI; YUSUKE SAKASHITA; JUNPEI NISHIYAMA; HIRONOBU FUJ IYOSHI; YUTAKA HIRATA: "A Model of the Iris Pattern Stretches in Relation to Pupil Diameter and Its Application to Measurement of Roll Eye Movements", INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TRANSACTION, vol. J93-D, no. 1, 2010, pages 39 - 46 |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3111421A1 (en) * | 2014-02-25 | 2017-01-04 | EyeVerify Inc. | Eye gaze tracking |
JP2017515182A (ja) * | 2014-02-25 | 2017-06-08 | アイベリファイ インコーポレイテッド | 視線追跡 |
JP2016024616A (ja) * | 2014-07-18 | 2016-02-08 | 国立大学法人静岡大学 | 眼球計測システム、視線検出システム、眼球計測方法、眼球計測プログラム、視線検出方法、および視線検出プログラム |
JP2016198301A (ja) * | 2015-04-10 | 2016-12-01 | 株式会社豊田中央研究所 | 虹彩パタン比較装置及びプログラム |
WO2016167091A1 (ja) * | 2015-04-13 | 2016-10-20 | 株式会社クリュートメディカルシステムズ | 視覚検査装置、視覚検査装置の視標補正方法、および表示装置 |
WO2016195066A1 (ja) * | 2015-06-05 | 2016-12-08 | 聖 星野 | 眼球の運動を検出する方法、そのプログラム、そのプログラムの記憶媒体、及び、眼球の運動を検出する装置 |
JPWO2016195066A1 (ja) * | 2015-06-05 | 2018-03-22 | 聖 星野 | 眼球の運動を検出する方法、そのプログラム、そのプログラムの記憶媒体、及び、眼球の運動を検出する装置 |
WO2019240157A1 (ja) * | 2018-06-12 | 2019-12-19 | 国立大学法人筑波大学 | 眼球運動測定装置、眼球運動測定方法及び眼球運動測定プログラム |
JPWO2019240157A1 (ja) * | 2018-06-12 | 2021-07-08 | 国立大学法人 筑波大学 | 眼球運動測定装置、眼球運動測定方法及び眼球運動測定プログラム |
JP7320283B2 (ja) | 2018-06-12 | 2023-08-03 | 国立大学法人 筑波大学 | 眼球運動測定装置、眼球運動測定方法及び眼球運動測定プログラム |
CN113175904A (zh) * | 2021-04-13 | 2021-07-27 | 西安交通大学 | 一种基于旋量模型的键槽特征公差建模方法及*** |
CN113175904B (zh) * | 2021-04-13 | 2022-10-25 | 西安交通大学 | 一种基于旋量模型的键槽特征公差建模方法及*** |
Also Published As
Publication number | Publication date |
---|---|
US9433345B2 (en) | 2016-09-06 |
EP2818099B1 (en) | 2019-09-18 |
EP2818099A1 (en) | 2014-12-31 |
US20150029461A1 (en) | 2015-01-29 |
JPWO2013125707A1 (ja) | 2015-07-30 |
JP6048844B2 (ja) | 2016-12-27 |
EP2818099A4 (en) | 2015-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6048844B2 (ja) | 眼球回旋測定装置、眼球回旋測定方法、及び、眼球回旋測定プログラム | |
US11775056B2 (en) | System and method using machine learning for iris tracking, measurement, and simulation | |
JP6651074B2 (ja) | 眼球の運動を検出する方法、そのプログラム、そのプログラムの記憶媒体、及び、眼球の運動を検出する装置 | |
US10048749B2 (en) | Gaze detection offset for gaze tracking models | |
EP2665406B1 (en) | Automated determination of arteriovenous ratio in images of blood vessels | |
JP5657494B2 (ja) | シワ検出方法、シワ検出装置およびシワ検出プログラム、並びに、シワ評価方法、シワ評価装置およびシワ評価プログラム | |
JP6368709B2 (ja) | 3次元身体データを生成する方法 | |
CN110807427B (zh) | 一种视线追踪方法、装置、计算机设备和存储介质 | |
US10176614B2 (en) | Image processing device, image processing method, and program | |
US20160162673A1 (en) | Technologies for learning body part geometry for use in biometric authentication | |
KR102393298B1 (ko) | 홍채 인식 방법 및 장치 | |
Odstrcilik et al. | Thickness related textural properties of retinal nerve fiber layer in color fundus images | |
JP2007207009A (ja) | 画像処理方法及び画像処理装置 | |
US20200401841A1 (en) | Apparatus for diagnosing glaucoma | |
JP2008022928A (ja) | 画像解析装置及び画像解析プログラム | |
Satriya et al. | Robust pupil tracking algorithm based on ellipse fitting | |
Kolar et al. | Registration of retinal sequences from new video-ophthalmoscopic camera | |
Malinowski et al. | An iris segmentation using harmony search algorithm and fast circle fitting with blob detection | |
KR20030066512A (ko) | 노이즈에 강인한 저용량 홍채인식 시스템 | |
Parikh et al. | Effective approach for iris localization in nonideal imaging conditions | |
JP6452236B2 (ja) | 眼球識別装置及び眼球識別方法 | |
Hoshino et al. | Measurement of eyeball rotational movements in the dark environment | |
KR101276792B1 (ko) | 눈 검출 장치 및 방법 | |
JP5087151B2 (ja) | 画像処理方法、画像処理装置及び画像処理プログラム | |
Sharma et al. | A comprehensive study of optic disc detection in artefact retinal images using a deep regression neural network for a fused distance-intensity map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13751309 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
ENP | Entry into the national phase |
Ref document number: 2014500963 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14380912 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013751309 Country of ref document: EP |