WO2015080003A1 - Pupil detection device, line-of-sight detection device, and pupil detection method - Google Patents

Pupil detection device, line-of-sight detection device, and pupil detection method Download PDF

Info

Publication number
WO2015080003A1
WO2015080003A1 PCT/JP2014/080682 JP2014080682W WO2015080003A1 WO 2015080003 A1 WO2015080003 A1 WO 2015080003A1 JP 2014080682 W JP2014080682 W JP 2014080682W WO 2015080003 A1 WO2015080003 A1 WO 2015080003A1
Authority
WO
WIPO (PCT)
Prior art keywords
pupil
region
unit
center
center position
Prior art date
Application number
PCT/JP2014/080682
Other languages
French (fr)
Japanese (ja)
Inventor
修二 箱嶋
首藤 勝行
賢 二宮
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2015080003A1 publication Critical patent/WO2015080003A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present invention relates to a pupil detection technique.
  • a line-of-sight detection device that detects the position where the subject is gazing on an observation surface such as a monitor screen from the face image captured by the camera.
  • an observation surface such as a monitor screen
  • a detection device to the head.
  • a non-contact type that does not require a device or the like to be attached to the head has been developed, and a highly accurate gaze detection device is required.
  • gaze detection is performed based on the positional relationship between the pupil on the camera image and the corneal reflection. For this reason, it is important to accurately obtain the pupil center coordinates and the corneal reflection center coordinates on the camera image.
  • Patent Document 1 proposes a technique capable of stably detecting a pupil by actively using information of a corneal reflection image even when a large portion of the pupil is hidden by corneal reflection. ing.
  • the present invention has been made in view of the above, and an object of the present invention is to provide a pupil detection device, a line-of-sight detection device, and a pupil detection method capable of detecting a pupil with a smaller calculation amount and higher accuracy.
  • the present invention uses a specifying unit that specifies a pupil region and a corneal reflection region from an image obtained by imaging an eye, and a first region included in the pupil region.
  • a first estimation unit that estimates the center position of the pupil
  • a second estimation unit that estimates the center position of the pupil using a second region that is included in the pupil region and is different from the first region
  • a pupil position detection unit that detects the center position of the pupil based on the center position of the pupil estimated by the first estimation unit and the center position of the pupil estimated by the second estimation unit;
  • the pupil detection device, line-of-sight detection device, and pupil detection method according to the present invention have an effect that the pupil can be detected with higher accuracy with a smaller amount of calculation.
  • FIG. 1 is a diagram illustrating an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment.
  • FIG. 2 is a diagram illustrating an outline of functions of the diagnosis support apparatus.
  • FIG. 3 is a block diagram showing detailed functions of the respective units shown in FIG.
  • FIG. 4 is a diagram showing eye and distance detection when two cameras are used.
  • FIG. 5 is a diagram illustrating a captured image captured by the stereo camera.
  • FIG. 6 is a diagram illustrating an example of an eye image cut out from the captured image of FIG.
  • FIG. 7 is a diagram illustrating an example of a luminance change of an image near the cornea reflection region.
  • FIG. 8 is a diagram illustrating an example of luminance change of an image near the pupil region.
  • FIG. 1 is a diagram illustrating an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment.
  • FIG. 2 is a diagram illustrating an outline of functions of the diagnosis support apparatus.
  • FIG. 9 is a diagram schematically showing FIG.
  • FIG. 10 is a diagram schematically showing FIG.
  • FIG. 11 is a diagram schematically showing FIG.
  • FIG. 12 is a diagram schematically showing FIG.
  • FIG. 13 is a diagram schematically showing FIG.
  • FIG. 14 is a diagram schematically illustrating FIG.
  • FIG. 15 is a flowchart illustrating pupil detection processing according to the first embodiment.
  • FIG. 16 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • FIG. 17 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • FIG. 18 is a diagram illustrating an outline of functions of the diagnosis support apparatus.
  • FIG. 16 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • FIG. 19 is a block diagram illustrating an example of detailed functions of the respective units illustrated in FIG.
  • FIG. 20 is a diagram illustrating an outline of processing executed by the diagnosis support apparatus according to the second embodiment.
  • FIG. 21 is an explanatory diagram showing the difference between the method using two light sources and the second embodiment using one light source.
  • FIG. 22 is a diagram for explaining calculation processing for calculating the distance between the pupil center position and the corneal curvature center position.
  • FIG. 23 is a flowchart illustrating an example of calculation processing according to the second embodiment.
  • FIG. 24 is a diagram illustrating a method of calculating the position of the corneal curvature center using the distance obtained in advance.
  • FIG. 25 is a flowchart illustrating an example of a line-of-sight detection process according to the second embodiment.
  • FIG. 26 is a diagram for explaining a calculation process of the modification.
  • FIG. 27 is a flowchart illustrating an example of a modification calculation process.
  • gaze point detection performed without contact is known as one of the usage scenes of the pupil detection device.
  • a specific method there is a method of irradiating a near-infrared point light source (LED or the like) to the eye and estimating the line of sight from the light source image reflected by the cornea and the position of the pupil.
  • the gaze point detection performance largely depends on the detection accuracy of the center coordinates of the pupil and the cornea reflection.
  • the pupil detection device (diagnosis support device) of the first embodiment detects the pupil center position in the eye region image by setting individual regions in each of the X direction and the Y direction, for example, and obtaining the luminance centroid of each region. To do. This eliminates the influence of corneal reflection and enables highly accurate detection with a small amount of computation.
  • the pupil detection device of the present embodiment can be used for a non-contact type gaze detection device or the like. By improving the accuracy of the pupil detection device, the overall performance of the visual line detection device can be improved.
  • the pupil detection device of the present embodiment can be used for a diagnosis support device that supports diagnosis of developmental disabilities using the pupil detection result. Below, the example which used the pupil detection apparatus for such a diagnosis assistance apparatus is demonstrated. Applicable devices are not limited to the line-of-sight detection device and the diagnosis support device.
  • FIG. 1 is a diagram illustrating an example of an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment.
  • a set of stereo cameras 102 is arranged below the display screen 101.
  • the stereo camera 102 is an imaging unit that can perform stereo shooting with infrared rays, and includes a right camera 202 and a left camera 204.
  • infrared LED (Light Emitting Diode) light sources 203 and 205 are arranged in the circumferential direction, respectively.
  • the infrared LED light sources 203 and 205 are light sources that irradiate near infrared rays having a wavelength of 850 nm, for example.
  • the pupils of the subject are detected by the infrared LED light sources 203 and 205. Details of the pupil detection method will be described later.
  • the space is expressed by coordinates and the position is specified.
  • the center position of the display screen 101 is the origin, the top and bottom are the Y coordinate (up is +), the side is the X coordinate (right is +), and the depth is the Z coordinate (front is +). .
  • FIG. 2 is a diagram showing an outline of functions of the diagnosis support apparatus 100.
  • FIG. 2 shows a part of the configuration shown in FIG. 1 and a configuration used for driving the configuration.
  • the diagnosis support apparatus 100 includes a right camera 202, a left camera 204, infrared LED light sources 203 and 205, a speaker 105, a drive / IF (interface) unit 208, and a control unit 300.
  • the display screen 101 shows the positional relationship between the right camera 202 and the left camera 204 in an easy-to-understand manner, but the display screen 101 is a screen displayed on the display unit 210.
  • the drive unit and the IF unit may be integrated or separate.
  • the speaker 105 functions as an audio output unit that outputs audio or the like for alerting the subject during calibration or the like.
  • the drive / IF unit 208 drives each unit included in the stereo camera 102.
  • the drive / IF unit 208 serves as an interface between each unit included in the stereo camera 102 and the control unit 300.
  • the control unit 300 is a communication I / F that communicates with a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory) by connecting to a network. And a computer equipped with a bus for connecting each unit.
  • a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory) by connecting to a network.
  • a computer equipped with a bus for connecting each unit.
  • the storage unit 150 stores various information such as a control program, a measurement result, and a diagnosis support result.
  • the storage unit 150 stores, for example, an image to be displayed on the display unit 210.
  • the display unit 210 displays various information such as a target image for diagnosis.
  • FIG. 3 is a block diagram showing an example of detailed functions of each unit shown in FIG. As shown in FIG. 3, a display unit 210 and a drive / IF unit 208 are connected to the control unit 300.
  • the drive / IF unit 208 includes camera IFs 314 and 315, an LED drive control unit 316, and a speaker drive unit 322.
  • the right camera 202 and the left camera 204 are connected to the drive / IF unit 208 via the camera IFs 314 and 315, respectively.
  • the driving / IF unit 208 drives these cameras to image the subject.
  • the infrared LED light source 203 and the infrared LED light source 205 are light sources that irradiate near-infrared rays of 850 nm, for example.
  • the wavelength of the infrared rays to be irradiated is not limited to the above.
  • the speaker driving unit 322 drives the speaker 105.
  • the diagnosis support apparatus 100 may include an interface (printer IF) for connecting to a printer as a printing unit.
  • the printer may be provided inside the diagnosis support apparatus 100.
  • the control unit 300 controls the entire diagnosis support apparatus 100.
  • the control unit 300 includes a specifying unit 351, a first estimation unit 352, a second estimation unit 353, a gaze detection unit 354, a viewpoint detection unit 355, an output control unit 356, an evaluation unit 357, and pupil position detection. Part 358.
  • the specific part 351, the 1st estimation part 352, and the 2nd estimation part 353 should just be provided at least.
  • Each element included in the control unit 300 (specification unit 351, first estimation unit 352, second estimation unit 353, gaze detection unit 354, viewpoint detection unit 355, output control unit 356, evaluation unit 357, pupil position detection unit 358) May be realized by software (program), a hardware circuit, or a combination of software and a hardware circuit.
  • the program When implemented by a program, the program is a file in an installable or executable format, such as a CD-ROM (Compact Disk Read Only Memory), a flexible disk (FD), a CD-R (Compact Disk Recordable), a DVD ( It is recorded on a computer-readable recording medium such as Digital Versatile Disk) and provided as a computer program product.
  • the program may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network.
  • the program may be provided or distributed via a network such as the Internet.
  • the program may be provided by being incorporated in advance in a ROM or the like.
  • the identifying unit 351 identifies the pupil region and the corneal reflection region from the captured image (image captured by the eyes) captured by the imaging unit (stereo camera 102).
  • a specifying method by the specifying unit 351 for example, a method of specifying a low-luminance (dark) region in an image as a pupil region and a high-luminance (bright) region as a corneal reflection region has been conventionally used. You can apply any way you are.
  • the first estimation unit 352 estimates the center position of the pupil using the region (first region) included in the pupil region.
  • the second estimation unit 353 estimates the center position of the pupil using a region (second region) that is included in the pupil region and is different from the first region.
  • the first estimation unit 352 uses a region (first region) that is surrounded by a tangent line (first tangent line) that is in contact with the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • the center position of the pupil in the first tangential direction is estimated.
  • the second estimation unit 353 uses a region (second region) that is in contact with the corneal reflection region and is surrounded by a tangent (second tangent) orthogonal to the first tangent and the outer peripheral line and does not include the corneal reflection region. Thus, the center position of the pupil in the second tangential direction is estimated.
  • the pupil position detection unit 358 detects the center position of the pupil based on the center position of the pupil estimated by the first estimation unit 352 and the center position of the pupil estimated by the second estimation unit 353.
  • the pupil position detection unit 358 includes, for example, a straight line that passes through the position estimated by the first estimation unit 352 and is orthogonal to the first tangent line, and a straight line that passes through the position estimated by the second estimation unit 353 and is orthogonal to the second tangent line. The center position of the pupil is detected from the intersection.
  • the first tangent can be a straight line extending in the horizontal direction (X direction), and the second tangent can be a straight line extending in the vertical direction (Y direction).
  • the directions of the first tangent line and the second tangent line are not limited to the horizontal direction and the vertical direction, and can be any direction as long as they are orthogonal to each other.
  • a case where the first tangent is in the horizontal direction (X direction) and the second tangent is in the vertical direction (Y direction) will be described as an example.
  • the center position of the first region in the first tangential direction may be estimated as the center position of the pupil in the first tangential direction.
  • the center position of the second region in the second tangent direction may be estimated as the center position of the pupil in the second tangent direction.
  • the corneal reflection diameter needs to be smaller than the pupil diameter, but this condition is usually satisfied.
  • the pupil position detection unit 358 includes, for example, a straight line that passes through the position estimated by the first estimation unit 352 and is orthogonal to the first tangent line, and a straight line that passes through the position estimated by the second estimation unit 353 and is orthogonal to the second tangent line.
  • the center position of the pupil is detected from the intersection.
  • the gaze detection unit 354 detects the gaze (gaze direction) of the subject using the detected center position of the pupil.
  • the viewpoint detection unit 355 detects the viewpoint of the subject using the detected gaze direction. For example, the viewpoint detection unit 355 detects a viewpoint (gaze point) that is a point that the subject gazes out of the target images displayed on the display screen 101.
  • any conventionally used method can be applied.
  • a gaze direction and a gaze point of a subject are detected using a stereo camera will be described as an example.
  • the line-of-sight detection unit 354 calculates the position (eye position) of the subject's pupil in the three-dimensional world coordinate system using a stereo vision technique.
  • the line-of-sight detection unit 354 calculates the position of the subject's corneal reflection using images taken by the left and right cameras. Then, the gaze detection unit 354 calculates a gaze vector representing the gaze direction of the subject from the position of the pupil of the subject and the position of corneal reflection.
  • the method for detecting the subject's line of sight is not limited to this.
  • the subject's line of sight may be detected by analyzing an image captured using visible light instead of infrared light.
  • the viewpoint detection unit 355 detects, for example, the intersection of the line-of-sight vector represented in the coordinate system as shown in FIG. 1 and the XY plane as the gaze point of the subject.
  • the gaze point may be measured by obtaining the intersection of the left and right gazes of the subject.
  • FIG. 4 is a diagram showing an example of eye and distance detection when two cameras (the right camera 202 and the left camera 204) are used.
  • a camera calibration theory based on a stereo calibration method is applied in advance to obtain camera parameters.
  • the stereo calibration method any conventionally used method such as a method using Tsai's camera calibration theory can be applied.
  • the three-dimensional coordinates of the eye in the world coordinate system are obtained. It is done. Thereby, for example, the distance between the eyes and the stereo camera 102 can be estimated.
  • the output control unit 356 controls the output of various information to the display unit 210, the speaker 105, and the like.
  • the output control unit 356 controls the output to the display unit 210 such as the diagnostic image and the evaluation result by the evaluation unit 357.
  • the diagnostic image may be an image corresponding to the evaluation process based on the detection results of the pupil, the line of sight, the viewpoint, and the like.
  • a diagnostic image including an image (such as a geometric pattern image) preferred by a subject with a developmental disorder and other images (such as a person image) may be used.
  • the evaluation unit 357 performs an evaluation process based on the diagnostic image and the gazing point detected by the viewpoint detection unit 355. For example, in the case of diagnosing a developmental disorder, the evaluation unit 357 analyzes the diagnostic image and the gazing point, and evaluates whether or not the image preferred by the subject with the developmental disorder has been gazed.
  • FIG. 5 is a diagram illustrating an example of a captured image captured by the stereo camera 102.
  • the captured image in FIG. 5 is an example of an image obtained by capturing the face of the subject, and includes an eye region 501.
  • FIG. 6 is a diagram illustrating an example of an eye image obtained by cutting out the eye region 501 from the captured image of FIG.
  • the eye image includes a pupil 601, an iris 602, and a corneal reflection 603.
  • the image may be defocused due to the influence of the depth of field of the photographing lens.
  • FIG. 7 is a diagram showing an example of the luminance change of the image near the cornea reflection region.
  • FIG. 8 is a diagram illustrating an example of luminance change of an image near the pupil region.
  • the corneal reflection 603 has a conical luminance distribution 701 that becomes brighter toward the center as shown in FIG. For this reason, it is a common practice to easily and accurately obtain the center coordinates using the luminance centroid. That is, when the total number of pixel sets existing in a certain region (corneal reflection region or pupil region) is n (1, 2, 3,..., N), the X coordinate and Y coordinate of the luminance centroid of the region. Can be calculated using the following equations (1) and (2), respectively.
  • the pupil region has a conical luminance distribution 801 that becomes darker toward the center as shown in FIG. For this reason, as in the case of corneal reflection, a method for obtaining the luminance gravity center of the pupil by the equations (1) and (2) is conceivable.
  • the pupil region used for calculating the luminance center of gravity is divided into cases in the X direction and the Y direction, respectively, and is limited to a range where there is no omission due to corneal reflection. Thereby, the influence of corneal reflection can be avoided and the pupil can be detected with higher accuracy.
  • the formula for calculating the pupil center for example, the above formulas (1) and (2) can be applied, so that the position of the pupil can be calculated with a small amount of calculation.
  • FIGS. 9 to 14 are diagrams schematically showing FIG. For the sake of explanation, ambiguous areas due to defocusing are omitted.
  • FIG. 15 is a flowchart illustrating an example of pupil detection processing according to the present embodiment.
  • the identifying unit 351 cuts out the eye area from the captured image (step S101). Note that in the case of a captured image obtained by capturing the entire image of the eye, the clipping process may be omitted.
  • the specifying unit 351 specifies the pupil region from the pixel luminance in the eye region (step S102). As illustrated in FIG. 9, the specifying unit 351 specifies, for example, a set of pixels having brightness that is equal to or less than a predetermined threshold as the pupil region 901.
  • the specifying unit 351 obtains the minimum value x1 and maximum value x2 of the X coordinate and the minimum value y1 and maximum value y2 of the Y coordinate of the pupil region (pupil region 901 in FIG. 9) (steps S103 and S104).
  • the specifying unit 351 specifies the corneal reflection region from the pixel luminance in the eye region (step S105). As illustrated in FIG. 9, the specifying unit 351 specifies, for example, a set of pixels having brightness equal to or higher than a predetermined threshold as the cornea reflection region 902. The specifying unit 351 obtains the minimum value x11 and maximum value x22 of the X coordinate and the minimum value y11 and maximum value y22 of the Y coordinate of the cornea reflection region (corneal reflection region 902 in FIG. 9) (steps S106 and S107).
  • the first estimation unit 352 calculates the luminance centroid in the X direction (step S108).
  • An example of a method for obtaining the luminance gravity center in the X direction will be described with reference to FIGS. 10 and 11.
  • the first estimation unit 352 separates the set of pixels (x, y) in the pupil region in the equation (1) into one or two regions.
  • the first region is a region included in “(x1, y22) ⁇ (x, y) ⁇ (x2, y2)”, which is the upper portion of the pupil region.
  • This region corresponds to a region (first region or third region) that is surrounded by a tangent line that is in contact with the upper portion of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • it corresponds to a region 1001 represented by oblique lines.
  • the second region is a region included in “(x1, y11) ⁇ (x, y) ⁇ (x2, y1)”, which is the lower part of the pupil region.
  • This region corresponds to a region (a first region or a third region) that is surrounded by a tangent line that is in contact with the lower portion of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • a region a first region or a third region
  • a region satisfying the condition is used for calculating the luminance centroid.
  • the region 1001 is used for calculating the luminance centroid. Since the region 1001 has a bilaterally symmetric shape, it is possible to obtain the X coordinate of the luminance centroid using equation (1).
  • a straight line 1101 in FIG. 11 is a straight line indicating the luminance gravity center in the X direction calculated in this way. It is estimated that the pupil center exists on this straight line 1101.
  • the second estimation unit 353 calculates the luminance centroid in the Y direction (step S109).
  • An example of a method for obtaining the luminance center of gravity in the Y direction will be described with reference to FIGS.
  • the second estimation unit 353 separates a set of pixels (x, y) in the pupil region into one or two regions in Equation (2).
  • the first region is a region included in “(x1, y1) ⁇ (x, y) ⁇ (x11, y2)”, which is the left part of the pupil region.
  • This region corresponds to a region (second region or fourth region) that is surrounded by a tangent line that contacts the left part of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • it corresponds to a region 1201 represented by hatching.
  • the second region is a region included in “(x22, y1) ⁇ (x, y) ⁇ (x2, y2)”, which is the right part of the pupil region.
  • This region corresponds to a region (second region or fourth region) that is surrounded by a tangent line that is in contact with the right part of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • it corresponds to a region 1202 represented by hatching.
  • the second estimation unit 353 calculates the luminance centroid using, for example, a region having a large area among the regions.
  • the areas 1201 and 1202 are compared, and the luminance centroid is calculated using the one having the larger area.
  • the area 1202 is used for calculation. Since this region 1202 has a vertically symmetric shape, the Y coordinate of the luminance centroid can be obtained using equation (2).
  • a straight line 1301 in FIG. 13 is a straight line indicating the luminance gravity center in the Y direction calculated in this way. It is estimated that the pupil center exists on this straight line 1301.
  • the average value of the Y coordinate of the luminance centroid obtained from the first region and the Y coordinate of the luminance centroid obtained from the second region is determined as the luminance centroid in the Y direction. May be calculated as
  • the point where the straight line 1401 indicating the luminance centroid in the X direction intersects with the straight line 1402 indicating the luminance centroid in the Y direction is the pupil center 1403 (step S110).
  • the pupil position detection unit 358 detects an intersection where the straight line 1401 and the straight line 1402 intersect as the pupil center 1403.
  • the pupil center can be obtained accurately with a small amount of calculation.
  • the following effects can be obtained. (1) When the corneal reflection diameter is smaller than the pupil diameter, the pupil center coordinates can be obtained more accurately. (2) Since the amount of calculation is small, the pupil can be detected even with a lower function CPU.
  • the gaze detection apparatus and the gaze detection method of the second embodiment will be described in detail based on the drawings.
  • this invention is not limited by this embodiment.
  • the line-of-sight detection apparatus is used as a diagnosis support apparatus that supports diagnosis of developmental disabilities using the line-of-sight detection result
  • Applicable devices are not limited to diagnosis support devices.
  • the line-of-sight detection apparatus (diagnosis support apparatus) of the present embodiment detects the line of sight using an illumination unit installed at one place.
  • the line-of-sight detection device (diagnosis support device) of the present embodiment calculates the corneal curvature center position with high accuracy by using a result obtained by gazing at one point on the subject before the line-of-sight detection.
  • an illumination part is an element which can irradiate light to a test subject's eyeball including a light source.
  • the light source is an element that generates light, such as an LED (Light Emitting Diode).
  • a light source may be comprised from one LED, and may be comprised by combining several LED and arrange
  • the “light source” may be used as a term representing the illumination unit in this way.
  • FIGS. 16 and 17 are diagrams illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • symbol may be attached
  • the diagnosis support apparatus includes a display unit 210, a stereo camera 2102, and an LED light source 2103.
  • the stereo camera 2102 is disposed below the display unit 210.
  • the LED light source 2103 is arranged at the center position of two cameras included in the stereo camera 2102.
  • the LED light source 2103 is a light source that irradiates near infrared rays having a wavelength of 850 nm, for example.
  • FIG. 16 shows an example in which an LED light source 2103 (illumination unit) is configured by nine LEDs.
  • the stereo camera 2102 uses a lens that can transmit near-infrared light having a wavelength of 850 nm.
  • the stereo camera 2102 includes a right camera 2202 and a left camera 2203.
  • the LED light source 2103 irradiates near-infrared light toward the eyeball 111 of the subject.
  • the pupil 112 is reflected and darkened with low brightness
  • the corneal reflection 113 generated as a virtual image in the eyeball 111 is reflected and brightened with high brightness. Accordingly, the positions of the pupil 112 and the corneal reflection 113 on the image can be acquired by each of the two cameras (the right camera 2202 and the left camera 2203).
  • the three-dimensional world coordinate values of the positions of the pupil 112 and the corneal reflection 113 are calculated from the positions of the pupil 112 and the corneal reflection 113 obtained by two cameras.
  • the top and bottom are the Y coordinate (up is +)
  • the side is the X coordinate (right is +)
  • the depth is the Z coordinate (front is +).
  • FIG. 18 is a diagram illustrating an outline of functions of the diagnosis support apparatus 2100 according to the second embodiment.
  • FIG. 18 shows a part of the configuration shown in FIGS. 16 and 17 and a configuration used for driving the configuration.
  • the diagnosis support apparatus 2100 includes a right camera 2202, a left camera 2203, an LED light source 2103, a speaker 105, a drive / IF (interface) unit 208, a control unit 2300, and a storage unit 150.
  • a display unit 210 In FIG. 18, the display screen 101 shows the positional relationship between the right camera 2202 and the left camera 2203 in an easy-to-understand manner, but the display screen 101 is a screen displayed on the display unit 210.
  • the drive unit and the IF unit may be integrated or separate.
  • the speaker 105 functions as an audio output unit that outputs audio or the like for alerting the subject during calibration or the like.
  • the drive / IF unit 208 drives each unit included in the stereo camera 2102.
  • the drive / IF unit 208 serves as an interface between each unit included in the stereo camera 2102 and the control unit 2300.
  • the control unit 2300 is, for example, a communication I / F that communicates with a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) or a RAM (Random Access Memory) by connecting to a network. And a computer equipped with a bus for connecting each unit.
  • a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) or a RAM (Random Access Memory) by connecting to a network.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the storage unit 150 stores various information such as a control program, a measurement result, and a diagnosis support result.
  • the storage unit 150 stores, for example, an image to be displayed on the display unit 210.
  • the display unit 210 displays various information such as a target image for diagnosis.
  • FIG. 19 is a block diagram illustrating an example of detailed functions of each unit illustrated in FIG. As shown in FIG. 19, a display unit 210 and a drive / IF unit 208 are connected to the control unit 2300.
  • the drive / IF unit 208 includes camera IFs 314 and 315, an LED drive control unit 316, and a speaker drive unit 322.
  • the right camera 2202 and the left camera 2203 are connected to the drive / IF unit 208 via the camera IFs 314 and 315, respectively.
  • the driving / IF unit 208 drives these cameras to image the subject.
  • the speaker driving unit 322 drives the speaker 105.
  • the diagnosis support apparatus 2100 may include an interface (printer IF) for connecting to a printer as a printing unit. Further, the printer may be provided inside the diagnosis support apparatus 2100.
  • the control unit 2300 controls the entire diagnosis support apparatus 2100.
  • the control unit 2300 includes a first calculation unit 2351, a second calculation unit (corneal reflection center calculation unit) 2352, a third calculation unit (corneal curvature center calculation unit) 2353, a line-of-sight detection unit 2354, and a viewpoint detection unit 2355. And an output control unit 2356 and an evaluation unit 2357.
  • the line-of-sight detection device may include at least the first calculation unit 2351, the second calculation unit 2352, the third calculation unit 2353, and the line-of-sight detection unit 2354.
  • Each element included in the control unit 2300 (the first calculation unit 2351, the second calculation unit 2352, the third calculation unit 2353, the line-of-sight detection unit 2354, the viewpoint detection unit 2355, the output control unit 2356, and the evaluation unit 2357) It may be realized by software (program), may be realized by a hardware circuit, or may be realized by using software and a hardware circuit in combination.
  • the program When implemented by a program, the program is a file in an installable or executable format, such as a CD-ROM (Compact Disk Read Only Memory), a flexible disk (FD), a CD-R (Compact Disk Recordable), a DVD ( It is recorded on a computer-readable recording medium such as Digital Versatile Disk) and provided as a computer program product.
  • the program may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network.
  • the program may be provided or distributed via a network such as the Internet.
  • the program may be provided by being incorporated in advance in a ROM or the like.
  • the first calculation unit 2351 calculates the position (first position) of the pupil center indicating the center of the pupil from the eyeball image captured by the stereo camera 2102.
  • the second calculator 2352 calculates the position of the corneal reflection center (second position) indicating the center of corneal reflection from the captured image of the eyeball.
  • the third calculation unit 2353 calculates the corneal curvature center (fourth position) from the straight line (first straight line) connecting the LED light source 2103 and the corneal reflection center. For example, the third calculation unit 2353 calculates a position on the straight line where the distance from the corneal reflection center is a predetermined value as the corneal curvature center. As the predetermined value, a value determined in advance from a general radius of curvature of the cornea or the like can be used.
  • the third calculation unit 2353 may calculate the corneal curvature center in consideration of individual differences. In this case, the third calculation unit 2353 first uses a pupil center and a corneal reflection center that are calculated when the subject is gazes at the target position (third position), and a straight line that connects the pupil center and the target position (the first position). 2) and the first straight line connecting the corneal reflection center and the LED light source 2103 is calculated. Then, the third calculation unit 2353 calculates the distance (first distance) between the pupil center and the calculated intersection, and stores the calculated distance in the storage unit 150, for example.
  • the target position may be a position that is determined in advance and can calculate a three-dimensional world coordinate value.
  • the center position of the display screen 101 (the origin of the three-dimensional world coordinates) can be set as the target position.
  • the output control unit 2356 displays an image (target image) or the like that causes the subject to gaze at the target position (center) on the display screen 101. Thereby, a test subject can be made to gaze at a target position.
  • the target image may be any image as long as it allows the subject to pay attention.
  • an image in which a display mode such as luminance or color changes, an image in which the display mode is different from other regions, or the like can be used as the target image.
  • the target position is not limited to the center of the display screen 101, and may be an arbitrary position. If the center of the display screen 101 is set as the target position, the distance from an arbitrary end of the display screen 101 is minimized. For this reason, it becomes possible to make the measurement error at the time of gaze detection smaller, for example.
  • the processing up to the calculation of the distance is executed in advance, for example, before actual gaze detection is started.
  • the third calculation unit 2353 calculates, on the straight line connecting the LED light source 2103 and the corneal reflection center, a position where the distance from the pupil center is a previously calculated distance as the corneal curvature center. .
  • the line-of-sight detection unit 2354 detects the line of sight of the subject from the pupil center and the corneal curvature center.
  • the gaze detection unit 2354 detects the direction from the corneal curvature center to the pupil center as the gaze direction of the subject.
  • the viewpoint detection unit 2355 detects the viewpoint of the subject using the detected gaze direction.
  • the viewpoint detection unit 2355 detects, for example, a viewpoint (gaze point) that is a point on the display screen 101 where the subject gazes.
  • the viewpoint detection unit 2355 detects, for example, the intersection of the line-of-sight vector represented in the three-dimensional world coordinate system as shown in FIG. 17 and the XY plane as the gaze point of the subject.
  • the output control unit 2356 controls the output of various information to the display unit 210, the speaker 105, and the like. For example, the output control unit 2356 outputs the target image at the target position on the display unit 210. Further, the output control unit 2356 controls the output to the display unit 210 such as the diagnostic image and the evaluation result by the evaluation unit 2357.
  • the diagnostic image may be an image according to the evaluation process based on the line-of-sight (viewpoint) detection result.
  • a diagnostic image including an image (such as a geometric pattern image) preferred by a subject with a developmental disorder and other images (such as a person image) may be used.
  • Evaluation unit 2357 performs an evaluation process based on the diagnostic image and the gazing point detected by the viewpoint detection unit 2355. For example, in the case of diagnosing a developmental disorder, the evaluation unit 2357 analyzes the diagnostic image and the gazing point, and evaluates whether or not the image preferred by the subject with the developmental disorder has been gazed.
  • the output control unit 2356 may display the same diagnostic image as in the first embodiment, and the evaluation unit 2357 may perform the same evaluation process as the evaluation unit 357 in the first embodiment.
  • the pupil detection processing (identification unit 351, first estimation unit 352, second estimation unit 353, pupil position detection unit 358) and gaze detection processing (gaze detection unit 354) of the first embodiment are
  • the pupil detection process (first calculation unit 2351) and the gaze detection process (second calculation unit 2352, third calculation unit 2353, gaze detection unit 2354) of the second embodiment may be used.
  • FIG. 20 is a diagram illustrating an outline of processing executed by the diagnosis support apparatus 2100 of the present embodiment.
  • the elements described in FIGS. 16 to 19 are denoted by the same reference numerals and description thereof is omitted.
  • the pupil center 407 and the corneal reflection center 408 represent the center of the pupil detected when the LED light source 2103 is turned on and the center of the corneal reflection point, respectively.
  • the corneal curvature radius 409 represents the distance from the corneal surface to the corneal curvature center 410.
  • FIG. 21 is an explanatory diagram showing a difference between a method using two light sources (illumination units) (hereinafter referred to as method A) and the present embodiment using one light source (illumination unit).
  • method A a method using two light sources
  • illumination unit one light source
  • Method A uses two LED light sources 511 and 512 instead of the LED light source 2103.
  • a straight line 515 connecting the cornea reflection center 513 and the LED light source 511 when the LED light source 511 is irradiated and a straight line 516 connecting the cornea reflection center 514 and the LED light source 512 when the LED light source 512 is irradiated.
  • An intersection is calculated. This intersection is the corneal curvature center 505.
  • a straight line 523 connecting the cornea reflection center 522 and the LED light source 2103 when the LED light source 2103 is irradiated is considered.
  • a straight line 523 passes through the corneal curvature center 505. It is also known that the radius of curvature of the cornea is almost constant with little influence from individual differences. Thus, the corneal curvature center when the LED light source 2103 is irradiated exists on the straight line 523 and can be calculated by using a general curvature radius value.
  • the viewpoint position may deviate from the original position due to individual differences in the eyeballs, and accurate viewpoint position detection cannot be performed. There is.
  • FIG. 22 is a diagram for explaining calculation processing for calculating the corneal curvature center position and the distance between the pupil center position and the corneal curvature center position before performing viewpoint detection (line-of-sight detection).
  • the elements described in FIGS. 16 to 19 are denoted by the same reference numerals and description thereof is omitted.
  • the connection between the left and right cameras (the right camera 2202 and the left camera 2203) and the control unit 2300 is not shown and is omitted.
  • the target position 605 is a position for displaying a target image or the like at one point on the display unit 210 and causing the subject to stare.
  • the center position of the display screen 101 is set.
  • a straight line 613 is a straight line connecting the LED light source 2103 and the corneal reflection center 612.
  • a straight line 614 is a straight line connecting the target position 605 (gaze point) that the subject looks at and the pupil center 611.
  • a corneal curvature center 615 is an intersection of the straight line 613 and the straight line 614.
  • the third calculation unit 2353 calculates and stores the distance 616 between the pupil center 611 and the corneal curvature center 615.
  • FIG. 23 is a flowchart illustrating an example of calculation processing according to the present embodiment.
  • the output control unit 2356 reproduces the target image at one point on the display screen 101 (step S201), and causes the subject to gaze at the one point.
  • the control unit 2300 turns on the LED light source 2103 toward the eyes of the subject using the LED drive control unit 316 (step S202).
  • the controller 2300 images the eyes of the subject with the left and right cameras (the right camera 2202 and the left camera 2203) (step S203).
  • the pupil part is detected as a dark part (dark pupil) by irradiation of the LED light source 2103. Further, a corneal reflection virtual image is generated as a reflection of LED irradiation, and a corneal reflection point (corneal reflection center) is detected as a bright portion. That is, the first calculation unit 2351 detects a pupil portion from the captured image, and calculates coordinates indicating the position of the pupil center. For example, the first calculation unit 2351 detects, as a pupil part, a region having a predetermined brightness or less including the darkest part in a certain region including the eyes, and a region having a predetermined brightness or more including the brightest part is reflected by the cornea. Detect as.
  • the second calculation unit 2352 detects a corneal reflection portion from the captured image, and calculates coordinates indicating the position of the corneal reflection center.
  • the first calculation unit 2351 and the second calculation unit 2352 calculate each coordinate value for each of two images acquired by the left and right cameras (step S204).
  • the left and right cameras are pre-calibrated with a stereo calibration method to obtain three-dimensional world coordinates, and conversion parameters are calculated.
  • a stereo calibration method any conventionally used method such as a method using Tsai's camera calibration theory can be applied.
  • the first calculation unit 2351 and the second calculation unit 2352 use the conversion parameters to convert the coordinates of the left and right cameras into the three-dimensional world coordinates of the pupil center and the corneal reflection center (step S205).
  • the 3rd calculation part 2353 calculates
  • the third calculation unit 2353 calculates a straight line connecting the world coordinates of the center of the target image displayed at one point on the display screen 101 and the world coordinates of the pupil center (step S207).
  • the 3rd calculation part 2353 calculates
  • the third calculation unit 2353 calculates the distance between the pupil center and the corneal curvature center at this time, and stores it in the storage unit 150 or the like (step S209). The stored distance is used to calculate the corneal curvature center at the time of subsequent detection of the viewpoint (line of sight).
  • the distance between the pupil center and the corneal curvature center when looking at one point on the display unit 210 in the calculation process is kept constant within a range in which the viewpoint in the display unit 210 is detected.
  • the distance between the center of the pupil and the center of corneal curvature may be obtained from the average of all the values calculated during playback of the target image, or from the average of several values of the values calculated during playback. You may ask for it.
  • FIG. 24 is a diagram showing a method of calculating the corrected position of the corneal curvature center using the distance between the pupil center and the corneal curvature center obtained in advance when performing viewpoint detection.
  • a gazing point 805 represents a gazing point obtained from a corneal curvature center calculated using a general curvature radius value.
  • a gazing point 806 represents a gazing point obtained from a corneal curvature center calculated using a distance obtained in advance.
  • the pupil center 811 and the corneal reflection center 812 indicate the position of the pupil center and the position of the corneal reflection center calculated at the time of viewpoint detection, respectively.
  • a straight line 813 is a straight line connecting the LED light source 2103 and the corneal reflection center 812.
  • the corneal curvature center 814 is the position of the corneal curvature center calculated from a general curvature radius value.
  • the distance 815 is the distance between the pupil center and the corneal curvature center calculated by the prior calculation process.
  • the corneal curvature center 816 is the position of the corneal curvature center calculated using the distance obtained in advance.
  • the corneal curvature center 816 is obtained from the fact that the corneal curvature center exists on the straight line 813 and the distance between the pupil center and the corneal curvature center is the distance 815.
  • the line of sight 817 calculated when a general radius of curvature value is used is corrected to the line of sight 818.
  • the gazing point on the display screen 101 is corrected from the gazing point 805 to the gazing point 806.
  • the connection between the left and right cameras (the right camera 2202 and the left camera 2203) and the control unit 2300 is not shown and is omitted.
  • FIG. 25 is a flowchart illustrating an example of a line-of-sight detection process according to the present embodiment.
  • the line-of-sight detection process of FIG. 25 can be executed as the process of detecting the line of sight in the diagnostic process using the diagnostic image.
  • a process for displaying a diagnostic image, an evaluation process by the evaluation unit 2357 using the detection result of the gazing point, and the like are executed.
  • Step S301 to step S305 are the same as step S202 to step S206 in FIG.
  • the third calculation unit 2353 calculates, as the corneal curvature center, a position that is on the straight line calculated in step S305 and whose distance from the pupil center is equal to the distance obtained by the previous calculation process (step S306).
  • the line-of-sight detection unit 2354 obtains a vector (line-of-sight vector) connecting the pupil center and the corneal curvature center (step S307). This vector indicates the line-of-sight direction viewed by the subject.
  • the viewpoint detection unit 2355 calculates the three-dimensional world coordinate value of the intersection between the line-of-sight direction and the display screen 101 (step S308). This value is a coordinate value representing one point on the display unit 210 that the subject gazes in world coordinates.
  • the viewpoint detection unit 2355 converts the obtained three-dimensional world coordinate value into a coordinate value (x, y) represented in the two-dimensional coordinate system of the display unit 210 (step S309). Thereby, the viewpoint (gaze point) on the display part 210 which a test subject looks at can be calculated.
  • the calculation process for calculating the distance between the pupil center position and the corneal curvature center position is not limited to the method described with reference to FIGS. Hereinafter, another example of the calculation process will be described with reference to FIGS.
  • FIG. 26 is a diagram for explaining the calculation process of the present modification. The elements described in FIGS. 16 to 19 and FIG.
  • the line segment 1101 is a line segment (first line segment) connecting the target position 605 and the LED light source 103.
  • a line segment 1102 is a line segment (second line segment) that is parallel to the line segment 1101 and connects the pupil center 611 and the straight line 613.
  • the distance 616 between the pupil center 611 and the corneal curvature center 615 is calculated and stored using the line segment 1101 and the line segment 1102 as follows.
  • FIG. 27 is a flowchart showing an example of calculation processing of the present modification.
  • Steps S401 to S407 are the same as steps S201 to S207 in FIG.
  • the third calculation unit 2353 calculates a line segment (the line segment 1101 in FIG. 26) that connects the center of the target image displayed at one point on the screen of the display unit 101 and the center of the LED light source 2103.
  • the length of the line segment (L1101) is calculated (step S408).
  • the third calculation unit 2353 calculates a line segment (line segment 1102 in FIG. 26) that passes through the pupil center 611 and is parallel to the line segment calculated in step S408, and calculates the length of the calculated line segment (L1102). Is calculated (step S409).
  • the third calculation unit 2353 includes a triangle having the corneal curvature center 615 as a vertex and the line segment calculated in step S408 as a lower side, and a triangle having the corneal curvature center 615 as a vertex and the line segment calculated in step S409 as a lower side. Is a similar relationship, the distance 616 between the pupil center 611 and the corneal curvature center 615 is calculated (step S410). For example, the third calculation unit 2353 causes the ratio of the length of the line segment 1102 to the length of the line segment 1101 and the ratio of the distance 616 to the distance between the target position 605 and the corneal curvature center 615 to be equal. The distance 616 is calculated.
  • the distance 616 can be calculated by the following equation (3).
  • L614 is the distance from the target position 605 to the pupil center 611.
  • Distance 616 (L614 ⁇ L1102) / (L1101 ⁇ L1102) (3)
  • the third calculation unit 2353 stores the calculated distance 616 in the storage unit 150 or the like (step S411).
  • the stored distance is used to calculate the corneal curvature center at the time of subsequent detection of the viewpoint (line of sight).
  • the following effects can be obtained. (1) It is not necessary to arrange light sources (illuminating units) at two places, and it becomes possible to perform line-of-sight detection with the light sources arranged at one place. (2) Since the number of light sources is one, the apparatus can be made compact and the cost can be reduced.
  • the pupil detection device, the line-of-sight detection device, and the pupil detection method according to the present invention are suitable for a diagnosis support device and a diagnosis support method for developmental disabilities using captured images.
  • Diagnosis support apparatus 101 Display screen 102, 2102 Stereo camera 105 Speaker 150 Storage unit 202, 2202 Right camera 203, 205 Infrared LED light source 204, 203 Left camera 208 Drive / IF unit 210 Display unit 300, 2300 Control unit 316 LED drive control unit 322 Speaker drive unit 351 Identification unit 352 First estimation unit 353 Second estimation unit 354, 2354 Gaze detection unit 355, 2355 View point detection unit 356, 2356 Output control unit 357, 2357 Evaluation unit 2351 First calculation unit 2352 Second calculation unit (corneal reflection center calculation unit) 2353 Third calculation unit (corneal curvature center calculation unit)

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

This pupil detection device is provided with: a specifying unit (351) which, from an image captured of an eye, specifies a pupil region and a corneal reflection region; a first estimation unit (352) which uses a first region contained in the pupil region to estimate the center position of the pupil; a second estimation unit (353) which uses a second region contained in the pupil region and different from the first region to estimate the center position of the pupil; and a pupil position detection unit (358) which detects the center position of the pupil on the basis of the center position of the pupil estimated by the first estimation unit (352) and the center position of the pupil estimated by the second estimation unit (353).

Description

瞳孔検出装置、視線検出装置および瞳孔検出方法Pupil detection device, gaze detection device, and pupil detection method
 本発明は、瞳孔検出技術に関する。 The present invention relates to a pupil detection technique.
 カメラの高精度化、処理高速化、および小型化が進み、カメラで撮像した顔の画像から、被験者がモニタ画面などの観察面上で注視している位置を検出する視線検出装置が提案されている。当初は被験者の頭部を固定したり、頭部に検出装置を取り付ける技術が多かった。最近は被験者の負担を軽減するために、頭部に装置等を取り付けることが不要な非接触タイプが開発され、さらに精度の高い視線検出装置が求められている。非接触タイプの視線検出装置では、カメラ画像上の瞳孔と角膜反射の位置関係によって視線検出を行う。このため、カメラ画像上の瞳孔中心座標と角膜反射中心座標とを正確に求めることが重要である。 With the advancement of high accuracy, high processing speed, and miniaturization of cameras, a line-of-sight detection device that detects the position where the subject is gazing on an observation surface such as a monitor screen from the face image captured by the camera has been proposed. Yes. Initially, there were many techniques for fixing the subject's head and attaching a detection device to the head. Recently, in order to reduce the burden on the subject, a non-contact type that does not require a device or the like to be attached to the head has been developed, and a highly accurate gaze detection device is required. In the non-contact type gaze detection apparatus, gaze detection is performed based on the positional relationship between the pupil on the camera image and the corneal reflection. For this reason, it is important to accurately obtain the pupil center coordinates and the corneal reflection center coordinates on the camera image.
 特許文献1は、角膜反射によって瞳孔の大部分が隠されてしまっている場合でも、角膜反射像の情報を積極的に利用することにより、瞳孔を安定して検出することができる技術を提案している。 Patent Document 1 proposes a technique capable of stably detecting a pupil by actively using information of a corneal reflection image even when a large portion of the pupil is hidden by corneal reflection. ing.
特開2012-024154号公報JP 2012-024154 A
 しかしながら、特許文献1の方法では、角膜反射の基準点(角膜反射画像領域の中心)が予め取得できている必要がある。このため撮像された画像から直接的に瞳孔の中心を求めることはできない。すなわち、少ない演算量で高精度に瞳孔を検出することができなかった。 However, in the method of Patent Document 1, it is necessary that the reference point for corneal reflection (the center of the corneal reflection image region) be acquired in advance. For this reason, the center of the pupil cannot be obtained directly from the captured image. In other words, the pupil could not be detected with high accuracy with a small amount of calculation.
 本発明は、上記に鑑みてなされたものであって、より少ない演算量で、より高精度に瞳孔を検出できる瞳孔検出装置、視線検出装置および瞳孔検出方法を提供することを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to provide a pupil detection device, a line-of-sight detection device, and a pupil detection method capable of detecting a pupil with a smaller calculation amount and higher accuracy.
 上述した課題を解決し、目的を達成するために、本発明は、目を撮像した画像から、瞳孔領域と角膜反射領域とを特定する特定部と、前記瞳孔領域に含まれる第1領域を用いて、瞳孔の中心位置を推定する第1推定部と、前記瞳孔領域に含まれ、前記第1領域と異なる第2領域を用いて、瞳孔の中心位置を推定する第2推定部と、前記第1推定部により推定された瞳孔の中心位置と、前記第2推定部により推定された瞳孔の中心位置とに基づいて、瞳孔の中心位置を検出する瞳孔位置検出部と、を備える。 In order to solve the above-described problems and achieve the object, the present invention uses a specifying unit that specifies a pupil region and a corneal reflection region from an image obtained by imaging an eye, and a first region included in the pupil region. A first estimation unit that estimates the center position of the pupil, a second estimation unit that estimates the center position of the pupil using a second region that is included in the pupil region and is different from the first region, A pupil position detection unit that detects the center position of the pupil based on the center position of the pupil estimated by the first estimation unit and the center position of the pupil estimated by the second estimation unit;
 本発明にかかる瞳孔検出装置、視線検出装置および瞳孔検出方法は、より少ない演算量で、より高精度に瞳孔を検出できるという効果を奏する。 The pupil detection device, line-of-sight detection device, and pupil detection method according to the present invention have an effect that the pupil can be detected with higher accuracy with a smaller amount of calculation.
図1は、第1の実施形態で用いる表示部、ステレオカメラ、および光源の配置を示す図である。FIG. 1 is a diagram illustrating an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment. 図2は、診断支援装置の機能の概要を示す図である。FIG. 2 is a diagram illustrating an outline of functions of the diagnosis support apparatus. 図3は、図2に示す各部の詳細な機能を示すブロック図である。FIG. 3 is a block diagram showing detailed functions of the respective units shown in FIG. 図4は、2台のカメラを使用した場合の目および距離の検出を示す図である。FIG. 4 is a diagram showing eye and distance detection when two cameras are used. 図5は、ステレオカメラにより撮像された撮像画像を示す図である。FIG. 5 is a diagram illustrating a captured image captured by the stereo camera. 図6は、図5の撮像画像から切り出した目画像の例を示す図である。FIG. 6 is a diagram illustrating an example of an eye image cut out from the captured image of FIG. 図7は、角膜反射領域付近の画像の輝度変化の例を示す図である。FIG. 7 is a diagram illustrating an example of a luminance change of an image near the cornea reflection region. 図8は、瞳孔領域付近の画像の輝度変化の例を示す図である。FIG. 8 is a diagram illustrating an example of luminance change of an image near the pupil region. 図9は、図6を模式的に表した図である。FIG. 9 is a diagram schematically showing FIG. 図10は、図6を模式的に表した図である。FIG. 10 is a diagram schematically showing FIG. 図11は、図6を模式的に表した図である。FIG. 11 is a diagram schematically showing FIG. 図12は、図6を模式的に表した図である。FIG. 12 is a diagram schematically showing FIG. 図13は、図6を模式的に表した図である。FIG. 13 is a diagram schematically showing FIG. 図14は、図6を模式的に表した図である。FIG. 14 is a diagram schematically illustrating FIG. 図15は、第1の実施形態の瞳孔検出処理を示すフローチャートである。FIG. 15 is a flowchart illustrating pupil detection processing according to the first embodiment. 図16は、第2の実施形態の表示部、ステレオカメラ、赤外線光源および被験者の配置の一例を示す図である。FIG. 16 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment. 図17は、第2の実施形態の表示部、ステレオカメラ、赤外線光源および被験者の配置の一例を示す図である。FIG. 17 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment. 図18は、診断支援装置の機能の概要を示す図である。FIG. 18 is a diagram illustrating an outline of functions of the diagnosis support apparatus. 図19は、図18に示す各部の詳細な機能の一例を示すブロック図である。FIG. 19 is a block diagram illustrating an example of detailed functions of the respective units illustrated in FIG. 図20は、第2の実施形態の診断支援装置により実行される処理の概要を説明する図である。FIG. 20 is a diagram illustrating an outline of processing executed by the diagnosis support apparatus according to the second embodiment. 図21は、2つの光源を用いる方法と、1つの光源を用いる第2の実施形態との違いを示す説明図である。FIG. 21 is an explanatory diagram showing the difference between the method using two light sources and the second embodiment using one light source. 図22は、瞳孔中心位置と角膜曲率中心位置との距離を算出する算出処理を説明するための図である。FIG. 22 is a diagram for explaining calculation processing for calculating the distance between the pupil center position and the corneal curvature center position. 図23は、第2の実施形態の算出処理の一例を示すフローチャートである。FIG. 23 is a flowchart illustrating an example of calculation processing according to the second embodiment. 図24は、事前に求めた距離を使用して角膜曲率中心の位置を算出する方法を示した図である。FIG. 24 is a diagram illustrating a method of calculating the position of the corneal curvature center using the distance obtained in advance. 図25は、第2の実施形態の視線検出処理の一例を示すフローチャートである。FIG. 25 is a flowchart illustrating an example of a line-of-sight detection process according to the second embodiment. 図26は、変形例の算出処理を説明するための図である。FIG. 26 is a diagram for explaining a calculation process of the modification. 図27は、変形例の算出処理の一例を示すフローチャートである。FIG. 27 is a flowchart illustrating an example of a modification calculation process.
 以下に、本発明にかかる瞳孔検出装置、視線検出装置および瞳孔検出方法の実施形態を図面に基づいて詳細に説明する。なお、この実施形態によりこの発明が限定されるものではない。 Hereinafter, embodiments of a pupil detection device, a line-of-sight detection device, and a pupil detection method according to the present invention will be described in detail with reference to the drawings. In addition, this invention is not limited by this embodiment.
(第1の実施形態)
 上述のように、瞳孔検出装置の利用シーンの1つとして、非接触で行う注視点検出などが知られている。具体的な手法としては、近赤外の点光源(LEDなど)を目に照射し、角膜で反射された光源像と瞳孔の位置から視線を推定する方法が存在する。この場合、注視点検出の性能は、瞳孔と角膜反射の中心座標の検出精度に大きく依存している。
(First embodiment)
As described above, gaze point detection performed without contact is known as one of the usage scenes of the pupil detection device. As a specific method, there is a method of irradiating a near-infrared point light source (LED or the like) to the eye and estimating the line of sight from the light source image reflected by the cornea and the position of the pupil. In this case, the gaze point detection performance largely depends on the detection accuracy of the center coordinates of the pupil and the cornea reflection.
 第1の実施形態の瞳孔検出装置(診断支援装置)は、例えばX方向およびY方向それぞれに個別の領域を設定し、各領域の輝度重心を求めることにより目領域画像内の瞳孔中心位置を検出する。これにより、角膜反射の影響を排除して、少ない演算量で精度の高い検出が可能となる。本実施形態の瞳孔検出装置は、非接触タイプの視線検出装置などに使用できる。瞳孔検出装置の精度向上によって、視線検出装置全体の性能を向上させることができる。また、本実施形態の瞳孔検出装置は、瞳孔検出結果を用いて発達障がいなどの診断を支援する診断支援装置に使用できる。以下では、このような診断支援装置に瞳孔検出装置を用いた例を説明する。適用可能な装置は視線検出装置および診断支援装置に限られるものではない。 The pupil detection device (diagnosis support device) of the first embodiment detects the pupil center position in the eye region image by setting individual regions in each of the X direction and the Y direction, for example, and obtaining the luminance centroid of each region. To do. This eliminates the influence of corneal reflection and enables highly accurate detection with a small amount of computation. The pupil detection device of the present embodiment can be used for a non-contact type gaze detection device or the like. By improving the accuracy of the pupil detection device, the overall performance of the visual line detection device can be improved. Moreover, the pupil detection device of the present embodiment can be used for a diagnosis support device that supports diagnosis of developmental disabilities using the pupil detection result. Below, the example which used the pupil detection apparatus for such a diagnosis assistance apparatus is demonstrated. Applicable devices are not limited to the line-of-sight detection device and the diagnosis support device.
 図1は、第1の実施形態で用いる表示部、ステレオカメラ、および光源の配置の一例を示す図である。図1に示すように、本実施形態では、表示画面101の下側に、1組のステレオカメラ102を配置する。ステレオカメラ102は、赤外線によるステレオ撮影が可能な撮像部であり、右カメラ202と左カメラ204とを備えている。 FIG. 1 is a diagram illustrating an example of an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment. As shown in FIG. 1, in this embodiment, a set of stereo cameras 102 is arranged below the display screen 101. The stereo camera 102 is an imaging unit that can perform stereo shooting with infrared rays, and includes a right camera 202 and a left camera 204.
 右カメラ202および左カメラ204の各レンズの直前には、円周方向に赤外LED(Light Emitting Diode)光源203および205がそれぞれ配置される。赤外LED光源203および205は、例えば波長850nmの近赤外線を照射する光源である。赤外LED光源203および205により被験者の瞳孔を検出する。瞳孔の検出方法の詳細は後述する。 Just before each lens of the right camera 202 and the left camera 204, infrared LED (Light Emitting Diode) light sources 203 and 205 are arranged in the circumferential direction, respectively. The infrared LED light sources 203 and 205 are light sources that irradiate near infrared rays having a wavelength of 850 nm, for example. The pupils of the subject are detected by the infrared LED light sources 203 and 205. Details of the pupil detection method will be described later.
 視線を検出する際には、空間を座標で表現して位置を特定する。本実施形態では、表示画面101の画面の中央位置を原点として、上下をY座標(上が+)、横をX座標(向かって右が+)、奥行きをZ座標(手前が+)としている。 When detecting the line of sight, the space is expressed by coordinates and the position is specified. In this embodiment, the center position of the display screen 101 is the origin, the top and bottom are the Y coordinate (up is +), the side is the X coordinate (right is +), and the depth is the Z coordinate (front is +). .
 図2は、診断支援装置100の機能の概要を示す図である。図2では、図1に示した構成の一部と、この構成の駆動などに用いられる構成を示している。図2に示すように、診断支援装置100は、右カメラ202と、左カメラ204と、赤外LED光源203および205と、スピーカ105と、駆動・IF(interface)部208と、制御部300と、記憶部150と、表示部210と、を含む。図2において、表示画面101は、右カメラ202および左カメラ204との位置関係を分かりやすく示しているが、表示画面101は表示部210において表示される画面である。なお、駆動部とIF部は一体でもよいし、別体でもよい。 FIG. 2 is a diagram showing an outline of functions of the diagnosis support apparatus 100. FIG. 2 shows a part of the configuration shown in FIG. 1 and a configuration used for driving the configuration. As shown in FIG. 2, the diagnosis support apparatus 100 includes a right camera 202, a left camera 204, infrared LED light sources 203 and 205, a speaker 105, a drive / IF (interface) unit 208, and a control unit 300. A storage unit 150 and a display unit 210. In FIG. 2, the display screen 101 shows the positional relationship between the right camera 202 and the left camera 204 in an easy-to-understand manner, but the display screen 101 is a screen displayed on the display unit 210. The drive unit and the IF unit may be integrated or separate.
 スピーカ105は、キャリブレーション時などに、被験者に注意を促すための音声などを出力する音声出力部として機能する。 The speaker 105 functions as an audio output unit that outputs audio or the like for alerting the subject during calibration or the like.
 駆動・IF部208は、ステレオカメラ102に含まれる各部を駆動する。また、駆動・IF部208は、ステレオカメラ102に含まれる各部と、制御部300とのインタフェースとなる。 The drive / IF unit 208 drives each unit included in the stereo camera 102. The drive / IF unit 208 serves as an interface between each unit included in the stereo camera 102 and the control unit 300.
 制御部300は、例えば、CPU(Central Processing Unit)などの制御装置と、ROM(Read Only Memory)やRAM(Random Access Memory)などの記憶装置と、ネットワークに接続して通信を行う通信I/Fと、各部を接続するバスを備えているコンピュータなどにより実現できる。 The control unit 300 is a communication I / F that communicates with a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory) by connecting to a network. And a computer equipped with a bus for connecting each unit.
 記憶部150は、制御プログラム、測定結果、診断支援結果など各種情報を記憶する。記憶部150は、例えば、表示部210に表示する画像等を記憶する。表示部210は、診断のための対象画像等、各種情報を表示する。 The storage unit 150 stores various information such as a control program, a measurement result, and a diagnosis support result. The storage unit 150 stores, for example, an image to be displayed on the display unit 210. The display unit 210 displays various information such as a target image for diagnosis.
 図3は、図2に示す各部の詳細な機能の一例を示すブロック図である。図3に示すように、制御部300には、表示部210と、駆動・IF部208が接続される。駆動・IF部208は、カメラIF314、315と、LED駆動制御部316と、スピーカ駆動部322と、を備える。 FIG. 3 is a block diagram showing an example of detailed functions of each unit shown in FIG. As shown in FIG. 3, a display unit 210 and a drive / IF unit 208 are connected to the control unit 300. The drive / IF unit 208 includes camera IFs 314 and 315, an LED drive control unit 316, and a speaker drive unit 322.
 駆動・IF部208には、カメラIF314、315を介して、それぞれ、右カメラ202、左カメラ204が接続される。駆動・IF部208がこれらのカメラを駆動することにより、被験者を撮像する。 The right camera 202 and the left camera 204 are connected to the drive / IF unit 208 via the camera IFs 314 and 315, respectively. The driving / IF unit 208 drives these cameras to image the subject.
 赤外LED光源203および赤外LED光源205は、例えば850nmの近赤外線を照射する光源である。なお、照射する赤外線の波長は上記に限られるものではない。 The infrared LED light source 203 and the infrared LED light source 205 are light sources that irradiate near-infrared rays of 850 nm, for example. In addition, the wavelength of the infrared rays to be irradiated is not limited to the above.
 スピーカ駆動部322は、スピーカ105を駆動する。なお、診断支援装置100が、印刷部としてのプリンタと接続するためのインタフェース(プリンタIF)を備えてもよい。また、プリンタを診断支援装置100の内部に備えるように構成してもよい。 The speaker driving unit 322 drives the speaker 105. The diagnosis support apparatus 100 may include an interface (printer IF) for connecting to a printer as a printing unit. In addition, the printer may be provided inside the diagnosis support apparatus 100.
 制御部300は、診断支援装置100全体を制御する。制御部300は、特定部351と、第1推定部352と、第2推定部353と、視線検出部354と、視点検出部355と、出力制御部356と、評価部357と、瞳孔位置検出部358と、を備えている。なお、瞳孔検出装置としては、少なくとも特定部351、第1推定部352、および、第2推定部353が備えられていればよい。 The control unit 300 controls the entire diagnosis support apparatus 100. The control unit 300 includes a specifying unit 351, a first estimation unit 352, a second estimation unit 353, a gaze detection unit 354, a viewpoint detection unit 355, an output control unit 356, an evaluation unit 357, and pupil position detection. Part 358. In addition, as a pupil detection apparatus, the specific part 351, the 1st estimation part 352, and the 2nd estimation part 353 should just be provided at least.
 制御部300に含まれる各要素(特定部351、第1推定部352、第2推定部353、視線検出部354、視点検出部355、出力制御部356、評価部357、瞳孔位置検出部358)は、ソフトウェア(プログラム)で実現してもよいし、ハードウェア回路で実現してもよいし、ソフトウェアとハードウェア回路とを併用して実現してもよい。 Each element included in the control unit 300 (specification unit 351, first estimation unit 352, second estimation unit 353, gaze detection unit 354, viewpoint detection unit 355, output control unit 356, evaluation unit 357, pupil position detection unit 358) May be realized by software (program), a hardware circuit, or a combination of software and a hardware circuit.
 プログラムで実現する場合、当該プログラムは、インストール可能な形式又は実行可能な形式のファイルでCD-ROM(Compact Disk Read Only Memory)、フレキシブルディスク(FD)、CD-R(Compact Disk Recordable)、DVD(Digital Versatile Disk)等のコンピュータで読み取り可能な記録媒体に記録されてコンピュータプログラムプロダクトとして提供される。プログラムを、インターネット等のネットワークに接続されたコンピュータ上に格納し、ネットワーク経由でダウンロードさせることにより提供するように構成してもよい。また、プログラムをインターネット等のネットワーク経由で提供または配布するように構成してもよい。また、プログラムを、ROM等に予め組み込んで提供するように構成してもよい。 When implemented by a program, the program is a file in an installable or executable format, such as a CD-ROM (Compact Disk Read Only Memory), a flexible disk (FD), a CD-R (Compact Disk Recordable), a DVD ( It is recorded on a computer-readable recording medium such as Digital Versatile Disk) and provided as a computer program product. The program may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. The program may be provided or distributed via a network such as the Internet. The program may be provided by being incorporated in advance in a ROM or the like.
 特定部351は、撮像部(ステレオカメラ102)により撮像された撮像画像(目を撮像した画像)から、瞳孔領域と角膜反射領域とを特定する。特定部351による特定方法としては、例えば、画像のうち輝度が小さい(暗い)領域を瞳孔領域として特定し、輝度が大きい(明るい)領域を角膜反射領域として特定する方法など、従来から用いられているあらゆる方法を適用できる。 The identifying unit 351 identifies the pupil region and the corneal reflection region from the captured image (image captured by the eyes) captured by the imaging unit (stereo camera 102). As a specifying method by the specifying unit 351, for example, a method of specifying a low-luminance (dark) region in an image as a pupil region and a high-luminance (bright) region as a corneal reflection region has been conventionally used. You can apply any way you are.
 第1推定部352は、瞳孔領域に含まれる領域(第1領域)を用いて、瞳孔の中心位置を推定する。第2推定部353は、瞳孔領域に含まれ、第1領域とは異なる領域(第2領域)を用いて、瞳孔の中心位置を推定する。例えば、第1推定部352は、角膜反射領域に接する接線(第1接線)と瞳孔領域の外周線とによって囲まれる領域であって角膜反射領域を含まない領域(第1領域)を用いて、第1接線方向の瞳孔の中心位置を推定する。第2推定部353は、角膜反射領域に接し、第1接線と直交する接線(第2接線)と外周線とによって囲まれる領域であって角膜反射領域を含まない領域(第2領域)を用いて、第2接線方向の瞳孔の中心位置を推定する。瞳孔位置検出部358は、第1推定部352により推定された瞳孔の中心位置と、第2推定部353により推定された瞳孔の中心位置とに基づいて、瞳孔の中心位置を検出する。瞳孔位置検出部358は、例えば、第1推定部352により推定された位置を通り第1接線に直交する直線と、第2推定部353により推定された位置を通り第2接線に直交する直線との交点から瞳孔の中心位置を検出する。 The first estimation unit 352 estimates the center position of the pupil using the region (first region) included in the pupil region. The second estimation unit 353 estimates the center position of the pupil using a region (second region) that is included in the pupil region and is different from the first region. For example, the first estimation unit 352 uses a region (first region) that is surrounded by a tangent line (first tangent line) that is in contact with the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region. The center position of the pupil in the first tangential direction is estimated. The second estimation unit 353 uses a region (second region) that is in contact with the corneal reflection region and is surrounded by a tangent (second tangent) orthogonal to the first tangent and the outer peripheral line and does not include the corneal reflection region. Thus, the center position of the pupil in the second tangential direction is estimated. The pupil position detection unit 358 detects the center position of the pupil based on the center position of the pupil estimated by the first estimation unit 352 and the center position of the pupil estimated by the second estimation unit 353. The pupil position detection unit 358 includes, for example, a straight line that passes through the position estimated by the first estimation unit 352 and is orthogonal to the first tangent line, and a straight line that passes through the position estimated by the second estimation unit 353 and is orthogonal to the second tangent line. The center position of the pupil is detected from the intersection.
 例えば第1接線を水平方向(X方向)に延びる直線とし、第2接線を垂直方向(Y方向)に延びる直線とすることができる。第1接線および第2接線の方向は、水平方向および垂直方向に限られるものではなく、互いに直交すれば任意の方向とすることができる。以下では、第1接線が水平方向(X方向)であり、第2接線が垂直方向(Y方向)である場合を例に説明する。 For example, the first tangent can be a straight line extending in the horizontal direction (X direction), and the second tangent can be a straight line extending in the vertical direction (Y direction). The directions of the first tangent line and the second tangent line are not limited to the horizontal direction and the vertical direction, and can be any direction as long as they are orthogonal to each other. Hereinafter, a case where the first tangent is in the horizontal direction (X direction) and the second tangent is in the vertical direction (Y direction) will be described as an example.
 第1領域および第2領域から瞳孔の中心位置を推定する方法としては、例えば、各領域の輝度重心を求める方法を適用できる。なお、推定方法はこれに限られるものではなく、任意の方法を適用できる。例えば、第1領域の第1接線方向の中心位置を、第1接線方向の瞳孔の中心位置として推定してもよい。同様に、第2領域の第2接線方向の中心位置を、第2接線方向の瞳孔の中心位置として推定してもよい。 As a method of estimating the center position of the pupil from the first region and the second region, for example, a method of obtaining the luminance centroid of each region can be applied. Note that the estimation method is not limited to this, and any method can be applied. For example, the center position of the first region in the first tangential direction may be estimated as the center position of the pupil in the first tangential direction. Similarly, the center position of the second region in the second tangent direction may be estimated as the center position of the pupil in the second tangent direction.
 なお、角膜反射領域に接する接線と瞳孔領域の外周線とによって囲まれる領域が存在するためには、瞳孔径に比較して角膜反射径が小さい必要があるが、通常はこの条件は満たされる。 In addition, in order for the region surrounded by the tangent line that contacts the corneal reflection region and the outer peripheral line of the pupil region to exist, the corneal reflection diameter needs to be smaller than the pupil diameter, but this condition is usually satisfied.
 瞳孔位置検出部358は、例えば、第1推定部352により推定された位置を通り第1接線に直交する直線と、第2推定部353により推定された位置を通り第2接線に直交する直線との交点から瞳孔の中心位置を検出する。視線検出部354は、検出された瞳孔の中心位置を用いて、被験者の視線(視線方向)を検出する。視点検出部355は、検出された視線方向を用いて被験者の視点を検出する。視点検出部355は、例えば、表示画面101に表示された対象画像のうち、被験者が注視する点である視点(注視点)を検出する。視線検出部354による視線検出方法、および、視点検出部355による視点検出方法としては、従来から用いられているあらゆる方法を適用できる。以下では、ステレオカメラを用いて被験者の視線方向および注視点を検出する場合を例に説明する。 The pupil position detection unit 358 includes, for example, a straight line that passes through the position estimated by the first estimation unit 352 and is orthogonal to the first tangent line, and a straight line that passes through the position estimated by the second estimation unit 353 and is orthogonal to the second tangent line. The center position of the pupil is detected from the intersection. The gaze detection unit 354 detects the gaze (gaze direction) of the subject using the detected center position of the pupil. The viewpoint detection unit 355 detects the viewpoint of the subject using the detected gaze direction. For example, the viewpoint detection unit 355 detects a viewpoint (gaze point) that is a point that the subject gazes out of the target images displayed on the display screen 101. As a visual line detection method by the visual line detection unit 354 and a viewpoint detection method by the viewpoint detection unit 355, any conventionally used method can be applied. Below, the case where a gaze direction and a gaze point of a subject are detected using a stereo camera will be described as an example.
 視線検出部354は、ステレオ視の手法により、3次元世界座標系での被験者の瞳孔の位置(目の位置)を算出する。また、視線検出部354は、左右のカメラで撮影された画像を用いて被験者の角膜反射の位置を算出する。そして、視線検出部354は、被験者の瞳孔の位置と角膜反射の位置とから、被験者の視線方向を表す視線ベクトルを算出する。 The line-of-sight detection unit 354 calculates the position (eye position) of the subject's pupil in the three-dimensional world coordinate system using a stereo vision technique. The line-of-sight detection unit 354 calculates the position of the subject's corneal reflection using images taken by the left and right cameras. Then, the gaze detection unit 354 calculates a gaze vector representing the gaze direction of the subject from the position of the pupil of the subject and the position of corneal reflection.
 なお、被験者の視線の検出方法はこれに限られるものではない。例えば、赤外線ではなく、可視光を用いて撮影した画像を解析することにより、被験者の視線を検出してもよい。 It should be noted that the method for detecting the subject's line of sight is not limited to this. For example, the subject's line of sight may be detected by analyzing an image captured using visible light instead of infrared light.
 視点検出部355は、例えば図1のような座標系で表される視線ベクトルとXY平面との交点を、被験者の注視点として検出する。両目の視線方向が得られた場合は、被験者の左右の視線の交点を求めることによって注視点を計測してもよい。 The viewpoint detection unit 355 detects, for example, the intersection of the line-of-sight vector represented in the coordinate system as shown in FIG. 1 and the XY plane as the gaze point of the subject. When the gaze direction of both eyes is obtained, the gaze point may be measured by obtaining the intersection of the left and right gazes of the subject.
 図4は、2台のカメラ(右カメラ202、左カメラ204)を使用した場合の目および距離の検出の一例を示す図である。2台のカメラは、事前にステレオ較正法によるカメラキャリブレーション理論を適用し、カメラパラメータを求めておく。ステレオ較正法は、Tsaiのカメラキャリブレーション理論を用いた方法など従来から用いられているあらゆる方法を適用できる。右カメラ202で撮影された画像から検出した目の位置と、左カメラ204で撮影された画像から検出した目の位置と、カメラパラメータとを用いて、世界座標系における目の3次元座標が得られる。これにより、例えば、目とステレオカメラ102間の距離などを推定することができる。 FIG. 4 is a diagram showing an example of eye and distance detection when two cameras (the right camera 202 and the left camera 204) are used. For the two cameras, a camera calibration theory based on a stereo calibration method is applied in advance to obtain camera parameters. As the stereo calibration method, any conventionally used method such as a method using Tsai's camera calibration theory can be applied. Using the eye position detected from the image captured by the right camera 202, the eye position detected from the image captured by the left camera 204, and the camera parameters, the three-dimensional coordinates of the eye in the world coordinate system are obtained. It is done. Thereby, for example, the distance between the eyes and the stereo camera 102 can be estimated.
 図3に戻り、出力制御部356は、表示部210およびスピーカ105などに対する各種情報の出力を制御する。例えば、出力制御部356は、診断画像、および、評価部357による評価結果などの表示部210に対する出力を制御する。 3, the output control unit 356 controls the output of various information to the display unit 210, the speaker 105, and the like. For example, the output control unit 356 controls the output to the display unit 210 such as the diagnostic image and the evaluation result by the evaluation unit 357.
 診断画像は、瞳孔、視線および視点などの検出結果に基づく評価処理に応じた画像であればよい。例えば発達障がいを診断する場合であれば、発達障がいの被験者が好む画像(幾何学模様映像など)と、それ以外の画像(人物映像など)と、を含む診断画像を用いてもよい。 The diagnostic image may be an image corresponding to the evaluation process based on the detection results of the pupil, the line of sight, the viewpoint, and the like. For example, in the case of diagnosing a developmental disorder, a diagnostic image including an image (such as a geometric pattern image) preferred by a subject with a developmental disorder and other images (such as a person image) may be used.
 評価部357は、診断画像と、視点検出部355により検出された注視点とに基づく評価処理を行う。例えば発達障がいを診断する場合であれば、評価部357は、診断画像と注視点とを解析し、発達障がいの被験者が好む画像を注視したか否かを評価する。 The evaluation unit 357 performs an evaluation process based on the diagnostic image and the gazing point detected by the viewpoint detection unit 355. For example, in the case of diagnosing a developmental disorder, the evaluation unit 357 analyzes the diagnostic image and the gazing point, and evaluates whether or not the image preferred by the subject with the developmental disorder has been gazed.
 以下では、角膜反射および瞳孔中心の座標を求める処理の例を図5~図15を用いて説明する。 Hereinafter, an example of processing for obtaining the coordinates of the corneal reflection and the pupil center will be described with reference to FIGS.
 図5は、ステレオカメラ102により撮像された撮像画像の一例を示す図である。図5の撮像画像は被験者の顔を撮像した画像の例であり、目領域501を含む。 FIG. 5 is a diagram illustrating an example of a captured image captured by the stereo camera 102. The captured image in FIG. 5 is an example of an image obtained by capturing the face of the subject, and includes an eye region 501.
 図6は、図5の撮像画像から目領域501を切り出した目画像の例を示す図である。図6に示すように、目画像は、瞳孔601と、虹彩602と、角膜反射603と、を含む。角膜反射および瞳孔の輪郭部分では、撮影用レンズの被写界深度などの影響により画像がデフォーカスする場合がある。この場合、角膜反射603と瞳孔601との境界付近、または、瞳孔601と虹彩602の境界付近には、段階的に輝度が変化する領域が存在する。 FIG. 6 is a diagram illustrating an example of an eye image obtained by cutting out the eye region 501 from the captured image of FIG. As shown in FIG. 6, the eye image includes a pupil 601, an iris 602, and a corneal reflection 603. In the corneal reflection and the outline of the pupil, the image may be defocused due to the influence of the depth of field of the photographing lens. In this case, there is a region where the luminance changes stepwise near the boundary between the corneal reflection 603 and the pupil 601 or near the boundary between the pupil 601 and the iris 602.
 図7は、角膜反射領域付近の画像の輝度変化の例を示す図である。図8は、瞳孔領域付近の画像の輝度変化の例を示す図である。 FIG. 7 is a diagram showing an example of the luminance change of the image near the cornea reflection region. FIG. 8 is a diagram illustrating an example of luminance change of an image near the pupil region.
 角膜反射603は、図7のように中心ほど明るくなる円錐状の輝度分布701を持つ。このため、輝度重心を用いて容易に精度良く中心座標を求めることが一般的に行われている。すなわち、ある領域(角膜反射領域または瞳孔領域)内に存在するピクセル集合の総数をn個(1、2、3、・・・、n)とすると、当該領域の輝度重心のX座標およびY座標はそれぞれ以下の(1)式および(2)式を用いて算出することができる。
Figure JPOXMLDOC01-appb-M000001
The corneal reflection 603 has a conical luminance distribution 701 that becomes brighter toward the center as shown in FIG. For this reason, it is a common practice to easily and accurately obtain the center coordinates using the luminance centroid. That is, when the total number of pixel sets existing in a certain region (corneal reflection region or pupil region) is n (1, 2, 3,..., N), the X coordinate and Y coordinate of the luminance centroid of the region. Can be calculated using the following equations (1) and (2), respectively.
Figure JPOXMLDOC01-appb-M000001
 一方、瞳孔領域では、図8のように、角膜反射領域とは逆に中心ほど暗くなる円錐状の輝度分布801を持つ。このため、角膜反射の場合と同様に(1)式および(2)式によって、瞳孔の輝度重心を求める方法が考えられる。 On the other hand, the pupil region has a conical luminance distribution 801 that becomes darker toward the center as shown in FIG. For this reason, as in the case of corneal reflection, a method for obtaining the luminance gravity center of the pupil by the equations (1) and (2) is conceivable.
 しかし、実際には図6のように角膜反射によって瞳孔の一部が欠落している場合がある。このような場合、輝度重心による算出方法では大きな誤差を生じる可能性がある。そこで、本実施形態では、輝度重心の算出に用いる瞳孔の領域を、X方向およびY方向それぞれの場合に分けて、角膜反射による欠落のない範囲に制限する。これにより、角膜反射の影響を回避して、より高精度に瞳孔を検出することができる。また、瞳孔中心の算出式としては例えば上記(1)式および(2)式を適用できるため、少ない演算量で瞳孔の位置を算出できる。 However, in practice, a part of the pupil may be missing due to corneal reflection as shown in FIG. In such a case, a large error may occur in the calculation method using the luminance centroid. Therefore, in the present embodiment, the pupil region used for calculating the luminance center of gravity is divided into cases in the X direction and the Y direction, respectively, and is limited to a range where there is no omission due to corneal reflection. Thereby, the influence of corneal reflection can be avoided and the pupil can be detected with higher accuracy. Moreover, as the formula for calculating the pupil center, for example, the above formulas (1) and (2) can be applied, so that the position of the pupil can be calculated with a small amount of calculation.
 次に、図9~図15を参照しながら、本実施形態の瞳孔検出処理の一例について説明する。図9~図14は、図6を模式的に表した図である。なお説明のためにデフォーカスによる曖昧な領域は省いている。図15は、本実施形態の瞳孔検出処理の一例を示すフローチャートである。 Next, an example of pupil detection processing according to the present embodiment will be described with reference to FIGS. 9 to 14 are diagrams schematically showing FIG. For the sake of explanation, ambiguous areas due to defocusing are omitted. FIG. 15 is a flowchart illustrating an example of pupil detection processing according to the present embodiment.
 特定部351は、撮像画像から目領域を切り出す(ステップS101)。なお目を画像全体に撮像した撮像画像などの場合は、切り出す処理を省略してもよい。特定部351は、目領域内のピクセル輝度から瞳孔領域を特定する(ステップS102)。図9に示すように、特定部351は、例えば決められた閾値以下の明るさを持つピクセルの集合を瞳孔領域901として特定する。特定部351は、瞳孔領域(図9では瞳孔領域901)のX座標の最小値x1と最大値x2、Y座標の最小値y1と最大値y2を求める(ステップS103、ステップS104)。 The identifying unit 351 cuts out the eye area from the captured image (step S101). Note that in the case of a captured image obtained by capturing the entire image of the eye, the clipping process may be omitted. The specifying unit 351 specifies the pupil region from the pixel luminance in the eye region (step S102). As illustrated in FIG. 9, the specifying unit 351 specifies, for example, a set of pixels having brightness that is equal to or less than a predetermined threshold as the pupil region 901. The specifying unit 351 obtains the minimum value x1 and maximum value x2 of the X coordinate and the minimum value y1 and maximum value y2 of the Y coordinate of the pupil region (pupil region 901 in FIG. 9) (steps S103 and S104).
 同様に、特定部351は、目領域内のピクセル輝度から角膜反射領域を特定する(ステップS105)。図9に示すように、特定部351は、例えば決められた閾値以上の明るさを持つピクセルの集合を角膜反射領域902として特定する。特定部351は、角膜反射領域(図9では角膜反射領域902)のX座標の最小値x11と最大値x22、Y座標の最小値y11と最大値y22を求める(ステップS106、ステップS107)。 Similarly, the specifying unit 351 specifies the corneal reflection region from the pixel luminance in the eye region (step S105). As illustrated in FIG. 9, the specifying unit 351 specifies, for example, a set of pixels having brightness equal to or higher than a predetermined threshold as the cornea reflection region 902. The specifying unit 351 obtains the minimum value x11 and maximum value x22 of the X coordinate and the minimum value y11 and maximum value y22 of the Y coordinate of the cornea reflection region (corneal reflection region 902 in FIG. 9) (steps S106 and S107).
 次に第1推定部352は、X方向の輝度重心を算出する(ステップS108)。図10および図11を用いて、X方向の輝度重心を求める方法の一例を説明する。 Next, the first estimation unit 352 calculates the luminance centroid in the X direction (step S108). An example of a method for obtaining the luminance gravity center in the X direction will be described with reference to FIGS. 10 and 11.
 第1推定部352は、(1)式における、瞳孔領域のピクセル(x,y)の集合を1つまたは2つの領域に分離する。1つ目の領域は、瞳孔領域の上部である「(x1,y22)<(x,y)<(x2,y2)」に含まれる領域である。この領域は、角膜反射領域の上部に接する接線と瞳孔領域の外周線とによって囲まれる領域であって角膜反射領域を含まない領域(第1領域または第3領域)に相当する。図10では、斜線で表された領域1001に相当する。 The first estimation unit 352 separates the set of pixels (x, y) in the pupil region in the equation (1) into one or two regions. The first region is a region included in “(x1, y22) <(x, y) <(x2, y2)”, which is the upper portion of the pupil region. This region corresponds to a region (first region or third region) that is surrounded by a tangent line that is in contact with the upper portion of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region. In FIG. 10, it corresponds to a region 1001 represented by oblique lines.
 2つ目の領域は、瞳孔領域の下部である「(x1,y11)<(x,y)<(x2,y1)」に含まれる領域である。この領域は、角膜反射領域の下部に接する接線と瞳孔領域の外周線とによって囲まれる領域であって角膜反射領域を含まない領域(第1領域または第3領域)に相当する。図10の例では「y1<y11」であるため、この条件を満たす領域が存在しない。 The second region is a region included in “(x1, y11) <(x, y) <(x2, y1)”, which is the lower part of the pupil region. This region corresponds to a region (a first region or a third region) that is surrounded by a tangent line that is in contact with the lower portion of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region. In the example of FIG. 10, since “y1 <y11”, there is no region that satisfies this condition.
 このように、瞳孔領域と角膜反射領域との位置関係によっては、上記2つの条件が満たされない場合がある。この場合は、条件を満たす領域が輝度重心の算出に用いられる。図10の例では、領域1001が輝度重心の算出に用いられる。領域1001は左右対称な形状であるため、(1)式を用いて輝度重心のX座標を求めることが可能である。図11の直線1101は、このようにして算出されたX方向の輝度重心を示す直線である。この直線1101上に、瞳孔中心が存在すると推定される。 Thus, depending on the positional relationship between the pupil region and the corneal reflection region, the above two conditions may not be satisfied. In this case, a region satisfying the condition is used for calculating the luminance centroid. In the example of FIG. 10, the region 1001 is used for calculating the luminance centroid. Since the region 1001 has a bilaterally symmetric shape, it is possible to obtain the X coordinate of the luminance centroid using equation (1). A straight line 1101 in FIG. 11 is a straight line indicating the luminance gravity center in the X direction calculated in this way. It is estimated that the pupil center exists on this straight line 1101.
 次に第2推定部353は、Y方向の輝度重心を算出する(ステップS109)。図12および図13を用いて、Y方向の輝度重心を求める方法の一例を説明する。 Next, the second estimation unit 353 calculates the luminance centroid in the Y direction (step S109). An example of a method for obtaining the luminance center of gravity in the Y direction will be described with reference to FIGS.
 第2推定部353は、(2)式における、瞳孔領域のピクセル(x,y)の集合を1つまたは2つの領域に分離する。1つ目の領域は、瞳孔領域の左部である「(x1,y1)<(x,y)<(x11,y2)」に含まれる領域である。この領域は、角膜反射領域の左部に接する接線と瞳孔領域の外周線とによって囲まれる領域であって角膜反射領域を含まない領域(第2領域または第4領域)に相当する。図12では、斜線で表された領域1201に相当する。 The second estimation unit 353 separates a set of pixels (x, y) in the pupil region into one or two regions in Equation (2). The first region is a region included in “(x1, y1) <(x, y) <(x11, y2)”, which is the left part of the pupil region. This region corresponds to a region (second region or fourth region) that is surrounded by a tangent line that contacts the left part of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region. In FIG. 12, it corresponds to a region 1201 represented by hatching.
 2つ目の領域は、瞳孔領域の右部である「(x22,y1)<(x,y)<(x2,y2)」に含まれる領域である。この領域は、角膜反射領域の右部に接する接線と瞳孔領域の外周線とによって囲まれる領域であって角膜反射領域を含まない領域(第2領域または第4領域)に相当する。図12では、斜線で表された領域1202に相当する。 The second region is a region included in “(x22, y1) <(x, y) <(x2, y2)”, which is the right part of the pupil region. This region corresponds to a region (second region or fourth region) that is surrounded by a tangent line that is in contact with the right part of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region. In FIG. 12, it corresponds to a region 1202 represented by hatching.
 図12の例のように2つの領域が得られた場合は、第2推定部353は、例えば各領域のうち面積が広い領域を用いて輝度重心を算出する。図12の例では、領域1201と領域1202の面積を比較し、より広い面積を有する方を用いて輝度重心を算出する。この例では領域1202を算出に用いる。この領域1202は上下対称な形状であるため(2)式を用いて輝度重心のY座標を求めることが可能である。図13の直線1301は、このようにして算出されたY方向の輝度重心を示す直線である。この直線1301上に、瞳孔中心が存在すると推定される。 When two regions are obtained as in the example of FIG. 12, the second estimation unit 353 calculates the luminance centroid using, for example, a region having a large area among the regions. In the example of FIG. 12, the areas 1201 and 1202 are compared, and the luminance centroid is calculated using the one having the larger area. In this example, the area 1202 is used for calculation. Since this region 1202 has a vertically symmetric shape, the Y coordinate of the luminance centroid can be obtained using equation (2). A straight line 1301 in FIG. 13 is a straight line indicating the luminance gravity center in the Y direction calculated in this way. It is estimated that the pupil center exists on this straight line 1301.
 なお、2つの領域が得られた場合に、1つ目の領域から得られる輝度重心のY座標と、2つ目の領域から得られる輝度重心のY座標の平均値を、Y方向の輝度重心として算出してもよい。 When two regions are obtained, the average value of the Y coordinate of the luminance centroid obtained from the first region and the Y coordinate of the luminance centroid obtained from the second region is determined as the luminance centroid in the Y direction. May be calculated as
 図14において、X方向の輝度重心を示す直線1401と、Y方向の輝度重心を示す直線1402とが交差する点が瞳孔中心1403である(ステップS110)。このように、瞳孔位置検出部358は、直線1401と直線1402とが交差する交点を瞳孔中心1403として検出する。 In FIG. 14, the point where the straight line 1401 indicating the luminance centroid in the X direction intersects with the straight line 1402 indicating the luminance centroid in the Y direction is the pupil center 1403 (step S110). As described above, the pupil position detection unit 358 detects an intersection where the straight line 1401 and the straight line 1402 intersect as the pupil center 1403.
 このように、第1の実施形態では、瞳孔の異なる領域からX座標の輝度重心およびY座標の輝度重心をそれぞれ算出することで、少ない演算量で精度良く瞳孔中心を求めることができる。 Thus, in the first embodiment, by calculating the luminance centroid of the X coordinate and the luminance centroid of the Y coordinate from different regions of the pupil, the pupil center can be obtained accurately with a small amount of calculation.
 以上のように、本実施形態によれば、例えば以下のような効果が得られる。
(1)瞳孔径に比較して角膜反射径が小さい場合、瞳孔中心座標をより正確に求めることができる。
(2)演算量が少ないため、より低機能なCPUであっても瞳孔を検出することができる。
As described above, according to the present embodiment, for example, the following effects can be obtained.
(1) When the corneal reflection diameter is smaller than the pupil diameter, the pupil center coordinates can be obtained more accurately.
(2) Since the amount of calculation is small, the pupil can be detected even with a lower function CPU.
(第2の実施形態)
 第2の実施形態では、第1の実施形態よりも一層、装置構成を簡略化できる視線検出装置および視線検出方法を実現する。
(Second Embodiment)
In the second embodiment, a line-of-sight detection device and a line-of-sight detection method capable of further simplifying the apparatus configuration are realized as compared with the first embodiment.
 以下に、第2の実施形態の視線検出装置および視線検出方法を図面に基づいて詳細に説明する。なお、この実施形態によりこの発明が限定されるものではない。また、以下では、視線検出結果を用いて発達障がいなどの診断を支援する診断支援装置に視線検出装置を用いた例を説明する。適用可能な装置は診断支援装置に限られるものではない。 Hereinafter, the gaze detection apparatus and the gaze detection method of the second embodiment will be described in detail based on the drawings. In addition, this invention is not limited by this embodiment. In the following, an example in which the line-of-sight detection apparatus is used as a diagnosis support apparatus that supports diagnosis of developmental disabilities using the line-of-sight detection result will be described. Applicable devices are not limited to diagnosis support devices.
 本実施形態の視線検出装置(診断支援装置)は、1ヵ所に設置された照明部を用いて視線を検出する。また、本実施形態の視線検出装置(診断支援装置)は、視線検出前に被験者に1点を注視させて測定した結果を用いて、角膜曲率中心位置を高精度に算出する。 The line-of-sight detection apparatus (diagnosis support apparatus) of the present embodiment detects the line of sight using an illumination unit installed at one place. In addition, the line-of-sight detection device (diagnosis support device) of the present embodiment calculates the corneal curvature center position with high accuracy by using a result obtained by gazing at one point on the subject before the line-of-sight detection.
 なお、照明部とは、光源を含み、被験者の眼球に光を照射可能な要素である。光源とは、例えばLED(Light Emitting Diode)などの光を発生する素子である。光源は、1個のLEDから構成されてもよいし、複数のLEDを組み合わせて1ヵ所に配置することにより構成されてもよい。以下では、このように照明部を表す用語として「光源」を用いる場合がある。 In addition, an illumination part is an element which can irradiate light to a test subject's eyeball including a light source. The light source is an element that generates light, such as an LED (Light Emitting Diode). A light source may be comprised from one LED, and may be comprised by combining several LED and arrange | positioning in one place. Hereinafter, the “light source” may be used as a term representing the illumination unit in this way.
 図16および17は、第2の実施形態の表示部、ステレオカメラ、赤外線光源および被験者の配置の一例を示す図である。なお、第1の実施形態と同様の構成については同一の符号を付し、説明を省略する場合がある。 FIGS. 16 and 17 are diagrams illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment. In addition, about the structure similar to 1st Embodiment, the same code | symbol may be attached | subjected and description may be abbreviate | omitted.
 図16に示すように、第2の実施形態の診断支援装置は、表示部210と、ステレオカメラ2102と、LED光源2103と、を含む。ステレオカメラ2102は、表示部210の下に配置される。LED光源2103は、ステレオカメラ2102に含まれる2つのカメラの中心位置に配置される。LED光源2103は、例えば波長850nmの近赤外線を照射する光源である。図16では、9個のLEDによりLED光源2103(照明部)を構成する例が示されている。なお、ステレオカメラ2102は、波長850nmの近赤外光を透過できるレンズを使用する。 As illustrated in FIG. 16, the diagnosis support apparatus according to the second embodiment includes a display unit 210, a stereo camera 2102, and an LED light source 2103. The stereo camera 2102 is disposed below the display unit 210. The LED light source 2103 is arranged at the center position of two cameras included in the stereo camera 2102. The LED light source 2103 is a light source that irradiates near infrared rays having a wavelength of 850 nm, for example. FIG. 16 shows an example in which an LED light source 2103 (illumination unit) is configured by nine LEDs. Note that the stereo camera 2102 uses a lens that can transmit near-infrared light having a wavelength of 850 nm.
 図17に示すように、ステレオカメラ2102は、右カメラ2202と左カメラ2203とを備えている。LED光源2103は、被験者の眼球111に向かって近赤外光を照射する。ステレオカメラ2102で取得される画像では、瞳孔112が低輝度で反射して暗くなり、眼球111内に虚像として生じる角膜反射113が高輝度で反射して明るくなる。従って、瞳孔112および角膜反射113の画像上の位置を2台のカメラ(右カメラ2202、左カメラ2203)それぞれで取得することができる。 As shown in FIG. 17, the stereo camera 2102 includes a right camera 2202 and a left camera 2203. The LED light source 2103 irradiates near-infrared light toward the eyeball 111 of the subject. In the image acquired by the stereo camera 2102, the pupil 112 is reflected and darkened with low brightness, and the corneal reflection 113 generated as a virtual image in the eyeball 111 is reflected and brightened with high brightness. Accordingly, the positions of the pupil 112 and the corneal reflection 113 on the image can be acquired by each of the two cameras (the right camera 2202 and the left camera 2203).
 さらに2台のカメラにより得られる瞳孔112および角膜反射113の位置から、瞳孔112および角膜反射113の位置の三次元世界座標値を算出する。本実施形態では、三次元世界座標として、表示画面101の中央位置を原点として、上下をY座標(上が+)、横をX座標(向かって右が+)、奥行きをZ座標(手前が+)としている。 Further, the three-dimensional world coordinate values of the positions of the pupil 112 and the corneal reflection 113 are calculated from the positions of the pupil 112 and the corneal reflection 113 obtained by two cameras. In the present embodiment, as the three-dimensional world coordinates, with the center position of the display screen 101 as the origin, the top and bottom are the Y coordinate (up is +), the side is the X coordinate (right is +), and the depth is the Z coordinate (front is +).
 図18は、第2の実施形態の診断支援装置2100の機能の概要を示す図である。図18では、図16および17に示した構成の一部と、この構成の駆動などに用いられる構成を示している。図18に示すように、診断支援装置2100は、右カメラ2202と、左カメラ2203と、LED光源2103と、スピーカ105と、駆動・IF(interface)部208と、制御部2300と、記憶部150と、表示部210と、を含む。図18において、表示画面101は、右カメラ2202および左カメラ2203との位置関係を分かりやすく示しているが、表示画面101は表示部210において表示される画面である。なお、駆動部とIF部は一体でもよいし、別体でもよい。 FIG. 18 is a diagram illustrating an outline of functions of the diagnosis support apparatus 2100 according to the second embodiment. FIG. 18 shows a part of the configuration shown in FIGS. 16 and 17 and a configuration used for driving the configuration. As shown in FIG. 18, the diagnosis support apparatus 2100 includes a right camera 2202, a left camera 2203, an LED light source 2103, a speaker 105, a drive / IF (interface) unit 208, a control unit 2300, and a storage unit 150. And a display unit 210. In FIG. 18, the display screen 101 shows the positional relationship between the right camera 2202 and the left camera 2203 in an easy-to-understand manner, but the display screen 101 is a screen displayed on the display unit 210. The drive unit and the IF unit may be integrated or separate.
 スピーカ105は、キャリブレーション時などに、被験者に注意を促すための音声などを出力する音声出力部として機能する。 The speaker 105 functions as an audio output unit that outputs audio or the like for alerting the subject during calibration or the like.
 駆動・IF部208は、ステレオカメラ2102に含まれる各部を駆動する。また、駆動・IF部208は、ステレオカメラ2102に含まれる各部と、制御部2300とのインタフェースとなる。 The drive / IF unit 208 drives each unit included in the stereo camera 2102. The drive / IF unit 208 serves as an interface between each unit included in the stereo camera 2102 and the control unit 2300.
 制御部2300は、例えば、CPU(Central Processing Unit)などの制御装置と、ROM(Read Only Memory)やRAM(Random Access Memory)などの記憶装置と、ネットワークに接続して通信を行う通信I/Fと、各部を接続するバスを備えているコンピュータなどにより実現できる。 The control unit 2300 is, for example, a communication I / F that communicates with a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) or a RAM (Random Access Memory) by connecting to a network. And a computer equipped with a bus for connecting each unit.
 記憶部150は、制御プログラム、測定結果、診断支援結果など各種情報を記憶する。記憶部150は、例えば、表示部210に表示する画像等を記憶する。表示部210は、診断のための対象画像等、各種情報を表示する。 The storage unit 150 stores various information such as a control program, a measurement result, and a diagnosis support result. The storage unit 150 stores, for example, an image to be displayed on the display unit 210. The display unit 210 displays various information such as a target image for diagnosis.
 図19は、図18に示す各部の詳細な機能の一例を示すブロック図である。図19に示すように、制御部2300には、表示部210と、駆動・IF部208が接続される。駆動・IF部208は、カメラIF314、315と、LED駆動制御部316と、スピーカ駆動部322と、を備える。 FIG. 19 is a block diagram illustrating an example of detailed functions of each unit illustrated in FIG. As shown in FIG. 19, a display unit 210 and a drive / IF unit 208 are connected to the control unit 2300. The drive / IF unit 208 includes camera IFs 314 and 315, an LED drive control unit 316, and a speaker drive unit 322.
 駆動・IF部208には、カメラIF314、315を介して、それぞれ、右カメラ2202、左カメラ2203が接続される。駆動・IF部208がこれらのカメラを駆動することにより、被験者を撮像する。 The right camera 2202 and the left camera 2203 are connected to the drive / IF unit 208 via the camera IFs 314 and 315, respectively. The driving / IF unit 208 drives these cameras to image the subject.
 スピーカ駆動部322は、スピーカ105を駆動する。なお、診断支援装置2100が、印刷部としてのプリンタと接続するためのインタフェース(プリンタIF)を備えてもよい。また、プリンタを診断支援装置2100の内部に備えるように構成してもよい。 The speaker driving unit 322 drives the speaker 105. The diagnosis support apparatus 2100 may include an interface (printer IF) for connecting to a printer as a printing unit. Further, the printer may be provided inside the diagnosis support apparatus 2100.
 制御部2300は、診断支援装置2100全体を制御する。制御部2300は、第1算出部2351と、第2算出部(角膜反射中心算出部)2352と、第3算出部(角膜曲率中心算出部)2353と、視線検出部2354と、視点検出部2355と、出力制御部2356と、評価部2357と、を備えている。なお、視線検出装置としては、少なくとも第1算出部2351、第2算出部2352、第3算出部2353、および、視線検出部2354が備えられていればよい。 The control unit 2300 controls the entire diagnosis support apparatus 2100. The control unit 2300 includes a first calculation unit 2351, a second calculation unit (corneal reflection center calculation unit) 2352, a third calculation unit (corneal curvature center calculation unit) 2353, a line-of-sight detection unit 2354, and a viewpoint detection unit 2355. And an output control unit 2356 and an evaluation unit 2357. Note that the line-of-sight detection device may include at least the first calculation unit 2351, the second calculation unit 2352, the third calculation unit 2353, and the line-of-sight detection unit 2354.
 制御部2300に含まれる各要素(第1算出部2351、第2算出部2352、第3算出部2353、視線検出部2354、視点検出部2355、出力制御部2356、および、評価部2357)は、ソフトウェア(プログラム)で実現してもよいし、ハードウェア回路で実現してもよいし、ソフトウェアとハードウェア回路とを併用して実現してもよい。 Each element included in the control unit 2300 (the first calculation unit 2351, the second calculation unit 2352, the third calculation unit 2353, the line-of-sight detection unit 2354, the viewpoint detection unit 2355, the output control unit 2356, and the evaluation unit 2357) It may be realized by software (program), may be realized by a hardware circuit, or may be realized by using software and a hardware circuit in combination.
 プログラムで実現する場合、当該プログラムは、インストール可能な形式又は実行可能な形式のファイルでCD-ROM(Compact Disk Read Only Memory)、フレキシブルディスク(FD)、CD-R(Compact Disk Recordable)、DVD(Digital Versatile Disk)等のコンピュータで読み取り可能な記録媒体に記録されてコンピュータプログラムプロダクトとして提供される。プログラムを、インターネット等のネットワークに接続されたコンピュータ上に格納し、ネットワーク経由でダウンロードさせることにより提供するように構成してもよい。また、プログラムをインターネット等のネットワーク経由で提供または配布するように構成してもよい。また、プログラムを、ROM等に予め組み込んで提供するように構成してもよい。 When implemented by a program, the program is a file in an installable or executable format, such as a CD-ROM (Compact Disk Read Only Memory), a flexible disk (FD), a CD-R (Compact Disk Recordable), a DVD ( It is recorded on a computer-readable recording medium such as Digital Versatile Disk) and provided as a computer program product. The program may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. The program may be provided or distributed via a network such as the Internet. The program may be provided by being incorporated in advance in a ROM or the like.
 第1算出部2351は、ステレオカメラ2102により撮像された眼球の画像から、瞳孔の中心を示す瞳孔中心の位置(第1位置)を算出する。第2算出部2352は、撮像された眼球の画像から、角膜反射の中心を示す角膜反射中心の位置(第2位置)を算出する。 The first calculation unit 2351 calculates the position (first position) of the pupil center indicating the center of the pupil from the eyeball image captured by the stereo camera 2102. The second calculator 2352 calculates the position of the corneal reflection center (second position) indicating the center of corneal reflection from the captured image of the eyeball.
 第3算出部2353は、LED光源2103と角膜反射中心とを結ぶ直線(第1直線)と、から角膜曲率中心(第4位置)を算出する。例えば、第3算出部2353は、この直線上で、角膜反射中心からの距離が所定値となる位置を、角膜曲率中心として算出する。所定値は、一般的な角膜の曲率半径値などから事前に定められた値を用いることができる。 The third calculation unit 2353 calculates the corneal curvature center (fourth position) from the straight line (first straight line) connecting the LED light source 2103 and the corneal reflection center. For example, the third calculation unit 2353 calculates a position on the straight line where the distance from the corneal reflection center is a predetermined value as the corneal curvature center. As the predetermined value, a value determined in advance from a general radius of curvature of the cornea or the like can be used.
 角膜の曲率半径値には個人差が生じうるため、事前に定められた値を用いて角膜曲率中心を算出すると誤差が大きくなる可能性がある。従って、第3算出部2353が、個人差を考慮して角膜曲率中心を算出してもよい。この場合、第3算出部2353は、まず目標位置(第3位置)を被験者に注視させたときに算出された瞳孔中心および角膜反射中心を用いて、瞳孔中心と目標位置とを結ぶ直線(第2直線)と、角膜反射中心とLED光源2103とを結ぶ第1直線と、の交点を算出する。そして第3算出部2353は、瞳孔中心と算出した交点との距離(第1距離)を算出し、例えば記憶部150に記憶する。 Since there may be individual differences in the radius of curvature of the cornea, there is a possibility that an error will increase if the cornea curvature center is calculated using a predetermined value. Therefore, the third calculation unit 2353 may calculate the corneal curvature center in consideration of individual differences. In this case, the third calculation unit 2353 first uses a pupil center and a corneal reflection center that are calculated when the subject is gazes at the target position (third position), and a straight line that connects the pupil center and the target position (the first position). 2) and the first straight line connecting the corneal reflection center and the LED light source 2103 is calculated. Then, the third calculation unit 2353 calculates the distance (first distance) between the pupil center and the calculated intersection, and stores the calculated distance in the storage unit 150, for example.
 目標位置は、予め定められ、三次元世界座標値が算出できる位置であればよい。例えば、表示画面101の中央位置(三次元世界座標の原点)を目標位置とすることができる。この場合、例えば出力制御部2356が、表示画面101上の目標位置(中央)に、被験者に注視させる画像(目標画像)等を表示する。これにより、被験者に目標位置を注視させることができる。 The target position may be a position that is determined in advance and can calculate a three-dimensional world coordinate value. For example, the center position of the display screen 101 (the origin of the three-dimensional world coordinates) can be set as the target position. In this case, for example, the output control unit 2356 displays an image (target image) or the like that causes the subject to gaze at the target position (center) on the display screen 101. Thereby, a test subject can be made to gaze at a target position.
 目標画像は、被験者を注目させることができる画像であればどのような画像であってもよい。例えば、輝度や色などの表示態様が変化する画像、および、表示態様が他の領域と異なる画像などを目標画像として用いることができる。 The target image may be any image as long as it allows the subject to pay attention. For example, an image in which a display mode such as luminance or color changes, an image in which the display mode is different from other regions, or the like can be used as the target image.
 なお、目標位置は表示画面101の中央に限られるものではなく、任意の位置でよい。表示画面101の中央を目標位置とすれば、表示画面101の任意の端部との距離が最小になる。このため、例えば視線検出時の測定誤差をより小さくすることが可能となる。 Note that the target position is not limited to the center of the display screen 101, and may be an arbitrary position. If the center of the display screen 101 is set as the target position, the distance from an arbitrary end of the display screen 101 is minimized. For this reason, it becomes possible to make the measurement error at the time of gaze detection smaller, for example.
 距離の算出までの処理は、例えば実際の視線検出を開始するまでに事前に実行しておく。実際の視線検出時には、第3算出部2353は、LED光源2103と角膜反射中心とを結ぶ直線上で、瞳孔中心からの距離が、事前に算出した距離となる位置を、角膜曲率中心として算出する。 The processing up to the calculation of the distance is executed in advance, for example, before actual gaze detection is started. At the time of actual line-of-sight detection, the third calculation unit 2353 calculates, on the straight line connecting the LED light source 2103 and the corneal reflection center, a position where the distance from the pupil center is a previously calculated distance as the corneal curvature center. .
 視線検出部2354は、瞳孔中心と角膜曲率中心とから被験者の視線を検出する。例えば視線検出部2354は、角膜曲率中心から瞳孔中心へ向かう方向を被験者の視線方向として検出する。 The line-of-sight detection unit 2354 detects the line of sight of the subject from the pupil center and the corneal curvature center. For example, the gaze detection unit 2354 detects the direction from the corneal curvature center to the pupil center as the gaze direction of the subject.
 視点検出部2355は、検出された視線方向を用いて被験者の視点を検出する。視点検出部2355は、例えば、表示画面101で被験者が注視する点である視点(注視点)を検出する。視点検出部2355は、例えば図17のような三次元世界座標系で表される視線ベクトルとXY平面との交点を、被験者の注視点として検出する。 The viewpoint detection unit 2355 detects the viewpoint of the subject using the detected gaze direction. The viewpoint detection unit 2355 detects, for example, a viewpoint (gaze point) that is a point on the display screen 101 where the subject gazes. The viewpoint detection unit 2355 detects, for example, the intersection of the line-of-sight vector represented in the three-dimensional world coordinate system as shown in FIG. 17 and the XY plane as the gaze point of the subject.
 出力制御部2356は、表示部210およびスピーカ105などに対する各種情報の出力を制御する。例えば、出力制御部2356は、表示部210上の目標位置に目標画像を出力させる。また、出力制御部2356は、診断画像、および、評価部2357による評価結果などの表示部210に対する出力を制御する。 The output control unit 2356 controls the output of various information to the display unit 210, the speaker 105, and the like. For example, the output control unit 2356 outputs the target image at the target position on the display unit 210. Further, the output control unit 2356 controls the output to the display unit 210 such as the diagnostic image and the evaluation result by the evaluation unit 2357.
 診断画像は、視線(視点)検出結果に基づく評価処理に応じた画像であればよい。例えば発達障がいを診断する場合であれば、発達障がいの被験者が好む画像(幾何学模様映像など)と、それ以外の画像(人物映像など)と、を含む診断画像を用いてもよい。 The diagnostic image may be an image according to the evaluation process based on the line-of-sight (viewpoint) detection result. For example, in the case of diagnosing a developmental disorder, a diagnostic image including an image (such as a geometric pattern image) preferred by a subject with a developmental disorder and other images (such as a person image) may be used.
 評価部2357は、診断画像と、視点検出部2355により検出された注視点とに基づく評価処理を行う。例えば発達障がいを診断する場合であれば、評価部2357は、診断画像と注視点とを解析し、発達障がいの被験者が好む画像を注視したか否かを評価する。 Evaluation unit 2357 performs an evaluation process based on the diagnostic image and the gazing point detected by the viewpoint detection unit 2355. For example, in the case of diagnosing a developmental disorder, the evaluation unit 2357 analyzes the diagnostic image and the gazing point, and evaluates whether or not the image preferred by the subject with the developmental disorder has been gazed.
 出力制御部2356が第1の実施形態と同様の診断画像を表示し、評価部2357が第1の実施形態の評価部357と同様の評価処理を行ってもよい。言い換えると、第1の実施形態の瞳孔検出処理(特定部351、第1推定部352、第2推定部353、瞳孔位置検出部358)、および、視線検出処理(視線検出部354)を、第2の実施形態の瞳孔検出処理(第1算出部2351)、および、視線検出処理(第2算出部2352、第3算出部2353、視線検出部2354)で置き換えてもよい。これにより、第1の実施形態の効果に加えて、第2の実施形態の効果(装置構成の簡略化など)を達成可能となる。 The output control unit 2356 may display the same diagnostic image as in the first embodiment, and the evaluation unit 2357 may perform the same evaluation process as the evaluation unit 357 in the first embodiment. In other words, the pupil detection processing (identification unit 351, first estimation unit 352, second estimation unit 353, pupil position detection unit 358) and gaze detection processing (gaze detection unit 354) of the first embodiment are The pupil detection process (first calculation unit 2351) and the gaze detection process (second calculation unit 2352, third calculation unit 2353, gaze detection unit 2354) of the second embodiment may be used. Thereby, in addition to the effect of 1st Embodiment, the effect (simplification of an apparatus structure etc.) of 2nd Embodiment can be achieved.
 図20は、本実施形態の診断支援装置2100により実行される処理の概要を説明する図である。図16~図19で説明した要素については同一の符号を付し説明を省略する。 FIG. 20 is a diagram illustrating an outline of processing executed by the diagnosis support apparatus 2100 of the present embodiment. The elements described in FIGS. 16 to 19 are denoted by the same reference numerals and description thereof is omitted.
 瞳孔中心407および角膜反射中心408は、それぞれ、LED光源2103を点灯させた際に検出される瞳孔の中心、および、角膜反射点の中心を表している。角膜曲率半径409は、角膜表面から角膜曲率中心410までの距離を表す。 The pupil center 407 and the corneal reflection center 408 represent the center of the pupil detected when the LED light source 2103 is turned on and the center of the corneal reflection point, respectively. The corneal curvature radius 409 represents the distance from the corneal surface to the corneal curvature center 410.
 図21は、2つの光源(照明部)を用いる方法(以下、方法Aとする)と、1つの光源(照明部)を用いる本実施形態との違いを示す説明図である。図16~図19で説明した要素については同一の符号を付し説明を省略する。 FIG. 21 is an explanatory diagram showing a difference between a method using two light sources (illumination units) (hereinafter referred to as method A) and the present embodiment using one light source (illumination unit). The elements described in FIGS. 16 to 19 are denoted by the same reference numerals and description thereof is omitted.
 方法Aは、LED光源2103の代わりに、2つのLED光源511、512を用いる。方法Aでは、LED光源511を照射したときの角膜反射中心513とLED光源511とを結ぶ直線515と、LED光源512を照射したときの角膜反射中心514とLED光源512とを結ぶ直線516との交点が算出される。この交点が角膜曲率中心505となる。 Method A uses two LED light sources 511 and 512 instead of the LED light source 2103. In the method A, a straight line 515 connecting the cornea reflection center 513 and the LED light source 511 when the LED light source 511 is irradiated, and a straight line 516 connecting the cornea reflection center 514 and the LED light source 512 when the LED light source 512 is irradiated. An intersection is calculated. This intersection is the corneal curvature center 505.
 これに対し、本実施形態では、LED光源2103を照射したときの、角膜反射中心522とLED光源2103とを結ぶ直線523を考える。直線523は、角膜曲率中心505を通る。また角膜の曲率半径は個人差による影響が少なくほぼ一定の値になることが知られている。このことから、LED光源2103を照射したときの角膜曲率中心は、直線523上に存在し、一般的な曲率半径値を用いることにより算出することが可能である。 In contrast, in this embodiment, a straight line 523 connecting the cornea reflection center 522 and the LED light source 2103 when the LED light source 2103 is irradiated is considered. A straight line 523 passes through the corneal curvature center 505. It is also known that the radius of curvature of the cornea is almost constant with little influence from individual differences. Thus, the corneal curvature center when the LED light source 2103 is irradiated exists on the straight line 523 and can be calculated by using a general curvature radius value.
 しかし、一般的な曲率半径値を用いて求めた角膜曲率中心の位置を使用して視点を算出すると、眼球の個人差により視点位置が本来の位置からずれて、正確な視点位置検出ができない場合がある。 However, if the viewpoint is calculated using the position of the center of corneal curvature calculated using a general radius of curvature, the viewpoint position may deviate from the original position due to individual differences in the eyeballs, and accurate viewpoint position detection cannot be performed. There is.
 図22は、視点検出(視線検出)を行う前に、角膜曲率中心位置と、瞳孔中心位置と角膜曲率中心位置との距離を算出する算出処理を説明するための図である。図16~図19で説明した要素については同一の符号を付し説明を省略する。なお、左右カメラ(右カメラ2202、左カメラ2203)と制御部2300とが接続することについては図示せず省略する。 FIG. 22 is a diagram for explaining calculation processing for calculating the corneal curvature center position and the distance between the pupil center position and the corneal curvature center position before performing viewpoint detection (line-of-sight detection). The elements described in FIGS. 16 to 19 are denoted by the same reference numerals and description thereof is omitted. The connection between the left and right cameras (the right camera 2202 and the left camera 2203) and the control unit 2300 is not shown and is omitted.
 目標位置605は、表示部210上の一点に目標画像等を出して、被験者に見つめさせるための位置である。本実施形態では表示画面101の中央位置としている。直線613は、LED光源2103と角膜反射中心612とを結ぶ直線である。直線614は、被験者が見つめる目標位置605(注視点)と瞳孔中心611とを結ぶ直線である。角膜曲率中心615は、直線613と直線614との交点である。第3算出部2353は、瞳孔中心611と角膜曲率中心615との距離616を算出して記憶しておく。 The target position 605 is a position for displaying a target image or the like at one point on the display unit 210 and causing the subject to stare. In this embodiment, the center position of the display screen 101 is set. A straight line 613 is a straight line connecting the LED light source 2103 and the corneal reflection center 612. A straight line 614 is a straight line connecting the target position 605 (gaze point) that the subject looks at and the pupil center 611. A corneal curvature center 615 is an intersection of the straight line 613 and the straight line 614. The third calculation unit 2353 calculates and stores the distance 616 between the pupil center 611 and the corneal curvature center 615.
 図23は、本実施形態の算出処理の一例を示すフローチャートである。 FIG. 23 is a flowchart illustrating an example of calculation processing according to the present embodiment.
 まず出力制御部2356は、表示画面101上の1点に目標画像を再生し(ステップS201)、被験者にその1点を注視させる。次に、制御部2300は、LED駆動制御部316を用いてLED光源2103を被験者の目に向けて点灯させる(ステップS202)。制御部2300は、左右カメラ(右カメラ2202、左カメラ2203)で被験者の目を撮像する(ステップS203)。 First, the output control unit 2356 reproduces the target image at one point on the display screen 101 (step S201), and causes the subject to gaze at the one point. Next, the control unit 2300 turns on the LED light source 2103 toward the eyes of the subject using the LED drive control unit 316 (step S202). The controller 2300 images the eyes of the subject with the left and right cameras (the right camera 2202 and the left camera 2203) (step S203).
 LED光源2103の照射により、瞳孔部分は暗い部分(暗瞳孔)として検出される。またLED照射の反射として、角膜反射の虚像が発生し、明るい部分として角膜反射点(角膜反射中心)が検出される。すなわち、第1算出部2351は、撮像された画像から瞳孔部分を検出し、瞳孔中心の位置を示す座標を算出する。第1算出部2351は、例えば目を含む一定領域の中で最も暗い部分を含む所定の明るさ以下の領域を瞳孔部分として検出し、最も明るい部分を含む所定の明るさ以上の領域を角膜反射として検出する。また、第2算出部2352は、撮像された画像から角膜反射部分を検出し、角膜反射中心の位置を示す座標を算出する。なお、第1算出部2351および第2算出部2352は、左右カメラで取得した2つの画像それぞれに対して、各座標値を算出する(ステップS204)。 The pupil part is detected as a dark part (dark pupil) by irradiation of the LED light source 2103. Further, a corneal reflection virtual image is generated as a reflection of LED irradiation, and a corneal reflection point (corneal reflection center) is detected as a bright portion. That is, the first calculation unit 2351 detects a pupil portion from the captured image, and calculates coordinates indicating the position of the pupil center. For example, the first calculation unit 2351 detects, as a pupil part, a region having a predetermined brightness or less including the darkest part in a certain region including the eyes, and a region having a predetermined brightness or more including the brightest part is reflected by the cornea. Detect as. In addition, the second calculation unit 2352 detects a corneal reflection portion from the captured image, and calculates coordinates indicating the position of the corneal reflection center. The first calculation unit 2351 and the second calculation unit 2352 calculate each coordinate value for each of two images acquired by the left and right cameras (step S204).
 なお、左右カメラは、三次元世界座標を取得するために、事前にステレオ較正法によるカメラ較正が行われており、変換パラメータが算出されている。ステレオ較正法は、Tsaiのカメラキャリブレーション理論を用いた方法など従来から用いられているあらゆる方法を適用できる。 Note that the left and right cameras are pre-calibrated with a stereo calibration method to obtain three-dimensional world coordinates, and conversion parameters are calculated. As the stereo calibration method, any conventionally used method such as a method using Tsai's camera calibration theory can be applied.
 第1算出部2351および第2算出部2352は、この変換パラメータを使用して、左右カメラの座標から、瞳孔中心と角膜反射中心の三次元世界座標に変換を行う(ステップS205)。第3算出部2353は、求めた角膜反射中心の世界座標と、LED光源2103の中心位置の世界座標とを結ぶ直線を求める(ステップS206)。次に、第3算出部2353は、表示画面101の1点に表示される目標画像の中心の世界座標と、瞳孔中心の世界座標とを結ぶ直線を算出する(ステップS207)。第3算出部2353は、ステップS106で算出した直線とステップS207で算出した直線との交点を求め、この交点を角膜曲率中心とする(ステップS208)。第3算出部2353は、このときの瞳孔中心と角膜曲率中心との間の距離を算出して記憶部150などに記憶する(ステップS209)。記憶された距離は、その後の視点(視線)検出時に、角膜曲率中心を算出するために使用される。 The first calculation unit 2351 and the second calculation unit 2352 use the conversion parameters to convert the coordinates of the left and right cameras into the three-dimensional world coordinates of the pupil center and the corneal reflection center (step S205). The 3rd calculation part 2353 calculates | requires the straight line which connects the world coordinate of the calculated | required corneal reflection center, and the world coordinate of the center position of the LED light source 2103 (step S206). Next, the third calculation unit 2353 calculates a straight line connecting the world coordinates of the center of the target image displayed at one point on the display screen 101 and the world coordinates of the pupil center (step S207). The 3rd calculation part 2353 calculates | requires the intersection of the straight line calculated by step S106, and the straight line calculated by step S207, and makes this intersection a cornea curvature center (step S208). The third calculation unit 2353 calculates the distance between the pupil center and the corneal curvature center at this time, and stores it in the storage unit 150 or the like (step S209). The stored distance is used to calculate the corneal curvature center at the time of subsequent detection of the viewpoint (line of sight).
 算出処理で表示部210上の1点を見つめる際の瞳孔中心と角膜曲率中心との間の距離は、表示部210内の視点を検出する範囲で一定に保たれている。瞳孔中心と角膜曲率中心との間の距離は、目標画像を再生中に算出された値全体の平均から求めてもよいし、再生中に算出された値のうち何回かの値の平均から求めてもよい。 The distance between the pupil center and the corneal curvature center when looking at one point on the display unit 210 in the calculation process is kept constant within a range in which the viewpoint in the display unit 210 is detected. The distance between the center of the pupil and the center of corneal curvature may be obtained from the average of all the values calculated during playback of the target image, or from the average of several values of the values calculated during playback. You may ask for it.
 図24は、視点検出を行う際に、事前に求めた瞳孔中心と角膜曲率中心との距離を使用して、補正された角膜曲率中心の位置を算出する方法を示した図である。注視点805は、一般的な曲率半径値を用いて算出した角膜曲率中心から求めた注視点を表す。注視点806は、事前に求めた距離を用いて算出した角膜曲率中心から求めた注視点を表す。 FIG. 24 is a diagram showing a method of calculating the corrected position of the corneal curvature center using the distance between the pupil center and the corneal curvature center obtained in advance when performing viewpoint detection. A gazing point 805 represents a gazing point obtained from a corneal curvature center calculated using a general curvature radius value. A gazing point 806 represents a gazing point obtained from a corneal curvature center calculated using a distance obtained in advance.
 瞳孔中心811および角膜反射中心812は、それぞれ、視点検出時に算出された瞳孔中心の位置、および、角膜反射中心の位置を示す。直線813は、LED光源2103と角膜反射中心812とを結ぶ直線である。角膜曲率中心814は、一般的な曲率半径値から算出した角膜曲率中心の位置である。距離815は、事前の算出処理により算出した瞳孔中心と角膜曲率中心との距離である。角膜曲率中心816は、事前に求めた距離を用いて算出した角膜曲率中心の位置である。角膜曲率中心816は、角膜曲率中心が直線813上に存在すること、および、瞳孔中心と角膜曲率中心との距離が距離815であることから求められる。これにより一般的な曲率半径値を用いる場合に算出される視線817は、視線818に補正される。また、表示画面101上の注視点は、注視点805から注視点806に補正される。なお、左右カメラ(右カメラ2202、左カメラ2203)と制御部2300とが接続することについては図示せず省略する。 The pupil center 811 and the corneal reflection center 812 indicate the position of the pupil center and the position of the corneal reflection center calculated at the time of viewpoint detection, respectively. A straight line 813 is a straight line connecting the LED light source 2103 and the corneal reflection center 812. The corneal curvature center 814 is the position of the corneal curvature center calculated from a general curvature radius value. The distance 815 is the distance between the pupil center and the corneal curvature center calculated by the prior calculation process. The corneal curvature center 816 is the position of the corneal curvature center calculated using the distance obtained in advance. The corneal curvature center 816 is obtained from the fact that the corneal curvature center exists on the straight line 813 and the distance between the pupil center and the corneal curvature center is the distance 815. Thus, the line of sight 817 calculated when a general radius of curvature value is used is corrected to the line of sight 818. Also, the gazing point on the display screen 101 is corrected from the gazing point 805 to the gazing point 806. The connection between the left and right cameras (the right camera 2202 and the left camera 2203) and the control unit 2300 is not shown and is omitted.
 図25は、本実施形態の視線検出処理の一例を示すフローチャートである。例えば、診断画像を用いた診断処理の中で視線を検出する処理として、図25の視線検出処理を実行することができる。診断処理では、図25の各ステップ以外に、診断画像を表示する処理、および、注視点の検出結果を用いた評価部2357による評価処理などが実行される。 FIG. 25 is a flowchart illustrating an example of a line-of-sight detection process according to the present embodiment. For example, the line-of-sight detection process of FIG. 25 can be executed as the process of detecting the line of sight in the diagnostic process using the diagnostic image. In the diagnostic process, in addition to the steps in FIG. 25, a process for displaying a diagnostic image, an evaluation process by the evaluation unit 2357 using the detection result of the gazing point, and the like are executed.
 ステップS301~ステップS305は、図23のステップS202~ステップS206と同様であるため説明を省略する。 Step S301 to step S305 are the same as step S202 to step S206 in FIG.
 第3算出部2353は、ステップS305で算出した直線上であって、瞳孔中心からの距離が、事前の算出処理によって求めた距離と等しい位置を角膜曲率中心として算出する(ステップS306)。 The third calculation unit 2353 calculates, as the corneal curvature center, a position that is on the straight line calculated in step S305 and whose distance from the pupil center is equal to the distance obtained by the previous calculation process (step S306).
 視線検出部2354は、瞳孔中心と角膜曲率中心とを結ぶベクトル(視線ベクトル)を求める(ステップS307)。このベクトルが、被験者が見ている視線方向を示している。視点検出部2355は、この視線方向と表示画面101との交点の三次元世界座標値を算出する(ステップS308)。この値が、被験者が注視する表示部210上の1点を世界座標で表した座標値である。視点検出部2355は、求めた三次元世界座標値を、表示部210の二次元座標系で表される座標値(x,y)に変換する(ステップS309)。これにより、被験者が見つめる表示部210上の視点(注視点)を算出することができる。 The line-of-sight detection unit 2354 obtains a vector (line-of-sight vector) connecting the pupil center and the corneal curvature center (step S307). This vector indicates the line-of-sight direction viewed by the subject. The viewpoint detection unit 2355 calculates the three-dimensional world coordinate value of the intersection between the line-of-sight direction and the display screen 101 (step S308). This value is a coordinate value representing one point on the display unit 210 that the subject gazes in world coordinates. The viewpoint detection unit 2355 converts the obtained three-dimensional world coordinate value into a coordinate value (x, y) represented in the two-dimensional coordinate system of the display unit 210 (step S309). Thereby, the viewpoint (gaze point) on the display part 210 which a test subject looks at can be calculated.
(変形例)
 瞳孔中心位置と角膜曲率中心位置との距離を算出する算出処理は、図22および図23で説明した方法に限られるものではない。以下では、算出処理の他の例について図26および図27を用いて説明する。
(Modification)
The calculation process for calculating the distance between the pupil center position and the corneal curvature center position is not limited to the method described with reference to FIGS. Hereinafter, another example of the calculation process will be described with reference to FIGS.
 図26は、本変形例の算出処理を説明するための図である。図16~図19および図22で説明した要素については同一の符号を付し説明を省略する。 FIG. 26 is a diagram for explaining the calculation process of the present modification. The elements described in FIGS. 16 to 19 and FIG.
 線分1101は、目標位置605とLED光源103とを結ぶ線分(第1線分)である。線分1102は、線分1101と平行で、瞳孔中心611と直線613とを結ぶ線分(第2線分)である。本変形例では、以下のように、線分1101、線分1102を用いて瞳孔中心611と角膜曲率中心615との距離616を算出して記憶しておく。 The line segment 1101 is a line segment (first line segment) connecting the target position 605 and the LED light source 103. A line segment 1102 is a line segment (second line segment) that is parallel to the line segment 1101 and connects the pupil center 611 and the straight line 613. In this modification, the distance 616 between the pupil center 611 and the corneal curvature center 615 is calculated and stored using the line segment 1101 and the line segment 1102 as follows.
 図27は、本変形例の算出処理の一例を示すフローチャートである。 FIG. 27 is a flowchart showing an example of calculation processing of the present modification.
 ステップS401~ステップS407は、図23のステップS201~ステップS207と同様であるため説明を省略する。 Steps S401 to S407 are the same as steps S201 to S207 in FIG.
 第3算出部2353は、表示部101の画面上の1点に表示される目標画像の中心と、LED光源2103の中心とを結ぶ線分(図26では線分1101)を算出するとともに、算出した線分の長さ(L1101とする)を算出する(ステップS408)。 The third calculation unit 2353 calculates a line segment (the line segment 1101 in FIG. 26) that connects the center of the target image displayed at one point on the screen of the display unit 101 and the center of the LED light source 2103. The length of the line segment (L1101) is calculated (step S408).
 第3算出部2353は、瞳孔中心611を通り、ステップS408で算出した線分と平行な線分(図26では線分1102)を算出するとともに、算出した線分の長さ(L1102とする)を算出する(ステップS409)。 The third calculation unit 2353 calculates a line segment (line segment 1102 in FIG. 26) that passes through the pupil center 611 and is parallel to the line segment calculated in step S408, and calculates the length of the calculated line segment (L1102). Is calculated (step S409).
 第3算出部2353は、角膜曲率中心615を頂点とし、ステップS408で算出した線分を下辺とする三角形と、角膜曲率中心615を頂点とし、ステップS409で算出した線分を下辺とする三角形とが相似関係にあることに基づき、瞳孔中心611と角膜曲率中心615との間の距離616を算出する(ステップS410)。例えば第3算出部2353は、線分1101の長さに対する線分1102の長さの比率と、目標位置605と角膜曲率中心615との間の距離に対する距離616の比率と、が等しくなるように、距離616を算出する。 The third calculation unit 2353 includes a triangle having the corneal curvature center 615 as a vertex and the line segment calculated in step S408 as a lower side, and a triangle having the corneal curvature center 615 as a vertex and the line segment calculated in step S409 as a lower side. Is a similar relationship, the distance 616 between the pupil center 611 and the corneal curvature center 615 is calculated (step S410). For example, the third calculation unit 2353 causes the ratio of the length of the line segment 1102 to the length of the line segment 1101 and the ratio of the distance 616 to the distance between the target position 605 and the corneal curvature center 615 to be equal. The distance 616 is calculated.
 距離616は、以下の(3)式により算出することができる。なおL614は、目標位置605から瞳孔中心611までの距離である。
 距離616=(L614×L1102)/(L1101-L1102)・・・(3)
The distance 616 can be calculated by the following equation (3). L614 is the distance from the target position 605 to the pupil center 611.
Distance 616 = (L614 × L1102) / (L1101−L1102) (3)
 第3算出部2353は、算出した距離616を記憶部150などに記憶する(ステップS411)。記憶された距離は、その後の視点(視線)検出時に、角膜曲率中心を算出するために使用される。 The third calculation unit 2353 stores the calculated distance 616 in the storage unit 150 or the like (step S411). The stored distance is used to calculate the corneal curvature center at the time of subsequent detection of the viewpoint (line of sight).
 以上のように、本実施形態によれば、例えば以下のような効果が得られる。
(1)光源(照明部)を2ヶ所に配置する必要がなく、1ヵ所に配置した光源で視線検出を行うことが可能となる。
(2)光源が1ヵ所になったため、装置をコンパクトにすることが可能となり、コストダウンも実現できる。
As described above, according to the present embodiment, for example, the following effects can be obtained.
(1) It is not necessary to arrange light sources (illuminating units) at two places, and it becomes possible to perform line-of-sight detection with the light sources arranged at one place.
(2) Since the number of light sources is one, the apparatus can be made compact and the cost can be reduced.
 なお、第1の実施形態と第2の実施形態は、構成要素を適宜組み合わせてもよい。 In the first embodiment and the second embodiment, constituent elements may be appropriately combined.
 以上のように、本発明にかかる瞳孔検出装置、視線検出装置および瞳孔検出方法は、撮像した画像を用いた発達障がいの診断支援装置および診断支援方法などに適している。 As described above, the pupil detection device, the line-of-sight detection device, and the pupil detection method according to the present invention are suitable for a diagnosis support device and a diagnosis support method for developmental disabilities using captured images.
 100、2100 診断支援装置
 101 表示画面
 102、2102 ステレオカメラ
 105 スピーカ
 150 記憶部
 202、2202 右カメラ
 203、205 赤外LED光源
 204、2203 左カメラ
 208 駆動・IF部
 210 表示部
 300、2300 制御部
 316 LED駆動制御部
 322 スピーカ駆動部
 351 特定部
 352 第1推定部
 353 第2推定部
 354、2354 視線検出部
 355、2355 視点検出部
 356、2356 出力制御部
 357、2357 評価部
 2351 第1算出部
 2352 第2算出部(角膜反射中心算出部)
 2353 第3算出部(角膜曲率中心算出部)
100, 2100 Diagnosis support apparatus 101 Display screen 102, 2102 Stereo camera 105 Speaker 150 Storage unit 202, 2202 Right camera 203, 205 Infrared LED light source 204, 203 Left camera 208 Drive / IF unit 210 Display unit 300, 2300 Control unit 316 LED drive control unit 322 Speaker drive unit 351 Identification unit 352 First estimation unit 353 Second estimation unit 354, 2354 Gaze detection unit 355, 2355 View point detection unit 356, 2356 Output control unit 357, 2357 Evaluation unit 2351 First calculation unit 2352 Second calculation unit (corneal reflection center calculation unit)
2353 Third calculation unit (corneal curvature center calculation unit)

Claims (9)

  1.  目を撮像した画像から、瞳孔領域と角膜反射領域とを特定する特定部と、
     前記瞳孔領域に含まれる第1領域を用いて、瞳孔の中心位置を推定する第1推定部と、
     前記瞳孔領域に含まれ、前記第1領域とは異なる第2領域を用いて、瞳孔の中心位置を推定する第2推定部と、
     前記第1推定部により推定された瞳孔の中心位置と、前記第2推定部により推定された瞳孔の中心位置とに基づいて、瞳孔の中心位置を検出する瞳孔位置検出部と、
     を備える瞳孔検出装置。
    A specific unit that identifies a pupil region and a corneal reflection region from an image of an eye image;
    A first estimation unit that estimates a center position of the pupil using a first region included in the pupil region;
    A second estimation unit that estimates a center position of the pupil using a second region that is included in the pupil region and is different from the first region;
    A pupil position detection unit that detects the center position of the pupil based on the center position of the pupil estimated by the first estimation unit and the center position of the pupil estimated by the second estimation unit;
    A pupil detection device comprising:
  2.  前記第1推定部は、前記角膜反射領域に接する第1接線と前記瞳孔領域の外周線とによって囲まれる領域であって前記角膜反射領域を含まない第1領域を用いて、前記第1接線方向の瞳孔の中心位置を推定し、
     前記第2推定部は、前記角膜反射領域に接し、前記第1接線と直交する第2接線と前記外周線とによって囲まれる領域であって前記角膜反射領域を含まない第2領域を用いて、前記第2接線方向の瞳孔の中心位置を推定する、
     請求項1に記載の瞳孔検出装置。
    The first estimation unit uses a first region that is surrounded by a first tangent line that is in contact with the corneal reflection region and an outer peripheral line of the pupil region, and does not include the corneal reflection region. Estimate the center position of the pupil of
    The second estimation unit uses a second region that is in contact with the corneal reflection region and is surrounded by the second tangent line orthogonal to the first tangent line and the outer peripheral line, and does not include the corneal reflection region. Estimating the center position of the pupil in the second tangential direction;
    The pupil detection device according to claim 1.
  3.  前記第1推定部は、前記第1領域の輝度重心を前記第1接線方向の瞳孔の中心位置として推定し、
     前記第2推定部は、前記第2領域の輝度重心を前記第2接線方向の瞳孔の中心位置として推定する、
     請求項2に記載の瞳孔検出装置。
    The first estimation unit estimates the luminance center of gravity of the first region as the center position of the pupil in the first tangential direction,
    The second estimation unit estimates the luminance centroid of the second region as the center position of the pupil in the second tangential direction;
    The pupil detection device according to claim 2.
  4.  前記第1推定部は、前記第1接線と平行であり前記角膜反射領域に接する第3接線と前記外周線とによって囲まれる領域であって前記角膜反射領域を含まない第3領域が存在する場合、前記第1領域と前記第3領域とを用いて、前記第1接線方向の瞳孔の中心位置を推定し、
     前記第2推定部は、前記第2接線と平行であり前記角膜反射領域に接する第4接線と前記外周線とによって囲まれる領域であって前記角膜反射領域を含まない第4領域が存在する場合、前記第2領域と前記第4領域とを用いて、前記第2接線方向の瞳孔の中心位置を推定する、
     請求項2に記載の瞳孔検出装置。
    In the case where there is a third region that is parallel to the first tangent and is surrounded by the third tangent that touches the corneal reflection region and the outer peripheral line and does not include the corneal reflection region. , Using the first region and the third region, estimating the center position of the pupil in the first tangential direction,
    In the case where there is a fourth region that is parallel to the second tangent and is surrounded by the fourth tangent that touches the corneal reflection region and the outer peripheral line and does not include the corneal reflection region. Estimating the center position of the pupil in the second tangential direction using the second region and the fourth region;
    The pupil detection device according to claim 2.
  5.  前記第1推定部は、前記第1領域と前記第3領域のうち、広い領域を用いて前記第1接線方向の瞳孔の中心位置を推定し、
     前記第2推定部は、前記第2領域と前記第4領域のうち、広い領域を用いて前記第2接線方向の瞳孔の中心位置を推定する、
     請求項4に記載の瞳孔検出装置。
    The first estimation unit estimates a center position of the pupil in the first tangential direction using a wide area of the first area and the third area,
    The second estimation unit estimates a center position of the pupil in the second tangential direction using a wide area of the second area and the fourth area;
    The pupil detection device according to claim 4.
  6.  前記第1推定部は、前記第1領域を用いて推定した中心位置と前記第3領域を用いて推定した中心位置との平均値を、前記第1接線方向の瞳孔の中心位置として推定し、
     前記第2推定部は、前記第2領域を用いて推定した中心位置と前記第4領域を用いて推定した中心位置との平均値を、前記第2接線方向の瞳孔の中心位置として推定する、
     請求項4に記載の瞳孔検出装置。
    The first estimating unit estimates an average value of a center position estimated using the first region and a center position estimated using the third region as a pupil center position in the first tangential direction;
    The second estimation unit estimates an average value of a center position estimated using the second region and a center position estimated using the fourth region as a pupil center position in the second tangential direction;
    The pupil detection device according to claim 4.
  7.  前記第1接線は前記画像の水平方向の線であり、
     前記第2接線は前記画像の垂直方向の線である、
     請求項2から6のいずれか1項に記載の瞳孔検出装置。
    The first tangent is a horizontal line of the image;
    The second tangent line is a vertical line of the image;
    The pupil detection device according to any one of claims 2 to 6.
  8.  請求項1から7のいずれか1項に記載の瞳孔検出装置と、
     光を照射する光源を含む照明部と、
     前記照明部によって光が照射されて、目を撮像した画像から角膜反射の中心を示す位置を算出する角膜反射中心算出部と、
     前記光源と前記角膜反射の中心を示す位置とを結ぶ直線に基づいて、角膜曲率中心を示す位置を算出する角膜曲率中心算出部と、
     前記瞳孔位置検出部により検出された瞳孔の中心位置と、前記角膜曲率中心を示す位置とに基づいて被験者の視線を検出する視線検出部と、
     を備える視線検出装置。
    The pupil detection device according to any one of claims 1 to 7,
    An illumination unit including a light source that emits light;
    A corneal reflection center calculation unit that calculates a position indicating the center of corneal reflection from an image obtained by irradiating light from the illumination unit and capturing an image of the eye;
    A corneal curvature center calculator that calculates a position indicating the corneal curvature center based on a straight line connecting the light source and the position indicating the center of the corneal reflection;
    A gaze detection unit that detects the gaze of the subject based on the center position of the pupil detected by the pupil position detection unit and the position indicating the corneal curvature center;
    A line-of-sight detection device comprising:
  9.  目を撮像した画像から、瞳孔領域と角膜反射領域とを特定する特定ステップと、
     前記瞳孔領域に含まれる第1領域を用いて、瞳孔の中心位置を推定する第1推定ステップと、
     前記瞳孔領域に含まれ、前記第1領域とは異なる第2領域を用いて、瞳孔の中心位置を推定する第2推定ステップと、
     前記第1推定ステップにより推定された瞳孔の中心位置と、前記第2推定ステップにより推定された瞳孔の中心位置とに基づいて、瞳孔の中心位置を検出する瞳孔位置検出ステップと、
     を含む瞳孔検出方法。
    A specific step of identifying a pupil region and a corneal reflection region from an image of an eye image;
    A first estimation step of estimating a center position of the pupil using a first region included in the pupil region;
    A second estimation step of estimating a center position of the pupil using a second region included in the pupil region and different from the first region;
    A pupil position detection step for detecting the center position of the pupil based on the center position of the pupil estimated by the first estimation step and the center position of the pupil estimated by the second estimation step;
    A pupil detection method including:
PCT/JP2014/080682 2013-11-29 2014-11-19 Pupil detection device, line-of-sight detection device, and pupil detection method WO2015080003A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2013247956 2013-11-29
JP2013-247956 2013-11-29
JP2014043098A JP6269177B2 (en) 2013-11-29 2014-03-05 Pupil detection device, gaze detection device, and pupil detection method
JP2014-043098 2014-03-05

Publications (1)

Publication Number Publication Date
WO2015080003A1 true WO2015080003A1 (en) 2015-06-04

Family

ID=53198947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/080682 WO2015080003A1 (en) 2013-11-29 2014-11-19 Pupil detection device, line-of-sight detection device, and pupil detection method

Country Status (2)

Country Link
JP (1) JP6269177B2 (en)
WO (1) WO2015080003A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3011894A1 (en) * 2014-10-24 2016-04-27 JVC KENWOOD Corporation Eye gaze detection apparatus and eye gaze detection method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08327887A (en) * 1995-05-31 1996-12-13 Matsushita Electric Ind Co Ltd Line of sight detecting device
JP2003144388A (en) * 2001-11-16 2003-05-20 Canon Inc Visual line detector
JP2008264341A (en) * 2007-04-24 2008-11-06 Chube Univ Eye movement measurement method and eye movement measuring instrument

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08327887A (en) * 1995-05-31 1996-12-13 Matsushita Electric Ind Co Ltd Line of sight detecting device
JP2003144388A (en) * 2001-11-16 2003-05-20 Canon Inc Visual line detector
JP2008264341A (en) * 2007-04-24 2008-11-06 Chube Univ Eye movement measurement method and eye movement measuring instrument

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3011894A1 (en) * 2014-10-24 2016-04-27 JVC KENWOOD Corporation Eye gaze detection apparatus and eye gaze detection method

Also Published As

Publication number Publication date
JP6269177B2 (en) 2018-01-31
JP2015126850A (en) 2015-07-09

Similar Documents

Publication Publication Date Title
US10722113B2 (en) Gaze detection apparatus and gaze detection method
EP3075304B1 (en) Line-of-sight detection assistance device and line-of-sight detection assistance method
US10896324B2 (en) Line-of-sight detection device and method for detecting line of sight
EP3123943B1 (en) Detection device and detection method
US11023039B2 (en) Visual line detection apparatus and visual line detection method
JP6201956B2 (en) Gaze detection device and gaze detection method
JP6245093B2 (en) Diagnosis support apparatus and diagnosis support method
JP2016028669A (en) Pupil detection device and pupil detection method
EP3028644B1 (en) Diagnosis assistance device and diagnosis assistance method
JP2020038734A (en) Visual line detection device and visual line detection method
JP6269177B2 (en) Pupil detection device, gaze detection device, and pupil detection method
US20130321608A1 (en) Eye direction detecting apparatus and eye direction detecting method
JP6187347B2 (en) Detection apparatus and detection method
JP6471533B2 (en) Gaze detection device and gaze detection method
JP2017131446A (en) Pupil detection apparatus and pupil detection method
JP2020119583A (en) Visual line detection device and visual line detection method
JP2016157326A (en) Line of sight detection device and line of sight detection method
JP2015181797A (en) Detection apparatus and detection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14866490

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14866490

Country of ref document: EP

Kind code of ref document: A1