WO2017154356A1 - Sight line detection device and sight line detection method - Google Patents

Sight line detection device and sight line detection method Download PDF

Info

Publication number
WO2017154356A1
WO2017154356A1 PCT/JP2017/001476 JP2017001476W WO2017154356A1 WO 2017154356 A1 WO2017154356 A1 WO 2017154356A1 JP 2017001476 W JP2017001476 W JP 2017001476W WO 2017154356 A1 WO2017154356 A1 WO 2017154356A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pupil
center
coordinates
corneal reflection
Prior art date
Application number
PCT/JP2017/001476
Other languages
French (fr)
Japanese (ja)
Inventor
河内 隆宏
Original Assignee
アルプス電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アルプス電気株式会社 filed Critical アルプス電気株式会社
Publication of WO2017154356A1 publication Critical patent/WO2017154356A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present invention relates to a line-of-sight detection device and a line-of-sight detection method capable of detecting the line-of-sight direction of a driver of a vehicle and other subjects.
  • the gaze detection device described in Patent Literature 1 acquires a bright pupil image, a dark pupil image, and a corneal reflection image, and uses an intermediate value between the maximum value and the minimum value of a difference image between the bright pupil image and the dark pupil image as a threshold value. , The difference image is binarized. Further, this line-of-sight detection device detects the outer periphery of the pupil image by performing an elliptic approximation process or the like on the pupil image vicinity region searched from the binarized image, and specifies the center of the detected ellipse as the center position of the pupil image.
  • the gaze detection device described in Patent Document 1 has the following problems. That is, the light source used to acquire each of the bright pupil image and the dark pupil image does not match the conditions such as wavelength, position, incidence / reflection angle with respect to the eye, and light emission timing, so that the corneal reflection in the bright pupil image and the dark pupil image There is a possibility that the positions of the images do not completely coincide. When such a difference between the bright pupil image and the dark pupil image is taken, the corneal reflection image is not removed because the coordinates, brightness, and size of the corneal reflection image are different, and a part of the corneal reflection image remains in the difference image. May end up.
  • the remaining image of the corneal reflection image remaining in this way is located in the vicinity of the pupil image, this remaining image is also included in the target of the ellipse approximation process for detecting the outer periphery of the pupil image. In addition, the center position cannot be accurately detected.
  • An object of the present invention is to provide a gaze detection apparatus and a gaze detection method that can be calculated.
  • the eye gaze detection device of the present invention acquires a bright pupil image and a dark pupil image, respectively, and creates a difference image between the bright pupil image and the dark pupil image, and a difference image.
  • An image processing unit that creates a binarized binary image and selects a pupil candidate from one or more bright spots in the binary image, and calculates the coordinates of the corneal reflection light center in the dark pupil image as the reflection light center coordinates
  • a line-of-sight detection comprising a corneal reflection light center detection unit and a determination unit for determining whether or not the corneal reflection light center is within the first range of the binary image including the pupil candidate based on the reflection light center coordinates.
  • the image processing unit When the determination unit determines that the corneal reflection light center is within the first range by the determination unit, the image processing unit reduces the luminance of the pixels within the second range including the corneal reflection light center. Create a pupil image that is
  • the output device includes a pupil center calculation unit that calculates the coordinates of the pupil center based on the pupil image, and a gaze direction calculation unit that calculates the gaze direction based on the reflected light center coordinates and the pupil center coordinates. It is said.
  • the gaze detection method of the present invention includes a step of acquiring a bright pupil image and a dark pupil image, a difference image between the bright pupil image and the dark pupil image, and binarizing the difference image to generate a binary image.
  • image producing step is characterized by comprising: a pupil center calculation step of calculating a pupil center of the coordinates based on the pupil image, and calculating the sight
  • the line-of-sight detection device and line-of-sight detection method of the present invention it is determined whether or not the corneal reflection light center is within the first range, and if it is within the first range, Since the brightness of the second range including the corneal reflection light center is reduced, even if a part of the corneal reflection image remains in the difference image, it is possible to suppress the influence, thereby accurately calculating the pupil center. Therefore, the accuracy of line-of-sight detection can be further increased.
  • the image processing unit calculates an estimated value of the coordinates of the corneal reflection light center in the binary image. It is preferable to create a pupil image in which the luminance of the pixels in the second range including the coordinates is reduced.
  • the pupil image creation step determines that the corneal reflection light center is within the first range in the determination step
  • the coordinates of the corneal reflection light center in the binary image are estimated. It is preferable to calculate a value and create a pupil image in which the luminance of the pixels in the second range including the coordinates corresponding to the coordinates is reduced.
  • the accuracy of calculating the pupil center can be further increased.
  • the image processing unit when the determination unit determines that the corneal reflection light center is within the first range, the image processing unit preferably sets the luminance of the pixels within the second range to zero. . Thereby, since the image of the corneal reflection light in the dark pupil image remaining in the binary image can be completely removed, the pupil center can be calculated with higher accuracy.
  • a first camera and a second camera which are arranged apart from each other and each acquire an image of an area including at least eyes, a first light source arranged close to the first camera, It is preferable that the second light source is disposed close to the second camera, and the pupil image extraction unit acquires the bright pupil image and the dark pupil image from the images acquired by the respective cameras.
  • the pupil image extraction unit acquires the bright pupil image and the dark pupil image from the images acquired by the respective cameras.
  • the first range is preferably an elliptical range calculated based on the pupil candidate selected in the pupil candidate selection step.
  • an appropriate range can be set according to the shape of the eyes of various subjects, which can contribute to accurate pupil center calculation.
  • the first range is a range of pixels of the pupil candidate selected in the pupil candidate selection step.
  • the first range can be accurately set based on the pupil candidates, which can contribute to accurate pupil center calculation.
  • the pupil center calculation step calculates an elliptical center as the pupil center.
  • the setting accuracy of the center of the pupil can be kept constant, which can contribute to highly accurate line-of-sight detection.
  • the influence of the remaining image can be suppressed, and thereby the center of the pupil can be accurately determined. Can be calculated.
  • FIG. 3 is an enlarged view of FIG. It is a flowchart which shows the flow of a process of the gaze detection method which concerns on 1st Embodiment of this invention. It is a flowchart which shows the flow of a process of the gaze detection method which concerns on 2nd Embodiment of this invention.
  • FIG. 1 is a block diagram illustrating a configuration of a visual line detection device 10 according to the first embodiment.
  • 2A is an example of a bright pupil image acquired by the line-of-sight detection device 10
  • FIG. 2B is an example of a dark pupil image
  • FIG. 2C is the bright pupil image of FIG. It is a figure which shows the difference image of the dark pupil image of (B).
  • FIG. 3 is an enlarged view of FIG. As shown in FIG.
  • the line-of-sight detection device 10 includes two light sources 11, 12, two cameras 21, 22, a control unit 30, two light source control units 31, 32, two image acquisition units 41, 42, and a pupil.
  • An image extraction unit 51, a pupil center calculation unit 54, a corneal reflection light center detection unit 55, a gaze direction calculation unit 56, an image processing unit 57, and a determination unit 58 are provided.
  • the line-of-sight detection device 10 is installed, for example, on an instrument panel or an upper part of a windshield in a vehicle cabin so as to face the driver's face as a subject.
  • the cameras 21 and 22 are arranged such that the optical axes are separated from each other by a predetermined distance.
  • the cameras 21 and 22 have, for example, CMOS (complementary metal oxide semiconductor) as an image sensor.
  • CMOS complementary metal oxide semiconductor
  • This image sensor acquires an image of a face including the driver's eyes, and light is detected by a plurality of pixels arranged in the horizontal direction and the vertical direction.
  • Each of the first light source 11 and the second light source 12 includes, for example, a plurality of LED (light emitting diode) light sources.
  • the first light source 11 is disposed so as to surround the lens outside the lens of the first camera 21, and the second light source 12 is disposed so as to surround the lens outside the lens of the second camera 22, for example.
  • band pass filters it is preferable to arrange band pass filters in accordance with the wavelengths of the detection lights emitted from the two light sources 11 and 12.
  • the pupil image extraction unit 51 can extract the pupil image and the gaze direction calculation unit 56 can calculate the gaze direction with high accuracy.
  • the LED light source of the first light source 11 and the LED light source of the second light source 12 both emit infrared light (near infrared light) of 800 nm or more and 1000 nm or less, and give this detection light to the driver's eyes. It is arranged to be able to.
  • 850 nm is a wavelength with a low light absorption rate in the eyeball of a human eye, and this light is easily reflected by the retina at the back of the eyeball.
  • the distance between the optical axes of the LED light sources of the first camera 21 and the first light source 11 is determined with respect to the distance between the optical axes of the first camera 21 and the second camera 22 in consideration of the distance between the visual line detection device 10 and the driver. And short enough. Therefore, the first light source 11 can be regarded as having a substantially coaxial optical axis with respect to the first camera 21. Similarly, since the distance between the optical axes of the LED light sources of the second camera 22 and the second light source 12 is sufficiently shorter than the distance between the optical axes of the first camera 21 and the second camera 22, the second light source 12, the optical axes of the second camera 22 can be regarded as being substantially coaxial.
  • the optical axes of the first camera 21 and the second camera 22 is sufficiently long, the optical axes of the first light source 11 and the first camera 21, and the second light source 12 and the second Each optical axis of the camera 22 is not coaxial.
  • the above arrangement may be expressed as two members being substantially coaxial and the like, and the two members being non-coaxial.
  • the determination unit 58 includes a CPU and a memory of a computer, and processing by each unit is performed by executing software installed in advance.
  • the light source control unit 31 and the light source control unit 32 respectively control lighting / non-lighting of the first light source 11 and the second light source 12 according to an instruction signal from the control unit 30.
  • the cameras 21 and 22 acquire images according to the instruction signal from the control unit 30, and images captured by the cameras 21 and 22 are acquired by the image acquisition units 41 and 42 for each frame.
  • the images acquired by the image acquisition units 41 and 42 are read into the pupil image extraction unit 51 for each frame.
  • the pupil image extraction unit 51 includes a bright pupil image detection unit 52 and a dark pupil image detection unit 53.
  • the bright pupil image detection unit 52 detects an eye image when the combination of the light source and the camera satisfies one of the following bright pupil photographing conditions (a).
  • the dark pupil image detection unit 53 detects the following dark pupils: An eye image when the combination of the light source and the camera satisfies any one of the imaging conditions (b) is detected.
  • (A) Bright pupil photographing condition (a-1) An image is acquired by the first camera 21 substantially coaxial with the first light source 11 during the lighting period.
  • the infrared light emitted from the first light source 11 has a low absorption rate in the eyeball reaching the retina of the driver's eye, and thus is easily reflected by the retina. Therefore, when the first light source 11 is turned on, in the image acquired by the first camera 21 substantially coaxial with the first light source 11, the infrared light reflected by the retina is detected through the pupil, and the pupil looks bright. This image is extracted as a bright pupil image by the bright pupil image detection unit 52. The same applies to an image acquired by the second camera 22 that is substantially coaxial with the second light source 12 when it is turned on.
  • the wavelength of the infrared light emitted from the first light source 11 and the second light source 12 is set to 800 nm or more and 900 nm or less (especially 850 nm), the light absorption rate in the eyeball of the human eye is low and clearer. A bright pupil image can be acquired.
  • the infrared light reflected by the retina is transmitted to the second camera 22.
  • the pupil appears dark because it is hardly incident. Therefore, this image is extracted by the dark pupil image detection unit 53 as a dark pupil image.
  • the wavelength of the infrared light emitted from the first light source 11 and the second light source 12 is set to be greater than 900 nm and less than or equal to 1000 nm (particularly, 950 nm), the light absorption rate in the eyeball of the human eye is high and clearer.
  • a dark pupil image can be acquired.
  • the dark pupil image detected by the dark pupil image detection unit 53 is subtracted from the bright pupil image detected by the bright pupil image detection unit 52 to create a difference image. That is, the difference image is an image in which an image region that is not common to the bright pupil image and the dark pupil image remains after subtraction of both images. Furthermore, a range of a predetermined size including both eyes is cut out as the pupil detection range.
  • the clipped image is given to the image processing unit 57, and binarized by image processing to create a binary image.
  • the created binary image is stored in a memory in the image processing unit 57.
  • labeling image processing is performed, and a label is attached to each of one or more bright spots in the binary image.
  • the image processing unit 57 selects a bright spot as a pupil candidate of the pupil image from one or more binary images based on the brightness, shape, etc. of the bright spot in the binary image.
  • the selection of the pupil candidate is performed, for example, so as to select the bright spot closest to the circle as the image of the pupil.
  • the binary image, the selected pupil candidate, and the label attached to the pupil candidate are output to the pupil center calculation unit 54 and the determination unit 58.
  • the pupil center calculation unit 54 calculates an area image corresponding to the shape and area of the pupil for the bright spot selected as the pupil candidate of the pupil image. Further, an ellipse including the area image is extracted, and the center of the ellipse shape, that is, the intersection of the major axis and the minor axis of the ellipse shape is calculated as the center position of the pupil. Various methods can be used for extracting the ellipse (ellipse fitting). For example, an ellipse inscribed by the area image is calculated, and this ellipse is selected.
  • the pupil center calculation unit 54 sends the calculated coordinate data of the center position of the pupil to the line-of-sight direction calculation unit 56 and outputs a signal indicating that the coordinate data has been calculated to the determination unit 58.
  • the dark pupil image signal detected by the dark pupil image detection unit 53 is given to the corneal reflection light center detection unit 55.
  • the dark pupil image signal includes a luminance signal based on reflected light from the reflection point of the cornea.
  • the reflected light from the reflection point of the cornea forms a Purkinje image, and is acquired as a spot image with a very small area by the imaging device of the camera.
  • the corneal reflection light center detection unit 55 performs image processing on the spot image, and obtains the coordinates of the center of the reflection light (corneal reflection light) from the reflection point of the cornea in the dark pupil image as the reflection light center coordinates.
  • the corneal reflection light center detection unit 55 also calculates the area of the corneal reflection image in the dark pupil image.
  • the center coordinates of the reflected light and the area of the corneal reflection image thus calculated are output to the determination unit 58.
  • the calculation of the reflected light center coordinate in the corneal reflected light center detection unit 55 is performed when the dark pupil image is input from the dark pupil image detection unit 53 or the detection of the corneal reflected light center is performed from the determination unit 58. Performed when a signal to instruct is received.
  • the determination unit 58 refers to the reflected light center coordinates calculated by the cornea reflected light center detection unit 55 on the binary image created by the image processing unit 57, and reflects the reflected light within the first range of the binary image. It is determined whether or not there is a center coordinate.
  • the first range is, for example, an elliptical range extracted by the pupil center calculation unit 54 for calculating the pupil center position, or a circle having a radius of a predetermined number of pixels centered on the pupil center position. Or the range of an ellipse.
  • the result of determination by the determination unit 58 is output to the image processing unit 57.
  • the determination unit 58 receives a signal indicating that the coordinates of the center position of the pupil are calculated from the pupil center calculation unit 54, the detection result of the corneal reflection light center is input from the corneal reflection light center detection unit 55. If not input, a signal instructing execution of detection of the corneal reflection light center is sent to the corneal reflection light center detection unit 55.
  • the image processing unit 57 determines the binary image based on the shape and luminance information of the image at the reflection light center coordinates of the binary image.
  • An estimated value (estimated coordinates) of the coordinates of the corneal reflection light center at is calculated. For example, when the residual image of the corneal reflection light in the binary image has a shape of a part of a circle, the coordinates when the shape is complemented with the circle shape are the coordinates of the corneal reflection light center. Estimated as (estimated coordinates).
  • the image processing unit 57 creates an image in which the luminance of the pixels within the second range including the estimated coordinates is reduced in the binary image, and updates the binary image with this image.
  • the luminance is reduced at a constant rate, and the rate may be zero or half, for example.
  • the second range is, for example, a range corresponding to a corneal reflection image in the dark pupil image.
  • a pupil image 72 is obtained by subtracting the pupil image 71 in the dark pupil image 62 from the pupil image 71 in the bright pupil image 61.
  • a crescent-shaped cornea reflection image 73 remains in the difference image 63 due to the positional shift of the cornea reflection images in the bright pupil image 61 and the dark pupil image 62. Therefore, on the difference image 63, a pupil image 72 and a corneal reflection image 73 appear as pupil candidates (bright spots) of the pupil image.
  • an ellipse 82 or an ellipse 83 is set as a first range having an elliptical shape centered on the pupil image 72.
  • the cornea reflection image 73 does not exist within this range, but when the ellipse 83 is the first range, the cornea reflection image 73 exists within this range.
  • the circle range 84 centered on the center 84c of the circle obtained by complementing the crescent shape of the cornea reflection image 73 is set as the second range, Set the pixel brightness to zero.
  • the pupil center calculation unit 54 can calculate the center of the pupil image more accurately by calculating the center position of the pupil based on the updated binary image.
  • the pupil center calculation value calculated by the pupil center calculation unit 54 and the corneal reflection light center calculation value calculated by the corneal reflection light center detection unit 55 are given to the gaze direction calculation unit 56.
  • the line-of-sight direction calculation unit 56 detects the direction of the line of sight from the pupil center calculated value and the corneal reflection light center calculated value.
  • FIG. 4 is a flowchart showing a flow of processing of the visual line detection method according to the first embodiment.
  • the operation of the first camera 21 will be described.
  • the first light source 11 is turned on for obtaining a pupil image, and a bright pupil image is obtained by obtaining an image in accordance with the turning on (step S1).
  • the second light source 12 is turned on, and a dark pupil image is obtained by acquiring an image in accordance with the lighting (step S2).
  • the operation of the second camera 22 will be described.
  • the first light source 11 is turned on for pupil image acquisition under the control of the control unit 30, and when an image is acquired in accordance with this lighting, the second camera 22 obtains a dark pupil image.
  • Step S1 When the second light source 12 is turned on and an image is acquired in accordance with this lighting, a bright pupil image is obtained by the second camera 22 (step S2).
  • step S1 the image captured by the first camera 21 is read from the image acquisition unit 41 into the bright pupil image detection unit 52 of the pupil image extraction unit 51, and the bright pupil image detection unit 52 detects the bright pupil image.
  • the image captured by the second camera 22 is read from the image acquisition unit 42 into the dark pupil image detection unit 53 of the pupil image extraction unit 51, and the dark pupil image detection unit 53 detects the dark pupil image.
  • step S2 the image captured by the first camera 21 is read from the image acquisition unit 41 into the dark pupil image detection unit 53 of the pupil image extraction unit 51, and the dark pupil image detection unit 53 detects the dark pupil image.
  • the image captured by the second camera 22 is read from the image acquisition unit 42 into the bright pupil image detection unit 52 of the pupil image extraction unit 51, and the bright pupil image detection unit 52 detects the bright pupil image.
  • the pupil image extraction unit 51 the dark pupil image detected by the dark pupil image detection unit 53 is subtracted from the bright pupil image detected by the bright pupil image detection unit 52 to create a difference image (step S3). Further, the pupil image extraction unit 51 cuts out an image having a predetermined size including both eyes of the subject as a detection range (step S4).
  • the image cut out in step S4 is sent to the image processing unit 57, and a binary image is created by image processing (step S5).
  • the created binary image is stored in a memory in the image processing unit 57.
  • the image processing unit 57 executes a labeling process on the binary image, and labels each of one or more bright spots included in the binary image (step S6). Further, the image processing unit 57 selects a pupil candidate (candidate label) of the pupil image based on the shape of the bright spot or the like (step S7).
  • the selected pupil candidate data is output to the pupil center calculation unit 54.
  • the pupil center calculation unit 54 calculates an area image corresponding to the shape and area of the pupil for the bright spot selected by the image processing unit 57. Further, the pupil center calculation unit 54 extracts an elliptical shape including the calculated area image (ellipse fitting), and calculates coordinates (XP, YP) of the elliptical center position (step S8). The calculated coordinate value is output to the determination unit 58.
  • the determination unit 58 determines whether or not the corneal reflection light center position has been detected (step S9).
  • the determination unit 58 instructs the pupil image extraction unit 51 to calculate the coordinates of the corneal reflection light center.
  • the pupil image extraction unit 51 uses, as an image of a range for detecting the coordinates of the corneal reflection light center, an image of a predetermined size including both eyes of the subject in the dark pupil image. Cut out (step S10).
  • the corneal reflection light center detection unit 55 calculates the coordinates (XG, YG) of the corneal reflection light center from the spot image in the dark pupil image based on the extracted detection range (step S11).
  • the calculation result of the coordinates of the corneal reflection light center in step S11 is output to the determination unit 58.
  • the determination unit 58 determines whether or not the coordinates (XG, YG) of the corneal reflection light center are within the elliptical range including the area image calculated in step S8 (step S8). S12).
  • the determination result is output to the image processing unit 57.
  • the image processing unit 57 estimates the coordinates (estimated coordinates) of the corneal reflection light center in the binary image. (Step S13).
  • the image processing unit 57 creates an image in which the luminance of the pixels in the second range including the estimated coordinates in the binary image, that is, the peripheral pixels of the pupil in the elliptical range is zero,
  • the binary image is updated in the memory (step S14).
  • the updated binary image is also output to the pupil image extraction unit 51, and the processing after step S8 is executed again based on this binary image.
  • step S9 if it is determined in step S9 that the corneal reflection light center has already been detected (Y in step S9), it is determined in step S12 that the coordinates of the corneal reflection light center are not within the elliptical range of the pupil. If so (N in step S12), the coordinates of the pupil and the coordinates of the corneal reflection light center are determined (step S15), and the line-of-sight direction calculation unit 56 calculates the line-of-sight direction of the subject based on these coordinates.
  • the corneal reflection light center is within the first range of the binary image including the pupil candidate of the pupil, and it is determined that the corneal reflection light center is within the first range
  • the line-of-sight direction is calculated based on the coordinates of the pupil center calculated based on the coordinates and the center coordinates of the reflected light. According to this, even if a part of the corneal reflection image remains in the difference image, it is possible to suppress the influence, and thereby it is possible to accurately calculate the pupil center, thereby further improving the accuracy of line-of-sight detection. Can be increased.
  • the image processing unit When it is determined that the corneal reflection light center is within the first range, an estimated value of the coordinates of the corneal reflection light center in the binary image is calculated, and a second value including coordinates corresponding to the coordinates is calculated. An image in which the luminance of the pixels in the range is reduced is created, and the binary image is updated with this image. For this reason, since the image of the corneal reflected light remaining in the binary image is accurately removed, it is possible to further increase the calculation accuracy of the pupil center.
  • the image processing unit when it is determined by the determination unit that the corneal reflection light center is within the first range, the image processing unit preferably sets the luminance of the pixels within the second range to zero. Thereby, since the image of the corneal reflection light remaining in the binary image can be completely removed, the pupil center can be calculated with higher accuracy.
  • the line-of-sight detection device 10 includes the first light source 11 and the first camera 21 arranged substantially coaxially, and the second light source 12 and the second camera 22 arranged substantially coaxially.
  • a bright pupil image is acquired with a camera that is substantially coaxial with the light source that is turned on, and a dark pupil image is acquired with a non-coaxial camera.
  • the line-of-sight detection device 10 remains in the difference image of these images. The influence of the residual image of the corneal reflection light can be suppressed.
  • the wavelengths of the light emitted from the two light sources 11 and 12 are both low in the light absorption rate in the eyeball, but the wavelength of the light emitted from one light source is the light absorption rate.
  • the wavelength may be high and difficult to reach the retina (for example, greater than 900 nm and 1000 nm or less, particularly 950 nm), and may be used as a light source for bright pupil image capturing and a light source for dark pupil image capturing. Further, the number of light sources and cameras is not limited to two.
  • the first range is preferably an elliptical range calculated based on the selected pupil candidate. Furthermore, it is preferable to calculate the center of the elliptical shape as the pupil center.
  • the first range is preferably a range of pixels of the pupil candidate selected in the pupil candidate selection step. In this case, the first range may be the same as the range of the candidate pupil pixels, but may be extended outward by a predetermined number of pixels from the range. As a result, an appropriate range can be set according to the shape of the eyes of various subjects, which can contribute to accurate pupil center calculation.
  • FIG. 5 is a flowchart showing a flow of processing of the visual line detection method according to the second embodiment.
  • Acquisition of a bright pupil image (step S21), acquisition of a dark pupil image (step S22), creation of a difference image (step S23), extraction of a pupil detection range (step S24), creation of a binary image in the second embodiment (Step S25), labeling (Step S26), and selection of a pupil candidate label (Step S27) are the same as Steps S1 to S7 of the first embodiment.
  • a notification signal is transmitted from the image processing unit 57 to the determination unit 58.
  • the determination unit 58 determines whether or not the corneal reflection light center position has been detected (step S28).
  • the determination unit 58 instructs the pupil image extraction unit 51 to calculate the coordinates of the corneal reflection light center.
  • the pupil image extraction unit 51 uses, as an image of a range for detecting the coordinates of the corneal reflection light center, an image of a predetermined size including both eyes of the subject in the dark pupil image. Cut out (step S29).
  • the corneal reflection light center detection unit 55 calculates the coordinates (XG, YG) of the corneal reflection light center from the spot image based on the extracted detection range (step S30).
  • the calculation result of the coordinates of the corneal reflection light center in step S30 is output to the determination unit 58.
  • the determination unit 58 determines whether or not the coordinates (XG, YG) of the corneal reflection light center are within the range of the pupil candidates (step S31). This determination is also performed when it is determined in step S28 that the corneal reflection light center has already been detected (Y in step S28).
  • the image processing unit 57 estimates the coordinates (estimated coordinates) of the corneal reflection light center in the binary image. (Step S32). Further, the image processing unit 57 creates an image in which the luminance of the pixels in the second range including the estimated coordinates in the binary image, that is, the peripheral pixels of the pupil in the range of the pupil candidates is zero, The binary image is updated in the memory (step S33).
  • step S31 If it is determined in step S31 that the coordinates of the corneal reflection light center are not within the range of pupil candidates for the pupil (N in step S12), and if the binary image is updated in step S33, pupil center calculation is performed.
  • the unit 54 an area image corresponding to the shape and area of the pupil is calculated for the bright spot selected by the image processing unit 57.
  • the pupil center calculation unit 54 extracts an elliptical shape including the calculated area image (ellipse fitting), and calculates the coordinates (XP, YP) of the elliptical center position (step S34).
  • step S35 the coordinates of the pupil and the coordinates of the corneal reflected light center are determined (step S35), and the line-of-sight direction calculation unit 56 calculates the line-of-sight direction of the subject based on these coordinates.
  • Other operations and effects are the same as those in the first embodiment.
  • the line-of-sight detection device and line-of-sight detection method according to the present invention provide the detection angle of the detection light to the subject's eyes, the shooting angle of the camera, and the camera in the acquisition of the bright pupil image and the dark pupil image. Even if the position of the corneal reflection light center in the bright pupil image and the dark pupil image is shifted due to a shift in the photographing timing, and the corneal reflection image may remain in the difference image between these images, This is useful in that the influence can be suppressed.
  • Eye-gaze detection apparatus 11 1st light source 12 2nd light source 21 1st camera 22 2nd camera 30 Control part 31, 32 Light source control part 41, 42 Image acquisition part 51 Pupil image extraction part 52 Bright pupil image detection part 53 Dark pupil image Detection unit 54 Pupil center calculation unit 55 Corneal reflection light center detection unit 56 Gaze direction calculation unit 57 Image processing unit 58 Discrimination unit 61 Bright pupil image 62 Dark pupil image 63 Difference image 71 Pupil image 72 Pupil image 73 Corneal reflection image 81 Detection Range 82, 83 Ellipse (first range) 84 circle range (second range)

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biophysics (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

[Problem] To provide a sight line detection device and a sight line detection method that, even when one portion of a corneal reflection image remains in a difference image for a bright pupil image and a dark pupil image, can suppress the effect of the remnant image and thereby accurately calculate the center of a pupil. [Solution] A sight line detection device that has an image processing unit that, when a determination unit has determined that the center of corneal reflection light is within a first area, creates a pupil image in which the luminance of pixels in a second area that includes the center of the corneal reflection light has been reduced. The sight line detection device comprises: a pupil center calculation unit that calculates the coordinates of a pupil center on the basis of the pupil image; and a sight line direction calculation unit that calculates a sight line direction on the basis of reflection light center coordinates and the coordinates of the pupil center.

Description

視線検出装置および視線検出方法Gaze detection device and gaze detection method
 本発明は、車両の運転者その他の対象者の視線方向を検出可能な視線検出装置および視線検出方法に関する。 The present invention relates to a line-of-sight detection device and a line-of-sight detection method capable of detecting the line-of-sight direction of a driver of a vehicle and other subjects.
 特許文献1に記載の視線検出装置は、明瞳孔画像、暗瞳孔画像、および角膜反射像を取得し、明瞳孔画像と暗瞳孔画像の差分画像の最大値と最小値との中間値を閾値として、差分画像を2値化する。さらに、この視線検出装置は、2値化画像から探索した瞳孔画像近傍領域について楕円近似処理等を行って瞳孔画像の外周を検出し、検出した楕円の中心を瞳孔画像の中心位置として特定する。 The gaze detection device described in Patent Literature 1 acquires a bright pupil image, a dark pupil image, and a corneal reflection image, and uses an intermediate value between the maximum value and the minimum value of a difference image between the bright pupil image and the dark pupil image as a threshold value. , The difference image is binarized. Further, this line-of-sight detection device detects the outer periphery of the pupil image by performing an elliptic approximation process or the like on the pupil image vicinity region searched from the binarized image, and specifies the center of the detected ellipse as the center position of the pupil image.
特開2014-94186号公報JP 2014-94186 A
 しかしながら、特許文献1に記載の視線検出装置には以下の問題があった。すなわち、明瞳孔画像と暗瞳孔画像のそれぞれの取得に用いる光源は、波長、位置、眼に対する入反射角度、発光タイミングなどの条件が一致していないため、明瞳孔画像と暗瞳孔画像における角膜反射像の位置が完全に一致しないおそれがある。このような明瞳孔画像と暗瞳孔画像の差分を取った場合、角膜反射像の座標、輝度、および、大きさが異なるために角膜反射像が除去されずに、一部が差分画像に残ってしまうことがある。こうして残った角膜反射像の残存画像が瞳孔画像の近傍に位置していた場合、この残存画像も瞳孔画像の外周検出のための楕円近似処理の対象に含まれてしまうことから、瞳孔画像の外周および中心位置を正確に検出することができなくなる。 However, the gaze detection device described in Patent Document 1 has the following problems. That is, the light source used to acquire each of the bright pupil image and the dark pupil image does not match the conditions such as wavelength, position, incidence / reflection angle with respect to the eye, and light emission timing, so that the corneal reflection in the bright pupil image and the dark pupil image There is a possibility that the positions of the images do not completely coincide. When such a difference between the bright pupil image and the dark pupil image is taken, the corneal reflection image is not removed because the coordinates, brightness, and size of the corneal reflection image are different, and a part of the corneal reflection image remains in the difference image. May end up. If the remaining image of the corneal reflection image remaining in this way is located in the vicinity of the pupil image, this remaining image is also included in the target of the ellipse approximation process for detecting the outer periphery of the pupil image. In addition, the center position cannot be accurately detected.
 そこで本発明は、明瞳孔画像と暗瞳孔画像の差分画像に角膜反射像の一部が残存していたとしても、この残存画像の影響を抑えることができ、これにより、瞳孔の中心を正確に算出することができる視線検出装置および視線検出方法を提供することを目的とする。 Therefore, the present invention can suppress the influence of the residual image even if a part of the corneal reflection image remains in the difference image between the bright pupil image and the dark pupil image, thereby accurately determining the center of the pupil. An object of the present invention is to provide a gaze detection apparatus and a gaze detection method that can be calculated.
 上記課題を解決するために、本発明の視線検出装置は、明瞳孔画像と暗瞳孔画像をそれぞれ取得し、明瞳孔画像と暗瞳孔画像の差分画像を作成する瞳孔画像抽出部と、差分画像を二値化した二値画像を作成し、二値画像における1つ以上の輝点から瞳孔候補を選択する画像処理部と、反射光中心座標として暗瞳孔画像における角膜反射光中心の座標を算出する角膜反射光中心検出部と、反射光中心座標に基づいて、瞳孔候補を含む二値画像の第1の範囲内に上記角膜反射光中心があるか否かを判別する判別部とを備える視線検出装置であって、判別部によって第1の範囲内に上記角膜反射光中心があると判別された場合、画像処理部は、上記角膜反射光中心を含む第2の範囲内の画素の輝度を低下させた画像である瞳孔画像を作成し、視線検出装置は、上記瞳孔画像に基づいて瞳孔中心の座標を算出する瞳孔中心算出部と、反射光中心座標と瞳孔中心の座標に基づいて視線方向を算出する視線方向算出部とを備えることを特徴としている。 In order to solve the above problems, the eye gaze detection device of the present invention acquires a bright pupil image and a dark pupil image, respectively, and creates a difference image between the bright pupil image and the dark pupil image, and a difference image. An image processing unit that creates a binarized binary image and selects a pupil candidate from one or more bright spots in the binary image, and calculates the coordinates of the corneal reflection light center in the dark pupil image as the reflection light center coordinates A line-of-sight detection comprising a corneal reflection light center detection unit and a determination unit for determining whether or not the corneal reflection light center is within the first range of the binary image including the pupil candidate based on the reflection light center coordinates. When the determination unit determines that the corneal reflection light center is within the first range by the determination unit, the image processing unit reduces the luminance of the pixels within the second range including the corneal reflection light center. Create a pupil image that is The output device includes a pupil center calculation unit that calculates the coordinates of the pupil center based on the pupil image, and a gaze direction calculation unit that calculates the gaze direction based on the reflected light center coordinates and the pupil center coordinates. It is said.
 また、本発明の視線検出方法は、明瞳孔画像と暗瞳孔画像をそれぞれ取得するステップと、明瞳孔画像と暗瞳孔画像の差分画像を作成し、差分画像を二値化して二値画像を作成するステップと、二値画像における1つ以上の輝点から瞳孔候補を選択する瞳孔候補選択ステップと、反射光中心座標として暗瞳孔画像において角膜反射光中心の座標を算出する角膜反射光中心算出ステップと、角膜反射光中心算出ステップにおいて算出された反射光中心座標に基づいて、瞳孔候補を含む二値画像の第1の範囲内に上記角膜反射光中心があるか否かを判別する判別ステップと、判別ステップにおいて第1の範囲内に上記角膜反射光中心があると判別された場合、角膜反射光中心を含む第2の範囲内の画素の輝度を低下させた瞳孔画像を作成する瞳孔画像作成ステップと、瞳孔画像に基づいて瞳孔中心の座標を算出する瞳孔中心算出ステップと、反射光中心座標と瞳孔中心の座標に基づいて視線方向を算出するステップとを備えることを特徴としている。 Further, the gaze detection method of the present invention includes a step of acquiring a bright pupil image and a dark pupil image, a difference image between the bright pupil image and the dark pupil image, and binarizing the difference image to generate a binary image. A pupil candidate selection step of selecting a pupil candidate from one or more bright spots in the binary image, and a corneal reflection light center calculation step of calculating coordinates of the corneal reflection light center in the dark pupil image as reflected light center coordinates And a determination step of determining whether or not the corneal reflection light center is within the first range of the binary image including the pupil candidate based on the reflection light center coordinates calculated in the corneal reflection light center calculation step; When the discrimination step determines that the corneal reflection light center is within the first range, a pupil that creates a pupil image in which the luminance of the pixels within the second range including the corneal reflection light center is reduced And image producing step is characterized by comprising: a pupil center calculation step of calculating a pupil center of the coordinates based on the pupil image, and calculating the sight line direction based on the coordinates of the reflected light center coordinates and the pupil center.
 本発明の視線検出装置および視線検出方法によれば、第1の範囲内に角膜反射光中心があるか否かを判別し、第1の範囲内にある場合には、第1の範囲内の角膜反射光中心を含む第2の範囲の輝度を低下させることから、差分画像に角膜反射像の一部が残存したとしても、その影響を抑えることが可能となり、これによって瞳孔中心を正確に算出することができ、よって、視線検出の精度をさらに高めることができる。 According to the line-of-sight detection device and line-of-sight detection method of the present invention, it is determined whether or not the corneal reflection light center is within the first range, and if it is within the first range, Since the brightness of the second range including the corneal reflection light center is reduced, even if a part of the corneal reflection image remains in the difference image, it is possible to suppress the influence, thereby accurately calculating the pupil center. Therefore, the accuracy of line-of-sight detection can be further increased.
 本発明の視線検出装置において、判別部によって第1の範囲内に角膜反射光中心があると判別された場合、画像処理部は、二値画像における上記角膜反射光中心の座標の推定値を算出し、この座標を含む第2の範囲内の画素の輝度を低下させた瞳孔画像を作成することが好ましい。
 また、本発明の視線検出方法では、瞳孔画像作成ステップにおいて、判別ステップにおいて第1の範囲内に角膜反射光中心があると判別された場合、二値画像における上記角膜反射光中心の座標の推定値を算出し、この座標に対応する座標を含む第2の範囲内の画素の輝度を低下させた瞳孔画像を作成することが好ましい。
 このように二値画像における角膜反射光中心の座標の推定値を算出し、この座標を含む第2の範囲内の画素の輝度を低下させることにより、二値画像に残った、暗瞳孔画像における角膜反射光の画像を正確に除去することができるため、瞳孔中心の算出精度をより高くすることが可能となる。
In the line-of-sight detection device of the present invention, when the determination unit determines that the corneal reflection light center is within the first range, the image processing unit calculates an estimated value of the coordinates of the corneal reflection light center in the binary image. It is preferable to create a pupil image in which the luminance of the pixels in the second range including the coordinates is reduced.
In the eye gaze detection method of the present invention, when the pupil image creation step determines that the corneal reflection light center is within the first range in the determination step, the coordinates of the corneal reflection light center in the binary image are estimated. It is preferable to calculate a value and create a pupil image in which the luminance of the pixels in the second range including the coordinates corresponding to the coordinates is reduced.
Thus, by calculating the estimated value of the coordinates of the corneal reflection light center in the binary image and reducing the luminance of the pixels in the second range including this coordinate, the dark pupil image remaining in the binary image Since the image of the corneal reflection light can be accurately removed, the accuracy of calculating the pupil center can be further increased.
 本発明の視線検出装置において、判別部によって第1の範囲内に角膜反射光中心があると判別された場合、画像処理部は、第2の範囲内の画素の輝度をゼロにすることが好ましい。
 これにより、二値画像に残った、暗瞳孔画像における角膜反射光の画像を完全に除去できるため、瞳孔中心をより精度良く算出することができる。
In the line-of-sight detection device of the present invention, when the determination unit determines that the corneal reflection light center is within the first range, the image processing unit preferably sets the luminance of the pixels within the second range to zero. .
Thereby, since the image of the corneal reflection light in the dark pupil image remaining in the binary image can be completely removed, the pupil center can be calculated with higher accuracy.
 本発明の視線検出装置において、互いに離れて配置されてそれぞれが少なくとも目を含む領域の画像を取得する第1カメラおよび第2カメラと、第1カメラに接近して配置された第1光源と、第2カメラに接近して配置された第2光源とを備え、瞳孔画像抽出部は、それぞれのカメラで取得した画像から明瞳孔画像と暗瞳孔画像を取得することが好ましい。
 点灯させた光源と略同軸のカメラで明瞳孔画像を取得するとともに、非同軸のカメラで暗瞳孔画像を取得する視線検出装置において、それぞれの画像中の角膜反射光中心の座標にずれが生じていたとしても、本発明によれば、これらの画像の差分画像において残存した角膜反射光の残存画像の影響を抑えることができる。
In the line-of-sight detection device of the present invention, a first camera and a second camera, which are arranged apart from each other and each acquire an image of an area including at least eyes, a first light source arranged close to the first camera, It is preferable that the second light source is disposed close to the second camera, and the pupil image extraction unit acquires the bright pupil image and the dark pupil image from the images acquired by the respective cameras.
In a line-of-sight detection device that acquires a bright pupil image with a light source and a substantially coaxial camera and a dark pupil image with a non-coaxial camera, there is a shift in the coordinates of the corneal reflected light center in each image. Even so, according to the present invention, it is possible to suppress the influence of the remaining image of the corneal reflected light remaining in the difference image between these images.
 本発明の視線検出方法において、第1の範囲は、瞳孔候補選択ステップで選択された瞳孔候補に基づいて算出した楕円形状の範囲であることが好ましい。
 これにより、さまざまな対象者の眼の形状に合わせて適切な範囲を設定することができ、正確な瞳孔中心算出に資することができる。
In the line-of-sight detection method of the present invention, the first range is preferably an elliptical range calculated based on the pupil candidate selected in the pupil candidate selection step.
As a result, an appropriate range can be set according to the shape of the eyes of various subjects, which can contribute to accurate pupil center calculation.
 本発明の視線検出方法において、第1の範囲は、瞳孔候補選択ステップで選択された瞳孔候補の画素の範囲であることが好ましい。
 これにより、瞳孔候補に基づいて正確に第1の範囲を設定することができ、正確な瞳孔中心算出に資することができる。
In the line-of-sight detection method of the present invention, it is preferable that the first range is a range of pixels of the pupil candidate selected in the pupil candidate selection step.
Thereby, the first range can be accurately set based on the pupil candidates, which can contribute to accurate pupil center calculation.
 本発明の視線検出方法において、瞳孔中心算出ステップにおいては、楕円形状の中心を瞳孔中心として算出することが好ましい。
 これにより、瞳孔中心の設定精度を一定に保つことができ、精度の高い視線検出に資することができる。
In the line-of-sight detection method of the present invention, it is preferable that the pupil center calculation step calculates an elliptical center as the pupil center.
Thereby, the setting accuracy of the center of the pupil can be kept constant, which can contribute to highly accurate line-of-sight detection.
 本発明によると、明瞳孔画像と暗瞳孔画像の差分画像に角膜反射像の一部が残存していたとしても、この残存画像の影響を抑えることができ、これにより、瞳孔の中心を正確に算出することができる。 According to the present invention, even if a part of the corneal reflection image remains in the difference image between the bright pupil image and the dark pupil image, the influence of the remaining image can be suppressed, and thereby the center of the pupil can be accurately determined. Can be calculated.
本発明の第1実施形態に係る視線検出装置の構成を示すブロック図である。It is a block diagram which shows the structure of the gaze detection apparatus which concerns on 1st Embodiment of this invention. (A)は明瞳孔画像の例、(B)は暗瞳孔画像の例、(C)は(A)の明瞳孔画像と(B)の暗瞳孔画像の差分画像を示す図である。(A) is an example of a bright pupil image, (B) is an example of a dark pupil image, (C) is a figure which shows the difference image of the bright pupil image of (A), and the dark pupil image of (B). 図2(C)の拡大図である。FIG. 3 is an enlarged view of FIG. 本発明の第1実施形態に係る視線検出方法の処理の流れを示すフローチャートである。It is a flowchart which shows the flow of a process of the gaze detection method which concerns on 1st Embodiment of this invention. 本発明の第2実施形態に係る視線検出方法の処理の流れを示すフローチャートである。It is a flowchart which shows the flow of a process of the gaze detection method which concerns on 2nd Embodiment of this invention.
 以下、本発明の実施形態に係る視線検出装置および視線検出方法について図面を参照しつつ詳しく説明する。
 <第1実施形態>
 図1から図3を参照して、第1実施形態に係る視線検出装置10の構成について説明する。図1は、第1実施形態に係る視線検出装置10の構成を示すブロック図である。図2(A)は視線検出装置10によって取得された明瞳孔画像の例、図2(B)は暗瞳孔画像の例、図2(C)は図2(A)の明瞳孔画像と図2(B)の暗瞳孔画像の差分画像を示す図である。図3は図2(C)の拡大図である。
 図1に示すように、視線検出装置10は、2つの光源11、12、2つのカメラ21、22、制御部30、2つの光源制御部31、32、2つの画像取得部41、42、瞳孔画像抽出部51、瞳孔中心算出部54、角膜反射光中心検出部55、視線方向算出部56、画像処理部57、および判別部58を備える。視線検出装置10は、車両の車室内の、例えばインストルメントパネルやウインドシールドの上部などに、対象者としての運転者の顔に向けるように設置される。
Hereinafter, a line-of-sight detection device and a line-of-sight detection method according to embodiments of the present invention will be described in detail with reference to the drawings.
<First Embodiment>
With reference to FIGS. 1 to 3, the configuration of the visual line detection device 10 according to the first embodiment will be described. FIG. 1 is a block diagram illustrating a configuration of a visual line detection device 10 according to the first embodiment. 2A is an example of a bright pupil image acquired by the line-of-sight detection device 10, FIG. 2B is an example of a dark pupil image, and FIG. 2C is the bright pupil image of FIG. It is a figure which shows the difference image of the dark pupil image of (B). FIG. 3 is an enlarged view of FIG.
As shown in FIG. 1, the line-of-sight detection device 10 includes two light sources 11, 12, two cameras 21, 22, a control unit 30, two light source control units 31, 32, two image acquisition units 41, 42, and a pupil. An image extraction unit 51, a pupil center calculation unit 54, a corneal reflection light center detection unit 55, a gaze direction calculation unit 56, an image processing unit 57, and a determination unit 58 are provided. The line-of-sight detection device 10 is installed, for example, on an instrument panel or an upper part of a windshield in a vehicle cabin so as to face the driver's face as a subject.
 カメラ21、22は、光軸が所定距離だけ互いに離間するように配置されている。カメラ21、22は、撮像素子として、例えばCMOS(相補型金属酸化膜半導体)を有している。この撮像素子は運転者の眼を含む顔の画像を取得し、水平方向および垂直方向に配列された複数の画素で光が検出される。 The cameras 21 and 22 are arranged such that the optical axes are separated from each other by a predetermined distance. The cameras 21 and 22 have, for example, CMOS (complementary metal oxide semiconductor) as an image sensor. This image sensor acquires an image of a face including the driver's eyes, and light is detected by a plurality of pixels arranged in the horizontal direction and the vertical direction.
 第1光源11と第2光源12は、それぞれ、例えば複数個のLED(発光ダイオード)光源からなる。第1光源11は、例えば、第1カメラ21のレンズの外側において、レンズを囲むように配置され、第2光源12は、例えば、第2カメラ22のレンズの外側において、レンズを囲むように配置されている。これらのカメラ21、22においては、2つの光源11、12から出射される検出光の波長に合わせたバンドパスフィルタを配置していることが好ましい。これにより、瞳孔画像抽出部51における瞳孔画像の抽出や、視線方向算出部56における視線方向の算出を精度良く行うことができる。 Each of the first light source 11 and the second light source 12 includes, for example, a plurality of LED (light emitting diode) light sources. For example, the first light source 11 is disposed so as to surround the lens outside the lens of the first camera 21, and the second light source 12 is disposed so as to surround the lens outside the lens of the second camera 22, for example. Has been. In these cameras 21 and 22, it is preferable to arrange band pass filters in accordance with the wavelengths of the detection lights emitted from the two light sources 11 and 12. Thereby, the pupil image extraction unit 51 can extract the pupil image and the gaze direction calculation unit 56 can calculate the gaze direction with high accuracy.
 第1光源11のLED光源、および、第2光源12のLED光源は、ともに、800nm以上1000nm以下の赤外光(近赤外光)を出射し、この検出光を運転者の眼に与えることができるように配置されている。特に、850nmは、人の眼の眼球内での光吸収率が低い波長であり、この光は眼球の奥の網膜で反射されやすい。 The LED light source of the first light source 11 and the LED light source of the second light source 12 both emit infrared light (near infrared light) of 800 nm or more and 1000 nm or less, and give this detection light to the driver's eyes. It is arranged to be able to. In particular, 850 nm is a wavelength with a low light absorption rate in the eyeball of a human eye, and this light is easily reflected by the retina at the back of the eyeball.
 第1カメラ21と第1光源11のLED光源の光軸間距離は、視線検出装置10と運転者との距離を考慮して、第1カメラ21と第2カメラ22の光軸間距離に対して十分に短くしている。そのため、第1光源11は第1カメラ21に対して互いの光軸が略同軸であるとみなすことができる。同様に、第2カメラ22と第2光源12のLED光源の光軸間距離は、第1カメラ21と第2カメラ22の光軸間距離に対して十分に短くしているため、第2光源12は第2カメラ22に対して互いの光軸が略同軸であるとみなすことができる。 The distance between the optical axes of the LED light sources of the first camera 21 and the first light source 11 is determined with respect to the distance between the optical axes of the first camera 21 and the second camera 22 in consideration of the distance between the visual line detection device 10 and the driver. And short enough. Therefore, the first light source 11 can be regarded as having a substantially coaxial optical axis with respect to the first camera 21. Similarly, since the distance between the optical axes of the LED light sources of the second camera 22 and the second light source 12 is sufficiently shorter than the distance between the optical axes of the first camera 21 and the second camera 22, the second light source 12, the optical axes of the second camera 22 can be regarded as being substantially coaxial.
 これに対して、第1カメラ21と第2カメラ22の光軸間距離を十分に長くとっているため、第1光源11および第1カメラ21の各光軸と、第2光源12および第2カメラ22の各光軸とは、同軸ではない。以下の説明においては、上記配置を、2つの部材が略同軸である等と表現し、2つの部材が非同軸である等と表現することがある。 On the other hand, since the distance between the optical axes of the first camera 21 and the second camera 22 is sufficiently long, the optical axes of the first light source 11 and the first camera 21, and the second light source 12 and the second Each optical axis of the camera 22 is not coaxial. In the following description, the above arrangement may be expressed as two members being substantially coaxial and the like, and the two members being non-coaxial.
 2つの光源制御部31、32、2つの画像取得部41、42、瞳孔画像抽出部51、瞳孔中心算出部54、角膜反射光中心検出部55、視線方向算出部56、画像処理部57、および判別部58は、コンピュータのCPUやメモリで構成されており、各部による処理は、予めインストールされたソフトウエアを実行することで行われる。 Two light source control units 31, 32, two image acquisition units 41, 42, pupil image extraction unit 51, pupil center calculation unit 54, corneal reflection light center detection unit 55, gaze direction calculation unit 56, image processing unit 57, and The determination unit 58 includes a CPU and a memory of a computer, and processing by each unit is performed by executing software installed in advance.
 光源制御部31および光源制御部32は、制御部30からの指示信号にしたがって、第1光源11と第2光源12の点灯・非点灯をそれぞれ制御する。カメラ21、22は、制御部30からの指示信号にしたがって画像を取得し、カメラ21、22で撮影された画像は、フレームごとに画像取得部41、42にそれぞれ取得される。 The light source control unit 31 and the light source control unit 32 respectively control lighting / non-lighting of the first light source 11 and the second light source 12 according to an instruction signal from the control unit 30. The cameras 21 and 22 acquire images according to the instruction signal from the control unit 30, and images captured by the cameras 21 and 22 are acquired by the image acquisition units 41 and 42 for each frame.
 画像取得部41、42で取得された画像は、フレームごとに瞳孔画像抽出部51に読み込まれる。瞳孔画像抽出部51は、明瞳孔画像検出部52と暗瞳孔画像検出部53とを備えている。明瞳孔画像検出部52では、以下の明瞳孔撮影条件(a)のいずれかを満たす、光源とカメラの組み合わせのときの眼の画像が検出され、暗瞳孔画像検出部53では、以下の暗瞳孔撮影条件(b)のいずれかを満たす、光源とカメラの組み合わせのときの眼の画像が検出される。
(a)明瞳孔撮影条件
 (a-1)第1光源11の点灯期間に、これと略同軸の第1カメラ21で画像を取得
 (a-2)第2光源12の点灯期間に、これと略同軸の第2カメラ22で画像を取得
(b)暗瞳孔撮影条件
 (b-1)第1光源11の点灯期間に、これと非同軸の第2カメラ22で画像を取得
 (b-2)第2光源12の点灯期間に、これと非同軸の第1カメラ21で画像を取得
The images acquired by the image acquisition units 41 and 42 are read into the pupil image extraction unit 51 for each frame. The pupil image extraction unit 51 includes a bright pupil image detection unit 52 and a dark pupil image detection unit 53. The bright pupil image detection unit 52 detects an eye image when the combination of the light source and the camera satisfies one of the following bright pupil photographing conditions (a). The dark pupil image detection unit 53 detects the following dark pupils: An eye image when the combination of the light source and the camera satisfies any one of the imaging conditions (b) is detected.
(A) Bright pupil photographing condition (a-1) An image is acquired by the first camera 21 substantially coaxial with the first light source 11 during the lighting period. (A-2) During the lighting period of the second light source 12 (B) Dark pupil imaging conditions (b-1) An image is acquired with the second camera 22 non-coaxial with the second light source 11 during the lighting period of the first light source 11 (b-2) During the lighting period of the second light source 12, an image is acquired by the first camera 21 non-coaxial with the first light source 12.
 第1光源11から出射する赤外光は、運転者の眼の網膜に至る眼球内での吸収率が低いため、網膜で反射されやすい。したがって、第1光源11が点灯したときに、第1光源11と略同軸の第1カメラ21で取得される画像では、網膜で反射された赤外光が瞳孔を通じて検出され、瞳孔が明るく見える。この画像は、明瞳孔画像として明瞳孔画像検出部52で抽出される。これは、第2光源12が点灯したときに、これと略同軸の第2カメラ22で取得される画像についても同様である。このとき、第1光源11および第2光源12から出射する赤外光の波長を800nm以上900nm以下(特に、850nm)にすると、人の眼の眼球内での光吸収率が低く、より鮮明な明瞳孔画像を取得できる。 The infrared light emitted from the first light source 11 has a low absorption rate in the eyeball reaching the retina of the driver's eye, and thus is easily reflected by the retina. Therefore, when the first light source 11 is turned on, in the image acquired by the first camera 21 substantially coaxial with the first light source 11, the infrared light reflected by the retina is detected through the pupil, and the pupil looks bright. This image is extracted as a bright pupil image by the bright pupil image detection unit 52. The same applies to an image acquired by the second camera 22 that is substantially coaxial with the second light source 12 when it is turned on. At this time, if the wavelength of the infrared light emitted from the first light source 11 and the second light source 12 is set to 800 nm or more and 900 nm or less (especially 850 nm), the light absorption rate in the eyeball of the human eye is low and clearer. A bright pupil image can be acquired.
 これに対して、第1光源11を点灯したときに、第1光源11と非同軸の第2カメラ22で画像を取得する場合には、網膜で反射された赤外光が第2カメラ22にほとんど入射しないため、瞳孔が暗く見える。したがって、この画像は暗瞳孔画像として、暗瞳孔画像検出部53で抽出される。これは、第2光源12が点灯したときに、非同軸の第1カメラ21で取得される画像についても同様である。このとき、第1光源11および第2光源12から出射する赤外光の波長を900nmより大きく1000nm以下(特に、950nm)にすると、人の眼の眼球内での光吸収率が高く、より鮮明な暗瞳孔画像を取得できる。 In contrast, when the first light source 11 is turned on and an image is acquired by the second camera 22 that is non-coaxial with the first light source 11, the infrared light reflected by the retina is transmitted to the second camera 22. The pupil appears dark because it is hardly incident. Therefore, this image is extracted by the dark pupil image detection unit 53 as a dark pupil image. The same applies to an image acquired by the non-coaxial first camera 21 when the second light source 12 is turned on. At this time, if the wavelength of the infrared light emitted from the first light source 11 and the second light source 12 is set to be greater than 900 nm and less than or equal to 1000 nm (particularly, 950 nm), the light absorption rate in the eyeball of the human eye is high and clearer. A dark pupil image can be acquired.
 瞳孔画像抽出部51では、明瞳孔画像検出部52で検出された明瞳孔画像から、暗瞳孔画像検出部53で検出された暗瞳孔画像が減算されて差分画像が作成される。つまり、差分画像は明瞳孔画像と暗瞳孔画像で共通しない画像領域が両画像の減算で残った画像である。さらに、瞳孔検出範囲として、両眼を含む所定の大きさの範囲が切り出される。切り出された画像は、画像処理部57に与えられ、画像処理によって二値化されて二値画像が作成される。作成された二値画像は画像処理部57内のメモリに保存される。画像処理部57においては、ラベリングの画像処理が施され、二値画像における1つ以上の輝点のそれぞれに対してラベルが付される。さらに、画像処理部57では、二値画像中の輝点の輝度や形状等に基づいて、1つ以上の二値画像の中から、瞳孔画像の瞳孔候補としての輝点が選択される。瞳孔候補の選択は、例えば、最も円に近い輝点を瞳孔の画像として選択するように行われる。二値画像、選択された瞳孔候補、および、瞳孔候補に付されたラベルは瞳孔中心算出部54と判別部58へ出力される。 In the pupil image extraction unit 51, the dark pupil image detected by the dark pupil image detection unit 53 is subtracted from the bright pupil image detected by the bright pupil image detection unit 52 to create a difference image. That is, the difference image is an image in which an image region that is not common to the bright pupil image and the dark pupil image remains after subtraction of both images. Furthermore, a range of a predetermined size including both eyes is cut out as the pupil detection range. The clipped image is given to the image processing unit 57, and binarized by image processing to create a binary image. The created binary image is stored in a memory in the image processing unit 57. In the image processing unit 57, labeling image processing is performed, and a label is attached to each of one or more bright spots in the binary image. Further, the image processing unit 57 selects a bright spot as a pupil candidate of the pupil image from one or more binary images based on the brightness, shape, etc. of the bright spot in the binary image. The selection of the pupil candidate is performed, for example, so as to select the bright spot closest to the circle as the image of the pupil. The binary image, the selected pupil candidate, and the label attached to the pupil candidate are output to the pupil center calculation unit 54 and the determination unit 58.
 瞳孔中心算出部54では、瞳孔の画像の瞳孔候補として選択された輝点について、瞳孔の形状と面積に対応するエリア画像が算出される。さらに、このエリア画像を含む楕円が抽出され、この楕円形状の中心、すなわち、楕円形状の長軸と短軸との交点が瞳孔の中心位置として算出される。この楕円の抽出(楕円フィッティング)は、各種の手法を用いることができ、例えばエリア画像が内接する楕円を算出し、この楕円を選択する。瞳孔中心算出部54は、算出した瞳孔の中心位置の座標データを視線方向算出部56へ送出するとともに、この座標データを算出したことを示す信号を判別部58へ出力する。 The pupil center calculation unit 54 calculates an area image corresponding to the shape and area of the pupil for the bright spot selected as the pupil candidate of the pupil image. Further, an ellipse including the area image is extracted, and the center of the ellipse shape, that is, the intersection of the major axis and the minor axis of the ellipse shape is calculated as the center position of the pupil. Various methods can be used for extracting the ellipse (ellipse fitting). For example, an ellipse inscribed by the area image is calculated, and this ellipse is selected. The pupil center calculation unit 54 sends the calculated coordinate data of the center position of the pupil to the line-of-sight direction calculation unit 56 and outputs a signal indicating that the coordinate data has been calculated to the determination unit 58.
 一方、暗瞳孔画像検出部53で検出された暗瞳孔画像信号は、角膜反射光中心検出部55に与えられる。暗瞳孔画像信号は、角膜の反射点からの反射光による輝度信号が含まれている。角膜の反射点からの反射光はプルキニエ像を結像するものであり、カメラの撮像素子では、きわめて小さい面積のスポット画像として取得される。角膜反射光中心検出部55では、このスポット画像が画像処理されて、反射光中心座標として、暗瞳孔画像における、角膜の反射点からの反射光(角膜反射光)の中心の座標が求められる。また、角膜反射光中心検出部55では、暗瞳孔画像における角膜反射像の面積も算出される。このように算出された反射光中心座標と角膜反射像の面積は判別部58に出力される。ここで、角膜反射光中心検出部55における反射光中心座標の算出は、暗瞳孔画像検出部53から暗瞳孔画像が入力されたとき、または、判別部58から角膜反射光中心の検出の実行を指示する信号を受けたときに行う。 On the other hand, the dark pupil image signal detected by the dark pupil image detection unit 53 is given to the corneal reflection light center detection unit 55. The dark pupil image signal includes a luminance signal based on reflected light from the reflection point of the cornea. The reflected light from the reflection point of the cornea forms a Purkinje image, and is acquired as a spot image with a very small area by the imaging device of the camera. The corneal reflection light center detection unit 55 performs image processing on the spot image, and obtains the coordinates of the center of the reflection light (corneal reflection light) from the reflection point of the cornea in the dark pupil image as the reflection light center coordinates. The corneal reflection light center detection unit 55 also calculates the area of the corneal reflection image in the dark pupil image. The center coordinates of the reflected light and the area of the corneal reflection image thus calculated are output to the determination unit 58. Here, the calculation of the reflected light center coordinate in the corneal reflected light center detection unit 55 is performed when the dark pupil image is input from the dark pupil image detection unit 53 or the detection of the corneal reflected light center is performed from the determination unit 58. Performed when a signal to instruct is received.
 判別部58は、角膜反射光中心検出部55で算出された反射光中心座標を、画像処理部57で作成された二値画像上で参照し、二値画像の第1の範囲内に反射光中心座標があるか否かを判別する。ここで、第1の範囲は、例えば、瞳孔中心算出部54が瞳孔の中心位置の算出のために抽出した楕円形状の範囲や、瞳孔の中心位置を中心とする所定ピクセル数を半径とする円または楕円の範囲である。判別部58による判別の結果は、画像処理部57へ出力される。 The determination unit 58 refers to the reflected light center coordinates calculated by the cornea reflected light center detection unit 55 on the binary image created by the image processing unit 57, and reflects the reflected light within the first range of the binary image. It is determined whether or not there is a center coordinate. Here, the first range is, for example, an elliptical range extracted by the pupil center calculation unit 54 for calculating the pupil center position, or a circle having a radius of a predetermined number of pixels centered on the pupil center position. Or the range of an ellipse. The result of determination by the determination unit 58 is output to the image processing unit 57.
 また、判別部58は、瞳孔中心算出部54から瞳孔の中心位置の座標を算出したことを示す信号を受け取ったところで、角膜反射光中心の検出結果が角膜反射光中心検出部55から入力されているか否かを確認し、入力されていなければ、角膜反射光中心検出部55に対して角膜反射光中心の検出の実行を指示する信号を送出する。 Further, when the determination unit 58 receives a signal indicating that the coordinates of the center position of the pupil are calculated from the pupil center calculation unit 54, the detection result of the corneal reflection light center is input from the corneal reflection light center detection unit 55. If not input, a signal instructing execution of detection of the corneal reflection light center is sent to the corneal reflection light center detection unit 55.
 判別部58によって第1の範囲内に角膜反射光中心があると判別された場合、画像処理部57では、二値画像の反射光中心座標における画像の形状や輝度情報に基づいて、二値画像における角膜反射光中心の座標の推定値(推定座標)が算出される。この座標は、例えば、二値画像における、角膜反射光の残存画像が円の一部の形状を有していた場合は、この形状を円形状に補完したときの中心を角膜反射光中心の座標(推定座標)と推定する。 When the determination unit 58 determines that the corneal reflection light center is within the first range, the image processing unit 57 determines the binary image based on the shape and luminance information of the image at the reflection light center coordinates of the binary image. An estimated value (estimated coordinates) of the coordinates of the corneal reflection light center at is calculated. For example, when the residual image of the corneal reflection light in the binary image has a shape of a part of a circle, the coordinates when the shape is complemented with the circle shape are the coordinates of the corneal reflection light center. Estimated as (estimated coordinates).
 さらに、画像処理部57は、二値画像において、上記推定座標を含む第2の範囲内の画素の輝度を低下させた画像を作成し、この画像で二値画像を更新する。ここで、輝度の低下は一定の割合で行うが、その割合は、例えば、輝度をゼロにするものでもよいし、半減させるものでもよい。また、第2の範囲は、例えば、暗瞳孔画像における角膜反射像に対応する範囲とする。 Further, the image processing unit 57 creates an image in which the luminance of the pixels within the second range including the estimated coordinates is reduced in the binary image, and updates the binary image with this image. Here, the luminance is reduced at a constant rate, and the rate may be zero or half, for example. Further, the second range is, for example, a range corresponding to a corneal reflection image in the dark pupil image.
 ここで、図2(A)に示す明瞳孔画像61と図2(B)に示す暗瞳孔画像62が得られた場合について説明する。明瞳孔画像61と暗瞳孔画像62の差分画像63においては、明瞳孔画像61における瞳孔の画像71から暗瞳孔画像62における瞳孔の画像71を減算することによって瞳孔画像72が得られる。一方、明瞳孔画像61と暗瞳孔画像62における角膜反射像の位置ずれのため、差分画像63においては三日月状の角膜反射像73が残存している。したがって、差分画像63上には、瞳孔画像の瞳孔候補(輝点)として瞳孔画像72と角膜反射像73が現れている。このような差分画像63から対象者の両眼を含む検出範囲81を切り出したとき、角膜反射像73が瞳孔画像72の近傍に位置していると瞳孔中心座標の算出に影響が生じうる。 Here, the case where the bright pupil image 61 shown in FIG. 2A and the dark pupil image 62 shown in FIG. 2B are obtained will be described. In the difference image 63 between the bright pupil image 61 and the dark pupil image 62, a pupil image 72 is obtained by subtracting the pupil image 71 in the dark pupil image 62 from the pupil image 71 in the bright pupil image 61. On the other hand, a crescent-shaped cornea reflection image 73 remains in the difference image 63 due to the positional shift of the cornea reflection images in the bright pupil image 61 and the dark pupil image 62. Therefore, on the difference image 63, a pupil image 72 and a corneal reflection image 73 appear as pupil candidates (bright spots) of the pupil image. When the detection range 81 including both eyes of the subject is cut out from such a difference image 63, if the cornea reflection image 73 is positioned in the vicinity of the pupil image 72, the calculation of the pupil center coordinates may be affected.
 これに対して、図3に示すように、瞳孔画像72を中心とする楕円形状を有する第1の範囲として、楕円82または楕円83を設定する。楕円82を第1の範囲とした場合には、この範囲内には角膜反射像73は存在しないが、楕円83を第1の範囲とした場合には、この範囲内に角膜反射像73が存在する。楕円83を第1の範囲とした場合には、角膜反射像73の三日月形状を補完して得られる円の中心84cを中心とする円範囲84を第2の範囲に設定し、この範囲内の画素の輝度をゼロにする。 On the other hand, as shown in FIG. 3, an ellipse 82 or an ellipse 83 is set as a first range having an elliptical shape centered on the pupil image 72. When the ellipse 82 is the first range, the cornea reflection image 73 does not exist within this range, but when the ellipse 83 is the first range, the cornea reflection image 73 exists within this range. To do. When the ellipse 83 is set as the first range, the circle range 84 centered on the center 84c of the circle obtained by complementing the crescent shape of the cornea reflection image 73 is set as the second range, Set the pixel brightness to zero.
 以上のような処理によれば、二値画像に角膜反射像の一部が残存していたとしても、この残存画像の影響を低減した新たな二値画像を得ることができる。このため、瞳孔中心算出部54では、更新された二値画像に基づいて瞳孔の中心位置を算出することによって、より正確に瞳孔画像の中心を算出することが可能となる。 According to the above processing, even if a part of the cornea reflection image remains in the binary image, a new binary image in which the influence of the remaining image is reduced can be obtained. For this reason, the pupil center calculation unit 54 can calculate the center of the pupil image more accurately by calculating the center position of the pupil based on the updated binary image.
 瞳孔中心算出部54で算出された瞳孔中心算出値と角膜反射光中心検出部55で算出された角膜反射光中心算出値は、視線方向算出部56に与えられる。視線方向算出部56では、瞳孔中心算出値と角膜反射光中心算出値とから視線の向きが検出される。 The pupil center calculation value calculated by the pupil center calculation unit 54 and the corneal reflection light center calculation value calculated by the corneal reflection light center detection unit 55 are given to the gaze direction calculation unit 56. The line-of-sight direction calculation unit 56 detects the direction of the line of sight from the pupil center calculated value and the corneal reflection light center calculated value.
 次に、図4を参照して、第1実施形態に係る視線検出方法における処理の流れについて説明する。図4は、第1実施形態に係る視線検出方法の処理の流れを示すフローチャートである。 Next, with reference to FIG. 4, the flow of processing in the visual line detection method according to the first embodiment will be described. FIG. 4 is a flowchart showing a flow of processing of the visual line detection method according to the first embodiment.
 まず、第1カメラ21の動作について説明する。制御部30の制御により、瞳孔画像取得のために第1光源11が点灯され、この点灯に合わせて画像を取得する事で明瞳孔画像を得る(ステップS1)。また、第2光源12が点灯され、この点灯に合わせて画像を取得する事で暗瞳孔画像を得る(ステップS2)。 First, the operation of the first camera 21 will be described. Under the control of the control unit 30, the first light source 11 is turned on for obtaining a pupil image, and a bright pupil image is obtained by obtaining an image in accordance with the turning on (step S1). Further, the second light source 12 is turned on, and a dark pupil image is obtained by acquiring an image in accordance with the lighting (step S2).
 次に、第2カメラ22の動作について説明する。第1カメラの場合と同様に、制御部30の制御により、瞳孔画像取得のために第1光源11が点灯され、この点灯に合わせて画像を取得すると、第2カメラ22では暗瞳孔画像が得られる(ステップS1)。また、第2光源12が点灯され、この点灯に合わせて画像を取得すると、第2カメラ22では明瞳孔画像が得られる(ステップS2)。 Next, the operation of the second camera 22 will be described. As in the case of the first camera, the first light source 11 is turned on for pupil image acquisition under the control of the control unit 30, and when an image is acquired in accordance with this lighting, the second camera 22 obtains a dark pupil image. (Step S1). When the second light source 12 is turned on and an image is acquired in accordance with this lighting, a bright pupil image is obtained by the second camera 22 (step S2).
 ステップS1において、第1カメラ21で撮影された画像は画像取得部41から瞳孔画像抽出部51の明瞳孔画像検出部52に読み込まれ、明瞳孔画像検出部52において明瞳孔画像が検出される。また、第2カメラ22で撮影された画像は画像取得部42から瞳孔画像抽出部51の暗瞳孔画像検出部53に読み込まれ、暗瞳孔画像検出部53において暗瞳孔画像が検出される。 In step S1, the image captured by the first camera 21 is read from the image acquisition unit 41 into the bright pupil image detection unit 52 of the pupil image extraction unit 51, and the bright pupil image detection unit 52 detects the bright pupil image. The image captured by the second camera 22 is read from the image acquisition unit 42 into the dark pupil image detection unit 53 of the pupil image extraction unit 51, and the dark pupil image detection unit 53 detects the dark pupil image.
 ステップS2において、第1カメラ21で撮影された画像は画像取得部41から瞳孔画像抽出部51の暗瞳孔画像検出部53に読み込まれ、暗瞳孔画像検出部53において暗瞳孔画像が検出される。また、第2カメラ22で撮影された画像は画像取得部42から瞳孔画像抽出部51の明瞳孔画像検出部52に読み込まれ、明暗瞳孔画像検出部52において明瞳孔画像が検出される。 In step S2, the image captured by the first camera 21 is read from the image acquisition unit 41 into the dark pupil image detection unit 53 of the pupil image extraction unit 51, and the dark pupil image detection unit 53 detects the dark pupil image. The image captured by the second camera 22 is read from the image acquisition unit 42 into the bright pupil image detection unit 52 of the pupil image extraction unit 51, and the bright pupil image detection unit 52 detects the bright pupil image.
 瞳孔画像抽出部51では、明瞳孔画像検出部52で検出された明瞳孔画像から、暗瞳孔画像検出部53で検出された暗瞳孔画像が減算されて差分画像が作成される(ステップS3)。さらに瞳孔画像抽出部51では、検出範囲として、対象者の両眼を含む所定の大きさの範囲の画像が切り出される(ステップS4)。 In the pupil image extraction unit 51, the dark pupil image detected by the dark pupil image detection unit 53 is subtracted from the bright pupil image detected by the bright pupil image detection unit 52 to create a difference image (step S3). Further, the pupil image extraction unit 51 cuts out an image having a predetermined size including both eyes of the subject as a detection range (step S4).
 上記ステップS4で切り出された画像は画像処理部57に送られ、画像処理によって二値画像が作成される(ステップS5)。作成された二値画像は画像処理部57内のメモリに保存される。この二値画像に対して、画像処理部57はラベリング処理を実行し、二値画像に含まれる1つ以上の輝点のそれぞれについてラベルが付される(ステップS6)。さらに、画像処理部57は、輝点の形状等に基づいて、瞳孔画像の瞳孔候補(候補ラベル)を選択する(ステップS7)。選択された瞳孔候補のデータは瞳孔中心算出部54へ出力される。 The image cut out in step S4 is sent to the image processing unit 57, and a binary image is created by image processing (step S5). The created binary image is stored in a memory in the image processing unit 57. The image processing unit 57 executes a labeling process on the binary image, and labels each of one or more bright spots included in the binary image (step S6). Further, the image processing unit 57 selects a pupil candidate (candidate label) of the pupil image based on the shape of the bright spot or the like (step S7). The selected pupil candidate data is output to the pupil center calculation unit 54.
 つづいて、瞳孔中心算出部54において、画像処理部57で選択された輝点について、瞳孔の形状と面積に対応するエリア画像が算出される。さらに、瞳孔中心算出部54は、算出されたエリア画像を含む楕円形状を抽出し(楕円フィッティング)、楕円形状の中心位置の座標(XP,YP)を算出する(ステップS8)。算出された座標値は判別部58へ出力される。 Subsequently, the pupil center calculation unit 54 calculates an area image corresponding to the shape and area of the pupil for the bright spot selected by the image processing unit 57. Further, the pupil center calculation unit 54 extracts an elliptical shape including the calculated area image (ellipse fitting), and calculates coordinates (XP, YP) of the elliptical center position (step S8). The calculated coordinate value is output to the determination unit 58.
 ここで判別部58は、角膜反射光中心位置が検出されたか否かを判別する(ステップS9)。角膜反射光中心がまだ検出されていないと判別した場合(ステップS9でN)、判別部58は、瞳孔画像抽出部51に対して角膜反射光中心の座標の算出を指示する。これに対して、瞳孔画像抽出部51は、暗瞳孔画像のうち、対象者の両眼を含む所定の大きさの範囲の画像を、角膜反射光中心の座標を検出するための範囲の画像として切り出す(ステップS10)。角膜反射光中心検出部55は、切り出した検出範囲に基づいて、暗瞳孔画像中のスポット画像から、角膜反射光中心の座標(XG,YG)を算出する(ステップS11)。 Here, the determination unit 58 determines whether or not the corneal reflection light center position has been detected (step S9). When it is determined that the corneal reflection light center has not yet been detected (N in step S9), the determination unit 58 instructs the pupil image extraction unit 51 to calculate the coordinates of the corneal reflection light center. On the other hand, the pupil image extraction unit 51 uses, as an image of a range for detecting the coordinates of the corneal reflection light center, an image of a predetermined size including both eyes of the subject in the dark pupil image. Cut out (step S10). The corneal reflection light center detection unit 55 calculates the coordinates (XG, YG) of the corneal reflection light center from the spot image in the dark pupil image based on the extracted detection range (step S11).
 上記ステップS11の角膜反射光中心の座標の算出結果は、判別部58へ出力される。この算出結果を受けた判別部58では、角膜反射光中心の座標(XG,YG)が、ステップS8で算出された、エリア画像を含む楕円形状の範囲内にあるか否かを判別する(ステップS12)。この判別結果は画像処理部57へ出力される。角膜反射光中心の座標が瞳孔の楕円形状の範囲内にあると判別された場合(ステップS12でY)、画像処理部57では、二値画像における角膜反射光中心の座標(推定座標)が推定される(ステップS13)。さらに、画像処理部57は、二値画像において推定座標を含む第2の範囲内の画素、すなわち、楕円形状の範囲内の瞳孔の周辺画素の輝度をゼロにした画像を作成し、この画像をメモリに保存して二値画像を更新する(ステップS14)。更新された二値画像は瞳孔画像抽出部51へも出力され、この二値画像に基づいて上記ステップS8以降の処理が再び実行される。 The calculation result of the coordinates of the corneal reflection light center in step S11 is output to the determination unit 58. Upon receiving this calculation result, the determination unit 58 determines whether or not the coordinates (XG, YG) of the corneal reflection light center are within the elliptical range including the area image calculated in step S8 (step S8). S12). The determination result is output to the image processing unit 57. When it is determined that the coordinates of the corneal reflection light center are within the range of the elliptical shape of the pupil (Y in step S12), the image processing unit 57 estimates the coordinates (estimated coordinates) of the corneal reflection light center in the binary image. (Step S13). Further, the image processing unit 57 creates an image in which the luminance of the pixels in the second range including the estimated coordinates in the binary image, that is, the peripheral pixels of the pupil in the elliptical range is zero, The binary image is updated in the memory (step S14). The updated binary image is also output to the pupil image extraction unit 51, and the processing after step S8 is executed again based on this binary image.
 一方、上記ステップS9において角膜反射光中心がすでに検出されていたと判別した場合(ステップS9でY)、また、上記ステップS12において角膜反射光中心の座標が瞳孔の楕円形状の範囲内にないと判別された場合(ステップS12でN)、瞳孔の座標と角膜反射光中心の座標が確定(ステップS15)し、これらの座標に基づいて視線方向算出部56において対象者の視線方向が算出される。 On the other hand, if it is determined in step S9 that the corneal reflection light center has already been detected (Y in step S9), it is determined in step S12 that the coordinates of the corneal reflection light center are not within the elliptical range of the pupil. If so (N in step S12), the coordinates of the pupil and the coordinates of the corneal reflection light center are determined (step S15), and the line-of-sight direction calculation unit 56 calculates the line-of-sight direction of the subject based on these coordinates.
 以上のように構成されたことから、上記実施形態によれば、次の効果を奏する。
(1)瞳孔の瞳孔候補を含む二値画像の第1の範囲内に角膜反射光中心があるか否かを判別し、第1の範囲内に前記角膜反射光中心があると判別された場合は、第1の範囲内の前記角膜反射光中心を含む第2の範囲内の画素の輝度を低下させた画像を作成し、この画像で二値画像を更新し、そして、この二値画像に基づいて算出した瞳孔中心の座標と反射光中心座標に基づいて視線方向を算出している。これによれば、差分画像に角膜反射像の一部が残存したとしても、その影響を抑えることが可能となり、これによって瞳孔中心を正確に算出することができ、よって、視線検出の精度をさらに高めることができる。
With the configuration described above, the following effects are achieved according to the above embodiment.
(1) When it is determined whether or not the corneal reflection light center is within the first range of the binary image including the pupil candidate of the pupil, and it is determined that the corneal reflection light center is within the first range Creates an image in which the luminance of the pixels in the second range including the corneal reflection light center in the first range is reduced, updates the binary image with this image, and adds the binary image to the binary image. The line-of-sight direction is calculated based on the coordinates of the pupil center calculated based on the coordinates and the center coordinates of the reflected light. According to this, even if a part of the corneal reflection image remains in the difference image, it is possible to suppress the influence, and thereby it is possible to accurately calculate the pupil center, thereby further improving the accuracy of line-of-sight detection. Can be increased.
(2)第1の範囲内に角膜反射光中心があると判別された場合、二値画像における前記角膜反射光中心の座標の推定値を算出し、この座標に対応する座標を含む第2の範囲内の画素の輝度を低下させた画像を作成し、この画像で二値画像を更新している。このため、二値画像に残った角膜反射光の画像を正確に除去することから、瞳孔中心の算出精度をより高くすることが可能となる。ここで、判別部によって第1の範囲内に角膜反射光中心があると判別された場合、画像処理部は、第2の範囲内の画素の輝度をゼロにすることが好ましい。これにより、二値画像に残った角膜反射光の画像を完全に除去できるため、瞳孔中心をより精度良く算出することができる。 (2) When it is determined that the corneal reflection light center is within the first range, an estimated value of the coordinates of the corneal reflection light center in the binary image is calculated, and a second value including coordinates corresponding to the coordinates is calculated. An image in which the luminance of the pixels in the range is reduced is created, and the binary image is updated with this image. For this reason, since the image of the corneal reflected light remaining in the binary image is accurately removed, it is possible to further increase the calculation accuracy of the pupil center. Here, when it is determined by the determination unit that the corneal reflection light center is within the first range, the image processing unit preferably sets the luminance of the pixels within the second range to zero. Thereby, since the image of the corneal reflection light remaining in the binary image can be completely removed, the pupil center can be calculated with higher accuracy.
(3)第1実施形態の視線検出装置10においては、略同軸に配置された第1光源11および第1カメラ21と、略同軸に配置された第2光源12および第2カメラ22とを備え、点灯させた光源と略同軸のカメラで明瞳孔画像を取得するとともに、非同軸のカメラで暗瞳孔画像を取得している。このような構成において、2つのカメラ21、22のそれぞれで取得した画像中の角膜反射光中心の座標にずれが生じていたとしても、視線検出装置10においては、これらの画像の差分画像において残存した角膜反射光の残存画像の影響を抑えることができる。第1実施形態では、2つの光源11、12からの出射光の波長を、両者とも、眼球内での光吸収率が低い波長としていたが、一方の光源からの出射光の波長を光吸収率が高く、網膜まで到達しづらい波長(例えば900nmよりも大きく1000nm以下、特に950nm)とし、明瞳孔画像撮影用の光源と暗瞳孔画像撮影用の光源としてもよい。また、光源やカメラの数は2台ずつに限定されない。 (3) The line-of-sight detection device 10 according to the first embodiment includes the first light source 11 and the first camera 21 arranged substantially coaxially, and the second light source 12 and the second camera 22 arranged substantially coaxially. A bright pupil image is acquired with a camera that is substantially coaxial with the light source that is turned on, and a dark pupil image is acquired with a non-coaxial camera. In such a configuration, even if there is a shift in the coordinates of the corneal reflection light center in the images acquired by the two cameras 21 and 22, the line-of-sight detection device 10 remains in the difference image of these images. The influence of the residual image of the corneal reflection light can be suppressed. In the first embodiment, the wavelengths of the light emitted from the two light sources 11 and 12 are both low in the light absorption rate in the eyeball, but the wavelength of the light emitted from one light source is the light absorption rate. The wavelength may be high and difficult to reach the retina (for example, greater than 900 nm and 1000 nm or less, particularly 950 nm), and may be used as a light source for bright pupil image capturing and a light source for dark pupil image capturing. Further, the number of light sources and cameras is not limited to two.
(4)第1の範囲は、選択された瞳孔候補に基づいて算出した楕円形状の範囲であることが好ましい。さらに、楕円形状の中心を瞳孔中心として算出することが好ましい。
 また、第1の範囲は、瞳孔候補選択ステップで選択された瞳孔候補の画素の範囲であることが好ましい。この場合、第1の範囲を瞳孔候補の画素の範囲と同一としてもよいが、その範囲よりも所定数の画素分だけ外側へ広げるようにしてもよい。
 これにより、さまざまな対象者の眼の形状に合わせて適切な範囲を設定することができ、正確な瞳孔中心算出に資することができる。
(4) The first range is preferably an elliptical range calculated based on the selected pupil candidate. Furthermore, it is preferable to calculate the center of the elliptical shape as the pupil center.
The first range is preferably a range of pixels of the pupil candidate selected in the pupil candidate selection step. In this case, the first range may be the same as the range of the candidate pupil pixels, but may be extended outward by a predetermined number of pixels from the range.
As a result, an appropriate range can be set according to the shape of the eyes of various subjects, which can contribute to accurate pupil center calculation.
 <第2実施形態>
 つづいて、本発明の第2実施形態について説明する。第2実施形態においては、明瞳孔画像における角膜反射光中心の座標の推定、および、この推定によって得られた座標を含む第2の範囲内の画素の輝度をゼロにする処理の後に、瞳孔に対応するエリア画像についての楕円フィッティング処理を行う点が第1実施形態と異なる。以下の説明において、第1実施形態と同様の構成・作用・効果についての詳細な説明は省略する。
Second Embodiment
Next, a second embodiment of the present invention will be described. In the second embodiment, after the estimation of the coordinates of the corneal reflection light center in the bright pupil image and the process of reducing the luminance of the pixels in the second range including the coordinates obtained by this estimation to zero, The point which performs the ellipse fitting process about a corresponding area image differs from 1st Embodiment. In the following description, a detailed description of the same configuration / action / effect as in the first embodiment will be omitted.
 図5は、第2実施形態に係る視線検出方法の処理の流れを示すフローチャートである。
 第2実施形態における、明瞳孔画像の取得(ステップS21)、暗瞳孔画像の取得(ステップS22)、差分画像の作成(ステップS23)、瞳孔検出範囲の切り出し(ステップS24)、二値画像の作成(ステップS25)、ラベリング(ステップS26)、及び、瞳孔の瞳孔候補のラベルの選択(ステップS27)は、第1実施形態のステップS1~S7のそれぞれと同様である。
FIG. 5 is a flowchart showing a flow of processing of the visual line detection method according to the second embodiment.
Acquisition of a bright pupil image (step S21), acquisition of a dark pupil image (step S22), creation of a difference image (step S23), extraction of a pupil detection range (step S24), creation of a binary image in the second embodiment (Step S25), labeling (Step S26), and selection of a pupil candidate label (Step S27) are the same as Steps S1 to S7 of the first embodiment.
 上記ステップS27で瞳孔の瞳孔候補のラベルが選択されると、画像処理部57から判別部58へ通知信号が送信される。この通知信号を受けた判別部58は、角膜反射光中心位置が検出されたか否かを判別する(ステップS28)。角膜反射光中心がまだ検出されていないと判別した場合(ステップS28でN)、判別部58は、瞳孔画像抽出部51に対して角膜反射光中心の座標の算出を指示する。これに対して、瞳孔画像抽出部51は、暗瞳孔画像のうち、対象者の両眼を含む所定の大きさの範囲の画像を、角膜反射光中心の座標を検出するための範囲の画像として切り出す(ステップS29)。角膜反射光中心検出部55は、切り出した検出範囲に基づいて、スポット画像から、角膜反射光中心の座標(XG,YG)を算出する(ステップS30)。 When a pupil candidate label is selected in step S27, a notification signal is transmitted from the image processing unit 57 to the determination unit 58. The determination unit 58 that has received this notification signal determines whether or not the corneal reflection light center position has been detected (step S28). When it is determined that the corneal reflection light center has not been detected yet (N in step S28), the determination unit 58 instructs the pupil image extraction unit 51 to calculate the coordinates of the corneal reflection light center. On the other hand, the pupil image extraction unit 51 uses, as an image of a range for detecting the coordinates of the corneal reflection light center, an image of a predetermined size including both eyes of the subject in the dark pupil image. Cut out (step S29). The corneal reflection light center detection unit 55 calculates the coordinates (XG, YG) of the corneal reflection light center from the spot image based on the extracted detection range (step S30).
 上記ステップS30の角膜反射光中心の座標の算出結果は、判別部58へ出力される。この算出結果を受けた判別部58では、角膜反射光中心の座標(XG,YG)が、瞳孔候補の範囲内にあるか否かを判別する(ステップS31)。この判別は、上記ステップS28において角膜反射光中心がすでに検出されていたと判別した場合(ステップS28でY)にも行われる。 The calculation result of the coordinates of the corneal reflection light center in step S30 is output to the determination unit 58. Upon receiving this calculation result, the determination unit 58 determines whether or not the coordinates (XG, YG) of the corneal reflection light center are within the range of the pupil candidates (step S31). This determination is also performed when it is determined in step S28 that the corneal reflection light center has already been detected (Y in step S28).
 角膜反射光中心の座標が瞳孔の瞳孔候補の範囲内にあると判別された場合(ステップS31でY)、画像処理部57では、二値画像における角膜反射光中心の座標(推定座標)が推定される(ステップS32)。さらに、画像処理部57は、二値画像において推定座標を含む第2の範囲内の画素、すなわち、瞳孔候補の範囲内の瞳孔の周辺画素の輝度をゼロにした画像を作成し、この画像をメモリに保存して二値画像を更新する(ステップS33)。 When it is determined that the coordinates of the corneal reflection light center are within the range of the pupil candidates of the pupil (Y in step S31), the image processing unit 57 estimates the coordinates (estimated coordinates) of the corneal reflection light center in the binary image. (Step S32). Further, the image processing unit 57 creates an image in which the luminance of the pixels in the second range including the estimated coordinates in the binary image, that is, the peripheral pixels of the pupil in the range of the pupil candidates is zero, The binary image is updated in the memory (step S33).
 上記ステップS31において角膜反射光中心の座標が瞳孔の瞳孔候補の範囲内にないと判別された場合(ステップS12でN)、および、上記ステップS33によって二値画像が更新された場合、瞳孔中心算出部54において、画像処理部57で選択された輝点について、瞳孔の形状と面積に対応するエリア画像が算出される。瞳孔中心算出部54は、算出されたエリア画像を含む楕円形状を抽出し(楕円フィッティング)、楕円形状の中心位置の座標(XP,YP)が算出される(ステップS34)。さらに、瞳孔の座標と角膜反射光中心の座標が確定(ステップS35)し、これらの座標に基づいて視線方向算出部56において対象者の視線方向が算出される。
 なお、その他の作用、効果は第1実施形態と同様である。
 本発明について上記実施形態を参照しつつ説明したが、本発明は上記実施形態に限定されるものではなく、改良の目的または本発明の思想の範囲内において改良または変更が可能である。
If it is determined in step S31 that the coordinates of the corneal reflection light center are not within the range of pupil candidates for the pupil (N in step S12), and if the binary image is updated in step S33, pupil center calculation is performed. In the unit 54, an area image corresponding to the shape and area of the pupil is calculated for the bright spot selected by the image processing unit 57. The pupil center calculation unit 54 extracts an elliptical shape including the calculated area image (ellipse fitting), and calculates the coordinates (XP, YP) of the elliptical center position (step S34). Further, the coordinates of the pupil and the coordinates of the corneal reflected light center are determined (step S35), and the line-of-sight direction calculation unit 56 calculates the line-of-sight direction of the subject based on these coordinates.
Other operations and effects are the same as those in the first embodiment.
Although the present invention has been described with reference to the above embodiment, the present invention is not limited to the above embodiment, and can be improved or changed within the scope of the purpose of the improvement or the idea of the present invention.
 以上のように、本発明に係る視線検出装置および視線検出方法は、明瞳孔画像と暗瞳孔画像の取得において、対象者の目への検出光の照射角度や、カメラの撮影角度や、カメラの撮影タイミングなどにずれが生じることによって、明瞳孔画像と暗瞳孔画像における角膜反射光中心の位置がずれ、これらの画像の差分画像に角膜反射像が残存することがあっても、この残存画像の影響を抑えることができる点で有用である。 As described above, the line-of-sight detection device and line-of-sight detection method according to the present invention provide the detection angle of the detection light to the subject's eyes, the shooting angle of the camera, and the camera in the acquisition of the bright pupil image and the dark pupil image. Even if the position of the corneal reflection light center in the bright pupil image and the dark pupil image is shifted due to a shift in the photographing timing, and the corneal reflection image may remain in the difference image between these images, This is useful in that the influence can be suppressed.
 10  視線検出装置
 11  第1光源
 12  第2光源
 21  第1カメラ
 22  第2カメラ
 30  制御部
 31、32 光源制御部
 41、42 画像取得部
 51  瞳孔画像抽出部
 52  明瞳孔画像検出部
 53  暗瞳孔画像検出部
 54  瞳孔中心算出部
 55  角膜反射光中心検出部
 56  視線方向算出部
 57  画像処理部
 58  判別部
 61  明瞳孔画像
 62  暗瞳孔画像
 63  差分画像
 71  瞳孔の画像
 72  瞳孔画像
 73  角膜反射像
 81  検出範囲
 82、83 楕円(第1の範囲)
 84  円範囲(第2の範囲)
DESCRIPTION OF SYMBOLS 10 Eye-gaze detection apparatus 11 1st light source 12 2nd light source 21 1st camera 22 2nd camera 30 Control part 31, 32 Light source control part 41, 42 Image acquisition part 51 Pupil image extraction part 52 Bright pupil image detection part 53 Dark pupil image Detection unit 54 Pupil center calculation unit 55 Corneal reflection light center detection unit 56 Gaze direction calculation unit 57 Image processing unit 58 Discrimination unit 61 Bright pupil image 62 Dark pupil image 63 Difference image 71 Pupil image 72 Pupil image 73 Corneal reflection image 81 Detection Range 82, 83 Ellipse (first range)
84 circle range (second range)

Claims (9)

  1.  明瞳孔画像と暗瞳孔画像をそれぞれ取得し、前記明瞳孔画像と前記暗瞳孔画像の差分画像を作成する瞳孔画像抽出部と、
     前記差分画像を二値化した二値画像を作成し、前記二値画像における1つ以上の輝点から瞳孔候補を選択する画像処理部と、
     反射光中心座標として前記暗瞳孔画像における角膜反射光中心の座標を算出する角膜反射光中心検出部と、
     前記反射光中心座標に基づいて、前記瞳孔候補を含む前記二値画像の第1の範囲内に前記角膜反射光中心があるか否かを判別する判別部とを備える視線検出装置であって、
     前記判別部によって前記第1の範囲内に前記角膜反射光中心があると判別された場合、前記画像処理部は、前記角膜反射光中心を含む第2の範囲内の画素の輝度を低下させた画像である瞳孔画像を作成し、
     前記視線検出装置は、
     前記瞳孔画像に基づいて瞳孔中心の座標を算出する瞳孔中心算出部と、
     前記反射光中心座標と前記瞳孔中心の座標に基づいて視線方向を算出する視線方向算出部とを備えることを特徴とする視線検出装置。
    A pupil image extraction unit that obtains a bright pupil image and a dark pupil image, respectively, and creates a difference image between the bright pupil image and the dark pupil image;
    An image processing unit that creates a binary image obtained by binarizing the difference image and selects a pupil candidate from one or more bright spots in the binary image;
    A corneal reflection light center detection unit that calculates coordinates of the corneal reflection light center in the dark pupil image as reflected light center coordinates;
    A line-of-sight detection device comprising: a determination unit that determines whether or not the corneal reflection light center is within a first range of the binary image including the pupil candidates based on the reflected light center coordinates;
    When it is determined by the determination unit that the corneal reflection light center is within the first range, the image processing unit reduces the luminance of the pixels in the second range including the corneal reflection light center. Create a pupil image that is an image,
    The line-of-sight detection device includes:
    A pupil center calculation unit that calculates the coordinates of the pupil center based on the pupil image;
    A line-of-sight detection apparatus comprising: a line-of-sight direction calculation unit that calculates a line-of-sight direction based on the reflected light center coordinates and the pupil center coordinates.
  2.  前記判別部によって前記第1の範囲内に前記角膜反射光中心があると判別された場合、前記画像処理部は、前記二値画像における前記角膜反射光中心の座標の推定値を算出し、この座標を含む前記第2の範囲内の画素の輝度を低下させた瞳孔画像を作成することを特徴とする請求項1に記載の視線検出装置。 When the determination unit determines that the corneal reflection light center is within the first range, the image processing unit calculates an estimated value of the coordinates of the corneal reflection light center in the binary image, and The line-of-sight detection device according to claim 1, wherein a pupil image in which luminance of pixels in the second range including coordinates is reduced is created.
  3.  前記判別部によって前記第1の範囲内に前記角膜反射光中心があると判別された場合、前記画像処理部は、前記第2の範囲内の画素の輝度をゼロにすることを特徴とする請求項1または請求項2に記載の視線検出装置。 The image processing unit makes the luminance of pixels in the second range zero when the determination unit determines that the corneal reflection light center is within the first range. Item 3. The line-of-sight detection device according to item 1 or 2.
  4.  互いに離れて配置されてそれぞれが少なくとも目を含む領域の画像を取得する第1カメラおよび第2カメラと、前記第1カメラに接近して配置された第1光源と、前記第2カメラに接近して配置された第2光源とを備え、
     前記瞳孔画像抽出部は、それぞれのカメラで取得した画像から前記明瞳孔画像と前記暗瞳孔画像を取得することを特徴とする請求項1から請求項3のいずれか1項に記載の視線検出装置。
    A first camera and a second camera, which are arranged apart from each other and each acquire an image of an area including at least an eye, a first light source arranged close to the first camera, and an approach to the second camera And a second light source arranged
    The line-of-sight detection device according to claim 1, wherein the pupil image extraction unit acquires the bright pupil image and the dark pupil image from images acquired by respective cameras. .
  5.  明瞳孔画像と暗瞳孔画像をそれぞれ取得するステップと、
     前記明瞳孔画像と前記暗瞳孔画像の差分画像を作成し、前記差分画像を二値化して二値画像を作成するステップと、
     前記二値画像における1つ以上の輝点から瞳孔候補を選択する瞳孔候補選択ステップと、
     反射光中心座標として前記暗瞳孔画像において角膜反射光中心の座標を算出する角膜反射光中心算出ステップと、
     前記角膜反射光中心算出ステップにおいて算出された前記反射光中心座標に基づいて、前記瞳孔候補を含む前記二値画像の第1の範囲内に前記角膜反射光中心があるか否かを判別する判別ステップと、
     前記判別ステップにおいて前記第1の範囲内に前記角膜反射光中心があると判別された場合、前記角膜反射光中心を含む第2の範囲内の画素の輝度を低下させた瞳孔画像を作成する瞳孔画像作成ステップと、
     前記瞳孔画像に基づいて瞳孔中心の座標を算出する瞳孔中心算出ステップと、
     前記反射光中心座標と前記瞳孔中心の座標に基づいて視線方向を算出するステップとを備えることを特徴とする視線検出方法。
    Obtaining a bright pupil image and a dark pupil image respectively;
    Creating a difference image between the bright pupil image and the dark pupil image, binarizing the difference image to create a binary image;
    A pupil candidate selection step of selecting a pupil candidate from one or more bright spots in the binary image;
    A corneal reflection light center calculating step for calculating coordinates of the corneal reflection light center in the dark pupil image as reflected light center coordinates;
    Discrimination for determining whether or not the corneal reflection light center is within a first range of the binary image including the pupil candidate based on the reflected light center coordinates calculated in the corneal reflection light center calculation step. Steps,
    When it is determined in the determination step that the corneal reflection light center is within the first range, a pupil that creates a pupil image in which the luminance of the pixels in the second range including the corneal reflection light center is reduced An image creation step;
    A pupil center calculating step for calculating the coordinates of the pupil center based on the pupil image;
    A gaze detection method comprising: calculating a gaze direction based on the reflected light center coordinates and the pupil center coordinates.
  6.  前記瞳孔画像作成ステップにおいて、前記判別ステップにおいて前記第1の範囲内に前記角膜反射光中心があると判別された場合、前記二値画像における前記角膜反射光中心の座標の推定値を算出し、この座標に対応する座標を含む第2の範囲内の画素の輝度を低下させた瞳孔画像を作成することを特徴とする請求項5に記載の視線検出方法。 In the pupil image creation step, when it is determined in the determination step that the corneal reflection light center is within the first range, an estimated value of the coordinates of the corneal reflection light center in the binary image is calculated, 6. The line-of-sight detection method according to claim 5, wherein a pupil image in which brightness of pixels in a second range including coordinates corresponding to the coordinates is reduced is created.
  7.  前記第1の範囲は、前記瞳孔候補選択ステップで選択された前記瞳孔候補に基づいて算出した楕円形状の範囲であることを特徴とする請求項5または請求項6に記載の視線検出方法。 The gaze detection method according to claim 5 or 6, wherein the first range is an elliptical range calculated based on the pupil candidate selected in the pupil candidate selection step.
  8.  前記第1の範囲は、前記瞳孔候補選択ステップで選択された前記瞳孔候補の画素の範囲であることを特徴とする請求項5または請求項6に記載の視線検出方法。 7. The visual line detection method according to claim 5, wherein the first range is a range of pixels of the pupil candidate selected in the pupil candidate selection step.
  9.  前記瞳孔中心算出ステップにおいては、前記楕円形状の中心を前記瞳孔中心として算出することを特徴とする請求項7に記載の視線検出方法。 The gaze detection method according to claim 7, wherein in the pupil center calculation step, the center of the elliptical shape is calculated as the pupil center.
PCT/JP2017/001476 2016-03-09 2017-01-18 Sight line detection device and sight line detection method WO2017154356A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016045717 2016-03-09
JP2016-045717 2016-03-09

Publications (1)

Publication Number Publication Date
WO2017154356A1 true WO2017154356A1 (en) 2017-09-14

Family

ID=59789228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/001476 WO2017154356A1 (en) 2016-03-09 2017-01-18 Sight line detection device and sight line detection method

Country Status (1)

Country Link
WO (1) WO2017154356A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111723773A (en) * 2020-06-30 2020-09-29 创新奇智(合肥)科技有限公司 Remnant detection method, device, electronic equipment and readable storage medium
CN114067420A (en) * 2022-01-07 2022-02-18 深圳佑驾创新科技有限公司 Sight line measuring method and device based on monocular camera

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08308801A (en) * 1995-05-16 1996-11-26 Olympus Optical Co Ltd Sight line detector
JP2011115460A (en) * 2009-12-04 2011-06-16 Saga Univ Visual axis controlling device, method of controlling visual axis, and program of the same
JP2016024616A (en) * 2014-07-18 2016-02-08 国立大学法人静岡大学 Eyeball measurement system, visual line detection system, eyeball measurement method, eyeball measurement program, visual line detection method, and visual line detection program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08308801A (en) * 1995-05-16 1996-11-26 Olympus Optical Co Ltd Sight line detector
JP2011115460A (en) * 2009-12-04 2011-06-16 Saga Univ Visual axis controlling device, method of controlling visual axis, and program of the same
JP2016024616A (en) * 2014-07-18 2016-02-08 国立大学法人静岡大学 Eyeball measurement system, visual line detection system, eyeball measurement method, eyeball measurement program, visual line detection method, and visual line detection program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111723773A (en) * 2020-06-30 2020-09-29 创新奇智(合肥)科技有限公司 Remnant detection method, device, electronic equipment and readable storage medium
CN111723773B (en) * 2020-06-30 2024-03-29 创新奇智(合肥)科技有限公司 Method and device for detecting carryover, electronic equipment and readable storage medium
CN114067420A (en) * 2022-01-07 2022-02-18 深圳佑驾创新科技有限公司 Sight line measuring method and device based on monocular camera

Similar Documents

Publication Publication Date Title
CN107533362B (en) Eye tracking device and method for operating an eye tracking device
US10722113B2 (en) Gaze detection apparatus and gaze detection method
US10698482B2 (en) Gaze tracking using non-circular lights
EP2881891B1 (en) Image processing device and image processing method
JP6346525B2 (en) Gaze detection device
JP6201956B2 (en) Gaze detection device and gaze detection method
US11023039B2 (en) Visual line detection apparatus and visual line detection method
JP2009254525A (en) Pupil detecting method and apparatus
JP5626039B2 (en) Gaze detection apparatus and gaze detection method
US20170116736A1 (en) Line of sight detection system and method
WO2017203769A1 (en) Sight line detection method
JP5601179B2 (en) Gaze detection apparatus and gaze detection method
US20210012105A1 (en) Method and system for 3d cornea position estimation
WO2018164104A1 (en) Eye image processing device
US10748301B2 (en) Corneal reflex position detection device, eye-gaze tracking device, and corneal reflex position detection method
WO2017154356A1 (en) Sight line detection device and sight line detection method
JP2016028669A (en) Pupil detection device and pupil detection method
JP2005296382A (en) Visual line detector
JP6011173B2 (en) Pupil detection device and pupil detection method
JP4451195B2 (en) Gaze detection device
US11796802B2 (en) Device tracking gaze and method therefor
WO2017154370A1 (en) Sight line detection device and sight line detection method
JP2017162233A (en) Visual line detection device and visual line detection method
KR20160055621A (en) Gaze tracking apparatus and method for detecting glint thereof
JP6693149B2 (en) Pupil detection device and pupil detection method

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17762702

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17762702

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP