WO2012008345A1 - まぶた検出装置及びプログラム - Google Patents
まぶた検出装置及びプログラム Download PDFInfo
- Publication number
- WO2012008345A1 WO2012008345A1 PCT/JP2011/065509 JP2011065509W WO2012008345A1 WO 2012008345 A1 WO2012008345 A1 WO 2012008345A1 JP 2011065509 W JP2011065509 W JP 2011065509W WO 2012008345 A1 WO2012008345 A1 WO 2012008345A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eyelid
- pixel
- feature amount
- image
- edge image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
- G06V10/451—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to an eyelid detection device and program, and more particularly, to an eyelid detection device and program for detecting a boundary point of the eyelid from an image of a region including eyes.
- an eyelid detection device that detects a boundary point between the eyelid and the eyeball by using a portion where the value changes greatly instead of the local maximum value of the first derivative value (edge value) as an eyelid feature amount (Patent No. 4309927) issue).
- the maximum value of the first-order differential value (edge value) is set as the eyelid position. A place away from the boundary is detected as a boundary point. As a result, there is a problem that a correct boundary point cannot be detected.
- the present invention has been made to solve the above-described problems, and provides an eyelid detection device and program capable of correctly detecting the boundary between the eyelid and the eyeball even when the eyelid is covered with makeup.
- the purpose is to provide.
- a first aspect of the present invention is based on an image of a region including an eye, a primary edge image representing a magnitude of a change in pixel value in a predetermined direction for each pixel in the region, and A generation unit that generates a secondary edge image representing a rate of change in the predetermined direction of the magnitude of the pixel value change in the predetermined direction for each pixel in the region; and a frequency cycle of the pixel value change at the eyelid boundary in the image
- a feature amount calculation unit that calculates a feature amount based on the pixel value of the pixel, and at least one of a boundary between the upper eyelid and the eyeball and a boundary between the lower eyelid and the eyeball based on the calculated feature amount of each pixel.
- To detect It is configured to include a
- the computer causes the primary edge image representing the magnitude of the pixel value change in a predetermined direction for each pixel in the region based on the image of the region including the eye,
- a generation unit that generates a secondary edge image representing a rate of change in the predetermined direction of the magnitude of the pixel value change in the predetermined direction for each pixel, according to the frequency period of the pixel value change at the eyelid boundary in the image, Either one of the primary edge image and the secondary edge image is shifted in the predetermined direction, and for each pixel, the pixel value of the pixel of the primary edge image and the pixel value of the pixel of the secondary edge image are set.
- the eyelid that detects at least one of the boundary between the upper eyelid and the eyeball and the boundary between the lower eyelid and the eyeball Is a program for functioning as a detection unit.
- the primary edge representing the magnitude of the pixel value change in the predetermined direction for each pixel in the region by the generation unit.
- a secondary edge image is generated that represents the rate of change in the predetermined direction of the image and the magnitude of the change in pixel value in the predetermined direction for each pixel in the region.
- the feature amount calculation unit shifts either the primary edge image or the secondary edge image in a predetermined direction according to the frequency cycle of the pixel value change at the boundary of the eyelid in the image, and the primary edge image for each pixel.
- a feature amount is calculated based on the pixel value of the pixel and the pixel value of the pixel of the secondary edge image.
- the eyelid detection unit detects at least one of the boundary between the upper eyelid and the eyeball and the boundary between the lower eyelid and the eyeball based on the calculated feature amount of each pixel.
- either the primary edge image or the secondary edge image is shifted in a predetermined direction in accordance with the frequency cycle of the pixel value change at the eyelid boundary, and the primary edge image and the secondary edge for each pixel.
- a feature amount is calculated based on the image, and a boundary between the upper eyelid and the eyeball or a boundary between the lower eyelid and the eyeball is detected. Thereby, even when the eyelid is covered with makeup, the boundary between the eyelid and the eyeball can be detected correctly.
- the third aspect of the present invention further includes an eye detection unit that detects the size of the eye from the image, and the feature amount calculation unit is a boundary of the eyelid that is determined in advance corresponding to the detected size of the eye In accordance with the frequency cycle of the pixel value change in, one of the primary edge image and the secondary edge image is shifted in a predetermined direction, and the feature amount can be calculated for each pixel.
- the fourth aspect of the present invention further includes a pixel value change extraction unit that extracts a pixel value change at the boundary of the eyelid from an image, and a frequency detection unit that detects a frequency of the extracted pixel value change, and a feature amount
- the calculation unit calculates a feature amount for each pixel by shifting either the primary edge image or the secondary edge image in a predetermined direction according to the frequency cycle of the pixel value change detected by the frequency detection unit. Can be.
- the feature amount calculation unit calculates one of the primary edge image and the secondary edge image by 1/4 of the period of the pixel value change frequency at the boundary of the eyelid in the image.
- the feature amount can be calculated for each pixel by shifting in a predetermined direction.
- the phase shift of the pixel value change at the eyelid boundary between the primary edge image and the secondary edge image can be matched.
- the predetermined direction can be a blinking direction.
- the feature amount calculating unit shifts the primary edge image in the blinking direction and downward, calculates the feature amount for each pixel, and the eyelid detection unit The boundary between the upper eyelid and the eyeball can be detected based on the calculated feature amount. This makes it possible to correctly detect the boundary between the upper eyelid and the eyeball by calculating the feature amount by combining the phase shift of the pixel value change at the boundary of the upper eyelid between the primary edge image and the secondary edge image. it can.
- the feature amount calculation unit calculates the feature amount for each pixel by shifting the secondary edge image in the blinking direction and upward
- the eyelid detection unit includes: A boundary between the upper eyelid and the eyeball can be detected based on the calculated feature amount. This makes it possible to correctly detect the boundary between the upper eyelid and the eyeball by calculating the feature amount by combining the phase shift of the pixel value change at the boundary of the upper eyelid between the primary edge image and the secondary edge image. it can.
- the feature amount calculating unit performs, for each pixel, weighted addition or multiplication of the pixel value of the pixel of the primary edge image and the pixel value of the pixel of the secondary edge image, The feature amount can be calculated.
- the eyelid detection unit determines at least one of the boundary between the upper eyelid and the eyeball and the boundary between the lower eyelid and the eyeball based on the peak point in the predetermined direction of the calculated feature value. Can be detected.
- either the primary edge image or the secondary edge image is placed in a predetermined direction in accordance with the frequency cycle of the pixel value change at the eyelid boundary.
- a feature amount is calculated based on the primary edge image and the secondary edge image, and the boundary between the upper eyelid and the eyeball or the boundary between the lower eyelid and the eyeball is detected.
- the eyelid detection device 10 includes an image capturing unit 12 including a CCD camera or the like that captures an image including the face of a person to be detected, and shutters of the image capturing unit 12.
- An illumination unit 14 composed of an infrared strobe, an infrared LED, and the like for illuminating a subject to be photographed in synchronization with the operation, a computer 16 that performs image processing, and a display device 18 that includes a CRT or the like are provided. .
- the computer 16 includes a CPU, a ROM that stores a program for an image processing routine, which will be described later, a RAM that stores data, and a bus that connects these. If the computer 16 is described in terms of functional blocks divided for each function realizing means determined based on hardware and software, the computer 16 is a grayscale image output from the image capturing unit 12 as shown in FIG.
- An image input unit 20 that inputs a face image, an image of a small region including eyes from the face image that is the output of the image input unit 20, that is, an eye image extraction unit 22 that extracts an eye image, and an eye image extraction unit 22
- the first-order differential value in the vertical direction is calculated for the eye image extracted in step 1 to generate a primary edge image
- the second-order differential value in the vertical direction is calculated to generate a secondary edge image.
- the feature amount calculation unit 26 that calculates the eyelid feature amount for each pixel, and the eyelid position is detected based on the calculated eyelid feature amount And And a eyelid position detecting section 28 for displaying the pig opening to the display device 18.
- the image input unit 20 includes, for example, an A / D converter, an image memory that stores image data of one screen, and the like.
- the eye image extraction unit 22 searches the eye area from the face image, specifies the extraction position, and extracts an image of a small area including the eyes as the eye image based on the specified extraction position.
- the edge image generation unit 24 blinks from the eye image as shown in FIG. 2 using a Prewitt filter as shown in FIG. 3A as a longitudinal first-order differential filter as shown in FIG.
- a primary edge image having a pixel value as a primary differential value representing the magnitude of the change in shading for each pixel in the direction from the top to the bottom is generated.
- the edge image generation unit 24 may generate a primary edge image using a Sobel filter as shown in FIG. 3B or a simple difference filter as shown in FIG. 3C as the vertical first-order differential filter. .
- the edge image generation unit 24 uses the vertical second-order differential filter as shown in FIG. 4A from the eye image as shown in FIG. A secondary edge image is generated with a secondary differential value representing a change rate of the magnitude of the shade change for each pixel in the direction from top to bottom as a pixel value.
- the edge image generation unit 24 may generate a secondary edge image using a filter as shown in FIGS. 4B to 4F as a vertical differential filter.
- the eyeball part is photographed dark because it has a lower reflectance than the eyelid, which is the skin part. Therefore, in an image with eyes as shown in FIG. 2 above, as shown in FIG. 2 above, in the direction from top to bottom, “bright (skin)” ⁇ “dark (border part of skin and eyeball) ) ”Is detected as the upper eyelid boundary, and“ dark ” ⁇ “ light ”is changed as the lower eyelid boundary.
- the edge image generation unit 24 calculates a first derivative value (edge value) of the image. Since the value of the primary differential value increases at the shade change portion, the boundary of the eyelid can be detected by detecting the portion where the primary differential value is large.
- the edge image generation unit 24 further differentiates the primary differential value to calculate the secondary differential value.
- the wrinkle feature value is calculated by combining the primary differential value and the secondary differential value.
- the peak positions of the primary differential value and the secondary differential value are shifted by 1/4 period.
- the secondary differential value is shifted by 1 ⁇ 4 period, and the primary differential value and the shifted secondary differential value are combined. Then, the wrinkle feature amount is calculated.
- the feature amount calculation unit 26 shifts the secondary edge image upward and downward by 1 ⁇ 4 period based on the frequency period of the density change in the vertical direction at the eyelid boundary, which is obtained in advance. .
- the frequency cycle of the concentration change in the vertical direction at the eyelid boundary is obtained in advance as described below.
- an eye image is extracted from an image captured by the image capturing unit 12, and a vertical change in shade at the eyelid boundary is extracted from the eye image.
- the frequency of the density change is detected using a sine wave as shown in FIG.
- the section of the sine wave used for the frequency detection is between the cross points or between the maximum point and the minimum point.
- the frequency and amplitude are changed, and fitting is performed so as to match the change in shade in the range according to the heel position as shown in FIG. Then, as shown in FIG. 7, the frequency at the shading change at the eyelid boundary is estimated, the frequency period is obtained, and stored in a memory (not shown).
- the feature quantity calculation unit 26 converts the secondary edge image into the blinking direction by a quarter of the period of the frequency of the density change in the vertical direction at the eyelid boundary, and the top of the image. Shift in the direction. Based on the primary edge image and the secondary edge image shifted upward, the feature amount calculation unit 26 calculates the upper limit of each pixel from the pixel value of each pixel as shown in the following equation (1). The feature quantity e upper (x, y) is calculated.
- e 1 (x, y) is a primary differential value at the coordinate position (x, y) of the primary edge image.
- t is a phase shift amount
- e 2 (x, y + t) is a secondary differential value at the coordinate position (x, y + t) of the secondary edge image before the shift corresponding to the coordinate position (x, y). It is. Further, 0 ⁇ ⁇ ⁇ 1.
- the feature amount calculation unit 26 shifts the secondary edge image downward in the image by 1 ⁇ 4 of the frequency period of the density change in the vertical direction at the eyelid boundary. Based on the primary edge image and the secondary edge image shifted downward, the feature amount calculation unit 26 calculates the lower limit of each pixel from the pixel value of each pixel as shown in the following equation (2). The feature quantity e lower (x, y) is calculated.
- e 1 (x, y) is a primary differential value at the coordinate position (x, y) of the primary edge image.
- t is the phase shift amount
- e 2 (x, y- t) corresponds to the coordinate position (x, y), shift of the coordinate position before the secondary edge image (x, y-t) Second derivative value.
- ⁇ is weighted to e 1 (x, y)
- (1 ⁇ ) is weighted to e 2 (x, yt) and added.
- the eyelid position detection unit 28 sets the vertical peak point of the upper eyelid feature amount as the first boundary point indicating the boundary between the upper eyelid and the eyeball.
- the eyelid position detection unit 28 generates an upper eyelid shape model (which may be two-dimensional or three-dimensional) including the positions of the corners of the eyelids and the eyes with respect to a set of vertical peak points of the upper eyelid feature amount calculated for each pixel. Change the parameters in various ways to perform saddle shape fitting.
- the eyelid position detection unit 28 detects the upper eyelid shape and position where the evaluation value is maximized. Further, the eyelid position detection unit 28 sets the vertical peak point of the lower eyelid feature value as the second boundary point indicating the boundary between the lower eyelid and the eyeball.
- the eyelid position detection unit 28 generates a lower eyelid shape model (which may be two-dimensional or three-dimensional) including the positions of the corners of the eyelids and the eyes, with respect to a set of vertical peak points of the lower eyelid feature amount calculated for each pixel. Change the parameters in various ways to perform saddle shape fitting.
- the eyelid position detector 28 detects the lower eyelid shape and position where the evaluation value is maximized.
- the calculated upper eyelid feature amount or lower eyelid feature amount may be used as the fitting evaluation value.
- the eyelid position detection unit 28 measures the eyelid opening degree from the detected upper eyelid shape and lower eyelid shape, and outputs the eyelid opening degree to the display device 18.
- the image capturing unit 12 captures a face image of the subject.
- the illumination unit 14 formed of an infrared strobe emits light in synchronization with photographing by the image capturing unit 12 to illuminate the face of the subject. . If continuous light is emitted from the illuminating unit 14, synchronization with the image capturing unit 12 is unnecessary, and the configuration is simplified.
- step 100 the computer 16 captures the face image photographed by the image capturing unit 12 as a video signal.
- step 102 the computer 16 A / D converts the video signal to generate a two-dimensional digital image.
- the subsequent processing is performed by digital processing based on this digital image. Therefore, hereinafter, simply referring to an image means a digital image.
- step 104 the computer 16 searches for an eye area from the face image, and sets an area including the eye as an extraction area.
- the computer 16 extracts a small area including one eye as an eye image.
- the eye region may be searched from the face image by image processing using a template matching method.
- the operator may instruct the eye area by pointing the eyes on the face image with an eye area instruction means such as a keyboard, mouse, electronic pen, or light pen.
- step 108 the computer 16 performs edge processing using the Prewitt filter as shown in FIG. 3A on the eye image extracted in step 106, and performs pixel-by-pixel processing in the direction from top to bottom.
- a primary edge image is generated with a primary differential value representing the magnitude of the shade change as the pixel value change as a pixel value. For example, if the current image coordinates are (x, y) and the pixel value at (x, y) is A (x, y), the primary differential value E ((x, y) in the primary edge image E ( x, y) is obtained by the following equation.
- E (x, y) A (x-1, y-1) + A (x, y-1) + A (x + 1, y-1) ⁇ A (x ⁇ 1, y + 1) ⁇ A (x, y + 1) ⁇ A (x + 1, y + 1)
- the computer 16 performs edge processing on the eye image extracted in step 106 using a vertical second-order differential filter as shown in FIG. 4A.
- the computer 16 generates a secondary edge image having a secondary differential value representing a change rate of the magnitude of the shade change as the pixel value change for each pixel in the direction from the top to the bottom. For example, if the current image coordinates are (x, y) and the pixel value at (x, y) is A (x, y), the secondary differential value E ′ of (x, y) in the secondary edge image (X, y) is obtained by the following equation.
- E ′ (x, y) A (x-1, y-1) + A (x, y-1) + A (x + 1, y-1) -2A (x-1, y) -2A (x, y) -2A (x + 1, y) + A (x-1, y + 1) + A (x, y + 1) + A (x + 1, y + 1)
- the computer 16 converts the secondary edge image of the image for calculation of the upper eyelid feature amount by a quarter of the frequency period of the density change in the vertical direction at the eyelid boundary obtained in advance. In addition to shifting upward, the secondary edge image is shifted downward for calculating the lower eyelid feature amount.
- step 114 the computer 16 determines the upper eyelid feature amount for each pixel based on the primary edge image generated in step 108 and the secondary edge image shifted upward in step 112. Is calculated. Further, the computer 16 calculates a lower eyelid feature amount for each pixel based on the primary edge image generated in step 108 and the secondary edge image shifted downward in step 112.
- the computer 16 detects the vertical peak point of the upper eyelid feature amount calculated in step 114 and detects the vertical peak point of the lower eyelid feature amount.
- the computer 16 detects the shape and position of the upper eyelid from the set of vertical peak points of the upper eyelid feature value detected in step 116, and the lower eyelid feature value detected in step 116. The shape and position of the lower eyelid are detected from a set of peak points in the vertical direction.
- step 120 the computer 16 causes the display device 18 to display the eyelid opening determined from the upper eyelid shape and position detected in step 118 and the lower eyelid shape and position, and executes the image processing routine. finish.
- the eyelid detection apparatus shifts the secondary edge image in the vertical direction by 1 ⁇ 4 of the period of the frequency of the change in shading at the eyelid boundary, and thus the primary edge image. And the phase shift of the pixel value change at the eyelid boundary of the secondary edge image is matched.
- the eyelid detection device calculates the upper eyelid feature amount and the lower eyelid feature amount based on the primary edge image and the secondary edge image for each pixel, and detects the eyelid position. Thereby, the eyelid detection device can correctly detect the position of the eyelid even when the eyelid is covered with makeup.
- the eyelid detection device calculates the amount of change in the primary differential value, that is, the value of the secondary differential value and captures the value obtained by shifting the phase of the secondary differential value in order to capture the portion where the primary differential value changes.
- the wrinkle feature amount calculation is performed so that the wrinkle feature amount increases in a place close to the actual wrinkle.
- the eyelid detection device can increase the eyelid feature amount near the eyelid position and can detect the eyelid position with high accuracy even when the eyelid is covered with eye shadow or the like.
- the eyelid detection device can measure the eyelid opening with high accuracy by accurately detecting the eyelid position.
- the eyelid detection device is different from the first embodiment in that the eyelid detection device detects the frequency from the shade change obtained from the eye image every time the eyelid position is detected.
- the computer 216 of the eyelid detection apparatus 210 includes an eye input from an image input unit 20, an eye image extraction unit 22, an edge image generation unit 24, and an eye image.
- a gradation change extraction unit 230 that extracts a density change in the vertical direction at the boundary, a frequency detection unit 232 that detects a frequency of the extracted density change, a feature amount calculation unit 26, and an eyelid position detection unit 28 are provided. .
- the shading change extracting unit 230 extracts the shading change in the vertical direction in the region including the eyelid boundary, which is obtained in advance, from the eye image extracted by the eye image extracting unit 22.
- the frequency detection unit 232 performs fitting so as to match the shade change in the range corresponding to the eyelid position while changing the frequency and amplitude using the predetermined section of the sine wave for the extracted shade change in the vertical direction. And detect the frequency at the shading change at the eyelid boundary.
- the feature amount calculation unit 26 shifts the secondary edge image upward by 1 ⁇ 4 of the detected frequency period, and according to the above equation (1), the upper edge feature amount e upper (x, y) of each pixel. ) Is calculated.
- the feature quantity calculation unit 26 shifts the secondary edge image downward by 1 ⁇ 4 of the detected frequency period, and according to the above equation (2), the lower edge feature quantity e lower ( x, y) is calculated.
- step 100 the computer 216 captures the face image captured by the image capturing unit 12 as a video signal.
- step 102 the computer 216 generates a two-dimensional digital image.
- step 104 the computer 216 searches for an eye area from the face image and sets an area including the eye as an extraction area.
- step 106 the computer 216 extracts a small area including one eye as an eye image.
- step 108 the computer 216 performs edge processing on the eye image extracted in step 106 to generate a primary edge image.
- the computer 216 performs edge processing using a vertical second-order differential filter on the eye image extracted in step 106 to generate a secondary edge image.
- step 200 the computer 216 extracts the change in lightness and darkness in the vertical direction from the previously obtained region including the eyelid boundary of the eye image extracted in step 106.
- step 202 the computer 216 detects the frequency of the vertical shade change extracted in step 200.
- step 204 the computer 216 shifts the secondary edge image upward for calculating the upper eyelid feature amount by 1 ⁇ 4 of the period of the frequency detected in step 202.
- the secondary edge image is shifted downward in the image for calculating the feature amount.
- step 114 the computer 216 calculates the upper eyelid feature amount and the lower eyelid feature amount for each pixel.
- the computer 216 detects the vertical peak point of the upper eyelid feature amount calculated in step 114 and also detects the vertical peak point of the lower eyelid feature amount.
- step 118 the computer 216 detects the upper eyelid shape and position from the set of vertical peak points of the upper eyelid feature value detected in step 116.
- the computer 216 detects the shape and position of the lower eyelid from the set of vertical peak points of the lower eyelid feature value detected in step 116.
- step 120 the computer 216 causes the display device 18 to display the upper eyelid shape and position detected in step 118 and the eyelid opening determined from the lower eyelid shape and position on the display device 18, thereby executing an image processing routine. finish.
- the eyelid detection device detects the frequency of the pixel value change at the eyelid boundary, and shifts the secondary edge image in the vertical direction by 1/4 of the period of the detected frequency. .
- the eyelid detection device can match the phase shift of the pixel value change at the boundary of the eyelid between the primary edge image and the secondary edge image, and the eyelid feature amount becomes large at a place close to the actual eyelid. Can be calculated.
- the eyelid detection device detects the eye size from the eye image, and determines the frequency of the shade change at the eyelid boundary from the eye size. It is different from the form.
- the computer 316 of the eyelid detection apparatus 310 acquires an image input unit 20, an eye image extraction unit 22, an edge image generation unit 24, an eye image, An eye size detection unit 330 that detects the size of the eye, a frequency determination unit 332 that determines the frequency of eyelid boundary density change according to the detected eye size, a feature amount calculation unit 26, and an eyelid position And a detector 28.
- the eye size detection unit 330 acquires the eye image extracted by the eye image extraction unit 22, and detects the size of the eye (for example, the distance between the corners of the eyes and the eyes) from the eye image.
- the frequency determining unit 332 stores in advance the correspondence between the size of the eyes and the frequency of the vertical shade change at the eyelid boundary. Based on the correspondence relationship, the frequency determination unit 332 determines the frequency of the shading change at the boundary of the eyelid corresponding to the detected eye size.
- the feature amount calculation unit 26 shifts the secondary edge image upward by 1 ⁇ 4 of the determined frequency period, and according to the above equation (1), the upper edge feature amount e upper (x, y) of each pixel. ) Is calculated.
- the feature amount calculation unit 26 shifts the secondary edge image downward by 1 ⁇ 4 of the determined frequency period, and according to the above equation (2), the lower edge feature amount e lower ( x, y) is calculated.
- the computer 316 captures the face image captured by the image capturing unit 12 as a video signal, and A / D converts the video signal to generate a two-dimensional digital image.
- the computer 316 searches for an eye region from the face image, sets a region including eyes as an extraction region, and extracts a small region including one eye as an eye image.
- the computer 316 detects the eye size from the extracted eye image, determines the frequency of the shade change at the boundary of the eyelid corresponding to the detected eye size, and stores the memory (not shown). To remember.
- an image processing routine is executed in the same manner as in the first embodiment described above.
- the eyelid detection device detects the size of the eye from the eye image, determines the frequency of density change at the boundary of the eyelid according to the size of the eye, and is 1 ⁇ 4 of the determined frequency cycle. Only the secondary edge image is shifted in the vertical direction. Thereby, the eyelid detection device can match the phase shift of the pixel value change at the boundary of the eyelid between the primary edge image and the secondary edge image, and the eyelid feature amount becomes large at a place close to the actual eyelid. Can be calculated.
- the eyelid detection device generates an edge image representing the magnitude of light and shade change from the light and shade image input by the image input unit.
- the image input by the image input unit may be a color image.
- the eyelid detection device may generate an edge image representing the magnitude of the change in the density value of the color image.
- the eyelid detection device calculates a secondary differential value by performing edge processing using a vertical second-order differential filter on the eye image
- the present invention is not limited thereto. is not.
- the eyelid detection device may perform edge processing using a longitudinal first-order differential filter again on the primary edge image to calculate a second-order differential value.
- the eyelid detection device may shift the primary edge image in the blinking direction.
- the eyelid detection device shifts the primary edge image downward in the blinking direction by a quarter of the frequency period of the density change in the longitudinal direction at the boundary of the eyelid, and the primary shifted in the downward direction.
- the upper eye feature amount of each pixel may be calculated based on the edge image and the secondary edge image.
- the eyelid detection device shifts the primary edge image upward in the blinking direction by a quarter of the frequency period of the density change in the vertical direction at the boundary of the eyelid, and the primary edge image shifted upward.
- the lower eyelid feature amount of each pixel may be calculated based on the secondary edge image.
- the eyelid detection device calculates the wrinkle feature amount by weighting and adding the pixel value of the primary edge image and the pixel value of the secondary edge image
- the present invention is not limited to this. Absent.
- the eyelid detection apparatus may calculate the wrinkle feature amount of each pixel by multiplying the pixel value of the primary edge image by the pixel value of the secondary edge image.
- the eyelid detection device detects the eyelid position by fitting the eyelid shape to the set of peak points of the eyelid feature amount
- the present invention is not limited to this.
- the eyelid detection device may detect the eyelid position by other methods.
- the computer-readable medium is a computer-readable medium, wherein a primary edge image representing a magnitude of a change in pixel value in a predetermined direction for each pixel in the region based on an image of the region including the eye, and each pixel in the region.
- a generating unit that generates a secondary edge image representing a rate of change in the predetermined direction of the magnitude of the change in the pixel value in the predetermined direction according to the frequency cycle of the pixel value change at the eyelid boundary in the image; Either one of the next edge image and the second edge image is shifted in the predetermined direction, and for each pixel, based on the pixel value of the first edge image and the pixel value of the second edge image.
- a feature amount calculation unit that calculates a feature amount, and an eyelid detection unit that detects at least one of the boundary between the upper eyelid and the eyeball based on the calculated feature amount of each pixel.
- the program for making it function is stored.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
A(x-1、y-1)+A(x、y-1)+A(x+1、y-1)
-A(x-1、y+1)-A(x、y+1)-A(x+1、y+1)
A(x-1、y-1)+A(x、y-1)+A(x+1、y-1)
-2A(x-1、y)-2A(x、y)-2A(x+1、y)
+A(x-1、y+1)+A(x、y+1)+A(x+1、y+1)
前記算出された各画素の特徴量に基づいて、上まぶたと眼球との境界及びの少なくとも一方を検出するまぶた検出部として機能させるためのプログラムを記憶する。
Claims (10)
- 目を含む領域の画像に基づいて、該領域における画素毎の所定方向の画素値変化の大きさを表す1次エッジ画像、及び該領域における画素毎の前記所定方向の画素値変化の大きさの前記所定方向の変化率を表わす2次エッジ画像を生成する生成部と、
前記画像におけるまぶたの境界における画素値変化の周波数の周期に応じて、前記1次エッジ画像及び前記2次エッジ画像の何れか一方を前記所定方向にずらし、各画素について、前記1次エッジ画像の該画素の画素値及び前記2次エッジ画像の該画素の画素値に基づいて、特徴量を算出する特徴量算出部と、
前記算出された各画素の特徴量に基づいて、上まぶたと眼球との境界及び下まぶたと眼球との境界の少なくとも一方を検出するまぶた検出部と、
を含むまぶた検出装置。 - 前記画像から、目の大きさを検出する目検出部を更に含み、
前記特徴量算出部は、前記検出された目の大きさに対応して予め定められた前記まぶたの境界における画素値変化の周波数の周期に応じて、前記1次エッジ画像及び前記2次エッジ画像の何れか一方を前記所定方向にずらして、各画素について前記特徴量を算出する請求項1記載のまぶた検出装置。 - 前記画像から、まぶたの境界における画素値変化を抽出する画素値変化抽出部と、
前記抽出された画素値変化の周波数を検出する周波数検出部とを更に含み、
前記特徴量算出部は、前記周波数検出部によって検出された前記画素値変化の周波数の周期に応じて、前記1次エッジ画像及び前記2次エッジ画像の何れか一方を前記所定方向にずらして、各画素について前記特徴量を算出する請求項1記載のまぶた検出装置。 - 前記特徴量算出部は、前記画像におけるまぶたの境界における画素値変化の周波数の周期の1/4だけ、前記1次エッジ画像及び前記2次エッジ画像の何れか一方を前記所定方向にずらして、各画素について前記特徴量を算出する請求項1~請求項3の何れか1項記載のまぶた検出装置。
- 前記所定方向を、まばたき方向とした請求項1~請求項4の何れか1項記載のまぶた検出装置。
- 前記特徴量算出部は、前記1次エッジ画像を、前記まばたき方向であって、かつ、下方向にずらし、各画素について前記特徴量を算出し、
前記まぶた検出部は、前記算出された特徴量に基づいて、上まぶたと眼球との境界を検出する請求項5記載のまぶた検出装置。 - 前記特徴量算出部は、前記2次エッジ画像を、前記まばたき方向であって、かつ、上方向にずらし、各画素について前記特徴量を算出し、
前記まぶた検出部は、前記算出された特徴量に基づいて、上まぶたと眼球との境界を検出する請求項5記載のまぶた検出装置。 - 前記特徴量算出部は、各画素について、前記1次エッジ画像の該画素の画素値及び前記2次エッジ画像の該画素の画素値の重み付き加算又は乗算により、前記特徴量を算出する請求項1~請求項7の何れか1項記載のまぶた検出装置。
- 前記まぶた検出部は、前記算出された特徴量の前記所定方向におけるピーク点に基づいて、前記上まぶたと眼球との境界及び前記下まぶたと眼球との境界の少なくとも一方を検出する請求項1~請求項8の何れか1項記載のまぶた検出装置。
- コンピュータを、
目を含む領域の画像に基づいて、該領域における画素毎の所定方向の画素値変化の大きさを表す1次エッジ画像、及び該領域における画素毎の前記所定方向の画素値変化の大きさの前記所定方向の変化率を表わす2次エッジ画像を生成する生成部、
前記画像におけるまぶたの境界における画素値変化の周波数の周期に応じて、前記1次エッジ画像及び前記2次エッジ画像の何れか一方を前記所定方向にずらし、各画素について、前記1次エッジ画像の該画素の画素値及び前記2次エッジ画像の該画素の画素値に基づいて、特徴量を算出する特徴量算出部、及び
前記算出された各画素の特徴量に基づいて、上まぶたと眼球との境界及びの少なくとも一方を検出するまぶた検出部
として機能させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112011102346T DE112011102346T5 (de) | 2010-07-14 | 2011-07-06 | Augenliderfassungsvorrichtung und Programm |
US13/805,597 US8693784B2 (en) | 2010-07-14 | 2011-07-06 | Eyelid detection device and program |
CN201180033982.9A CN102985948B (zh) | 2010-07-14 | 2011-07-06 | 眼睑检测装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010160053A JP5180997B2 (ja) | 2010-07-14 | 2010-07-14 | まぶた検出装置及びプログラム |
JP2010-160053 | 2010-07-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012008345A1 true WO2012008345A1 (ja) | 2012-01-19 |
Family
ID=45469348
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/065509 WO2012008345A1 (ja) | 2010-07-14 | 2011-07-06 | まぶた検出装置及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8693784B2 (ja) |
JP (1) | JP5180997B2 (ja) |
CN (1) | CN102985948B (ja) |
DE (1) | DE112011102346T5 (ja) |
WO (1) | WO2012008345A1 (ja) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9020199B2 (en) * | 2011-04-19 | 2015-04-28 | Aisin Seiki Kabushiki Kaisha | Eyelid detection device, eyelid detection method, and recording medium |
JP6227996B2 (ja) * | 2013-12-18 | 2017-11-08 | 浜松ホトニクス株式会社 | 計測装置及び計測方法 |
KR102198852B1 (ko) * | 2014-03-24 | 2021-01-05 | 삼성전자 주식회사 | 홍채 인식 장치 및 이를 포함하는 모바일 장치 |
JP6535223B2 (ja) * | 2015-05-28 | 2019-06-26 | 浜松ホトニクス株式会社 | 瞬目計測方法、瞬目計測装置、及び瞬目計測プログラム |
JP2019082743A (ja) * | 2016-03-18 | 2019-05-30 | 三菱電機株式会社 | 情報処理装置及び情報処理方法 |
US10846517B1 (en) * | 2016-12-30 | 2020-11-24 | Amazon Technologies, Inc. | Content modification via emotion detection |
US11430264B2 (en) * | 2018-07-16 | 2022-08-30 | Honor Device Co., Ltd. | Eye open or closed state detection method and electronic device |
JP7137746B2 (ja) * | 2019-01-30 | 2022-09-15 | 株式会社Jvcケンウッド | 映像処理装置、映像処理方法および映像処理プログラム |
CN113785258A (zh) * | 2019-03-22 | 2021-12-10 | 惠普发展公司,有限责任合伙企业 | 检测眼睛测量 |
CN113240657B (zh) * | 2021-05-25 | 2023-12-22 | 郑州新益华信息科技有限公司 | 一种基于医疗大数据的眼睑炎图像处理及预警*** |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0678901A (ja) * | 1992-06-12 | 1994-03-22 | Nec Corp | 上まぶた領域、目頭・目尻・上まぶた領域及び目の構造の検出方法及び装置 |
JP2008226031A (ja) * | 2007-03-14 | 2008-09-25 | Toyota Central R&D Labs Inc | まぶた検出装置及びプログラム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04309927A (ja) | 1991-04-09 | 1992-11-02 | Hitachi Ltd | アクティブマトリクス基板の製造方法とこれを用いた液晶表示素子 |
JP3214057B2 (ja) * | 1992-04-14 | 2001-10-02 | キヤノン株式会社 | 瞳孔中心検出方法、瞳孔中心検出装置、瞳孔輪部検出方法および瞳孔輪部検出装置 |
JP3143819B2 (ja) | 1994-05-20 | 2001-03-07 | 株式会社豊田中央研究所 | まぶたの開度検出装置 |
RU2330607C2 (ru) * | 2001-06-13 | 2008-08-10 | Компьюмедикс Лимитед | Способ и устройство для мониторинга сознания |
US20090252382A1 (en) * | 2007-12-06 | 2009-10-08 | University Of Notre Dame Du Lac | Segmentation of iris images using active contour processing |
JP2010160053A (ja) | 2009-01-08 | 2010-07-22 | Shimadzu Corp | ガスクロマトグラフ及びガスクロマトグラフ分析方法 |
-
2010
- 2010-07-14 JP JP2010160053A patent/JP5180997B2/ja not_active Expired - Fee Related
-
2011
- 2011-07-06 US US13/805,597 patent/US8693784B2/en not_active Expired - Fee Related
- 2011-07-06 WO PCT/JP2011/065509 patent/WO2012008345A1/ja active Application Filing
- 2011-07-06 CN CN201180033982.9A patent/CN102985948B/zh not_active Expired - Fee Related
- 2011-07-06 DE DE112011102346T patent/DE112011102346T5/de not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0678901A (ja) * | 1992-06-12 | 1994-03-22 | Nec Corp | 上まぶた領域、目頭・目尻・上まぶた領域及び目の構造の検出方法及び装置 |
JP2008226031A (ja) * | 2007-03-14 | 2008-09-25 | Toyota Central R&D Labs Inc | まぶた検出装置及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN102985948B (zh) | 2016-02-10 |
US8693784B2 (en) | 2014-04-08 |
DE112011102346T5 (de) | 2013-04-18 |
JP5180997B2 (ja) | 2013-04-10 |
CN102985948A (zh) | 2013-03-20 |
JP2012022531A (ja) | 2012-02-02 |
US20130101225A1 (en) | 2013-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012008345A1 (ja) | まぶた検出装置及びプログラム | |
JP4309927B2 (ja) | まぶた検出装置及びプログラム | |
JP5873442B2 (ja) | 物体検出装置および物体検出方法 | |
JP5837508B2 (ja) | 姿勢状態推定装置および姿勢状態推定方法 | |
CN109684925B (zh) | 一种基于深度图像的人脸活体检测方法及设备 | |
JP5657494B2 (ja) | シワ検出方法、シワ検出装置およびシワ検出プログラム、並びに、シワ評価方法、シワ評価装置およびシワ評価プログラム | |
US10304164B2 (en) | Image processing apparatus, image processing method, and storage medium for performing lighting processing for image data | |
CN103002225B (zh) | 多曝光高动态范围图像捕捉 | |
US11069057B2 (en) | Skin diagnostic device and skin diagnostic method | |
JP2020526809A (ja) | 仮想顔化粧の除去、高速顔検出およびランドマーク追跡 | |
WO2012120697A1 (ja) | 画像処理装置、画像処理方法、および制御プログラム | |
EP2720190A1 (en) | Image processing device, image processing method, and control program | |
CN110619628A (zh) | 一种人脸图像质量评估方法 | |
EP3241151A1 (en) | An image face processing method and apparatus | |
JP2008226125A (ja) | 瞼検出装置、瞼検出方法、及び、プログラム | |
CN109816694A (zh) | 目标跟踪方法、装置及电子设备 | |
JP2008288684A (ja) | 人物検出装置及びプログラム | |
JP5201184B2 (ja) | 画像処理装置及びプログラム | |
JP2008191760A (ja) | 対象物検出装置、対象物検出方法、及びプログラム | |
CN105320925B (zh) | 图像捕获中的特征检测 | |
CN108255298B (zh) | 一种投影交互***中的红外手势识别方法及设备 | |
Hang | Real-time image acquisition and processing system design based on DSP | |
CN105323647B (zh) | 电视观看环境的光线强度的检测方法和装置与智能电视 | |
KR20160023417A (ko) | 컬러 이미지 분석을 이용한 비접촉 멀티 터치 인식 방법 및 시스템 | |
CN110532993B (zh) | 一种人脸防伪方法、装置、电子设备及介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180033982.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11806675 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13805597 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120111023464 Country of ref document: DE Ref document number: 112011102346 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11806675 Country of ref document: EP Kind code of ref document: A1 |