WO2020261423A1 - 認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 - Google Patents
認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 Download PDFInfo
- Publication number
- WO2020261423A1 WO2020261423A1 PCT/JP2019/025342 JP2019025342W WO2020261423A1 WO 2020261423 A1 WO2020261423 A1 WO 2020261423A1 JP 2019025342 W JP2019025342 W JP 2019025342W WO 2020261423 A1 WO2020261423 A1 WO 2020261423A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- person
- image
- iris
- target
- authentication
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/16—Image acquisition using multiple overlapping images; Image stitching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
Definitions
- the present invention relates to the technical fields of an authentication system, an authentication method, a control device, a computer program, and a recording medium capable of authenticating a target person.
- Patent Document 1 includes a face imaging camera and an iris imaging camera, calculates the eye position of a subject from an image captured by the face imaging camera, and places the iris imaging camera at the calculated eye position.
- An iris recognition system that captures an iris image of a subject with a camera for iris imaging after pointing is described.
- a telephoto camera is often used as a camera for iris imaging.
- the imaging range (in other words, the angle of view) of the iris imaging camera is usually narrower than the imaging range of the face imaging camera and often at least partially overlaps the imaging range of the face imaging camera.
- the face imaging camera may image a person who is not desirable to be selected as the target of iris recognition in addition to or in place of the person who is desirable to be selected as the target of iris recognition. There is.
- the face imaging camera captures a person who is not desirable to be selected as the target person for iris recognition, the person may be selected as the target person for iris recognition.
- An object of the present invention is to provide an authentication system, an authentication method, a control device, a computer program, and a recording medium capable of solving the above-mentioned technical problems.
- One aspect of the authentication system is a first imaging device that acquires a first image used for authentication by imaging a first imaging range, and a second imaging range that is wider than the first imaging range.
- the target of the authentication is based on the second imaging device that acquires two images, the target area including a predetermined part of the person in the second image, and the reference area corresponding to the first imaging range in the second image. It is equipped with a control device for selecting a person.
- One aspect of the authentication method is from a second imaging device that acquires a second image by capturing a second imaging range that is wider than the first imaging range of the first imaging device that acquires the first image used for authentication.
- the person to be authenticated based on the acquisition step of acquiring the second image, the target area including a predetermined part of the person in the second image, and the reference area corresponding to the first imaging range in the second image. Includes a selection process to select.
- One aspect of the control device is from a second imaging device that acquires a second image by capturing a second imaging range that is wider than the first imaging range of the first imaging device that acquires the first image used for authentication.
- the subject of the authentication is based on the acquisition means for acquiring the second image, the target area including a predetermined part of the person in the second image, and the reference area corresponding to the first imaging range in the second image. It is provided with a selection means for selecting a person.
- One aspect of the computer program causes the computer to execute one aspect of the imaging method described above.
- One aspect of the recording medium is a recording medium on which one aspect of the computer program described above is recorded.
- the person to be authenticated can be appropriately selected.
- FIG. 1 is a block diagram showing an overall configuration of the iris authentication system of the present embodiment.
- FIG. 2 is a schematic view showing the positional relationship between the entire camera and the iris camera and a person.
- FIG. 3 is a plan view showing the relationship between the entire image and the plurality of iris images.
- FIG. 4 is a block diagram showing a hardware configuration of the iris recognition device of the present embodiment.
- FIG. 5 is a block diagram showing a functional block realized in the CPU included in the iris authentication device of the present embodiment.
- FIG. 6 is a flowchart showing the flow of the operation of the iris authentication system of the present embodiment (that is, the iris authentication operation).
- FIG. 7A is a plan view showing an entire image showing three people, and FIG.
- FIG. 7B shows the positional relationship between the face area and the iris image area shown in FIG. 7A. It is a plan view.
- FIG. 8A is a plan view showing an overall image showing three people, and FIG. 8B shows the positional relationship between the face area and the iris image area shown in FIG. 8A. It is a plan view.
- FIG. 9 is a plan view showing an iris image.
- FIG. 10A is a plan view showing an overall image showing two people, and FIG. 10B shows the positional relationship between the face area and the iris image area shown in FIG. 10A.
- 10 (c) is a plan view showing the positional relationship between the face region and the iris image region shown in FIG. 10 (a) together with the degree of closeness between the two.
- FIG. 10 (c) is a plan view showing the positional relationship between the face region and the iris image region shown in FIG. 10 (a) together with the degree of closeness between the two.
- FIG. 11A is a plan view showing an overall image showing two people
- FIG. 11B shows the positional relationship between the face area and the iris image area shown in FIG. 11A
- 11 (c) is a plan view showing the positional relationship between the face region and the iris image region shown in FIG. 11 (a) together with the degree of closeness between the two.
- FIG. 12 is a plan view showing a display example of the entire image to which information regarding the iris authentication operation is added.
- FIG. 13 is a plan view showing a display example of the entire image to which information regarding the iris authentication operation is added.
- Each of FIGS. 14 (a) to 14 (c) is a plan view showing an entire image showing two people.
- FIG. 15 is a block diagram showing a modified example of the functional block realized in the CPU included in the iris recognition device.
- an authentication system 1 that executes an iris authentication operation that authenticates the person T based on the iris pattern of the person T.
- an iris recognition system 1 may be adopted as a part of, for example, a system for automating immigration procedures at an airport (so-called ABC (Automated Border Control)).
- ABC Automatic Border Control
- the iris recognition system 1 may be a walk-through type iris recognition system that authenticates the moving person T.
- the description will proceed with reference to an example in which the iris recognition system 1 is a walk-through type iris recognition system.
- the iris recognition system 1 is not limited to the iris recognition system illustrated in this paragraph.
- the iris recognition system 1 may be used as an arbitrary iris recognition system capable of authenticating the person T (for example, an iris recognition system that authenticates the person T who is not moving or is stationary).
- the iris recognition system 1 is a specific example of the "authentication system" in the appendix described later.
- FIG. 1 is a block diagram showing an overall configuration of the iris authentication system 1 of the present embodiment.
- the iris recognition system 1 is a specific example of the overall image pickup device 2 which is a specific example of the “second image pickup device” described later and a specific example of the “first image pickup device” which will be described later. It includes a plurality of iris imaging devices 3 and an iris recognition device 6 which is a specific example of the "control device” described later.
- the iris recognition system 1 may further include a motion sensor 4 and a motion sensor 5.
- the whole image pickup device 2 includes a single (or, in some cases, a plurality of) whole cameras 21.
- the iris imaging device 3 includes a plurality of iris cameras 31.
- FIG. 1 shows an example in which the iris imaging device 3 includes n (where n is an integer of 2 or more) iris cameras 31.
- n iris cameras 31 will be referred to as iris camera 31-1, iris camera 31-2, ..., And iris camera 31-n, respectively, as necessary.
- the number of iris cameras 31 may be appropriately set according to the characteristics of each iris camera 31 (for example, at least one such as the imaging range of each iris camera 31 and the resolution of each iris camera 31).
- Each of the whole camera 21 and the plurality of iris cameras 31 is an imaging device capable of capturing a person T.
- the overall camera 21 and the plurality of iris cameras 31 will be described in more detail with reference to FIG.
- FIG. 2 is a schematic view showing the positional relationship between the entire camera 21, the plurality of iris cameras 31, and the person T.
- the overall camera 21 captures the person T in an imaging range wider than the imaging range of each iris camera 31. That is, the imaging range of the entire camera 21 is wider than the imaging range of each iris camera 31.
- the imaging range of the overall camera 21 is set to an appropriate range so that the overall camera 21 can image the person T regardless of the height of the person T. That is, the imaging range of the entire camera 21 is set to an appropriate range so that the overall camera 21 can image a relatively tall person T and also a relatively short person T.
- the imaging range of the overall camera 21 is the target portion TP (in the present embodiment, the eye including the iris) used by the overall camera 21 for authentication among the person T, regardless of the height of the person T. It is set in an appropriate range so that the image can be taken.
- the "camera imaging range” in the present embodiment means a range including a scene that can be captured by the camera, and may be referred to as a field of view range, an imaging area, a field of view area, or an angle of view area.
- the size of such an imaging range typically increases as the angle of view (in other words, the viewing angle) of the camera increases. Therefore, typically, the optical system (for example, a lens) of the entire camera 21 has a wider angle than the optical system of each iris camera 31. That is, the angle of view of the entire camera 21 is wider than the angle of view of each iris camera 31. Further, the angle of view of the camera typically increases as the focal length of the optical system (for example, a lens) included in the camera becomes shorter. Therefore, typically, the focal length of the optical system of the entire camera 21 is shorter than the focal length of the optical system of each iris camera 31.
- the overall camera 21 captures the person T located at the trigger point P1. That is, the imaging range of the entire camera 21 is set to an appropriate imaging range so that the entire camera 21 can image the person T located at the trigger point P1.
- the trigger point P1 is a point located on the movement path of the person T. Further, the trigger point P1 is a point located on the front side of the reference point P0 when viewed from the person T moving toward the trigger point P1. That is, the trigger point P1 is a point located on the front side (that is, the rear side) in the moving direction of the person T with respect to the reference point P0. Further, the trigger point P1 is a point separated from the reference point P0 by a distance D1 along the moving direction of the person T.
- the reference point P0 may be, for example, a point where each iris camera 31 is installed.
- the reference point P0 may be, for example, the destination of the moving person T.
- the destination may be, for example, a point through which the person T passes after authentication (for example, a point where an airport gate is installed).
- the person T is moving from the left side to the right side of the paper. Therefore, in the example shown in FIG. 2, the trigger point P1 is a point separated from the reference point P0 by a distance D1 toward the left side of the paper.
- the person T may move along a linear path in which the moving directions are always the same, or move along a path in which the moving direction changes in the middle (for example, a curved path or a bent path). You may.
- the overall camera 21 is arranged so that the focusing position of the overall camera 21 is located at the trigger point P1.
- the "focus position" in the present embodiment is a certain area extending before and after the best focus position (for example, an area that can be regarded as being in focus and corresponds to the depth of field). Shall mean.
- the overall camera 21 is arranged so that the focusing position of the overall camera 21 is an area including the trigger point P1 (that is, the trigger point P1 is located in the area corresponding to the focusing position). Is preferable.
- the trigger point P1 is set at the in-focus position of the entire camera 21.
- the whole camera 21 has a resolution sufficient to identify the face of the person T located at the trigger point P1 from the whole image 200 which is an image captured by the whole camera 21.
- the overall camera 21 has a resolution such that the target portion TP (that is, eyes) of the person T located at the trigger point P1 can be identified from the overall image 200 where in the overall image 200. doing.
- each iris camera 31 captures the person T located at the focus point P2. That is, the imaging range of each iris camera 31 is set to an appropriate imaging range so that each iris camera 31 can image the person T located at the focus point P2.
- the focus point P2 is a point located on the movement path of the person T, similarly to the trigger point P1. Further, the focus point P2 is a point located on the front side of the reference point P0 when viewed from the person T moving toward the focus point P2, similarly to the trigger point P1. That is, the focus point P2 is a point located on the front side (that is, the rear side) in the moving direction of the person T from the reference point P0.
- the focus point P2 is a point separated from the reference point P0 by a distance D2 along the moving direction of the person T.
- the person T is moving from the left side to the right side of the paper. Therefore, in the example shown in FIG. 2, the focus point P2 is a point separated from the reference point P0 toward the left side of the paper by a distance D2.
- the distance D2 between the focus point P2 and the reference point P0 may be the same as the distance D1 between the trigger point P1 and the reference point P0.
- the same point is used as the trigger point P1 and the focus point P2, respectively. That is, the trigger point P1 and the focus point P2 are the same point.
- the iris recognition system 1 can be used as, for example, an iris recognition system that authenticates a person T who is not moving or is stationary.
- the distance D2 between the focus point P2 and the reference point P0 may be shorter than the distance D1 between the trigger point P1 and the reference point P0.
- the focus point P2 is located on the back side (that is, the front side) in the moving direction of the person T from the trigger point P1.
- the trigger point P1 is located on the front side (that is, the rear side) in the moving direction of the person T with respect to the focus point P2. Therefore, the moving person T passes through the focus point P2 after passing through the trigger point P1. In other words, the moving person T passes through the trigger point P1 before passing through the focus point P2.
- the distances D1 and D2 may be set to any value as long as the relationship that the distance D2 is shorter than the distance D1 is satisfied. As an example, the distances D1 and D2 may be set to 3 m and 2 m, respectively.
- the iris authentication system 1 can be used as, for example, a walk-through type iris authentication system that authenticates a moving person T.
- Each iris camera 31 is arranged so that the focusing position of each iris camera 31 is located at the focus point P2. Specifically, each iris camera 31 is arranged so that the focusing position of each iris camera 31 is a region including the focus point P2 (that is, the focus point P2 is located in the region corresponding to the focus position). It can be said that it will be done. Conversely, the focus point P2 is set at the in-focus position of each iris camera 31.
- the angle of view of the entire camera 21 is wider than the angle of view of each iris camera 31 (that is, the focal length of the optical system of the entire camera 21 is shorter than the focal length of the optical system of each iris camera 31. Therefore, the area corresponding to the in-focus position of the entire camera 21 is wider than the area corresponding to the in-focus position of each iris camera 31.
- the plurality of iris cameras 31 are arranged so that the imaging ranges of the plurality of iris cameras 31 partially overlap in the vertical direction (or a desired direction different from the vertical direction) at the focus point P2.
- the plurality of iris cameras 31 are the lower end of the imaging range of the iris camera 31-k (where k is an integer satisfying 1 ⁇ k ⁇ n) and the iris camera 31-m at the focus point P2.
- the same scene is partially reflected in the two images captured by the two iris cameras 31 whose imaging ranges partially overlap.
- the same scene is reflected in the lower end of the image captured by the iris camera 31-k and the upper end of the image captured by the iris camera 31-m.
- the combined imaging range obtained by combining the imaging ranges of the plurality of iris cameras 31 has a predetermined horizontal length in the horizontal direction and the vertical direction. Is arranged to have a predetermined vertical length.
- the predetermined horizontal length may be a length (for example, 0.2 m) that can include the target site TP of the person T located at the focus point P2 in the composite imaging range.
- the predetermined vertical length is a length (for example, 0.4 m) that allows the target site TP of the person T located at the focus point P2 to be included in the composite imaging range regardless of the height of the person T. May be good.
- Each iris camera 31 has a resolution sufficient to identify the target portion TP of the person T located at the focus point P2 from the iris image 300 which is an image captured by each iris camera 31.
- each iris camera 31 has a resolution sufficient to identify the iris pattern of the person T located at the focus point P2 from the iris image 300, which is an image captured by each iris camera 31.
- the overall camera 21 and the plurality of iris cameras 31 are arranged so that the combined imaging range of the plurality of iris cameras 31 at least partially overlaps the imaging range of the overall camera 21.
- the combined imaging range of the plurality of iris cameras 31 is included in the imaging range of the overall camera 21 at at least one (or other point) of the trigger point P1 and the focus point P2. Arranged so as to.
- FIG. 3 which is a plan view showing the relationship between the entire image 200 and the plurality of iris images 300, a plurality of iris cameras 31 are included in a part of the overall image 200 captured by the overall camera 21.
- the overall camera 21 and the plurality of iris cameras 31 may not be arranged so that the combined imaging range of the plurality of iris cameras 31 at least partially overlaps the imaging range of the overall camera 21. That is, the entire camera 21 and the plurality of iris cameras 31 may be arranged so that the combined imaging range of the plurality of iris cameras 31 does not overlap with the imaging range of the overall camera 21. For example, the overall camera 21 and the plurality of iris cameras 31 may be arranged so that the combined imaging range of the plurality of iris cameras 31 does not overlap the imaging range of the overall camera 21 at both the trigger point P1 and the focus point P2. ..
- the entire image 200 includes the iris image region RA.
- a part of the entire image 200 is the iris image area RA.
- the iris image area RA is an area corresponding to the combined imaging range of the plurality of iris cameras 31.
- the iris image area RA is an area in which the same scene as the scene captured in the plurality of iris images 300 is captured. As described above, the imaging ranges of the plurality of iris cameras 31 partially overlap along the vertical direction.
- an area corresponding to the imaging range of the iris camera 31-1 an area corresponding to the imaging range of the iris camera 31-2, and an area corresponding to the imaging range of the iris camera 31-3. , ...,
- the regions corresponding to the imaging range of the iris camera 31-n are arranged along the vertical direction. That is, in the iris image area RA, the area in which the same scene as the scene captured in the iris image 300-1 captured by the iris camera 31-1 is captured, and the scene captured in the iris image 300-2 captured by the iris camera 31-2.
- the area where the same scene is captured the area where the same scene is captured in the iris image 300-3 captured by the iris camera 31-3, and the iris image 300-n captured by the iris camera 31-n.
- the area where the same scene is captured is arranged along the vertical direction.
- the target portion TP of the person T is shown in the iris image 300-3.
- the target portion TP is captured in the region corresponding to the imaging range of the iris camera 31-3 in the entire image 200 (particularly, the iris image region RA).
- the motion sensor 4 is a detection device for detecting whether or not the person T is located at the trigger point P1.
- the detection result of the motion sensor 4 is output to the iris authentication device 6.
- the detection result of the motion sensor 4 is used as a condition for determining whether or not the entire camera 21 captures the person T located at the trigger point P1.
- the motion sensor 5 is a detection device for detecting whether or not the person T is located at the focus point P2.
- the detection result of the motion sensor 5 is output to the iris authentication device 6.
- the detection result of the motion sensor 5 is used as a condition for determining whether or not the iris camera 31 captures the person T located at the focus point P2.
- the iris authentication device 6 controls the overall operation of the iris authentication system 1.
- the iris recognition device 6 executes the iris recognition operation.
- the iris recognition operation is, for example, a process of selecting one iris camera 31 for capturing a person T located at the focus point P2 from a plurality of iris cameras 31 based on the entire image 200 captured by the entire camera 21. This operation includes the process of authenticating the person T based on the iris image 300 captured by the selected iris camera 31.
- the configuration of the iris authentication device 6 that executes such an iris authentication operation will be described in more detail.
- FIG. 4 is a block diagram showing a hardware configuration of the iris recognition device 6 of the present embodiment.
- the iris recognition device 6 includes a CPU (Central Processing Unit) 61, a RAM (Random Access Memory) 62, a ROM (Read Only Memory) 63, a storage device 64, an input device 65, and the like. It includes an output device 66.
- the CPU 61, the RAM 62, the ROM 63, the storage device 64, the input device 65, and the output device 66 are connected via the data bus 67.
- the iris recognition device 6 does not have to include at least one of a RAM 62, a ROM 63, a storage device 64, an input device 65, an output device 66, and a data bus 67.
- the CPU 61 reads a computer program.
- the CPU 61 may read a computer program stored in at least one of the RAM 62, the ROM 63, and the storage device 64.
- the CPU 61 may read a computer program stored in a computer-readable recording medium using a recording medium reading device (not shown).
- the CPU 61 may acquire (that is, may read) a computer program from a device (not shown) arranged outside the iris recognition device 6 via a network interface.
- the CPU 61 controls the RAM 62, the storage device 64, the input device 65, and the output device 66 by executing the read computer program.
- a logical functional block for executing the iris authentication operation is realized in the CPU 61. That is, the CPU 61 can function as a controller for realizing a logical functional block for executing the iris authentication operation.
- FIG. 5 shows an example of a logical functional block realized in the CPU 61 to perform the iris authentication operation.
- an image acquisition unit 611 which is a specific example of the "acquisition means" in the appendix described later, and an appendix described later.
- the target selection unit 612, the camera setting unit 613, the image pickup control unit 614, and the authentication unit 615 which are specific examples of the "selection means" in the above, are realized.
- the operations of the image acquisition unit 611, the target selection unit 612, the camera setting unit 613, the image pickup control unit 614, and the authentication unit 615 will be described in detail later with reference to FIG. The explanation is omitted.
- the RAM 62 temporarily stores the computer program executed by the CPU 61.
- the RAM 62 temporarily stores data temporarily used by the CPU 61 when the CPU 61 is executing a computer program.
- the RAM 62 may be, for example, a D-RAM (Dynamic RAM).
- the ROM 63 stores a computer program executed by the CPU 61.
- the ROM 63 may also store fixed data.
- the ROM 63 may be, for example, a P-ROM (Programmable ROM).
- the storage device 64 stores data stored by the iris recognition device 6 for a long period of time.
- the storage device 64 may operate as a temporary storage device of the CPU 61.
- the storage device 64 may include, for example, at least one of a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), and a disk array device.
- the input device 65 is a device that receives an input instruction from the user of the iris authentication device 6.
- the input device 65 may include, for example, at least one of a keyboard, a mouse and a touch panel.
- the output device 66 is a device that outputs information about the iris authentication device 6 to the outside.
- the output device 66 may be a display device capable of displaying information about the iris recognition device 6.
- FIG. 6 is a flowchart showing the flow of the operation (that is, the iris authentication operation) of the iris authentication system 1 of the present embodiment.
- the imaging control unit 614 determines whether or not the person T is located at the trigger point P1 based on the detection result of the motion sensor 4 (step S11). As a result of the determination in step S11, if it is determined that the person T is not located at the trigger point P1 (step S11: No), the process of step S11 is repeated. On the other hand, if it is determined that the person T is located at the trigger point P1 as a result of the determination in step S11 (step S11: Yes), the imaging control unit 614 is the person located at the trigger point P1.
- the entire camera 21 is controlled so as to capture T (step S12). As a result, the whole camera 21 takes an image of the person T located at the trigger point P1 (step S12).
- the entire image 200 captured by the overall camera 21 is acquired by the image acquisition unit 611 (step S12).
- the target selection unit 612 selects at least one of the persons T shown in the overall image 200 as the authentication target Ta to perform iris recognition based on the overall image 200 acquired in step S12 (). Steps S21 to S27).
- the target selection unit 612 In order to select the authentication target person Ta, the target selection unit 612 first detects the target region TA including the target site TP in the entire image 200 acquired in step S12 (step S21). As described above, in the present embodiment, the eyes of the person T are used as the target site TP. Therefore, the target selection unit 612 detects the target region TA including the eyes of the person T, which is the target site TP, in the entire image 200. In the following, for convenience of explanation, the target selection unit 612 will detect a face area in which the face of the person T is shown as a target area TA including the eyes of the person T. Therefore, in the following description, for convenience of explanation, the target region TA is appropriately referred to as “face region TA”.
- the face area TA may be an area in which only the face is shown. That is, the face region TA may be a region that is distinguished from the surroundings by the contour of the face (in other words, can be cut out by the contour of the face). Alternatively, the face region TA may be a region having a predetermined shape (for example, a rectangular shape) in which a scene around the face is captured in addition to the face.
- the target selection unit 612 may use an existing method for detecting an region in which a desired object appears in a certain image. Therefore, a detailed description of the method for detecting the face region TA will be omitted.
- the target selection unit 612 determines whether or not the face region TA is actually detected in step S21 (step S22).
- step S21 If it is determined in step S21 that the face region TA is not detected as a result of the determination in step S22 (step S22: No), the person T is not reflected in the entire image 200 acquired in step S12. It is estimated that it was. In this case, the iris recognition device 6 repeats the processes after step S11 without selecting the authentication target person Ta.
- step S22 if it is determined that the face region TA is detected in step S21 as a result of the determination in step S22 (step S22: Yes), the target selection unit 612 is included in the entire image 200 acquired in step S12.
- the person T who is reflected and corresponds to the face area TA satisfying a predetermined selection condition is selected as the authentication target person Ta (step S23). That is, the target selection unit 612 does not select the person T corresponding to the face area TA that does not satisfy the predetermined selection condition as the authentication target person Ta among the person T reflected in the entire image 200.
- the predetermined selection condition may include a condition based on the face area TA and the iris image area RA. In this case, it can be said that the target selection unit 612 selects the authentication target person Ta based on the face area TA and the iris image area RA.
- the predetermined selection condition may include a condition based on the positional relationship between the face region TA and the iris image region RA. In this case, it can be said that the target selection unit 612 selects the authentication target person Ta based on the positional relationship between the face area TA and the iris image area RA.
- the target selection unit 612 specifies the iris image area RA in order to select the person T corresponding to the face area TA satisfying the selection condition as the authentication target person Ta.
- the iris image region RA sets the imaging range of the iris camera 31 at the focus point P2 to the virtual optical surface located at the trigger point P1 (substantially, the imaging range of the entire camera 21 at the trigger point P1). Yes, it corresponds to the area obtained by projecting onto the entire image 200). Therefore, the relationship between the iris image 300 and the entire image 200 (particularly, the relationship between the iris image 300 and the iris image area RA) is the imaging range (that is, the angle of view) of the iris camera 31 and the overall camera 21.
- H typically homography
- the arbitrary coordinates p_if in the iris image 300 are equivalent to the arbitrary coordinates in the imaging range of the iris camera 31 at the focus point P2.
- the coordinates in the whole image 200 are the coordinates in the imaging range of the whole camera 21 at the trigger point P1, and are typically equivalent to the coordinates in the iris image area RA.
- the transformation matrix H shows the correspondence between the imaging range of the iris camera 31 at the focus point P2 and the imaging range of the entire camera 21 at the focus point P2.
- the transformation matrix H_iwf is used.
- the transformation matrix H is the imaging range of the entire camera 21 at the focus point P2 and the overall camera 21 at the trigger point P1 with respect to the transformation matrix H_iwf described above.
- a matrix (that is, H_iwf ⁇ H_wft) obtained by multiplying the transformation matrix H_wft indicating the correspondence with the imaging range of the above is used.
- the target selection unit 612 selects the authentication target person Ta based on the degree of overlap between the face area TA and the iris image area RA.
- the target selection unit 612 may select the person T corresponding to the face area TA having a relatively high degree of overlap with the iris image area RA as the authentication target person Ta.
- the target selection unit 612 does not have to select the person T corresponding to the face area TA having a relatively low degree of overlap with the iris image area RA as the authentication target person Ta.
- the overlapping condition may be, for example, a condition based on the overlapping region OA that overlaps with the iris image region RA in the face region TA. More specifically, the overlapping condition may be a condition based on the ratio of the overlapping region OA to the face region TA. In this case, the overlapping condition may be a condition that the ratio of the overlapping region OA to the face region TA becomes larger than a predetermined first threshold value.
- the person T corresponding to the face area TA in which the ratio of the overlapping area OA is larger than the first threshold value is selected as the authentication target person Ta.
- the person T corresponding to the face area TA in which the ratio of the overlapping area OA does not become larger than the first threshold value is not selected as the authentication target person Ta.
- FIGS. 7 (a) to 7 (b) specific examples of each of the authentication target person Ta selected based on the overlapping condition and the authentication target person Ta not selected based on the overlapping condition are shown. This will be further described. Specifically, as shown in FIG. 7A, an example in which three persons T (specifically, person T # 11, person T # 12, and person T # 13) are shown in the entire image 200. The explanation will proceed using.
- the person T # 11 is shown in the whole image 200 so that the whole face is included in the iris image area RA.
- the whole image 200 of the person T # 12 is such that a part of the face is included in the iris image area RA while the remaining part of the face is not included in the iris image area RA (that is, it protrudes from the iris image area RA). It is reflected in.
- the person T # 13 is shown in the whole image 200 so that the whole face is not included in the iris image area RA.
- FIG. 7B which is a plan view showing the positional relationship between the face region TA and the iris image region RA
- the entire face region TA # 11 of the person T # 11 is included in the iris image region RA. Is done. Therefore, the overlapping area OA # 11 of the person T # 11 coincides with the face area TA # 11. Therefore, the ratio of the overlapping region OA # 11 to the face region TA # 11 is 100%.
- the ratio of the overlapping region OA # 12 to the face region TA # 12 is smaller than 100% and larger than 0%. In the example shown in FIG. 7B, the ratio of the overlapping region OA # 12 to the face region TA # 12 is 45%.
- the entire face area TA # 13 of the person T # 13 is not included in the iris image area RA. Therefore, the overlapping area OA # 13 of the person T # 13 does not exist. Therefore, the ratio of the overlapping region OA # 13 to the face region TA # 13 is 0%.
- the person T # corresponding to the face area TA # 11 in which the ratio of the overlapping area OA is larger than 60%. 11 is selected as the authentication target person Ta.
- the persons T # 12 and T # 13 corresponding to the face areas TA # 12 and TA # 13 in which the ratio of the overlapping area OA does not become larger than 60% are not selected as the authentication target person Ta.
- the overlapping condition is a condition for at least one of the plurality of iris cameras 31 to select a person T who can appropriately image the face (particularly, the target portion TP) as the authentication target person Ta.
- the target selection unit 612 appropriately sets a person T whose face (particularly, the target site TP) can be appropriately imaged by at least one of the plurality of iris cameras 31 as the authentication target Ta. Can be selected for.
- the person T that is desirable to be selected as the authentication target person Ta and the person T that is not desirable to be selected as the authentication target person Ta are appropriately selected from the ratio of the overlapping area OA. It is set to an appropriate value so that it can be distinguished.
- a first threshold value may be set based on an experiment or a simulation as a result of, for example, an actually performed iris recognition operation (particularly, an operation of selecting an authentication target person Ta).
- the target selection unit 612 selects the authentication target person Ta based on the face region TA, the iris image region RA, and the degree of proximity.
- the target selection unit 612 may select the person T corresponding to the face area TA having a relatively high degree of proximity to the iris image area RA as the authentication target person Ta.
- the target selection unit 612 does not have to select the person T corresponding to the face region TA, which has a relatively low degree of proximity to the iris image region RA, as the authentication target person Ta.
- the proximity condition may be, for example, a condition based on the degree of proximity between the feature point of the face region TA and the feature point of the iris image region RA. That is, the proximity condition may be a condition based on the distance Dis between the feature point of the face region TA and the feature point of the iris image region RA. In this case, the proximity condition may be a condition that the distance Dis is smaller than a predetermined second threshold value.
- the proximity condition may be a condition that the distance Dis is smaller than a predetermined second threshold value.
- the distance Dis means the distance in the direction in which the imaging ranges of the plurality of iris cameras 31 intersect in the direction in which they are arranged (for example, the vertical direction).
- the distance Dis means a distance in a direction (for example, the horizontal direction) orthogonal to the direction (for example, the vertical direction) in which the imaging ranges of the plurality of iris cameras 31 are arranged.
- the feature point is, for example, the center of gravity (that is, the geometric center). However, the feature point may be a point different from the center of gravity. For example, when the face region TA and the iris image region RA have a point-symmetrical shape, the feature point may be the center of symmetry.
- FIGS. 8 (a) to 8 (b) specific examples of each of the authentication target person Ta selected based on the proximity condition and the authentication target person Ta not selected based on the overlapping condition are shown. This will be further described. Specifically, as shown in FIG. 8A, three persons T (specifically, persons T # 11 to persons T # 13 described in FIG. 7A) are shown in the entire image 200. The explanation will proceed using the example.
- FIG. 8B which is a plan view showing the positional relationship between the center of gravity TC of the face region TA and the center of gravity RC of the iris image region RA, the center of gravity TC # 11 and the center of gravity RC of the face region TA # 11
- the distance Dis # 11 is the distance between the center of gravity TC # 12 and the center of gravity RC of the face area TA # 12 and the distance between the center of gravity TC # 13 of the face area TA # 13 and the center of gravity RC of the face area TA # 13. It is smaller than Dis # 13.
- the entire face of the person T # 11 is included in the iris image region RA, so that the center of gravity TC # 11 should be closer to the center of gravity RC than the center of gravity TC # 12 and the center of gravity TC # 13. Further, the distance Dis # 12 between the center of gravity TC # 12 and the center of gravity RC in the face area TA # 12 is smaller than the distance Dis # 13 between the center of gravity TC # 13 and the center of gravity RC in the face area TA # 13. ..
- center of gravity TC # 12 is larger than the center of gravity TC # 13 of the face region TA # 13 which is not included in the iris image region RA at all because a part of the face of the person T # 12 is included in the iris image region RA. This is because it should be close to the center of gravity RC.
- the distance from the center of gravity RC is the second threshold value.
- the persons T # 12 and T # 13 corresponding to the face regions TA # 12 and TA # 13 having the centers of gravity TC # 12 and TC # 13 whose distance from the center of gravity RC does not become smaller than the second threshold value are Not selected as the person to be authenticated Ta.
- the proximity condition is a condition for selecting a person T as an authentication target person Ta, in which at least one of the plurality of iris cameras 31 can appropriately image a face (particularly, a target site TP), as in the overlapping condition. It can be said that. That is, when the proximity condition is used, the target selection unit 612 appropriately sets a person T whose face (particularly, the target site TP) can be appropriately imaged by at least one of the plurality of iris cameras 31 as the authentication target person Ta. Can be selected for.
- the second threshold value used in the proximity condition makes it possible to appropriately distinguish between the person T who is desirable to be selected as the authentication target person Ta and the person T who is not desirable to be selected as the authentication target person Ta from the distance Dis. It is set to an appropriate value so that Such a second threshold value may be set based on an experiment or a simulation as a result of, for example, an actually performed iris recognition operation (particularly, an operation of selecting an authentication target person Ta). Such a second threshold value may be set based on the length RA_len of the iris image region RA. Specifically, the second threshold value may be set based on the length RA_len of the iris image region RA along the direction of the distance Dis.
- the second threshold value is the length RA_len of the iris image region RA along the direction (for example, the horizontal direction) that intersects the direction in which the imaging ranges of the plurality of iris cameras 31 are arranged (for example, the vertical direction). It may be set based on. In the example shown in FIG. 8B, the length RA_len is the length of the iris image region RA in the left-right direction (that is, the horizontal direction) of the paper surface. As an example, the second threshold value may be set to a value obtained by multiplying the length RA_len of the iris image region RA by a coefficient less than 1 (for example, 0.5).
- the target selection unit 612 determines whether or not the number of authentication target Tas selected in step S23 is 1 (step S24). That is, the target selection unit 612 determines whether or not the number of face region TAs satisfying the selection condition is one (step S24).
- Step S24 when it is determined that the number of authentication target Tas selected in step S23 is not 1 (that is, the number of face area TAs satisfying the selection condition is not 1). (Step S24: No), the target selection unit 612 determines whether or not the number of authentication target Tas selected in step S23 is 0 (step S25). That is, the target selection unit 612 determines whether or not the number of face region TAs satisfying the selection condition is 0 (step S25).
- Step S25 when it is determined that the number of authentication target Tas selected in step S23 is 0 (that is, the number of face area TAs satisfying the selection condition is 0) ( Step S25: Yes), it is estimated that none of the person T shown in the whole image 200 was selected as the authentication target person Ta. In this case, since the authentication target person Ta is not selected, the iris recognition system 1 cannot perform iris recognition. Therefore, in this case, the iris recognition device 6 repeats the processes after step S11 without selecting the authentication target person Ta.
- Step S25 it is determined that the number of authentication target Tas selected in step S23 is not 0 (that is, the number of face region TAs satisfying the selection condition is not 0).
- Step S25: No it is estimated that each of two or more persons T was selected as the authentication target person Ta in step S23. That is, it is estimated that the number of face region TAs satisfying the selection condition was two or more.
- the target selection unit 612 selects any one of the two or more persons T selected as the authentication target person Ta in step S23 as the actual authentication target person Ta (step S26). That is, the target selection unit 612 selects the person T corresponding to any one of the two or more face region TAs satisfying the selection condition as the actual authentication target person Ta (step S26).
- the target selection unit 612 may select one person T located in the foreground as the actual authentication target person Ta. Specifically, the target selection unit 612 identifies, for example, one person with the largest distance between both eyes as one person T located in the foreground, and the specified person T is an actual authentication target. Person Ta may be selected. For example, the target selection unit 612 may select one person T corresponding to the face area TA having the largest degree of overlap with the iris image area RA as the actual authentication target person Ta. For example, the target selection unit 612 may select one person T corresponding to the face area TA having the largest ratio of the overlapping area OA as the actual authentication target person Ta.
- the target selection unit 612 may select one person T corresponding to the face area TA having the greatest degree of proximity to the iris image area RA as the actual authentication target person Ta. For example, the target selection unit 612 selects one person T corresponding to the face region TA having the feature point at which the distance Dis from the feature point of the iris image region RA is the smallest as the actual authentication target person Ta. May be good.
- the face area TA is used as an overlapping condition.
- the condition that the ratio of the overlapping region OA is the largest may be used.
- the condition that the distance Dis is the smallest may be used instead of the condition that the distance Dis is smaller than the predetermined second threshold value.
- step S24 it was determined that the number of authentication target Tas selected in step S23 was 1 (that is, the number of face region TAs satisfying the selection condition was 1). In that case (step S24: Yes), the processes of steps S25 to S26 may not be performed.
- the camera setting unit 613 selects one iris camera 31 for photographing the authentication target person Ta located at the focus point P2 from the plurality of iris cameras 31 (step S31).
- the camera setting unit 613 is an authentication target among a plurality of imaging ranges (that is, a plurality of imaging ranges in the iris image area RA) corresponding to the plurality of iris cameras 31 based on the entire image 200.
- One imaging range including the target site TP of the person Ta is specified.
- the camera setting unit 613 selects the iris camera 31 corresponding to the specified one imaging range as the one iris camera 31 for imaging the authentication target person Ta.
- the camera setting unit 613 is used to image the iris camera 31-3 with the authentication target Ta. Is selected as the iris camera 31 of.
- the camera setting unit 613 defines an image portion actually acquired (that is, read out) for performing iris recognition among the iris images 300 captured by the one iris camera 31 selected in step S31.
- the gaze area (ROI: Region of Interest) IA is set (step S32). Specifically, as shown in FIG. 9, which is a plan view showing the iris image 300 captured by the selected one iris camera 31, the camera setting unit 613 is captured by, for example, the selected one iris camera 31. A rectangular (or other shape) region in which the target portion TP is expected to be reflected in the iris image 300 is set as the gaze region IA.
- the iris recognition system 1 When the gaze area IA is set, the iris recognition system 1 operates in the gaze area mode.
- the image acquisition unit 611 acquires the entire iris image 300 captured by the iris camera 31, and instead of acquiring the entire iris image 300, the image portion of the iris image 300 in the gaze area IA (that is, of the image data of the iris image 300). To get a part of). That is, the image acquisition unit 611 does not have to acquire the image portion (that is, the remaining part of the image data of the iris image 300) in the region other than the gaze region IA of the iris image 300.
- the frame rate at which the image acquisition unit 611 acquires the iris image 300 from the iris camera 31 is substantially improved as compared with the case where the entire iris image 300 is acquired.
- the frame rate is doubled as compared with the case of acquiring the entire iris image 300. Therefore, even if the frame rate of the iris camera 31 itself is less than the frame rate required for iris recognition, the image acquisition unit 611 can acquire the iris image 300 at the frame rate required for iris recognition. it can.
- the image pickup control unit 614 determines whether or not the authentication target person Ta (that is, the person T selected as the authentication target person Ta) is located at the focus point P2 based on the detection result of the motion sensor 5. (Step S41). As a result of the determination in step S41, if it is determined that the authentication target person Ta is not located at the focus point P2 (step S41: No), the process of step S41 is repeated. On the other hand, when it is determined that the authentication target person Ta is located at the focus point P2 as a result of the determination in step S41 (step S41: Yes), the image pickup control unit 614 is located at the focus point P2. The one iris camera 31 selected in step S31 is controlled so as to image the authentication target person Ta (step S42).
- the selected one iris camera 31 images the authentication target person Ta located at the focus point P2 (step S42).
- the iris image 300 (particularly, the image portion in the gaze area IA of the iris image 300) captured by the selected iris camera 31 is acquired by the image acquisition unit 611 (step S42).
- the authentication unit 615 performs iris authentication using the iris image 300 acquired in step S42 (step S51). For example, the authentication unit 615 identifies the iris pattern of the authentication target person Ta based on the iris image 300 acquired in step S42. After that, the authentication unit 615 determines whether or not the specified pattern matches the pattern registered in the database stored in the storage device 64 or the like. When the specified pattern matches the pattern registered in the database, the authentication unit 615 determines that the authentication target person Ta is a legitimate person. When the specified pattern matches the pattern registered in the database, the authentication unit 615 determines that the authentication target person Ta is not a legitimate person.
- the iris recognition system 1 can appropriately select (in other words, specify or determine) the authentication target person Ta.
- the iris authentication system 1 sets the person T of one of the plurality of persons T in a situation where the plurality of persons T are reflected in the entire image 200 (particularly, the iris image area RA). It can be appropriately selected as the authentication target person Ta.
- the iris authentication system 1 satisfies the selection condition among the plurality of face region TAs in a situation where a plurality of face region TAs are detected in the entire image 200 (particularly, the iris image region RA).
- One person T corresponding to the face area TA of the above can be appropriately selected as the authentication target person Ta.
- the iris recognition system 1 can select the person T who is desirable to be selected as the authentication target person Ta as the authentication target person Ta. In other words, the iris recognition system 1 makes it difficult to select a person T who is not desirable to be selected as the authentication target person Ta as the authentication target person Ta.
- the person T is selected as the authentication target person Ta in front of the person T who is desirable to be selected as the authentication target person Ta.
- a person T who is not desirable to be photographed enters the imaging range of the iris camera 31 from the side.
- the person T who is desirable to be selected as the authentication target person Ta is referred to as "person T_desire”
- the person T who is not desirable to be selected as the authentication target person Ta is referred to as "person T_underire”. .
- Such a situation may occur, for example, when the person T_undesire looks into the iris camera 31 from the side in front of the person T_desire.
- FIG. 10 (a) since the person T_underire looks into the iris camera 31 from the side, at least a part of the face area TA_underire of the person T_underire is an iris image as compared with the face area TA_desire of the person T_desire. There is a high possibility that it will be out of the area RA.
- FIG. 10B which is a plan view showing the positional relationship between the face region TA and the iris image region RA together with the degree of overlap between the two, the person is compared with the ratio of the overlap region OA_desire of the person T_desire.
- the proportion of the overlapping region OA_underire of T_undoshire is likely to be small.
- the target selection unit 612 can appropriately select the person T_desire as the authentication target person Ta instead of the person T_undersire by using the above-mentioned overlapping condition.
- FIG. 10 (c) which is a plan view showing the positional relationship between the face region TA and the iris image region RA together with the degree of proximity between the two, the center of gravity TC_desire of the face region TA_desire and the center of gravity RC of the iris image region RA
- the distance between the center of gravity TC_underire of the face region TA_undesire and the center of gravity RC of the iris image region RA is more likely to be larger than the distance Dis_desire.
- the target selection unit 612 can appropriately select the person T_desire as the authentication target person Ta instead of the person T_undersire by using the proximity condition described above.
- the person T_undoshire located in front of the person T_desire may be selected as the authentication target person Ta. There is. This is because the person T_undoshire is the front person T. Therefore, the iris recognition system 1 of the present embodiment can appropriately select the authentication target person Ta, especially in a situation where there is a person T_undoshire looking into the iris camera 31 in front of the person T_desire. Of course, even in a situation different from the situation in which the person T_underire looking into the iris camera 31 in front of the person T_desire exists, the iris recognition system 1 can appropriately select the authentication target person Ta by using the selection condition. Absent.
- FIG. 11 is a block diagram showing the overall configuration of the iris authentication system 1a of the first modification.
- the same components as the components included in the above-mentioned iris authentication system 1 are designated by the same reference numerals, and detailed description thereof will be omitted.
- the iris recognition system 1a is different from the iris recognition system 1 in that it further includes a display 7a.
- Other features of the iris recognition system 1a may be the same as the other features of the iris recognition system 1.
- the iris recognition device 6 includes a display as an example of the output device 66 (see FIG. 4), the display of the iris recognition device 6 may be used as the display 7a.
- the iris recognition device 6 may control the display 7a so that the entire image 200 is displayed on the display 7a. Further, the iris recognition device 6 may control the display 7a so that the information regarding the iris recognition operation is displayed on the display 7a together with the entire image 200. That is, the iris authentication device 6 may control the display 7a so that the entire image 200 to which the information regarding the iris authentication operation is added is displayed on the display 7a.
- a display example of the entire image 200 to which information regarding the iris authentication operation is added will be described with reference to FIGS. 12 and 13. It should be noted that FIGS. 12 and 13 are plan views showing the entire image 200 including the two persons T, respectively.
- the iris recognition device 6 may control the display 7a so that the target area TA is displayed together with the entire image 200 in a display mode in which the target area TA can be identified on the entire image 200. That is, the iris authentication device 6 is added with information for identifying the target area TA on the entire image 200 (for example, a decorative display that distinguishes the target area TA from other image portions) as information related to the iris authentication operation.
- the display 7a may be controlled so as to display the image 200.
- the iris recognition device 6 controls the display 7a so that the iris image region RA is displayed together with the overall image 200 in a display mode in which the iris image region RA can be identified on the entire image 200.
- the information for identifying the iris image area RA on the entire image 200 (for example, the decorative display for distinguishing the iris image area RA from other image portions) is used as the information related to the iris recognition operation.
- the display 7a may be controlled so as to display the added overall image 200.
- the iris recognition device 6 may control the display 7a so as to display the overlapping area OA together with the whole image 200 in a display mode in which the overlapping area OA can be identified on the whole image 200. That is, in the iris recognition device 6, information for identifying the overlapping area OA on the entire image 200 (for example, a decorative display for distinguishing the overlapping area OA from other image portions) is added as information related to the iris recognition operation.
- the display 7a may be controlled so as to display the entire image 200.
- the iris recognition device 6 displays the person T selected as the authentication target person Ta together with the whole image 200 in a display mode in which the person T selected as the authentication target person Ta on the entire image 200 can be identified.
- the display 7a may be controlled so as to display. That is, the iris recognition device 6 distinguishes the information for identifying the person T selected as the authentication target person Ta on the entire image 200 (for example, the person T selected as the authentication target person Ta from other image parts).
- the display 7a may be controlled so that the entire image 200 is displayed as information related to the iris authentication operation.
- the iris recognition device 6 is a person who is not selected as the authentication target person Ta together with the whole image 200 in a display mode in which the person T who is not selected as the authentication target person Ta on the whole image 200 can be identified.
- the display 7a may be controlled so as to display T. That is, the iris recognition device 6 sets the information for identifying the person T not selected as the authentication target person Ta on the entire image 200 (for example, the person T not selected as the authentication target person Ta with another image portion).
- the display 7a may be controlled so that the distinctive decorative display) displays the entire image 200 added as information regarding the iris authentication operation.
- the iris recognition device 6 may control the display 7a so as to display the entire image 200 to which the information regarding the abnormal event is added as the information regarding the iris recognition operation.
- FIG. 13 shows that two or more persons T were selected as the authentication target person Ta in step S23 of FIG. 6 described above (that is, the number of face area TAs satisfying the selection condition was two or more).
- An example of displaying an alert is described.
- the authentication target person Ta is narrowed down to one person in principle by the process of step S26 in FIG. 6, is one authentication target person Ta narrowed down in this way appropriate (that is, the authentication target person Ta). It is useful in that the user of the iris authentication system 1a can confirm (whether the narrowing down is performed properly).
- the iris authentication system 1a of the first modification can enjoy the same effect as the effect that the iris authentication system 1 described above can enjoy. Further, in the iris recognition system 1a of the first modification, the user of the iris recognition system 1a can appropriately grasp the status of the iris recognition operation by checking the display contents of the display 7a. As a result, the user can take an appropriate response (for example, a response to an abnormal event) as needed.
- the iris recognition device 6 may use a selection condition including a condition based on a time-dependent change in the positional relationship between the face region TA and the iris image region RA.
- the target selection unit 612 sets a person T corresponding to the face area TA in which the amount of change in the positional relationship with the iris image area RA per unit time is less than a predetermined first permissible amount as the authentication target person Ta. You may choose.
- the target selection unit 612 selects the person T corresponding to the face area TA in which the amount of change in the positional relationship with the iris image area RA per unit time is larger than the first allowable amount as the authentication target person Ta. It does not have to be.
- the iris recognition device 6 changes the degree of overlap between the face region TA and the iris image region RA with time (typically, the ratio of the overlapping region OA to the face region TA).
- Overlapping conditions may be used, including conditions based on (change over time).
- the target selection unit 612 may select a person T corresponding to the face area TA in which the amount of change in the ratio of the overlapping area OA per unit time is less than a predetermined second allowable amount as the authentication target person Ta. ..
- the target selection unit 612 is a person corresponding to the face region TA in which the ratio of the overlapping region OA is larger than the above-mentioned first threshold value and the amount of change in the ratio of the overlapping region OA per unit time is less than the second allowable amount.
- T may be selected as the authentication target person Ta.
- the target selection unit 612 does not have to select the person T corresponding to the face area TA in which the amount of change in the ratio of the overlapping area OA per unit time is larger than the second allowable amount as the authentication target person Ta.
- the target selection unit 612 is a person corresponding to the face region TA in which the ratio of the overlapping region OA is larger than the above-mentioned first threshold value, but the amount of change in the ratio of the overlapping region OA per unit time is larger than the second allowable amount. It is not necessary to select T as the authentication target person Ta.
- the iris recognition device 6 changes the degree of proximity between the face region TA and the iris image region RA with time (typically, the feature points of the face region TA and the iris image).
- Proximity conditions may be used that include conditions based on the distance Dis) between the feature points of the region RA.
- the target selection unit 612 may select a person T corresponding to the face region TA in which the amount of change in the distance Dis per unit time is less than a predetermined third allowable amount as the authentication target person Ta.
- the target selection unit 612 authenticates the person T corresponding to the face region TA in which the distance Dis is smaller than the above-mentioned second threshold value and the amount of change in the distance Dis per unit time is less than the third allowable amount. It may be selected as Ta.
- the target selection unit 612 does not have to select the person T corresponding to the face region TA in which the amount of change in the distance Dis per unit time is larger than the third allowable amount as the authentication target person Ta.
- the target selection unit 612 authenticates the person T corresponding to the face region TA in which the distance Dis is smaller than the above-mentioned second threshold value but the amount of change in the distance Dis per unit time is larger than the third allowable amount. It does not have to be selected as Ta.
- the iris authentication system 1 can more appropriately select the authentication target person Ta.
- the reason will be described with reference to FIGS. 14 (a) to 14 (c).
- FIG. 14A is a plan view showing the entire image 200 including the person T_desire and the person T_undersire at time t1.
- FIG. 14B is a plan view showing the entire image 200 including the person T_desire and the person T_undoshire at the time t2 when a certain time has elapsed from the time t1.
- FIG. 14C is a plan view showing the entire image 200 including the person T_desire and the person T_undoshire at the time t3 when a certain time has passed from the time t2.
- FIGS. 14 (a) to 14 (c) it is assumed that the person T_desire is moving toward the iris camera 31, and the person T_undesire is moving so as to look into the iris camera 31 from the side.
- the person T_undesire is relatively likely to appear in the entire image 200 so as to move from the outside to the inside of the iris image region RA. For this reason, the proportion of overlapping region OA of the person T_undesire is relatively likely to increase from 0% to 100%.
- the person T_desire is relatively likely to appear in the overall image 200 so that it is always located inside the iris image region RA. Therefore, the ratio of the overlapping region OA of the person T_desire is relatively likely to be maintained at 100%. Therefore, the ratio of the overlapping area OA of the person T_under is likely to change significantly per unit time as compared with the ratio of the overlapping area OA of the person T_under.
- the target selection unit 612 can distinguish between the person T_desire and the person T_undesire from the time-dependent change of the ratio of the overlapping region OA and / or the time-dependent change of the distance Dis. That is, the target selection unit 612 can appropriately select the person T_desire as the authentication target person Ta instead of the person T_undersire.
- the target selection unit 612 selects one of the plurality of persons T as the authentication target person Ta. However, the target selection unit 612 may select two or more of the plurality of persons T as the authentication target person Ta. For example, the target selection unit 612 may select two or more persons T who satisfy the selection condition as the authentication target person Ta. For example, when there are two or more persons T satisfying the selection condition (step S25: No in FIG. 6), the target selection unit 612 does not have to perform the process of selecting the frontmost person T as the authentication target person Ta. Good.
- the distance Dis used in the proximity condition is a direction (for example, horizontal) in which the imaging ranges of the plurality of iris cameras 31 intersect in the direction in which they are arranged (for example, the vertical direction).
- the distance Dis may be the distance in any direction.
- the distance Dis may be the shortest distance (so-called Euclidean distance) between the feature point of the face region TA and the feature point of the iris image region RA.
- the distance Dis may be the distance between the feature points of the face region TA and the feature points of the iris image region RA in the direction (for example, the vertical direction) in which the imaging ranges of the plurality of iris cameras 31 are arranged. ..
- the target site TP when the target site TP is the eyes of the person T, the area in which the face of the person T is reflected (that is, the face area) is used as the target area TA.
- the target site TP when the target site TP is the eyes of the person T, a region different from the face region may be used as the target region TA.
- the target area TA even if the area where the eyes of the person T are shown (that is, the area where only the eyes or the scene around the eyes (for example, a part of the face) is shown in addition to the eyes) is used as the target area TA.
- a region in which the body of the person for example, at least one of the neck, torso, arms, hands and legs) is captured in addition to at least one of the eyes and face of the person T may be used as the target area TA.
- the iris imaging device 3 may include a single iris camera 31.
- the imaging range of the iris camera 31 is set to an appropriate range so that the iris camera 31 can image the target portion TP of the person T located at the focus point P2 regardless of the height of the person T. You may.
- the camera setting unit 613 processes the process of step S31 in FIG. 6 (that is, one iris camera 31 that captures the authentication target Ta). It is not necessary to perform the selection process).
- the camera setting unit 613 does not have to perform the process of setting the gaze area IA (the process corresponding to step S32 in FIG. 6). In this case, the iris recognition system 1 does not have to operate in the gaze area mode.
- the image acquisition unit 611 may acquire the entire iris image 300 captured by the iris camera 31.
- the authentication unit 615 may perform iris authentication using the entire iris image 300 captured by the iris camera 31.
- the iris recognition device 6 includes an image acquisition unit 611, a target selection unit 612, a camera setting unit 613, an image pickup control unit 614, and an authentication unit 615.
- FIG. 15 which is a block diagram showing a modified example of the iris recognition device 6, the iris recognition device 6 does not include at least one of a camera setting unit 613, an imaging control unit 614, and an authentication unit 615. May be good.
- an external device of the iris recognition device 6 may execute the processes performed by the camera setting unit 613, the image pickup control unit 614, and the authentication unit 615, respectively.
- the iris authentication system 1 includes a motion sensor 4.
- the iris recognition system 1 does not have to include the motion sensor 4.
- the overall camera 21 may continue to image the scene within the imaging range at a predetermined frame rate (that is, the imaging rate) regardless of whether or not the person T is located at the trigger point P1. ..
- the overall camera 21 may continue to capture the scene within the imaging range at a predetermined frame rate, at least during the period when the person T passes the trigger point P1.
- the overall camera 21 can take an image of the person T at the timing when the person T reaches the trigger point P1. That is, even when the iris recognition system 1 does not include the motion sensor 4, the image acquisition unit 611 can acquire the entire image 200 showing the person T located at the trigger point P1.
- the iris recognition device 6 (for example, the target selection unit 612) analyzes the entire image 200 to perform an image analysis of the entire image 200 to obtain a person located at the trigger point P1. It may be determined whether or not the entire image 200 in which T is captured has been acquired. That is, the iris recognition device 6 may determine whether or not the person T is located at the trigger point P1 by performing image analysis on the entire image 200. When it is determined that the person T is located at the trigger point P1 (that is, the entire image 200 showing the person T located at the trigger point P1 is acquired), the iris authentication device 6 determines the entire image.
- a process of selecting one iris camera 31 for photographing a person T (particularly, an authentication target person Ta) located at a focus point P2 (specifically, from step S21 to step S32 in FIG. 6). Perform a series of processes).
- the iris recognition device 6 does not start a series of processes from step S21 to step S32 in FIG.
- the iris recognition device 6 may adopt an existing method as an image analysis method for determining whether or not the person T is located at the trigger point P1. For example, the iris recognition device 6 may determine whether or not the person T is located at the trigger point P1 by estimating the depth from the entire image 200.
- the iris recognition device 6 detects the feet of the person T shown in the entire image 200, and determines whether or not the detected feet are located at the trigger point P1, so that the person T is the trigger point. It may be determined whether or not it is located at P1. For example, the iris recognition device 6 determines whether or not the distance between the eyes of the person T in the entire image 200 has reached a predetermined value, and thus determines whether or not the person T is located at the trigger point P1. May be determined.
- the iris authentication system 1 includes a motion sensor 5. However, the iris recognition system 1 does not have to include the motion sensor 5.
- the person T is in focus P2.
- the scene within the imaging range may be continuously imaged at a predetermined frame rate (that is, the imaging rate).
- the iris camera 31 may continue to image the scene within the imaging range at a predetermined frame rate (that is, the imaging rate), at least during the period when the authentication target Ta passes through the focus point P2.
- the iris camera 31 sets the authentication target person Ta at the timing when the authentication target person Ta reaches the focus point P2 from the trigger point P1. It can be imaged. That is, even when the iris recognition system 1 does not include the motion sensor 5, the image acquisition unit 611 can acquire the iris image 300 in which the authentication target person Ta located at the focus point P2 is captured. ..
- the iris recognition system 1 may authenticate the authentication target person Ta using any part of the authentication target person Ta.
- any authentication system that authenticates the authentication target person Ta using an arbitrary part of the authentication target person Ta may have the same configuration as the iris authentication system 1 and perform the same operation.
- the iris recognition system 1 may be provided with at least one face camera capable of photographing the face of the person to be authenticated Ta instead of the iris image pickup device 3.
- the face camera may be the same as the iris camera 31 described above, except that the face of the person to be authenticated Ta can be imaged. Further, the iris recognition system 1 may authenticate the authentication target person Ta based on the image captured by the face camera.
- the authentication system according to Appendix 1 is a first imaging device that acquires a first image used for authentication by imaging a first imaging range, and an imaging second imaging range wider than the first imaging range. The authentication is performed based on the second image pickup apparatus for acquiring the second image, the target area including a predetermined part of the person in the second image, and the reference area corresponding to the first image pickup range in the second image. It is an authentication system including a control device for selecting a target person.
- Appendix 2 The authentication system according to Appendix 2 is the imaging system according to Appendix 1 in which the control device selects the target person based on the positional relationship between the target area and the reference area.
- Appendix 3 The authentication system according to Appendix 3 is the authentication system according to Appendix 1 or 2, wherein the control device selects the target person based on an overlapping area that overlaps with the reference area in the target area.
- Appendix 4 The authentication system according to Appendix 4 is the authentication system according to Appendix 3, wherein the control device selects the target person based on the ratio of the overlapping area to the target area.
- Appendix 5 The authentication system according to Appendix 5, wherein the control device selects a person corresponding to the target region including the overlapping region whose ratio is larger than a predetermined first threshold value as the target person. It is an authentication system.
- control device selects the target person based on the distance between the center of gravity of the target region and the center of gravity of the reference region in the second image.
- Appendix 7 The authentication system according to Appendix 7 is the authentication system according to Appendix 6, wherein the control device determines a person corresponding to the target area whose distance is smaller than the second threshold value as the target person.
- Appendix 8 The authentication system according to Appendix 8 is the authentication system according to Appendix 6 or 7, wherein the feature point includes the center of gravity.
- the first imaging device includes a plurality of imaging means capable of acquiring a plurality of unit images, each of which can be used as the first image, and the plurality of imaging means include the above-mentioned.
- the scenes appearing in the plurality of unit images are arranged so as to be connected along the first direction, and the center of gravity of the target region is the center of gravity of the target region in the second direction intersecting the first direction, and the reference.
- the center of gravity of the region is the authentication system according to Appendix 8, which is the center of gravity of the reference region in the second direction.
- Appendix 10 The authentication system according to Appendix 10 is the authentication system according to any one of Appendix 1 to 9, further comprising a display device for displaying the second image.
- Appendix 11 In the authentication system according to Appendix 11, the display device, together with the second image, includes the target area, the reference area, an overlapping area of the target area that overlaps with the reference area, and a person selected as the target person.
- the authentication system according to Appendix 10 which displays at least one of the persons who are not selected as the target person in a display mode that can be identified on the second image.
- Appendix 12 The authentication system according to Appendix 12, wherein the display device is the authentication system according to Appendix 10 or 11 that displays an abnormal event related to the authentication.
- Appendix 13 The authentication system according to Appendix 13, wherein the control device selects the target person based on a change over time in the positional relationship between the target area and the reference area. It is an authentication system.
- Appendix 14 In the authentication system described in Appendix 14, when the amount of change in the positional relationship between the target area and the reference area per unit time exceeds the permissible amount, the control device corresponds to the person corresponding to the target area. Is the authentication system according to Appendix 13, which is not selected as the target person.
- Appendix 15 In the authentication system according to Appendix 15, when the control device changes the ratio of the overlapping area overlapping the reference area to the target area in the target area per unit time, the amount of change exceeds the permissible amount. , The authentication system according to Appendix 13 or 14, wherein the person corresponding to the target area is not selected as the target person.
- Appendix 16 In the authentication system according to Appendix 16, when the control device changes the distance between the feature point of the target region and the feature point of the reference region per unit time exceeds the permissible amount.
- the second imaging device includes a plurality of imaging means, and the control device is one imaging means for imaging the selected target person from the plurality of imaging means.
- Appendix 18 The authentication system according to Appendix 18 is the authentication system according to any one of Appendix 1 to 17, wherein the predetermined portion includes at least one of an eye and a face.
- Appendix 19 The authentication system according to Appendix 19, wherein the authentication includes at least one of iris authentication and face authentication, and the authentication system is at least one of the iris authentication system and the face authentication system. It is an authentication system described in.
- Appendix 20 The authentication method described in Appendix 20 is from a second imaging device that acquires a second image by capturing a second imaging range that is wider than the first imaging range of the first imaging device that acquires the first image used for authentication. , The subject of the authentication, based on the acquisition step of acquiring the second image, the target area including a predetermined part of the person in the second image, and the reference area corresponding to the first imaging range in the second image. It is an authentication method including a selection process for selecting a person.
- Appendix 21 The control device according to Appendix 21 is from a second imaging device that acquires a second image by capturing a second imaging range that is wider than the first imaging range of the first imaging device that acquires the first image used for authentication. The authentication is performed based on the acquisition means for acquiring the second image, the target area including a predetermined part of the person in the second image, and the reference area corresponding to the first imaging range in the second image. It is a control device including a selection means for selecting a target person.
- Appendix 22 The computer program described in Appendix 22 is a computer program that causes a computer to execute the imaging method described in Appendix 20.
- Appendix 23 The recording medium described in Appendix 23 is a recording medium on which the computer program described in Appendix 22 is recorded.
- the present invention can be appropriately modified within the scope of the claims and within the scope not contrary to the gist or idea of the invention which can be read from the entire specification, and the authentication system, authentication method, control device, computer program and record accompanied by such modification.
- the medium is also included in the technical idea of the present invention.
- Iris recognition system 2 Whole image pickup device 21 Whole camera 200 Whole image 3
- Iris image pickup device 31 Iris camera 300
- Iris image 4 5
- Human sensor 6 Iris recognition device 61
- CPU 611 Image acquisition unit 612
- Target selection unit 613 Camera setting unit 614
- Imaging control unit 615 Authentication unit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Security & Cryptography (AREA)
- Ophthalmology & Optometry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Image Input (AREA)
- Collating Specific Patterns (AREA)
- Studio Devices (AREA)
Abstract
Description
(1-1)虹彩認証システム1の全体構成
はじめに、図1を参照しながら、本実施形態の虹彩認証システム1の全体構成について説明する。図1は、本実施形態の虹彩認証システム1の全体構成を示すブロック図である。
続いて、図4を参照しながら、本実施形態の虹彩認証装置6の構成について説明する。図4は、本実施形態の虹彩認証装置6のハードウェア構成を示すブロック図である。
続いて、図6を参照しながら、本実施形態の虹彩認証システム1の動作(つまり、虹彩認証動作)の流れについて説明する。図6は、本実施形態の虹彩認証システム1の動作(つまり、虹彩認証動作)の流れを示すフローチャートである。
以上説明したよう虹彩認証システム1は、認証対象者Taを適切に選択する(言い換えれば、指定する又は決定する)ことができる。特に、虹彩認証システム1は、複数の人物Tが全体画像200(特に、そのうちの虹彩画像領域RA)に写り込んでいる状況下で、当該複数の人物Tのうちの一名の人物Tを、認証対象者Taとして適切に選択することができる。言い換えれば、虹彩認証システム1は、複数の顔領域TAが全体画像200(特に、そのうちの虹彩画像領域RA)において検出されている状況下で、当該複数の顔領域TAのうち選択条件を満たす一の顔領域TAに対応する一名の人物Tを、認証対象者Taとして適切に選択することができる。つまり、虹彩認証システム1は、認証対象者Taとして選択されることが望ましい人物Tを認証対象者Taとして選択することができる。言い換えれば、虹彩認証システム1は、認証対象者Taとして選択されることが望ましくない人物Tを認証対象者Taとして選択しにくくなる。
(4-1)第1変形例
図11を参照しながら、第1変形例の虹彩認証システム1aについて説明する。図11は、第1変形例の虹彩認証システム1aの全体構成を示すブロック図である。尚、以下の説明では、上述した虹彩認証システム1が備える構成要素と同一の構成要素については、同一の参照符号を付してその詳細な説明を省略する。
第2変形例では、虹彩認証装置6は、顔領域TAと虹彩画像領域RAとの位置関係の経時変化に基づく条件を含む選択条件を用いてもよい。例えば、対象選択部612は、虹彩画像領域RAとの間の位置関係の単位時間当たりの変化量が所定の第1許容量未満となる顔領域TAに対応する人物Tを、認証対象者Taとして選択してもよい。例えば、対象選択部612は、虹彩画像領域RAとの間の位置関係の単位時間当たりの変化量が第1許容量より大きくなる顔領域TAに対応する人物Tを、認証対象者Taとして選択しなくてもよい。
上述した説明では、全体画像200(特に、そのうちの虹彩画像領域RA)に複数の人物Tが写っている(つまり、複数の顔領域TAが検出された)場合には、対象選択部612は、複数の人物Tのうちの一名を、認証対象者Taとして選択する。しかしながら、対象選択部612は、複数の人物Tのうちの二名以上を、認証対象者Taとして選択してもよい。例えば、対象選択部612は、選択条件を満たす二名以上の人物Tを、認証対象者Taとして選択してもよい。例えば、対象選択部612は、選択条件を満たす二名以上の人物Tが存在する場合に(図6のステップS25:No)、最前の人物Tを認証対象者Taとして選択する処理を行わなくてもよい。
以上説明した実施形態に関して、更に以下の付記を開示する。
付記1に記載の認証システムは、第1撮像範囲を撮像することで認証に用いる第1画像を取得する第1撮像装置と、前記第1撮像範囲よりも広い第2撮像範囲を撮像することで第2画像を取得する第2撮像装置と、前記第2画像のうち人物の所定部位を含むターゲット領域と前記第2画像のうち前記第1撮像範囲に対応する基準領域と基づいて、前記認証の対象者を選択する制御装置とを備える認証システムである。
付記2に記載の認証システムは、前記制御装置は、前記ターゲット領域と前記基準領域との位置関係に基づいて、前記対象者を選択する付記1に記載の撮像システムである。
付記3に記載の認証システムは、前記制御装置は、前記ターゲット領域のうち前記基準領域と重複する重複領域に基づいて、前記対象者を選択する付記1又は2に記載の認証システムである。
付記4に記載の認証システムは、前記制御装置は、前記ターゲット領域に対する前記重複領域の割合に基づいて、前記対象者を選択する付記3に記載の認証システムである。
付記5に記載の認証システムは、前記制御装置は、前記割合が所定の第1閾値よりも大きい前記重複領域を含む前記ターゲット領域に対応する人物を、前記対象者として選択する付記4に記載の認証システムである。
付記6に記載の認証システムは、前記制御装置は、前記第2画像のうち前記ターゲット領域の重心と前記基準領域の重心と間の距離に基づいて、前記対象者を選択する付記1から5のいずれか一項に記載の認証システムである。
付記7に記載の認証システムは、前記制御装置は、前記距離が第2閾値よりも小さい前記ターゲット領域に対応する人物を、前記対象者として決定する付記6に記載の認証システムである。
付記8に記載の認証システムは、前記特徴点は、重心を含む付記6又は7に記載の認証システムである。
付記9に記載の認証システムは、前記第1撮像装置は、夫々が前記第1画像として利用可能な複数の単位画像を夫々取得可能な複数の撮像手段を含み、前記複数の撮像手段は、前記複数の単位画像に写る光景が第1方向に沿ってつながるように配置されており、前記ターゲット領域の重心は、前記第1方向に交差する第2方向における前記ターゲット領域の重心であり、前記基準領域の重心は、前記第2方向における前記基準領域の重心である付記8に記載の認証システムである。
付記10に記載の認証システムは、前記第2画像を表示する表示装置を更に備える付記1から9のいずれか一項に記載の認証システムである。
付記11に記載の認証システムは、前記表示装置は、前記第2画像と共に、前記ターゲット領域、前記基準領域、前記ターゲット領域のうち前記基準領域と重複する重複領域、前記対象者として選択された人物及び前記対象者として選択されなかった人物の少なくとも一つを、前記第2画像上で識別可能な表示態様で表示する付記10に記載の認証システムである。
付記12に記載の認証システムは、前記表示装置は、前記認証に関する異常事象を表示する付記10又は11に記載の認証システムである。
付記13に記載の認証システムは、前記制御装置は、前記ターゲット領域と前記基準領域との位置関係の経時変化に基づいて、前記対象者を選択する付記1から12のいずれか一項に記載の認証システムである。
付記14に記載の認証システムは、前記制御装置は、前記ターゲット領域と前記基準領域との位置関係の単位時間あたりの変化量が許容量を超えている場合には、当該ターゲット領域に対応する人物を、前記対象者として選択しない付記13に記載の認証システムである。
付記15に記載の認証システムは、前記制御装置は、前記ターゲット領域のうち前記基準領域と重複する重複領域の前記ターゲット領域に対する割合の単位時間あたりの変化量が許容量を超えている場合には、当該ターゲット領域に対応する人物を、前記対象者として選択しない付記13又は14に記載の認証システムである。
付記16に記載の認証システムは、前記制御装置は、前記ターゲット領域の特徴点と前記基準領域の特徴点との間の距離の単位時間あたりの変化量が許容量を超えている場合には、当該ターゲット領域に対応する人物を、前記対象者として選択しない付記13から15のいずれか一項に記載の認証システムである。
付記17に記載の認証システムは、前記第2撮像装置は、複数の撮像手段を含み、前記制御装置は、前記複数の撮像手段から、前記選択された対象者を撮像するための一の撮像手段を選択し、前記選択された対象者を撮像して前記第1画像を取得するように前記選択した一の撮像手段を制御する付記1から16のいずれか一項に記載の認証システムである。
付記18に記載の認証システムは、前記所定部位は、目及び顔の少なくとも一方を含む付記1から17のいずれか一項に記載の認証システムである。
付記19に記載の認証システムは、前記認証は、虹彩認証及び顔認証の少なくとも一方を含み、前記認証システムは、虹彩認証システム及び顔認証システムの少なくとも一方である付記1から18のいずれか一項に記載の認証システムである。
付記20に記載の認証方法は、認証に用いる第1画像を取得する第1撮像装置の第1撮像範囲よりも広い第2撮像範囲を撮像することで第2画像を取得する第2撮像装置から、前記第2画像を取得する取得工程と、第2画像のうち人物の所定部位を含むターゲット領域と前記第2画像のうち前記第1撮像範囲に対応する基準領域と基づいて、前記認証の対象者を選択する選択工程とを備える認証方法である。
付記21に記載の制御装置は、認証に用いる第1画像を取得する第1撮像装置の第1撮像範囲よりも広い第2撮像範囲を撮像することで第2画像を取得する第2撮像装置から、前記第2画像を取得する取得手段と、前記第2画像のうち人物の所定部位を含むターゲット領域と前記第2画像のうち前記第1撮像範囲に対応する基準領域と基づいて、前記認証の対象者を選択する選択手段とを備える制御装置である。
付記22に記載のコンピュータプログラムは、コンピュータに、付記20に記載の撮像方法を実行させるコンピュータプログラムである。
付記23に記載の記録媒体は、付記22に記載のコンピュータプログラムが記録された記録媒体である。
2 全体撮像装置
21 全体カメラ
200 全体画像
3 虹彩撮像装置
31 虹彩カメラ
300 虹彩画像
4、5 人感センサ
6 虹彩認証装置
61 CPU
611 画像取得部
612 対象選択部
613 カメラ設定部
614 撮像制御部
615 認証部
Claims (23)
- 第1撮像範囲を撮像することで認証に用いる第1画像を取得する第1撮像装置と、
前記第1撮像範囲よりも広い第2撮像範囲を撮像することで第2画像を取得する第2撮像装置と、
前記第2画像のうち人物の所定部位を含むターゲット領域と前記第2画像のうち前記第1撮像範囲に対応する基準領域と基づいて、前記認証の対象者を選択する制御装置と
を備える認証システム。 - 前記制御装置は、前記ターゲット領域と前記基準領域との位置関係に基づいて、前記対象者を選択する
請求項1に記載の認証システム。 - 前記制御装置は、前記ターゲット領域のうち前記基準領域と重複する重複領域に基づいて、前記対象者を選択する
請求項1又は2に記載の認証システム。 - 前記制御装置は、前記ターゲット領域に対する前記重複領域の割合に基づいて、前記対象者を選択する
請求項3に記載の認証システム。 - 前記制御装置は、前記割合が所定の第1閾値よりも大きい前記重複領域を含む前記ターゲット領域に対応する人物を、前記対象者として選択する
請求項4に記載の認証システム。 - 前記制御装置は、前記第2画像のうち前記ターゲット領域の特徴点と前記基準領域の特徴点と間の距離に基づいて、前記対象者を選択する
請求項1から5のいずれか一項に記載の認証システム。 - 前記制御装置は、前記距離が第2閾値よりも小さい前記ターゲット領域に対応する人物を、前記対象者として決定する
請求項6に記載の認証システム。 - 前記特徴点は、重心を含む
請求項6又は7に記載の認証システム。 - 前記第1撮像装置は、夫々が前記第1画像として利用可能な複数の単位画像を夫々取得可能な複数の撮像手段を含み、
前記複数の撮像手段は、前記複数の単位画像に写る光景が第1方向に沿ってつながるように配置されており、
前記ターゲット領域の重心は、前記第1方向に交差する第2方向における前記ターゲット領域の重心であり、
前記基準領域の重心は、前記第2方向における前記基準領域の重心である
請求項8に記載の認証システム。 - 前記第2画像を表示する表示装置を更に備える
請求項1から9のいずれか一項に記載の認証システム。 - 前記表示装置は、前記第2画像と共に、前記ターゲット領域、前記基準領域、前記ターゲット領域のうち前記基準領域と重複する重複領域、前記対象者として選択された人物及び前記対象者として選択されなかった人物の少なくとも一つを、前記第2画像上で識別可能な表示態様で表示する
請求項10に記載の認証システム。 - 前記表示装置は、前記認証に関する異常事象を表示する
請求項10又は11のいずれか一項に記載の認証システム。 - 前記制御装置は、前記ターゲット領域と前記基準領域との位置関係の経時変化に基づいて、前記対象者を選択する
請求項1から12のいずれか一項に記載の認証システム。 - 前記制御装置は、前記ターゲット領域と前記基準領域との位置関係の単位時間あたりの変化量が許容量を超えている場合には、当該ターゲット領域に対応する人物を、前記対象者として選択しない
請求項13に記載の認証システム。 - 前記制御装置は、前記ターゲット領域のうち前記基準領域と重複する重複領域の前記ターゲット領域に対する割合の単位時間あたりの変化量が許容量を超えている場合には、当該ターゲット領域に対応する人物を、前記対象者として選択しない
請求項13又は14に記載の認証システム。 - 前記制御装置は、前記ターゲット領域の特徴点と前記基準領域の特徴点との間の距離の単位時間あたりの変化量が許容量を超えている場合には、当該ターゲット領域に対応する人物を、前記対象者として選択しない
請求項13から15のいずれか一項に記載の認証システム。 - 前記第2撮像装置は、複数の撮像手段を含み、
前記制御装置は、前記複数の撮像手段から、前記選択された対象者を撮像するための一の撮像手段を選択し、前記選択された対象者を撮像して前記第1画像を取得するように前記選択した一の撮像手段を制御する
請求項1から16のいずれか一項に記載の認証システム。 - 前記所定部位は、目及び顔の少なくとも一方を含む
請求項1から17のいずれか一項に記載の認証システム。 - 前記認証は、虹彩認証及び顔認証の少なくとも一方を含み、
前記認証システムは、虹彩認証システム及び顔認証システムの少なくとも一方である
請求項1から18のいずれか一項に記載の認証システム。 - 認証に用いる第1画像を取得する第1撮像装置の第1撮像範囲よりも広い第2撮像範囲を撮像することで第2画像を取得する第2撮像装置から、前記第2画像を取得する取得工程と、
第2画像のうち人物の所定部位を含むターゲット領域と前記第2画像のうち前記第1撮像範囲に対応する基準領域と基づいて、前記認証の対象者を選択する選択工程と
を含む認証方法。 - 認証に用いる第1画像を取得する第1撮像装置の第1撮像範囲よりも広い第2撮像範囲を撮像することで第2画像を取得する第2撮像装置から、前記第2画像を取得する取得手段と、
前記第2画像のうち人物の所定部位を含むターゲット領域と前記第2画像のうち前記第1撮像範囲に対応する基準領域と基づいて、前記認証の対象者を選択する選択手段と
を備える制御装置。 - コンピュータに、請求項20に記載の認証方法を実行させるコンピュータプログラム。
- 請求項22に記載のコンピュータプログラムが記録された記録媒体。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19934710.5A EP3992905A4 (en) | 2019-06-26 | 2019-06-26 | AUTHENTICATION SYSTEM, AUTHENTICATION METHOD, CONTROL DEVICE, COMPUTER PROGRAM AND RECORDING MEDIA |
CN201980097902.2A CN114041164A (zh) | 2019-06-26 | 2019-06-26 | 认证***、认证方法、控制装置、计算机程序和记录介质 |
JP2021528737A JP7211511B2 (ja) | 2019-06-26 | 2019-06-26 | 認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 |
BR112021025977A BR112021025977A2 (pt) | 2019-06-26 | 2019-06-26 | Sistema e método de autenticação, dispositivos de controle, de produto de computador e de mídia de gravação |
PCT/JP2019/025342 WO2020261423A1 (ja) | 2019-06-26 | 2019-06-26 | 認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 |
US17/620,551 US20220245228A1 (en) | 2019-06-26 | 2019-06-26 | Authentication system, authentication method, control apparatus, computer program and recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/025342 WO2020261423A1 (ja) | 2019-06-26 | 2019-06-26 | 認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020261423A1 true WO2020261423A1 (ja) | 2020-12-30 |
Family
ID=74060811
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/025342 WO2020261423A1 (ja) | 2019-06-26 | 2019-06-26 | 認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220245228A1 (ja) |
EP (1) | EP3992905A4 (ja) |
JP (1) | JP7211511B2 (ja) |
CN (1) | CN114041164A (ja) |
BR (1) | BR112021025977A2 (ja) |
WO (1) | WO2020261423A1 (ja) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004297518A (ja) | 2003-03-27 | 2004-10-21 | Matsushita Electric Ind Co Ltd | 認証対象画像撮像装置及びその方法 |
JP2004343317A (ja) | 2003-05-14 | 2004-12-02 | Sony Corp | 撮像装置 |
JP2006126899A (ja) * | 2004-10-26 | 2006-05-18 | Matsushita Electric Ind Co Ltd | 生体判別装置、生体判別方法、およびそれを用いた認証システム |
JP2006319550A (ja) | 2005-05-11 | 2006-11-24 | Omron Corp | 撮像装置、携帯端末 |
JP2016066241A (ja) * | 2014-09-25 | 2016-04-28 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、及びプログラム |
JP2017045485A (ja) * | 2013-11-25 | 2017-03-02 | キヤノンマーケティングジャパン株式会社 | 情報処理装置、情報処理システム、制御方法、及びプログラム |
JP2017142772A (ja) | 2016-02-05 | 2017-08-17 | 富士通株式会社 | 虹彩認証装置及び虹彩認証プログラム |
JP2017530476A (ja) | 2014-09-24 | 2017-10-12 | プリンストン・アイデンティティー・インコーポレーテッド | バイオメトリックキーを用いたモバイルデバイスにおけるワイヤレス通信デバイス機能の制御 |
JP2019040642A (ja) * | 2015-09-09 | 2019-03-14 | 日本電気株式会社 | 顔認証装置、顔認証方法及びプログラム |
JP2019057036A (ja) * | 2017-09-20 | 2019-04-11 | キヤノン株式会社 | 情報処理装置、その制御方法及びプログラム |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007504562A (ja) * | 2003-09-04 | 2007-03-01 | サーノフ コーポレーション | 1つの画像から虹彩認証を行う方法および装置 |
-
2019
- 2019-06-26 BR BR112021025977A patent/BR112021025977A2/pt unknown
- 2019-06-26 WO PCT/JP2019/025342 patent/WO2020261423A1/ja unknown
- 2019-06-26 EP EP19934710.5A patent/EP3992905A4/en active Pending
- 2019-06-26 CN CN201980097902.2A patent/CN114041164A/zh active Pending
- 2019-06-26 US US17/620,551 patent/US20220245228A1/en active Pending
- 2019-06-26 JP JP2021528737A patent/JP7211511B2/ja active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004297518A (ja) | 2003-03-27 | 2004-10-21 | Matsushita Electric Ind Co Ltd | 認証対象画像撮像装置及びその方法 |
JP2004343317A (ja) | 2003-05-14 | 2004-12-02 | Sony Corp | 撮像装置 |
JP2006126899A (ja) * | 2004-10-26 | 2006-05-18 | Matsushita Electric Ind Co Ltd | 生体判別装置、生体判別方法、およびそれを用いた認証システム |
JP2006319550A (ja) | 2005-05-11 | 2006-11-24 | Omron Corp | 撮像装置、携帯端末 |
JP2017045485A (ja) * | 2013-11-25 | 2017-03-02 | キヤノンマーケティングジャパン株式会社 | 情報処理装置、情報処理システム、制御方法、及びプログラム |
JP2017530476A (ja) | 2014-09-24 | 2017-10-12 | プリンストン・アイデンティティー・インコーポレーテッド | バイオメトリックキーを用いたモバイルデバイスにおけるワイヤレス通信デバイス機能の制御 |
JP2016066241A (ja) * | 2014-09-25 | 2016-04-28 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、及びプログラム |
JP2019040642A (ja) * | 2015-09-09 | 2019-03-14 | 日本電気株式会社 | 顔認証装置、顔認証方法及びプログラム |
JP2017142772A (ja) | 2016-02-05 | 2017-08-17 | 富士通株式会社 | 虹彩認証装置及び虹彩認証プログラム |
JP2019057036A (ja) * | 2017-09-20 | 2019-04-11 | キヤノン株式会社 | 情報処理装置、その制御方法及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3992905A4 |
Also Published As
Publication number | Publication date |
---|---|
CN114041164A (zh) | 2022-02-11 |
EP3992905A1 (en) | 2022-05-04 |
JPWO2020261423A1 (ja) | 2020-12-30 |
US20220245228A1 (en) | 2022-08-04 |
JP7211511B2 (ja) | 2023-01-24 |
BR112021025977A2 (pt) | 2022-02-08 |
EP3992905A4 (en) | 2022-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6273685B2 (ja) | 追尾処理装置及びこれを備えた追尾処理システム並びに追尾処理方法 | |
US10182720B2 (en) | System and method for interacting with and analyzing media on a display using eye gaze tracking | |
JP5203281B2 (ja) | 人物検出装置、人物検出方法、及び人物検出プログラム | |
EP2434372A2 (en) | Controlled access to functionality of a wireless device | |
JP5001930B2 (ja) | 動作認識装置及び方法 | |
JP2010086336A (ja) | 画像制御装置、画像制御プログラムおよび画像制御方法 | |
JP2009245338A (ja) | 顔画像照合装置 | |
JP2023029941A (ja) | 撮像システム、撮像方法、コンピュータプログラム及び記録媒体 | |
JP7453137B2 (ja) | Pos端末装置および画像処理方法 | |
JP2012238293A (ja) | 入力装置 | |
TW201429239A (zh) | 視域顯示系統、視域顯示方法及記錄有視域顯示程式之電腦可讀取記錄媒體 | |
US9773143B2 (en) | Image processing apparatus, image processing method, and image processing system | |
KR101648786B1 (ko) | 객체 인식 방법 | |
KR101919138B1 (ko) | 원거리 멀티 생체 인식 방법 및 장치 | |
US11210528B2 (en) | Information processing apparatus, information processing method, system, and storage medium to determine staying time of a person in predetermined region | |
JP5448952B2 (ja) | 同一人判定装置、同一人判定方法および同一人判定プログラム | |
WO2020261423A1 (ja) | 認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 | |
JP7371764B2 (ja) | 情報登録装置、情報登録システム、情報登録方法およびコンピュータプログラム | |
JP7364965B2 (ja) | 認証方法、認証プログラム、および情報処理装置 | |
JP6885127B2 (ja) | 表示制御装置、表示制御方法およびプログラム | |
US20230014562A1 (en) | Image processing apparatus, image processing method, and image processing program | |
US20230215015A1 (en) | Tracking device, tracking method, and recording medium | |
KR102596487B1 (ko) | 표시 제어 시스템, 방법 및 컴퓨터 판독 가능한 기록 매체 | |
CN115086619A (zh) | 一种视频数据传输方法、装置、摄像机及存储介质 | |
JPWO2022158239A5 (ja) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19934710 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021528737 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112021025977 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2019934710 Country of ref document: EP Effective date: 20220126 |
|
ENP | Entry into the national phase |
Ref document number: 112021025977 Country of ref document: BR Kind code of ref document: A2 Effective date: 20211221 |