CN111461016A - Remnant determining method and device and electronic equipment - Google Patents

Remnant determining method and device and electronic equipment Download PDF

Info

Publication number
CN111461016A
CN111461016A CN202010250328.4A CN202010250328A CN111461016A CN 111461016 A CN111461016 A CN 111461016A CN 202010250328 A CN202010250328 A CN 202010250328A CN 111461016 A CN111461016 A CN 111461016A
Authority
CN
China
Prior art keywords
image
pixel points
pixel
sample
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010250328.4A
Other languages
Chinese (zh)
Inventor
钟玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202010250328.4A priority Critical patent/CN111461016A/en
Publication of CN111461016A publication Critical patent/CN111461016A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a method, a device and electronic equipment for determining remnant, wherein in the method, at least two first image samples to be analyzed are obtained, and the at least two first image samples are images of the same sample object; according to the position distribution of the pixel points, respectively carrying out information superposition on the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample; and if the pixel points belonging to the first characteristic category in the second image sample meet the condition, determining that the at least two first image samples belong to the remnant image. The scheme of this application can be applicable to the vestige of different grade type and detect, and realization vestige that can be more comprehensive detects.

Description

Remnant determining method and device and electronic equipment
Technical Field
The present application relates to the field of information processing technologies, and in particular, to a method and an apparatus for determining a remnant, and an electronic device.
Background
With the development of large-screen or full-screen electronic devices, electronic devices adopting underscreen fingerprint identification are increasing.
Currently, the under-screen fingerprint recognition uses optical technology to capture the fingerprint image of the user. If the fingerprint vestige of the user is left on the screen after the user inputs the fingerprint on the screen for fingerprint input or identification, the fingerprint vestige can be easily recycled for fingerprint identification verification, and the loophole is a 'vestige reuse' loophole. In the process of fingerprint identification by using the fingerprint remnants for multiple times, the fingerprint remnants can be learned in a fingerprint database, so that anyone can use the fingerprint remnants to carry out fingerprint verification.
In order to reduce the security risk caused by "trace reuse", fingerprint trace detection is currently required to be performed on fingerprint information to detect whether a fingerprint template formed by fingerprint trace exists in a fingerprint library. However, at present, fingerprint residue detection can only detect some known specific residues, but cannot detect whether fingerprint residues exist in a fingerprint database more comprehensively, so that the fingerprint residue detection has greater limitation.
Disclosure of Invention
In order to achieve the above purpose, the present application provides a method, an apparatus, and an electronic device for determining a remnant.
The method for determining the remnant comprises the following steps:
obtaining at least two first image samples to be analyzed, wherein the at least two first image samples are images of the same sample object;
according to the position distribution of the pixel points, respectively carrying out information superposition on the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample, wherein the characteristic categories to which the pixel points at the same position in the at least two first image samples belong all belong to the first characteristic category, and the characteristic category to which the pixel points at the position in the second image sample belong is the first characteristic category;
and determining that the at least two first image samples belong to the remnant image when pixel points belonging to the first feature category in the second image sample meet a condition.
Preferably, the method further comprises the following steps:
determining the feature type to which each pixel point in the first image sample belongs;
according to the position distribution of the pixel points, information superposition is respectively carried out on the pixel points which are located at the same position in the at least two first image samples, and a superposed second image sample is obtained, wherein the information superposition comprises the following steps:
according to the position distribution of the pixel points and the feature types to which the pixel points belong in the first image samples, respectively overlapping the feature types of the pixel points at the same position in the at least two first image samples to obtain a second image sample after overlapping;
wherein, the information superposition is the type superposition of the characteristic types of the pixel points;
and the pixel point information of the pixel point at any position in the second image sample is the information of the overlapped characteristic category.
Preferably, the feature categories to which the pixel points belong include a first feature category and a second feature category;
if all the pixel points at the same position in the at least two first image samples do not belong to the first feature category, the feature category of the pixel point at the position in the second image sample is the second feature category;
pixel points in the second image sample belonging to the first feature category satisfy a condition, including:
and pixel points belonging to the first characteristic category and pixel points belonging to the second characteristic category in the second image sample meet the condition.
Preferably, the pixel points belonging to the first feature category and the pixel points belonging to the second feature category in the second image sample satisfy a condition, including:
and the ratio of the pixel point belonging to the first characteristic category to the pixel point belonging to the second characteristic category in the second image sample is greater than a set ratio.
Preferably, before the information superposition is performed on the pixel points located at the same position in the at least two first image samples respectively according to the position distribution of the pixel points, and the superposed second image sample is obtained, the method further includes:
carrying out gray scale and binarization processing on the first image sample, wherein in the processed first image sample, the characteristic category corresponding to the pixel point with the value of 0 is a first characteristic category; the feature class corresponding to the pixel point with the value of 255 is the second feature class.
Preferably, the information superposition is performed on the pixel points located at the same position in the at least two first image samples respectively according to the position distribution of the pixel points, so as to obtain a superposed second image sample, including:
according to the position distribution of the pixel points, respectively superposing the values of the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample, wherein the values of the pixel points at the same position in the at least two first image samples are both 0, and the value of the pixel point at the position in the second image sample is 0; and if the values of the pixel points at the same position in the at least two first image samples are not all 0, the value of the pixel point at the position in the second image sample is 255.
Preferably, the determining that the at least two image samples belong to the remnant image includes:
determining an addition value obtained by adding values of all pixel points in the second image sample, and determining a product between the row number of the pixel points in the second image sample and the column number of the pixel points in the second image sample;
and if the ratio of the summation value to the product is greater than a set threshold, determining that the at least two first image samples belong to the remnant image.
Preferably, the at least two first image samples are at least two first fingerprint images, and the at least two first fingerprint images are fingerprint images of the same finger of the same user.
In yet another aspect, the present application provides a vestige determination apparatus, including:
the device comprises an image acquisition unit, a processing unit and a processing unit, wherein the image acquisition unit is used for acquiring at least two first image samples to be analyzed, and the at least two first image samples are images of the same sample object;
the pixel superposition unit is used for respectively superposing the information of the pixels at the same position in the at least two first image samples according to the position distribution of the pixels to obtain a superposed second image sample, wherein the attribute categories of the pixels at the same position in the at least two first image samples belong to the first attribute category, and the attribute category of the pixels at the position in the second image sample is the first attribute category;
and the vestige determining unit is used for determining that the pixel points belonging to the first characteristic category in the second image sample meet the condition and that the at least two image samples belong to the grain vestige image.
In yet another aspect, the present application provides an electronic device comprising:
a processor and a memory;
the processor is used for acquiring at least two first image samples to be analyzed, wherein the at least two first image samples are images of the same sample object; according to the position distribution of the pixel points, information superposition is respectively carried out on the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample, wherein the characteristic categories to which the pixel points at the same position in the at least two first image samples belong to the first characteristic category, and the characteristic category to which the pixel points at the position in the second image sample belong belongs is the first characteristic category; if pixel points belonging to a first characteristic category in the second image sample meet a condition, determining that the at least two first image samples belong to a remnant image;
the memory is used for storing programs needed by the processor to execute the operations.
According to the scheme, after the at least two first image samples to be analyzed are obtained, information superposition can be performed on the pixel points at the same position in the at least two first image samples according to the position distribution of the pixel points, and the superposed second image sample is obtained. Since the feature type of the pixel point at the same position in the second image samples is the first feature type only under the condition that the feature types to which the pixel points at the same position in the at least two second image samples belong to the first feature type, the change conditions of the plurality of first image samples corresponding to the samples can be reflected under the condition that the pixel points at the same position in the second image samples belong to the first feature type. And according to the characteristic that the distribution characteristics of the pixel points in the vestige image of the same object sample are basically kept unchanged within a period of time, whether the at least two first image samples belong to the vestige image or not can be analyzed according to the pixel points belonging to the first characteristic category in the second image sample, and the vestige detection is realized.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of an embodiment of a method for determining a remnant, according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a method for determining a remnant, according to yet another embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for determining a remnant, according to yet another embodiment of the present application;
fig. 4 is a schematic flowchart of a method for determining a remnant, according to yet another embodiment of the present application;
fig. 5 is a schematic structural diagram illustrating an embodiment of a remnant determination apparatus according to an embodiment of the present application;
fig. 6 is a schematic diagram of a composition architecture of an electronic device according to an embodiment of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced otherwise than as specifically illustrated.
Detailed Description
The method for determining the vestige is not only suitable for detecting whether the grain images such as fingerprint images or palm print images belong to the vestige images, but also suitable for detecting whether the images of the same object belong to the vestige images, such as detecting whether the image samples of a certain object belong to the vestige images of partial vestiges of the object.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without inventive step, are within the scope of the present disclosure.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of a method for determining a remnant, which is provided by the present application.
The method of the embodiment may include:
s101, at least two first image samples to be analyzed are obtained, and the at least two first image samples are images of the same sample object.
The image content of the first image sample may also be different according to the sample object.
For example, when the sample object is the same finger of the same user, the at least two first image samples are at least two fingerprint images of the same finger of the user.
For another example, in the case that the sample object is the palm of the same user, the at least two first images sample at least two palm print images of the palm of the user.
For another example, when the sample object is an article, the at least two first image samples are at least two images of the article.
Optionally, in consideration of that the remnant image is relatively fixed and has invariance in a certain time, so that the remnant can be analyzed more accurately, and the at least two first image samples to be analyzed are at least two image samples generated by the same sample object in a continuous time period.
S102, according to the position distribution of the pixel points, information superposition is respectively carried out on the pixel points which are located at the same position in at least two first image samples, and a superposed second image sample is obtained.
If the feature categories to which the pixel points at the same position in the at least two first image samples belong all belong to the first feature categories, the feature category to which the pixel points at the position in the second image sample belong is the first feature category.
The characteristic category to which the pixel point belongs refers to the characteristic category to which the pixel point information of the pixel point belongs. The pixel information of the pixel can be represented by the value of the pixel, and the like, and is not limited herein.
The pixel point in the image sample belongs to the first characteristic category, which can indicate that the pixel point has data information capable of reflecting the object sample. If the texture image of the image sample as the target sample is taken as an example, the feature type of the pixel point information of the pixel point of the image sample belongs to the first feature type, and it is determined that the pixel point includes the texture information.
The feature categories of the pixel points may include a first feature category and a second feature category. The feature class for a pixel in the image sample may be the first feature class or the second feature class. And if the characteristic category of the pixel point in the image sample belongs to the second characteristic category, the pixel point does not have data information capable of reflecting the object sample. Still taking the texture image of the image sample as the target sample as an example, if the characteristic category of the pixel point of the image sample belongs to the second characteristic category, it indicates that the pixel point information of the pixel point does not include texture information.
The information superposition mode in the present application can be various.
For example, in a possible implementation manner, for a pixel at the same position, the pixel information of the pixel at the position in at least two first image samples may be superimposed. In this case, the pixel point information of the pixel point at each position in the superimposed second image sample is the pixel point information obtained by superimposing the pixel point information of the pixel point at the position in the at least two first image samples. If the pixel information of the pixel is a value, the values of the pixels at the position in the at least two first image samples can be superimposed for the pixel at any position, so that the value of the pixel at the position in the second image sample is superimposed.
The pixel information superposition is not directly combining or adding the pixel information into a whole, for example, the values of the pixels from the same position in different first image samples are not directly added. Because the pixel point information can be expressed in different modes in the image, the specific mode of the pixel point information can be various, but no matter the expression mode of the pixel point information, the pixel point information can meet the superposition rule when being superposed. That is, as long as it is ensured that the feature class to which the pixel point at the same position in at least two first image samples belongs to the first feature class, the feature class to which the pixel point at the position in the second image sample belongs may be the first feature class.
If the feature types of the pixel points at the same position in the at least two first image samples are the first feature types, overlapping the pixel point information of the pixel points at the position in the at least two first image samples, wherein the pixel point information of the pixel points at the position in the second image sample after overlapping is the average value of the pixel point information of the pixel points at the position in the at least two first image samples; or the pixel point information of the pixel point at the position in the superimposed second image sample is set first pixel point information, and the characteristic category of the first pixel point information belongs to the first characteristic category.
Similarly, if the feature categories of the pixels at the same position in the at least two first image samples are both the first feature categories, after the pixel point information of the pixels at the position in the at least two first image samples is superimposed, the superimposed pixel point information of the pixels at the position in the second image sample may be preset pixel point information, and the feature category corresponding to the preset pixel point information does not belong to the first feature category.
For example, if the value of the pixel is less than 130, it is indicated that the pixel has content information capable of representing the object sample, that is, the value of the pixel is less than 130, and the pixel belongs to the first feature category. Meanwhile, if the value of the pixel point of the first image sample a at the position m1 is 90, the feature class of the pixel point at the position m1 is the first feature class, and the value of the pixel point at the position m2 is 200, the feature class of the pixel point is the second feature class. And if the value of the pixel point of the first image sample B at the position m1 is 80, the feature class of the pixel point at the position m1 is also the first feature class, and if the value of the pixel point at the position m2 is 30, the feature class of the pixel point is the first feature class.
On the basis of the above assumptions, when superimposing pixel point information on a pixel point at the position m1 in the first image sample a and the first image sample B, a value of the pixel point at the position m1 in the superimposed second image sample may be (80+90)/2 ═ 85, and the pixel point at the position m1 in the second image sample still belongs to the first feature class.
Correspondingly, when the pixel point information of the pixel point at the position m2 in the first image sample a and the first image sample B is superimposed, since the pixel points at the positions of the two first image samples are not all in the first feature class, the value of the pixel point at the position m2 in the superimposed second image sample may be the set value 200, and then the pixel point at the position m2 in the second image sample still meets the requirement that the pixel point does not belong to the first feature class.
It can be understood that the above description is only given by taking several possible ways as an example, and the application will be further described by taking a more convenient way of superimposing pixel point information as an example in the following. Of course, there may be other possibilities for the manner of superimposing the pixel point information, as long as the above-mentioned superimposition rule is satisfied, and no limitation is imposed on this.
In the above, the information superposition is used as a possible situation of the pixel information superposition, and a further possible situation of the information superposition is described below.
In yet another possible case, the information superposition of the pixel points may be a category superposition of feature categories of the pixel points. Specifically, the feature type to which each pixel point belongs in the first image sample may be determined first, and then, according to the position distribution of the pixel points and the feature type to which each pixel point belongs in the first image sample, the feature types of the pixel points at the same position in the at least two first image samples are superimposed respectively, so as to obtain a superimposed second image sample. In this case, the pixel point information of the pixel point at any position in the second image sample is the information of the feature category after the superposition.
When the feature categories of the pixel points are superimposed, the superimposition rule that, if the feature categories to which the pixel points at the same position in at least two first image samples belong all belong to the first feature category, and the feature category to which the pixel points at the position in the second image sample belong belongs is the first feature category is still satisfied. However, different from the situation of overlapping the pixel point information, when the feature categories of the pixel points are overlapped, the pixel point information of the pixel points is not concerned, but the overlapping is performed only according to the feature categories of the pixel points, so that the pixel point information represented by the pixel point at any position in the overlapped second image sample is the category result after the feature categories corresponding to the pixel point at the position in the at least two first image samples are overlapped.
For example, assuming that the first feature class is represented by 0, but not represented by 1, if the feature classes of the pixel points at a certain position in at least two first image samples all belong to the first feature class, after the feature classes of the pixel points at the certain position in the at least two first image samples are superimposed, the superimposed feature class is still the first feature class, correspondingly, the pixel point information of the pixel point at the certain position in the superimposed second image sample may be 0, and the pixel point information of 0 indicates that the feature class of the pixel point in the second image sample is the first feature class. Similarly, if the feature categories of the pixel points at a certain position in at least two first image samples do not all belong to the first feature categories, the pixel point information of the pixel point at the certain position in the superimposed second image sample is 1 which represents that the pixel point does not belong to the first feature categories.
Of course, in practical applications, there may be other possibilities for information superposition, as long as the superposition rule is satisfied, and no limitation is imposed on this.
It should be noted that, when the number of the at least two first image samples exceeds two, all the first image samples may be uniformly superimposed, that is, the second image sample is obtained by one-time superimposition; or, the two first image samples are superposed according to the pixel point positions, then the superposed image sample and one first image sample which is not superposed are superposed, and iteration is continuously performed until all the first image samples are superposed together.
S103, if the pixel points belonging to the first characteristic category in the second image sample meet the condition, at least two first image samples are determined to belong to the remnant image.
The inventor of the present application found through research that: for the same sample object, the remnant images generated within a period of time are basically similar or even the same, for example, in a scene that needs to be verified by using the image sample of the sample object, if the remnant image of the sample object is maliciously acquired by another person, the same remnant image may be used for verification within a period of time. On this basis, the distribution conditions of the pixel points belonging to the first feature category in the different remnant images of the sample object are basically consistent, so that most of the pixel points belonging to the first feature category in the remnant images can be reserved after the at least two remnant images are superposed according to the positions of the pixel points in the manner mentioned above.
On the contrary, for the same sample object, if all the image samples of the sample object generated within a period of time are not the remnant images, since the non-remnant images (image samples not belonging to the remnant) have a large variability, after at least two image samples are superimposed according to the positions of the pixel points as above, the pixel points belonging to the first feature category in different image samples will have a situation of mutual cancellation, so that the number of the pixel points belonging to the first feature category in the superimposed image samples is relatively small.
Based on the research findings, it can be known that the pixel points belonging to the first feature category in the second image sample can be used as a basis for judging whether the at least two first image samples belong to the remnant image.
In this embodiment, there may be a plurality of situations where the pixel point belonging to the first feature category in the second image sample satisfies the condition. Based on the above findings, it can be set as desired.
For example, under the condition that the number of pixels in the image sample is known, it can be set that the number of pixels belonging to the first feature type exceeds a set value, and then the pixels belonging to the first feature type are considered to satisfy the condition.
For another example, it may be set that the pixel proportion of the pixel belonging to the first feature category in the second image sample exceeds a set target value, and then the pixel belonging to the first feature category is considered to satisfy the condition.
For another example, if the feature categories of the pixel points are divided into a first feature category and a second feature category, the pixel points belonging to the first feature category and the pixel points belonging to the second feature category in the second image may also be set to satisfy a condition, and detailed description will be specifically provided later.
For another example, considering that the pixel information of the pixels in the second image sample can represent the feature category to which the pixels belong, and the value or the value range of the pixels belonging to the pixels of the first feature category is fixed, on this basis, the values of the pixels belonging to the first feature category in the second image sample can be summed, and if the summed result is greater than the set threshold, it is determined that the condition is satisfied.
Of course, there may be other various possibilities that the pixel point belonging to the first feature category in the second image sample satisfies the condition, and a few cases will be taken as examples for detailed description in the following, but other possible cases meeting the above research findings are also applicable to the present embodiment.
Therefore, after the at least two first image samples to be analyzed are obtained, information superposition can be performed on the pixel points at the same position in the at least two first image samples according to the position distribution of the pixel points, and a superposed second image sample is obtained. Since the feature type of the pixel point at the same position in the second image samples is the first feature type only under the condition that the feature types to which the pixel points at the same position in the at least two second image samples belong to the first feature type, the change conditions of the plurality of first image samples corresponding to the samples can be reflected under the condition that the pixel points at the same position in the second image samples belong to the first feature type. And according to the characteristic that the distribution characteristics of the pixel points in the vestige image of the same object sample are basically kept unchanged within a period of time, whether the at least two first image samples belong to the vestige image or not can be analyzed according to the pixel points belonging to the first characteristic category in the second image sample, and the vestige detection is realized.
The following describes the situation that the feature categories to which the pixel points in the image sample belong are respectively the first feature category and the second feature category.
As shown in fig. 2, which shows a schematic flowchart of another embodiment of the method for determining a remnant, the method of this embodiment may include:
s201, at least two first image samples to be analyzed are obtained, and the at least two first image samples are images of the same sample object.
S202, determining the feature type to which each pixel point in the first image sample belongs.
The feature categories to which the pixel point information belongs are a first feature category and a second feature category, respectively, and for each pixel point in the first image sample, the pixel point may belong to the first feature category or the second feature category.
The pixel information of the pixel belonging to the first characteristic category representing the pixel is characteristic information representing a sample object, for example, the image sample is an image of an object sample with texture, and the pixel of the first characteristic category is a pixel with texture data. Correspondingly, the pixel belongs to the second characteristic category, which indicates that the pixel does not contain information capable of embodying the characteristics of the sample object, for example, the pixel is a pixel of a white background.
The feature type to which each pixel point in the first image sample belongs may be determined in various manners. For example, considering that the pixel point information of the pixel point in the image can be represented by different values, a boundary value for a user to divide the first feature category and the second feature category can be set, and if the value of the pixel point is smaller than the boundary value, the pixel point can be considered to belong to the first feature category; if the pixel point is not smaller than the boundary value, the pixel point can be considered to belong to the second characteristic category.
For another example, when the first image sample does not belong to a grayscale image, the pixel points in the first image sample may be further classified into one of the first feature type and the second feature type by performing grayscale and binarization processing on the first image sample.
And S203, according to the position distribution of the pixel points, respectively performing information superposition on the pixel points at the same position in at least two first image samples to obtain a superposed second image sample.
If the feature categories to which the pixel points at the same position in the at least two first image samples belong all belong to the first feature categories, the feature category to which the pixel points at the position in the second image sample belong is the first feature category. And if the feature categories to which the pixel points at the same position in the at least two first image samples belong are not all the first feature categories, the feature category to which the pixel points at the position in the second image sample belong is the second feature category.
If the pixels at the same position in at least two first image samples have pixels belonging to the first feature category and pixels belonging to the second feature category, the pixels at the position in the overlapped second image sample belong to the second feature category. For example, the pixel point information of the pixel point of the position point in the second image sample may be an identification value corresponding to the second feature type, or the feature type to which the pixel point information of the pixel point of the position point belongs may be the second feature type.
For the second image sample to be superimposed in this embodiment, reference may be made to the related description of the foregoing embodiments, and details are not repeated here.
And S204, if the pixel points belonging to the first characteristic category and the pixel points belonging to the second characteristic category in the second image sample meet the condition, determining that the at least two image samples belong to the vestige image.
For example, in one possible case, a first number of pixels in the second image sample belonging to the first feature class and a second number of pixels in the second image sample belonging to the second feature class are determined, and if the first number and the second number satisfy a condition, it is determined that the at least two image samples belong to the remnant image.
The first number and the second number satisfy the condition that a ratio of the first number to the second number is greater than a set ratio. It can be understood that, as found in the previous research, if the number of the pixel points belonging to the first feature category in the second image sample is far greater than the number of the pixel points belonging to the second feature category, it indicates that the distribution situations of the pixel points belonging to the first feature category in the at least two first image samples are relatively similar, and the distribution situations are in accordance with the feature of generating the remnant image for the same sample object in the same time period, so that the at least two image samples belonging to the remnant image can be obtained.
Similarly, the first number and the second number may satisfy the condition that a difference between the first number and the second number is greater than a set difference.
In another possible case, under the condition that the value of each pixel in the second image sample can directly represent the feature type of the pixel, the sum of the values of each pixel in the second image sample can reflect the conditions of the pixels of the first feature type and the second feature type in the second image sample, and therefore, the condition can be judged to be met according to the specific conditions of the first value of the pixel representing the first feature type and the second value of the pixel representing the second feature type and the size relationship between the sum and the set value. For example, if the value of the pixel point representing the first feature type is set to be a smaller value, it may be that the sum is set to be smaller than a set value, and it is determined that the condition is satisfied.
Similarly, under the condition that the value of each pixel point in the second image sample can directly represent the feature type of the pixel point, the sum obtained by adding the values of each pixel point in the second image sample can be determined, the function between the sum and the number of rows and columns of the pixel points in the second image is set to be in accordance with the set functional relationship, and then the condition is determined to be met. For example, if the ratio between the sum and the product is greater than a set threshold, it is determined that the condition is satisfied, resulting in the at least two first image samples belonging to the remnant image.
In the embodiment, the types of the pixel points in the first image sample are divided into the first characteristic type and the second characteristic type, after the second image sample is obtained by superposing the pixel points in at least two first image samples, the characteristic type of each pixel point in the second image sample is comprehensively considered, and whether the at least two first image samples belong to the remnant image or not is analyzed according to the pixel points belonging to the first characteristic type and the pixel points belonging to the second characteristic type, so that the remnant image can be more accurately and reliably identified.
The method for determining the remnant of the present application will be described below by taking a case of determining the feature type of the first image sample as an example.
As shown in fig. 3, which shows a schematic flowchart of another embodiment of the method for determining a remnant, the method of this embodiment may include:
s301, at least two first image samples to be analyzed are obtained, and the at least two first image samples are images of the same sample object.
And S302, carrying out gray scale and binarization processing on the first image sample.
In the processed first image sample, the feature category corresponding to the pixel point with the value of 0 is a first feature category; the feature class corresponding to the pixel point with the value of 255 is the second feature class.
It is understood that the graying of the image is a process of changing a color image having brightness and color into a grayscale image. Each pixel in the first image sample after the graying process only needs one byte to store the gray value (also called as intensity value and brightness value), and the range of the gray value is 0-255. The method for graying the first image sample may be various, and the present application does not limit this method.
The image binarization processing refers to a process of setting the gray value of a pixel point on an image to be 0 or 255, namely, enabling the whole image to show an obvious black-and-white effect.
For example, an image may be generally represented by an image matrix, and the image matrix W for the grayed first image sample may be represented as follows:
Figure BDA0002435260760000141
wherein, wijAnd the value of any pixel point in the image matrix representing the first image sample ranges from 0 to 255. Wherein i is a natural number from 1 to n, j is a natural number from 1 to m, n is the number of rows of pixel points in the image matrix of the first image sample, and m is the number of columns of pixel points in the image matrix of the first image sample.
Accordingly, the process of binarizing the grayed first image sample can be referred to as the following formula one:
Figure BDA0002435260760000142
wherein, σ is a set binarization threshold, and the value of σ can be set as required, for example, σ can be 70.
According to the formula I, for any pixel point w in the first image sampleijThe pixel point wijBinary pixel point listShown as w "ij,w”ijIs 0 or 1, wherein when wijIs less than the set binary threshold value sigma, then w "ijIs 0, otherwise, w "ijIs 255.
In the first image sample after binarization, the value of a pixel point is 0, which means that the pixel point is black, that is, the pixel point has data information of an object sample, such as a texture of the object sample; if the value of the pixel point is 255, the pixel point is white, that is, the pixel point does not have data information such as the texture of the object, and the pixel point is a background outside the object sample. Therefore, through graying and binarization processing of the first image sample, a feature category rule can be performed on each pixel point in the first image sample, and the value of the pixel point in the processed first image sample can directly represent the feature category to which the pixel point belongs, that is, the value of the pixel point is 0 to represent the first feature category, and the value of the pixel point is 255 to represent the second feature category.
And S303, according to the position distribution of the pixel points, respectively superposing the values of the pixel points at the same position in the at least two first image samples to obtain superposed second image samples.
Because the values of the pixel points in the first image samples after the graying and binarization processing are also the identifiers of the feature categories corresponding to the pixel points, the superposition of the values of the pixel points at the same position in the at least two first image samples can be regarded as the superposition of pixel point information, and can also be regarded as the superposition of the feature categories corresponding to the pixel points.
Accordingly, the superposition rules in conjunction with the aforementioned information superposition are known: if the values of the pixel points at the same position in the at least two first image samples are both 0, the value of the pixel point at the position in the second image sample is 0. If the pixel values of the pixel points at the same position in the at least two first image samples are not all 0, the value of the pixel point at the position in the second image sample is 255.
It can be seen that the superimposed second image sample is also a binarized image in practice.
S304, determining the added value obtained by adding the values of all the pixel points in the second image sample, and determining the product between the row number of the pixel points in the second image sample and the column number of the pixel points in the second image sample.
If the second image sample has 100 pixels, the sum is the sum of the values of the 100 pixels.
Since the number of rows and columns of pixels in the second image sample is the same as the number of rows and columns of pixels in the first image sample, if the image matrix of the first image sample is n rows and m columns, then the second image sample has n rows and m columns of pixels, and the product is n × m.
S305, if the ratio between the summation value and the product is larger than a set threshold value, determining that at least two first image samples belong to the vestige image.
The set threshold may be set as required, and if the ratio between the sum and the product is greater than the set threshold, it indicates that the ratio of the pixel points belonging to the first feature class in the second image sample is greater than the set target value, that is, it indicates that the distribution characteristics of the pixel points belonging to the first feature class in the at least two first image samples are similar, and the distribution characteristics conform to the characteristics that the remnant image is almost unchanged within a period of time, so that it may be determined that the at least two first image samples belong to the remnant image.
For example, the ratio between the sum and the product is greater than the set threshold can be represented by the following equation two:
Figure BDA0002435260760000161
wherein the content of the first and second substances,
Figure BDA0002435260760000162
representing any pixel point in the second image sample. n is the number of rows of pixel points in the second image sample, and m is the number of columns of pixel points in the second image sample. θ is a preset value.
Of course, the second formula is only for understanding the steps S304 and S305, and in practical applications, there may be other cases, which is not limited to this.
It should be noted that, in the steps S304 and S305, a condition that the pixel point belonging to the first feature class and the pixel point belonging to the second feature class in the second image sample need to satisfy is taken as an example, but the same applies to the present embodiment for the other cases mentioned above.
In order to facilitate a clearer understanding of the scheme of the present application, the following describes the scheme of the present application by taking a case of determining a fingerprint remnant as an example.
It can be understood that if the fingerprint remnant of the user remains on the screen of the electronic device, after the fingerprint remnant is acquired by another person, the another person can still input the fingerprint into the electronic device by using the fingerprint remnant to realize the identity authentication. Therefore, fingerprint residue detection can be performed on fingerprint images that are continuously input in a short time. Meanwhile, the electronic equipment is considered to continuously update the fingerprint database according to the input fingerprints, so that the fingerprint templates stored in the fingerprint database for a period of time are all fingerprint vestiges, and the vestige detection can be carried out on the fingerprint images in the fingerprint database.
Fig. 4 is a schematic flow chart illustrating a method for determining a fingerprint residue according to another embodiment of the present application, which is described by taking fingerprint residue identification as an example. The method of the embodiment may include:
s401, at least two first fingerprint images to be analyzed are obtained, and the at least two first fingerprint images are fingerprint images of the same finger of the same user.
For example, in one possible case, at least two first fingerprint images to be analyzed in the fingerprint library may be acquired, in which case they belong to at least two fingerprint images of the same finger of the user to be analyzed and generated in successive time periods.
For another example, the at least two first fingerprint images to be analyzed may also be at least two cached fingerprint images input by the same finger of the same user in a continuous time period of a set duration, and so on.
S402, aiming at each first fingerprint image, carrying out gray scale and binarization processing on the first fingerprint image.
The feature category corresponding to the pixel point with the value of 0 in the processed first fingerprint image is a first feature category, and the feature category corresponding to the pixel point with the value of 255 is a second feature category.
And S403, according to the position distribution of the pixel points, respectively superposing the values of the pixel points at the same position in at least two first fingerprint images to obtain superposed second fingerprint images.
Under the condition that the pixel values of the pixel points at the same position in at least two first fingerprint images are both 0, the value of the pixel point at the position in the second fingerprint image is 0; under the condition that the pixel values of the pixel points at the same position in the at least two first fingerprint images are not all 0, the value of the pixel point at the position in the second fingerprint image is 255.
S404, determining an addition value obtained by adding values of all pixel points in the second fingerprint image, and determining a product between the line number of the pixel points in the second fingerprint image and the column number of the pixel points in the second fingerprint image.
S405, if the ratio of the summation value to the product is larger than a set threshold, determining that at least two first fingerprint images belong to fingerprint remnant images.
In this embodiment, the above steps S402 to S405 can refer to the related description of the embodiment in fig. 3, and the same parts are not described again.
To facilitate understanding of the benefits of the embodiment of fig. 4, the description will be made in terms of a scenario of detecting the fingerprint residues in the fingerprint database, where if the fingerprint residues of a finger of a user are obtained by others, the fingerprint residues can only be used to simulate the fingerprint verification behavior of the user after the others obtain the fingerprint residues. On the basis, the fingerprint database of the electronic equipment can continuously correct the fingerprint image in the fingerprint database according to the fingerprint image input by the user, so that the fingerprint database can also store a fingerprint image template generated based on the fingerprint vestige, and convenience is provided for other people to simulate the user to perform fingerprint verification in the electronic equipment by using the fingerprint vestige. Since the fingerprint vestige acquired by others is not changed, if the fingerprint verification behavior is input based on the same fingerprint vestige in a short time, the fingerprint image templates generated in the fingerprint database are generated based on the same fingerprint vestige. Therefore, the fingerprint image belonging to the fingerprint vestige in the fingerprint library has invariance in a short time.
Based on this, can acquire the fingerprint image that same user's same finger generated in continuous period of time in the fingerprint storehouse through the scheme of this application to adopt the scheme of this application to superpose two at least fingerprint images who acquire, if the pixel value is more for pixel of 0 in the second fingerprint image that superposes, it is very high to show the fingerprint line similarity of these two at least fingerprint images, belongs to the fingerprint image based on fingerprint vestige storage.
On the contrary, if the at least two fingerprint images acquired in the fingerprint database are not only fingerprint residues, the angle of each fingerprint input by the user and the area of the fingerprint will be different. Like this, after obtaining two at least fingerprint images that generate in a continuous period of time to same finger of same user from the fingerprint storehouse, because fingerprint line change is great in these two at least fingerprint images, when overlapping according to the pixel, the condition that the pixel value on the same position is 0 can be very little relatively to make the pixel number that the value is 0 in the second fingerprint image after the stack less, on this basis, can judge that these two at least fingerprint images are unlikely all to be the fingerprint vestige in the fingerprint storehouse, belong to the fingerprint image of normal storage.
It is to be understood that, in any of the above embodiments of the present application, if the at least two acquired first image samples belong to image samples in a sample library, after determining that the at least two first image samples are remnant images, the at least two first image samples may also be deleted from the sample library. For example, after determining that the obtained at least two fingerprint images are fingerprint remnants, the at least two fingerprint images may be deleted from the fingerprint library.
The application also provides a vestige determining device corresponding to the vestige determining method. Fig. 5 is a schematic diagram illustrating a composition structure of an embodiment of the vestige determination apparatus according to the present application. The apparatus may include:
an image obtaining unit 501, configured to obtain at least two first image samples to be analyzed, where the at least two first image samples are images of a same sample object;
the pixel overlapping unit 502 is configured to overlap information of pixel points at the same position in the at least two first image samples according to the position distribution of the pixel points, to obtain a second image sample after overlapping, where the feature categories to which the pixel points at the same position in the at least two first image samples belong all belong to the first feature category, and then the feature category to which the pixel points at the same position in the second image sample belong is the first feature category;
and a vestige determination unit 503, configured to determine that the pixel points in the second image sample that belong to the first feature category satisfy a condition, and that the at least two image samples belong to a grain vestige image.
In one possible case, the apparatus may further include:
the category determining unit is used for determining the feature category to which each pixel point in the first image sample belongs;
the pixel superposition unit is specifically used for superposing the feature types of the pixel points at the same position in the at least two first image samples respectively according to the position distribution of the pixel points and the feature types to which the pixel points belong in the first image samples to obtain superposed second image samples; wherein, the information superposition is the type superposition of the characteristic types of the pixel points; and the pixel point information of the pixel point at any position in the second image sample is the information of the overlapped characteristic category.
As an optional mode, the feature categories to which the pixel points belong include a first feature category and a second feature category;
in the process of superposing the second image sample, if all the pixel points at the same position in at least two first image samples do not belong to the first feature category, the feature category of the pixel point at the position in the second image sample is the second feature category;
the vestige determination unit determines that pixel points belonging to the first feature category in the second image sample satisfy a condition, specifically: and pixel points belonging to the first characteristic category and pixel points belonging to the second characteristic category in the second image sample meet the condition.
Optionally, the determining, by the remnant determining unit, that the pixel point belonging to the first feature type and the pixel point belonging to the second feature type in the second image sample satisfy the condition specifically is:
and the ratio of the pixel point belonging to the first characteristic category to the pixel point belonging to the second characteristic category in the second image sample is greater than a set ratio.
In one possible case, the apparatus further comprises:
the image processing unit is used for carrying out gray scale and binarization processing on the first image sample, wherein in the processed first image sample, the characteristic category corresponding to the pixel point with the value of 0 is a first characteristic category; the feature class corresponding to the pixel point with the value of 255 is the second feature class.
Optionally, the pixel overlapping unit includes:
the pixel superposition subunit is configured to superpose, according to the position distribution of the pixel points, the values of the pixel points at the same position in the at least two first image samples, respectively, to obtain a superposed second image sample, where the values of the pixel points at the same position in the at least two first image samples are both 0, and then the value of the pixel point at the position in the second image sample is 0; and if the values of the pixel points at the same position in the at least two first image samples are not all 0, the value of the pixel point at the position in the second image sample is 255.
Optionally, the trace determining unit includes:
the first calculating subunit is configured to determine an added value obtained by adding values of all the pixel points in the second image sample, and determine a product between a row number of the pixel points in the second image sample and a column number of the pixel points in the second image sample;
and the remnant determining subunit is used for determining that the at least two first image samples belong to the remnant image if the ratio of the sum value to the product is greater than a set threshold.
In yet another aspect, the present application further provides an electronic device, which may be any computer device with image processing or data processing capabilities. For example, referring to fig. 6, a schematic diagram of a component structure of an electronic device of the present application is shown. The electronic device may include:
a processor 601 and a memory 602;
the processor 601 is configured to obtain at least two first image samples to be analyzed, where the at least two first image samples are images of a same sample object; according to the position distribution of the pixel points, information superposition is respectively carried out on the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample, wherein the characteristic categories to which the pixel points at the same position in the at least two first image samples belong to the first characteristic category, and the characteristic category to which the pixel points at the position in the second image sample belong belongs is the first characteristic category; if pixel points belonging to a first characteristic category in the second image sample meet a condition, determining that the at least two first image samples belong to a remnant image;
the memory 602 is used for storing programs required by the processor to execute the above operations.
The specific operations executed by the processor may refer to the related descriptions in the foregoing method for determining a trace, and are not described herein again.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method of vestige determination, comprising:
obtaining at least two first image samples to be analyzed, wherein the at least two first image samples are images of the same sample object;
according to the position distribution of the pixel points, respectively carrying out information superposition on the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample, wherein the characteristic categories to which the pixel points at the same position in the at least two first image samples belong all belong to the first characteristic category, and the characteristic category to which the pixel points at the position in the second image sample belong is the first characteristic category;
and determining that the at least two first image samples belong to the remnant image when pixel points belonging to the first feature category in the second image sample meet a condition.
2. The method of claim 1, further comprising:
determining the feature type to which each pixel point in the first image sample belongs;
according to the position distribution of the pixel points, information superposition is respectively carried out on the pixel points which are located at the same position in the at least two first image samples, and a superposed second image sample is obtained, wherein the information superposition comprises the following steps:
according to the position distribution of the pixel points and the feature types to which the pixel points belong in the first image samples, respectively overlapping the feature types of the pixel points at the same position in the at least two first image samples to obtain a second image sample after overlapping;
wherein, the information superposition is the type superposition of the characteristic types of the pixel points;
and the pixel point information of the pixel point at any position in the second image sample is the information of the overlapped characteristic category.
3. The method of claim 1, wherein the feature classes to which the pixel points belong comprise a first feature class and a second feature class;
if all the pixel points at the same position in the at least two first image samples do not belong to the first feature category, the feature category of the pixel point at the position in the second image sample is the second feature category;
pixel points in the second image sample belonging to the first feature category satisfy a condition, including:
and pixel points belonging to the first characteristic category and pixel points belonging to the second characteristic category in the second image sample meet the condition.
4. The method of claim 3, wherein the pixel points in the second image sample belonging to the first feature class and the pixel points belonging to the second feature class satisfy a condition comprising:
and the ratio of the pixel point belonging to the first characteristic category to the pixel point belonging to the second characteristic category in the second image sample is greater than a set ratio.
5. The method according to claim 3, further comprising, before said performing information superposition on pixel points at the same position in said at least two first image samples according to the position distribution of the pixel points to obtain a superposed second image sample:
carrying out gray scale and binarization processing on the first image sample, wherein in the processed first image sample, the characteristic category corresponding to the pixel point with the value of 0 is a first characteristic category; the feature class corresponding to the pixel point with the value of 255 is the second feature class.
6. The method according to claim 5, wherein the information superposition is performed on the pixel points located at the same position in the at least two first image samples according to the position distribution of the pixel points, so as to obtain a superposed second image sample, including:
according to the position distribution of the pixel points, respectively superposing the values of the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample, wherein the values of the pixel points at the same position in the at least two first image samples are both 0, and the value of the pixel point at the position in the second image sample is 0; and if the values of the pixel points at the same position in the at least two first image samples are not all 0, the value of the pixel point at the position in the second image sample is 255.
7. The method of claim 4, wherein pixel points in the second image sample belonging to the first feature class and pixel points belonging to the second feature class satisfy a condition, and determining that the at least two image samples belong to a remnant image comprises:
determining an addition value obtained by adding values of all pixel points in the second image sample, and determining a product between the row number of the pixel points in the second image sample and the column number of the pixel points in the second image sample;
and if the ratio of the summation value to the product is greater than a set threshold, determining that the at least two first image samples belong to the remnant image.
8. The method according to claim 1, the at least two first image samples being at least two first fingerprint images being fingerprint images of a same finger of a same user.
9. A vestige determination apparatus comprising:
the device comprises an image acquisition unit, a processing unit and a processing unit, wherein the image acquisition unit is used for acquiring at least two first image samples to be analyzed, and the at least two first image samples are images of the same sample object;
the pixel superposition unit is used for respectively superposing the information of the pixels at the same position in the at least two first image samples according to the position distribution of the pixels to obtain a superposed second image sample, wherein the attribute categories of the pixels at the same position in the at least two first image samples belong to the first attribute category, and the attribute category of the pixels at the position in the second image sample is the first attribute category;
and the vestige determining unit is used for determining that the pixel points belonging to the first characteristic category in the second image sample meet the condition and that the at least two image samples belong to the grain vestige image.
10. An electronic device, comprising:
a processor and a memory;
the processor is used for acquiring at least two first image samples to be analyzed, wherein the at least two first image samples are images of the same sample object; according to the position distribution of the pixel points, information superposition is respectively carried out on the pixel points at the same position in the at least two first image samples to obtain a superposed second image sample, wherein the characteristic categories to which the pixel points at the same position in the at least two first image samples belong to the first characteristic category, and the characteristic category to which the pixel points at the position in the second image sample belong belongs is the first characteristic category; if pixel points belonging to a first characteristic category in the second image sample meet a condition, determining that the at least two first image samples belong to a remnant image;
the memory is used for storing programs needed by the processor to execute the operations.
CN202010250328.4A 2020-04-01 2020-04-01 Remnant determining method and device and electronic equipment Pending CN111461016A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010250328.4A CN111461016A (en) 2020-04-01 2020-04-01 Remnant determining method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010250328.4A CN111461016A (en) 2020-04-01 2020-04-01 Remnant determining method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111461016A true CN111461016A (en) 2020-07-28

Family

ID=71680492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010250328.4A Pending CN111461016A (en) 2020-04-01 2020-04-01 Remnant determining method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111461016A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1573796A (en) * 2003-06-13 2005-02-02 索尼株式会社 Image verification system and image verification method
CN105205439A (en) * 2015-02-13 2015-12-30 比亚迪股份有限公司 Method for calculating area of fingerprint overlapping region and electronic device
US20160350580A1 (en) * 2014-02-14 2016-12-01 Crucialtec Co., Ltd. Electronic device comprising minimum sensing area and fingerprint information processing method thereof
CN106485125A (en) * 2016-10-21 2017-03-08 上海与德信息技术有限公司 A kind of fingerprint identification method and device
US20180107858A1 (en) * 2014-08-26 2018-04-19 Gingy Technology Inc. Fingerprint identification method and fingerprint identification device
CN108121946A (en) * 2017-11-15 2018-06-05 大唐微电子技术有限公司 A kind of Pre-processing Method for Fingerprint Image and device
CN110572636A (en) * 2019-07-23 2019-12-13 RealMe重庆移动通信有限公司 camera contamination detection method and device, storage medium and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1573796A (en) * 2003-06-13 2005-02-02 索尼株式会社 Image verification system and image verification method
US20160350580A1 (en) * 2014-02-14 2016-12-01 Crucialtec Co., Ltd. Electronic device comprising minimum sensing area and fingerprint information processing method thereof
US20180107858A1 (en) * 2014-08-26 2018-04-19 Gingy Technology Inc. Fingerprint identification method and fingerprint identification device
CN105205439A (en) * 2015-02-13 2015-12-30 比亚迪股份有限公司 Method for calculating area of fingerprint overlapping region and electronic device
CN106485125A (en) * 2016-10-21 2017-03-08 上海与德信息技术有限公司 A kind of fingerprint identification method and device
CN108121946A (en) * 2017-11-15 2018-06-05 大唐微电子技术有限公司 A kind of Pre-processing Method for Fingerprint Image and device
CN110572636A (en) * 2019-07-23 2019-12-13 RealMe重庆移动通信有限公司 camera contamination detection method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
Scherhag et al. Detection of face morphing attacks based on PRNU analysis
US11830230B2 (en) Living body detection method based on facial recognition, and electronic device and storage medium
JP7386545B2 (en) Method for identifying objects in images and mobile device for implementing the method
CN108229297B (en) Face recognition method and device, electronic equipment and computer storage medium
Mahmood et al. Copy‐move forgery detection technique for forensic analysis in digital images
KR101896357B1 (en) Method, device and program for detecting an object
Türkyılmaz et al. License plate recognition system using artificial neural networks
CN107886082B (en) Method and device for detecting mathematical formulas in images, computer equipment and storage medium
CN110008997B (en) Image texture similarity recognition method, device and computer readable storage medium
CN110852311A (en) Three-dimensional human hand key point positioning method and device
McBride et al. A comparison of skin detection algorithms for hand gesture recognition
CN113011144A (en) Form information acquisition method and device and server
CN112396050B (en) Image processing method, device and storage medium
CN109389110B (en) Region determination method and device
US20230147685A1 (en) Generalized anomaly detection
CN112101386A (en) Text detection method and device, computer equipment and storage medium
CN110210467B (en) Formula positioning method of text image, image processing device and storage medium
CN110533704B (en) Method, device, equipment and medium for identifying and verifying ink label
CN114155363A (en) Converter station vehicle identification method and device, computer equipment and storage medium
Shu et al. Face spoofing detection based on multi-scale color inversion dual-stream convolutional neural network
CN114241463A (en) Signature verification method and device, computer equipment and storage medium
Joshi et al. First steps toward CNN based source classification of document images shared over messaging app
Belhedi et al. Adaptive scene‐text binarisation on images captured by smartphones
JP2006323779A (en) Image processing method and device
US20230069960A1 (en) Generalized anomaly detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination