CN111680549A - Paper pattern recognition method - Google Patents

Paper pattern recognition method Download PDF

Info

Publication number
CN111680549A
CN111680549A CN202010348238.9A CN202010348238A CN111680549A CN 111680549 A CN111680549 A CN 111680549A CN 202010348238 A CN202010348238 A CN 202010348238A CN 111680549 A CN111680549 A CN 111680549A
Authority
CN
China
Prior art keywords
pattern image
paper pattern
paper
identified
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010348238.9A
Other languages
Chinese (zh)
Other versions
CN111680549B (en
Inventor
陈端
曾绍群
胡庆磊
黄凯
李宁
李梦婷
李培
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaophoton Wuhan Technology Co ltd
Huazhong University of Science and Technology
Original Assignee
Convergence Wuhan Technology Co ltd
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Convergence Wuhan Technology Co ltd, Huazhong University of Science and Technology filed Critical Convergence Wuhan Technology Co ltd
Priority to CN202010348238.9A priority Critical patent/CN111680549B/en
Publication of CN111680549A publication Critical patent/CN111680549A/en
Application granted granted Critical
Publication of CN111680549B publication Critical patent/CN111680549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/80Recognising image objects characterised by unique random patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Inspection Of Paper Currency And Valuable Securities (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a paper pattern recognition method, which comprises the following steps: s1, shooting microscopic images of micron precision of internal fibers of a reference paper file and a paper file to be identified under the condition of a transmission light source, and taking the microscopic images as a reference paper pattern image and a paper pattern image to be identified respectively; s2, extracting the characteristic points of the reference paper pattern image and the paper pattern image to be identified for matching, and generating a characteristic point matching pair; s3, estimating a transformation matrix between the reference paper pattern image and the paper pattern image to be identified according to the feature point matching pair, and respectively obtaining the interested areas of the reference paper pattern image and the paper pattern image to be identified; s4, respectively enhancing the fiber textures of the regions of interest; and S5, measuring the similarity according to the enhanced texture structure and outputting an identification result. The method can not only resist the translation, rotation and scaling of the paper pattern image caused by illumination change or artificial and equipment operation deviation in the paper pattern acquisition process, but also resist the abnormal situations of dirt on the surface of paper and the like.

Description

Paper pattern recognition method
Technical Field
The invention relates to the field of article identification, in particular to a paper texture identification method based on a micro texture structure of paper.
Background
With the rapid development of computer hardware and computer vision technologies, more image processing-based technologies are receiving wide attention in the field of anti-counterfeit identification of articles. In the aspect of anti-counterfeiting problems of important paper documents such as contracts, bills, performance tickets and the like, because the documents lack obvious mark characteristics and because the counterfeiting difficulty of the documents is greatly reduced due to the development of the current printing technology and printing precision, not only the threshold of the counterfeiting technology is low, but also the counterfeiting cost is very low, and the documents become key attack objects of some counterfeiters.
The traditional method is used for identifying and preventing the counterfeiting of important documents by signing, stamping, printing anti-counterfeiting labels and the like. Although the methods are low in cost and easy to implement, the methods are easy to counterfeit and attack and do not have good anti-counterfeiting performance. In addition, special anti-counterfeiting paper and special printing ink are utilized, and anti-counterfeiting technologies such as random fibers, random bubble anti-counterfeiting labels and the like are added to articles manually in a physical or chemical mode, but the modes have high cost and technical threshold, are not beneficial to popularization and application, and are generally only used for package anti-counterfeiting of expensive commodities. In recent years, due to the rapid development of image acquisition, image processing and computer technology, an identification method based on natural unclonable features of paper itself becomes a hot problem;
in particular, patent CN 102073865 a proposes an anti-counterfeit method using fiber texture of paper itself, which can realize anti-counterfeit recognition of paper without additional technical processing on paper, but the method judges whether paper is true or false based on the pattern recognition result of extracted texture feature points, and the extracted feature points depend on the position of paper at each acquisition, the relative position of acquisition equipment and paper, and the optical magnification of acquisition equipment; therefore, in order to ensure the accuracy of the recognition result, the acquisition conditions are required to be highly consistent when the image is acquired, and the translation, rotation and scaling of the paper cannot be resisted well, and in addition, when pollutants such as handwriting, water stain and the like appear on the surface of the paper, the extracted feature points can also be changed, so that the recognition result of the paper is inaccurate;
compared with the above patent CN 102073865 a, the anti-counterfeiting method using physical characteristic recognition of a substance disclosed in patent CN 102955930 a is characterized in that the substance is light-permeable, and mode recognition is performed by acquiring an optically acquired physical characteristic image after transmission; the purpose of adopting the transmitted light is to realize the image acquisition of the physical characteristics of the image at low cost, and the problem of the method still exists in the patent CN 102073865A;
compared with the patent CN 102073865 a, the paper pattern recognition method disclosed in patent CN 110599665 a obtains the decontamination feature region image of the paper pattern to be verified through the Yolo-v2 model before extracting the feature points of the paper pattern image, and because the final recognition of the recognition method is still based on the feature vectors of the feature points of the collected paper pattern image, the recognition result is affected by the location, area size, location distribution of the remaining region after removing contaminants, and distribution of the feature points of the remaining region after removing contaminants;
in summary, the existing paper pattern identification method based on the natural unclonable feature of the paper pattern has two obvious limitations. Firstly, the paper pattern recognition method cannot well resist translation, rotation, scaling and the like of a paper pattern image caused by human and equipment operation deviation during paper pattern collection; secondly, the paper pattern recognition method cannot resist the dirt on the surface of the paper, and the recognition result is greatly influenced by the dirt.
Disclosure of Invention
The invention provides a new paper pattern recognition method by combining two aspects of a paper pattern acquisition scheme and a paper pattern recognition algorithm aiming at the defects of the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows: a paper pattern recognition method is constructed, and the method comprises the following steps:
s1, shooting internal fiber micron precision microscopic images of a reference paper file and a paper file to be identified under the condition of a transmission light source, and respectively taking the internal fiber micron precision microscopic images as a reference paper pattern image and a paper pattern image to be identified;
s2, extracting the characteristic points of the reference paper pattern image and the paper pattern image to be identified for matching, and generating a characteristic point matching pair;
s3, estimating a transformation matrix between the reference paper pattern image and the paper pattern image to be identified according to the feature point matching pair, and respectively obtaining the interested areas of the reference paper pattern image and the paper pattern image to be identified;
s4, respectively enhancing the fiber textures of the regions of interest;
and S5, measuring the similarity according to the enhanced texture structure and outputting an identification result.
Preferably, in the paper print identification method according to the present invention, in step S1,
when the reference paper pattern image is collected, the coordinate position of the corresponding point of the central point of the collected reference paper pattern image on the reference paper file relative to the central point of the reference paper file is recorded;
and when the paper pattern image to be identified is collected, collecting the paper pattern image to be identified in the corresponding area according to the recorded coordinate position.
Preferably, in the paper print identification method of the present invention, when the paper print image to be identified is collected, the recorded coordinate position is within the field of view of the microscopic image.
Preferably, in the paper pattern recognition method of the present invention, in a case where focusing is accurate, microscopic images of the reference paper document and the paper document to be recognized are captured.
Preferably, in the paper print identification method according to the present invention, the step S2 includes:
s2-1, carrying out image preprocessing on the reference paper pattern image and the paper pattern image to be identified;
and S2-2, extracting texture feature points of the preprocessed reference paper pattern image and the paper pattern image to be recognized, calculating feature vectors of the feature points, and matching the feature points according to the feature vectors to generate feature point matching pairs.
Preferably, in the paper print identification method of the present invention, the matching feature points according to feature vectors includes:
and matching the feature points according to the similarity of the feature vectors corresponding to the extracted feature points.
Preferably, in the paper print identification method of the present invention, the matching feature points according to the similarity of the feature vectors corresponding to the extracted feature points includes:
respectively extracting the characteristic points of the reference paper pattern image and the paper pattern image to be identified, respectively calculating the similarity of characteristic vectors corresponding to the characteristic points of the reference paper pattern image and the paper pattern image to be identified, selecting a threshold value, and generating a characteristic point matching pair when the calculated similarity is greater than the threshold value.
Preferably, in the paper print identification method of the present invention, the feature points are SURF or SIFT feature points.
Preferably, in the paper print identification method according to the present invention, the step S3 includes:
s3-1, rejecting wrong feature point matching pairs in the feature point matching pairs obtained in the step S2, and estimating a transformation matrix between the reference paper pattern image and the paper pattern image to be identified according to the remaining effective feature point matching pairs;
s3-2, transforming the paper pattern image to be recognized by using the transformation matrix;
and S3-3, overlapping the converted paper pattern image to be recognized and the reference paper pattern image, and respectively intercepting the inscribed rectangle of the overlapped part from the two paper pattern images to be used as an interested area.
Preferably, in the paper print recognition method according to the present invention,
the step S3-1 includes: eliminating wrong feature point matching pairs in the feature point matching pairs obtained in the step S2 by adopting an MSAC algorithm, and estimating an affine transformation matrix from the paper pattern image to be recognized to the reference paper pattern image according to the remaining effective feature point matching pairs by taking the reference paper pattern image as a standard;
the step S3-2 includes: carrying out affine transformation on the paper pattern image to be identified by utilizing the affine transformation matrix;
the step S3-3 includes: and stacking the converted paper pattern images, selecting a threshold S, and if the area of the converted overlapped part accounts for the total area of the reference paper pattern image or the paper pattern image to be identified and is larger than S, intercepting the maximum inscribed rectangle of the overlapped part of the converted two images as an interested area.
Preferably, in the paper print identification method according to the present invention, the step S3 further includes:
if the number of the effective feature point matching pairs in the step S3-1 is too small to estimate the transformation matrix, or the area of the region of interest is too small, the reference paper print image and the paper print image to be identified, which are preprocessed in the step S2-1, are used as the region of interest.
Preferably, in the paper print identification method according to the present invention, the step S4 includes:
and respectively enhancing the fiber textures of the two interested regions by adopting Gabor filters at multiple angles, simultaneously generating an amplitude response matrix and a phase angle response matrix by the Gabor filter at each angle, then superposing the amplitude response matrix and the phase angle response matrix at the multiple angles, and finally generating a corresponding amplitude response matrix and a corresponding phase angle response matrix.
Preferably, in the paper print identification method according to the present invention, the step S4 includes:
and adjusting the interested areas of the reference paper pattern image and the paper pattern image to be identified to be the same size according to the preset image size parameters.
Preferably, in the paper print identification method according to the present invention, the step S5 includes:
respectively mapping the enhanced texture structures of the interested areas of the paper pattern image to be identified and the reference paper pattern image to 01 digital space, generating a similarity index by using Hamming distance, and measuring the similarity of the bit streams correspondingly generated by the reference paper pattern image and the paper pattern image to be identified.
Preferably, in the paper print identification method of the present invention, mapping the enhanced texture structure to a 01-digit space, generating a similarity index by using hamming distance, and measuring the similarity of the reference paper print image and the bit stream generated by the paper print image to be identified includes:
respectively calculating the average values of the final amplitude response matrix and the final phase angle response matrix of the interested areas of the reference paper pattern image and the paper pattern image to be identified, carrying out size ratio on each number in the response matrix and the average value, taking 1 which is larger than the average value, taking 0 which is smaller than the average value, spreading the two number matrices according to rows or columns, splicing the obtained bit streams together, and respectively taking the two number matrices as the digital paper patterns of the reference paper pattern image and the paper pattern image to be identified;
calculating the Hamming distance between the digital paper patterns of the reference paper pattern image and the paper pattern image to be identified, and taking the proportion of the Hamming distance in the total length of the digital paper patterns as a similarity index between the reference paper pattern and the paper pattern to be identified;
and selecting a threshold t, wherein if the similarity index is greater than the threshold t, the identification fails, and if the similarity index is less than the threshold t, the identification succeeds.
The invention provides a paper pattern recognition method based on a fiber micron precision microstructure in paper, which mainly utilizes a texture structure interwoven by a microfiber with randomness in the paper to recognize paper patterns. Compared with the prior art, the paper pattern recognition method provided by the invention at least has the following beneficial effects: (1) by adopting a paper texture acquisition scheme of microscopic and transmission illumination, a micrometer-precision fiber structure texture image from the surface of the paper to the lower 50 micrometers can be directly acquired, and the fiber texture is stable and insensitive to illumination change and paper surface dirt; (2) the paper texture recognition is carried out by adopting the fiber texture features with the paper micron precision, so that the stability of the paper texture calibration method based on the characteristic point pre-estimation paper texture transformation matrix is ensured, and the translation, rotation and scaling of a paper texture image caused by illumination change or man-made and equipment operation deviation in the paper texture acquisition process can be resisted; (3) because the fiber texture features with micron precision are not easy to lose when the surface of the paper is dirty, the paper texture recognition algorithm provided by the invention can still realize paper texture calibration and recognition work under the condition of dirty objects, so that the paper texture recognition method provided by the invention can resist the adverse situations of dirty objects and the like on the surface of the paper.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flow chart of a paper print identification method of the present invention;
FIG. 2 is a flow chart of obtaining a reference paper print image and a region of interest of a paper print image to be identified according to the present invention;
FIG. 3 is a schematic diagram of an example of a region of interest for acquiring a reference paper print image and a paper print image to be recognized according to the present invention;
fig. 4 is a schematic diagram of an example of enhancing fiber texture of a region of interest using a multi-angle Gabor filter.
Detailed Description
For a more clear understanding of the technical features, objects and effects of the present invention, embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
As shown in fig. 1, the invention constructs a paper pattern recognition method, which is based on a fiber micro-texture structure inside paper, is easy to implement, does not need to introduce additional printing operation, can resist paper translation, rotation and image scaling during acquisition, and can ensure the accuracy of the result of the paper pattern recognition method when different pollutants appear on the surface of the paper, and the method comprises the following steps:
step S1: and shooting microscopic images of micron precision of internal fibers of the reference paper file and the paper file to be identified under the condition of a transmission light source, and taking the microscopic images as a reference paper pattern image and a paper pattern image to be identified respectively.
Specifically, in step S1, the reference paper pattern image needs to be collected and stored in advance for the next paper pattern recognition. The paper pattern image to be identified is used for identifying and matching with the reference paper pattern image, and when the paper pattern image to be identified and the reference paper pattern image are collected from the same area of the same piece of paper, the paper pattern matching is successful; when the paper pattern image to be identified and the reference paper pattern image are collected from different areas of the same piece of paper or different pieces of paper, the paper pattern matching fails.
The same area of the same piece of paper refers to the fact that the acquired reference paper pattern image and the paper pattern image to be identified contain a common area on the same paper document, but the corresponding areas of the two acquired images are not required to be identical, namely, the translation and rotation caused by manual acquisition are allowed when the paper pattern images are acquired.
Wherein, step S1 specifically includes: when the reference paper pattern image is collected, the coordinate position of the corresponding point of the central point of the collected reference paper pattern image on the reference paper file relative to the central point of the reference paper file is recorded; and when the paper pattern image to be identified is collected, collecting the paper pattern image to be identified in the corresponding area according to the recorded coordinate position.
And when the paper pattern image to be identified is collected, the recorded coordinate position relative to the central point of the reference paper file is only required to be ensured to be within the field range of microscopic shooting.
In addition, when microscopic images of the reference paper file and the paper file to be identified are collected, only accurate focusing is required to be ensured during each collection, the high consistency of the magnification ratio during image collection is not required, and the inconsistency of the magnification ratio during collection of the reference paper pattern image and the paper pattern image to be identified can be allowed. Therefore, in the present embodiment, in the case where the focusing is accurate, microscopic images of the reference paper document and the paper document to be recognized are taken.
The scheme for acquiring the microscopic and transmissive illumination in the step S1 can acquire a fiber image with a fiber micrometer precision inside the paper, has stable texture characteristics, and is insensitive to abnormal situations such as illumination change and surface contamination of the paper.
Step S2: and extracting the characteristic points of the reference paper pattern image and the paper pattern image to be identified for matching to generate a characteristic point matching pair.
Specifically, step S2 includes:
step S2-1: carrying out image preprocessing on the reference paper pattern image and the paper pattern image to be identified;
the purpose of image preprocessing is to simplify data of a reference paper pattern image and a paper pattern image to be identified, enable the reference paper pattern image and the paper pattern image to be identified to have the same size and data format, and effectively avoid identification errors caused by inconsistent image data formats. Specifically, the reference paper pattern image and the paper pattern image to be recognized are grayed and adjusted to the same size, and in this embodiment, the reference paper pattern image and the paper pattern image to be recognized are grayed and adjusted to 640 × 640.
Step S2-2: and extracting texture feature points of the preprocessed reference paper pattern image and the paper pattern image to be recognized, calculating feature vectors of the feature points, and matching the feature points according to the feature vectors to generate feature point matching pairs.
The characteristic points generally refer to points with invariance of rotation, translation or affine transformation, and the translation, rotation and miscut of the acquired images can be effectively resisted by adopting a characteristic matching method, and the algorithm robustness is improved because the method is insensitive to illumination change, noise, viewpoint transformation and the like.
In this embodiment, matching feature points according to feature vectors includes: and matching the feature points according to the similarity of the feature vectors corresponding to the extracted feature points. Specifically, feature points of the reference paper pattern image and the paper pattern image to be recognized are respectively extracted, similarity of feature vectors corresponding to the feature points of the reference paper pattern image and the paper pattern image to be recognized is respectively calculated, a threshold value is selected, and when the calculated similarity is larger than the threshold value, a feature point matching pair is generated. Wherein, in some embodiments, the feature points are SURF or SIFT feature points. In this embodiment, the feature points are SURF feature points, which can effectively resist translation, rotation, scaling, and illumination changes when acquiring the paper print image.
Compared with the prior art, the step of matching feature points in step S2 is to estimate the transformation matrix in step S3 and prepare for obtaining the interested areas of the reference paper-line image and the paper-line image to be identified, rather than directly performing paper-line identification according to the feature vectors of the feature points.
Step S3: respectively acquiring interested areas of the reference paper pattern image and the paper pattern image to be identified according to the transformation matrix between the characteristic point matching pair estimation reference paper pattern image and the paper pattern image to be identified;
specifically, step S3 includes:
step S3-1: rejecting wrong feature point matching pairs in the feature point matching pairs obtained in the step S2, and estimating a transformation matrix between the reference paper pattern image and the paper pattern image to be recognized according to the remaining effective feature point matching pairs so that the corresponding effective feature point matching pairs between the reference paper pattern image and the paper pattern image to be recognized are consistent as much as possible in spatial position after transformation;
step S3-2: transforming the paper pattern image to be identified by using a transformation matrix;
step S3-3: and (4) stacking the converted paper pattern image to be identified and the reference paper pattern image, and respectively intercepting the inscribed rectangle of the overlapped part from the two paper pattern images to be used as an interested area so as to realize paper pattern calibration. The interested areas are two, one is of the reference paper pattern image, and the other is of the paper pattern image to be identified.
The purpose of eliminating the feature point matching pairs in the step S3-1 is to effectively reduce the influence of noise on the feature point selection, eliminate the feature point mismatching pairs caused by noise interference, and ensure the accuracy of the transformation matrix between the paper pattern image to be recognized and the reference paper pattern image estimated in the step S3-1.
The transformation matrix in step S3-1 can automatically adjust the position translation, rotation, and scaling of the paper pattern to be recognized relative to the reference paper pattern when the reference paper pattern image and the paper pattern image to be recognized are collected from the same area of the same sheet.
In the region of interest described in step S3-1, when the reference paper print image and the paper print image to be recognized are acquired from the same region of the same sheet of paper, the acquired region of interest is a common region acquired twice.
In the present embodiment, preferably, step S3-1 includes: eliminating error feature point matching pairs in the feature point matching pairs obtained in the step S2 by adopting an MSAC algorithm, and estimating an affine transformation matrix from the paper pattern image to be recognized to the reference paper pattern image according to the remaining effective feature point matching pairs by taking the reference paper pattern image as a standard;
step S3-2 includes: carrying out affine transformation on the paper pattern image to be identified by utilizing an affine transformation matrix, so that the corresponding effective characteristic point matching pairs between the paper pattern image and the paper pattern image after transformation are consistent as much as possible in spatial position;
step S3-3 includes: and stacking the converted paper pattern image to be identified and the reference paper pattern image to ensure that the reference paper pattern image and the paper pattern image to be identified have larger similar areas, selecting a threshold value S, and if the area of the converted overlapped part accounts for the total area of the reference paper pattern image or the paper pattern image to be identified and is larger than S, intercepting the maximum inscribed rectangle of the overlapped part of the converted two as an interested area. In the present embodiment, S is 20%.
In some embodiments, step S3 further includes: and if the number of the effective feature point matching pairs in the step S3-1 is too small to estimate the transformation matrix, or the area of the region of interest is too small, taking the reference paper pattern image and the paper pattern image to be identified which are preprocessed in the step S2-1 as the region of interest.
Specifically, in the present embodiment, since three pairs of feature points are required for estimating one affine transformation matrix, if the number of remaining valid pairs of feature points in step S3-1 is less than 3, which is not enough to estimate the affine transformation matrix or the ratio of the area of the overlapped portion between the transformed reference paper pattern and the paper pattern to be recognized is less than 20%, the reference paper pattern image and the paper pattern image to be recognized, which have been preprocessed in step S2-1, are used as the region of interest.
Compared with the prior art, the method for intercepting the region of interest can realize the registration between the collected paper pattern images without additional operations such as rectangle printing, watermark printing and the like.
Compared with the prior art, the method for intercepting the region of interest ensures that the paper pattern recognition does not depend on the extraction of global feature points, and when local graffiti, water stain and printed characters appear on the surface of the paper, the accuracy of the paper pattern recognition can be still ensured without removing dirt in advance.
The step S3 is to obtain the region of interest by using the feature point matching pairs extracted in the step S2, so that the translation and rotation of the paper, the illumination change of the environment, the relative position change between the paper and the image acquisition device, and the change of the optical magnification of image acquisition can be effectively resisted when acquiring the paper pattern.
The flow for obtaining the interested areas of the reference paper pattern image and the paper pattern image to be recognized in step S3 is shown in fig. 2, and an example schematic diagram of the interested areas of the reference paper pattern image and the paper pattern image to be recognized (the reference paper pattern and the paper pattern to be recognized are collected from the same area of the same piece of paper) is shown in fig. 3;
the stability of the paper pattern calibration method of the steps S2 and S3 for estimating the transformation between the images based on the paper fiber microscopic image feature points is guaranteed by the stability of the microscopic image texture features of the fiber micron precision in the paper acquired in the step S1, so that the calibration algorithm can resist the translation, rotation and scaling of the paper pattern images caused by illumination change or manual and equipment operation deviation in the paper pattern acquisition process.
Step S4: the fiber texture of the region of interest is separately enhanced.
Specifically, step S4 includes:
the fiber textures of two interested areas are respectively enhanced by adopting Gabor filters at multiple angles, simultaneously, the Gabor filter at each angle generates an amplitude response matrix and a phase angle response matrix, then the amplitude response matrix and the phase angle response matrix at the multiple angles are superposed, and finally, a corresponding amplitude response matrix and a corresponding phase angle response matrix are generated by a paper pattern image.
Most of the existing paper grain anti-counterfeiting technologies utilize unidirectional Gabor filtering to enhance paper grains, and then utilize eigenvectors or singular values of a generated matrix to identify the final paper grains, which can cause the details of the paper grains to be lost and influence the identification accuracy.
The multi-angle Gabor filtering is enhanced, and the paper texture information can be retained to a greater extent by considering the arrangement of the paper surface and the internal micro-fiber texture and the randomness of the direction, so that the identification accuracy is improved.
Specifically, the fiber textures of the interested regions of the reference paper pattern image and the paper pattern image to be identified are enhanced by adopting a multi-angle Gabor filter, and the Gabor filter has the characteristic of obtaining optimal localization in a space domain and a frequency domain simultaneously, so that the local structure information corresponding to space frequency, space position and direction selectivity can be well described. In this embodiment, the Gabor filters in four directions are simultaneously applied to one paper print, the spatial frequencies of the four Gabor filters are all set to 0.1, and then the direction parameters are respectively set to 0 °, 30 °, 60 °, and 90 °, so that each paper print can obtain four amplitude responses and four phase angle responses corresponding to the four Gabor filters, and the four amplitude responses and the four phase angle responses are respectively superimposed to obtain a combined amplitude response and phase angle response as a final response result of one paper print. It should be noted that, the magnitude of the amplitude response matrix and the phase angle response matrix is the same as the magnitude of the input region of interest, so to ensure that the recognition results of different paper patterns are comparable, the regions of interest of the reference paper pattern image and the paper pattern image to be recognized need to be adjusted to the same magnitude according to the image size parameters given in advance before enhancing the paper patterns, because the sizes of the overlapping regions of different paper pattern images are different, the sizes of the intercepted regions of interest may be different. In this embodiment, the regions of interest are each adjusted to 32 x 32 size images before the texture is enhanced. An example schematic diagram of the method for enhancing the fiber texture of the region of interest by using a Gabor filter with multiple angles is shown in fig. 4.
The micrometer precision microscopic texture features of the fibers in the paper collected by the method are stable, and can still be kept stable under the conditions of illumination change and dirt on the surface of the paper, so that the identification algorithm based on the paper fiber texture image in the step S4 can resist abnormal conditions such as illumination change, paper dirt and the like.
Step S5: and measuring the similarity according to the enhanced texture structure and outputting an identification result.
Specifically, the method for measuring similarity according to the enhanced texture structure in step S5 includes:
respectively mapping the enhanced texture structures of the interested areas of the paper pattern image to be identified and the reference paper pattern image to 01 digital space, generating a similarity index by using the hamming distance, and measuring the similarity of the bit streams correspondingly generated by the reference paper pattern image and the paper pattern image to be identified.
Preferably, mapping the enhanced texture structure to a 01-digit space, generating a similarity index by using hamming distance, and measuring the similarity of the reference paper pattern image and the bit stream generated by the paper pattern image to be recognized correspondingly comprises:
respectively calculating the average values of the final amplitude response matrix and the final phase angle response matrix of the interested areas of the reference paper pattern image and the paper pattern image to be identified, carrying out size ratio on each number in the response matrix and the average value, taking 1 when the number is larger than the average value, taking 0 when the number is smaller than the average value, spreading the two number matrices according to rows or columns, and splicing the obtained bit streams together to respectively serve as the digital paper patterns of the reference paper pattern image and the paper pattern image to be identified;
calculating the Hamming distance between the digital paper patterns of the reference paper pattern image and the paper pattern image to be identified, and taking the proportion of the Hamming distance in the total length of the digital paper patterns as a similarity index between the reference paper pattern and the paper pattern to be identified;
and selecting a threshold t, wherein if the similarity index is greater than the threshold t, the identification fails, and if the similarity index is less than the threshold t, the identification succeeds.
Specifically, calculating the final amplitude response of the region of interest and the mean value of the phase angle response matrix respectively, then comparing each number in the response matrix with the mean value, and taking 1 if the number is larger than the mean value and taking 0 if the number is smaller than the mean value; in this embodiment, for the 32 × 32 roi of one texture, two 01 digital matrices corresponding to the 32 × 32 amplitude responses and the phase angle responses respectively can be finally obtained, the two digital matrices are expanded by rows or columns, and then the obtained bitstreams are spliced together, so that a bitstream with a size of 32 × 2 — 2048 can be obtained and used as the digital texture of the reference texture image and the texture image to be recognized respectively.
In information coding, the number of coded different bits on corresponding bits of two legal codes is called Hamming distance; in this embodiment, the hamming distance between the digital paper patterns of the reference paper pattern image and the paper pattern image to be recognized obtained in the above steps is calculated, and then the ratio of the hamming distance to the total length 2048 of the digital paper patterns is used as the similarity index between the reference paper pattern and the paper pattern to be recognized, for example, when the different number of bits at the corresponding bits of the digital paper patterns generated by the reference paper pattern image and the paper pattern image to be recognized is 80, that is, the hamming distance is 80, the corresponding similarity index is 0.039; if the reference paper pattern image and the paper pattern image to be identified are collected from the same area of the same paper, the similarity index between the reference paper pattern image and the digital paper pattern corresponding to the paper pattern image to be identified is close to 0; if the reference paper pattern image and the paper pattern image to be identified are collected from different paper or different areas of the same paper, the similarity index between the digital paper patterns corresponding to the reference paper pattern image and the paper pattern image to be identified is close to 0.5 due to randomness. In this embodiment, the threshold t is selected to be 0.25, and when the similarity index is greater than 0.25, the recognition fails, and when the similarity index is less than 0.25, the recognition succeeds. Moreover, it should be noted that the threshold may be selected according to a large number of experimental results.
The invention provides a paper pattern recognition method based on a micron precision fiber microstructure in paper, which mainly utilizes a texture structure interwoven by micro fibers with randomness in the paper to recognize paper patterns. Compared with the prior art, the paper pattern recognition method provided by the invention at least has the following beneficial effects: (1) by adopting a paper texture acquisition scheme of microscopic and transmission illumination, a micrometer-precision fiber structure texture image from the surface of the paper to the lower 50 micrometers can be directly acquired, and the fiber texture is stable and insensitive to illumination change and paper surface dirt; (2) the paper grain identification is carried out by adopting the fiber texture features with the paper micron precision, so that the stability of the paper grain calibration method based on the characteristic point pre-estimation paper grain transformation matrix is ensured, and the translation, rotation and scaling of a paper grain image caused by manual and equipment operation deviation in the paper grain acquisition process can be resisted; (3) because the fiber texture features with micron precision are not easy to lose when the surface of the paper is dirty, the paper texture recognition algorithm provided by the invention can still realize paper texture calibration and recognition work under the condition of dirty objects, so that the paper texture recognition method provided by the invention can resist the adverse situations of dirty objects and the like on the surface of the paper.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (15)

1. A paper pattern recognition method is characterized by comprising the following steps:
s1, shooting microscopic images of micron precision of fibers in a reference paper file and a paper file to be identified under the condition of a transmission light source, and respectively taking the microscopic images as a reference paper pattern image and a paper pattern image to be identified;
s2, extracting the characteristic points of the reference paper pattern image and the paper pattern image to be identified for matching, and generating a characteristic point matching pair;
s3, estimating a transformation matrix between the reference paper pattern image and the paper pattern image to be identified according to the feature point matching pair, and respectively obtaining the interested areas of the reference paper pattern image and the paper pattern image to be identified;
s4, respectively enhancing the fiber textures of the regions of interest;
and S5, measuring the similarity according to the enhanced texture structure and outputting an identification result.
2. The paper print recognition method according to claim 1, wherein, in step S1,
when the reference paper pattern image is collected, the coordinate position of the corresponding point of the central point of the collected reference paper pattern image on the reference paper file relative to the central point of the reference paper file is recorded;
and when the paper pattern image to be identified is collected, collecting the paper pattern image to be identified in the corresponding area according to the recorded coordinate position.
3. The paper print identification method according to claim 2, wherein the recorded coordinate position is within a field of view of a photomicrograph when the paper print image to be identified is acquired.
4. The paper print identification method according to claim 1, characterized in that microscopic images of the reference paper document and the paper document to be identified are taken with accurate focusing.
5. The method according to claim 1, wherein the step S2 includes:
s2-1, carrying out image preprocessing on the reference paper pattern image and the paper pattern image to be identified;
and S2-2, extracting texture feature points of the preprocessed reference paper pattern image and the paper pattern image to be recognized, calculating feature vectors of the feature points, and matching the feature points according to the feature vectors to generate feature point matching pairs.
6. The paper print recognition method according to claim 5, wherein the matching feature points according to feature vectors comprises:
and matching the feature points according to the similarity of the feature vectors corresponding to the extracted feature points.
7. The paper print recognition method according to claim 6, wherein the matching of feature points according to the similarity of feature vectors corresponding to the extracted feature points comprises:
respectively extracting the characteristic points of the reference paper pattern image and the paper pattern image to be identified, respectively calculating the similarity of characteristic vectors corresponding to the characteristic points of the reference paper pattern image and the paper pattern image to be identified, selecting a threshold value, and generating a characteristic point matching pair when the calculated similarity is greater than the threshold value.
8. The paper print identification method according to claim 6, wherein the feature points are SURF or SIFT feature points.
9. The method according to claim 5, wherein the step S3 includes:
s3-1, rejecting wrong feature point matching pairs in the feature point matching pairs obtained in the step S2, and estimating a transformation matrix between the reference paper pattern image and the paper pattern image to be identified according to the remaining effective feature point matching pairs;
s3-2, transforming the paper pattern image to be recognized by using the transformation matrix;
and S3-3, overlapping the converted paper pattern image to be recognized and the reference paper pattern image, and respectively intercepting the inscribed rectangle of the overlapped part from the two paper pattern images to be used as an interested area.
10. The paper print recognition method of claim 9,
the step S3-1 includes: eliminating wrong feature point matching pairs in the feature point matching pairs obtained in the step S2 by adopting an MSAC algorithm, and estimating an affine transformation matrix from the paper pattern image to be recognized to the reference paper pattern image according to the remaining effective feature point matching pairs by taking the reference paper pattern image as a standard;
the step S3-2 includes: carrying out affine transformation on the paper pattern image to be identified by utilizing the affine transformation matrix;
the step S3-3 includes: and stacking the converted paper pattern images, selecting a threshold S, and if the area of the converted overlapped part accounts for the total area of the reference paper pattern image or the paper pattern image to be identified and is larger than S, intercepting the maximum inscribed rectangle of the overlapped part of the converted two images as an interested area.
11. The method according to claim 9, wherein the step S3 further includes:
if the number of the effective feature point matching pairs in the step S3-1 is too small to estimate the transformation matrix, or the area of the region of interest is too small, the reference paper print image and the paper print image to be identified, which are preprocessed in the step S2-1, are used as the region of interest.
12. The method according to claim 1, wherein the step S4 includes:
and respectively enhancing the fiber textures of the two interested regions by adopting Gabor filters at multiple angles, simultaneously generating an amplitude response matrix and a phase angle response matrix by the Gabor filter at each angle, then superposing the amplitude response matrix and the phase angle response matrix at the multiple angles, and finally generating a corresponding amplitude response matrix and a corresponding phase angle response matrix.
13. The method of claim 12, wherein the step S4 is preceded by:
and adjusting the interested areas of the reference paper pattern image and the paper pattern image to be identified to be the same size according to the preset image size parameters.
14. The method according to claim 13, wherein the step S5 includes:
respectively mapping the enhanced texture structures of the interested areas of the paper pattern image to be identified and the reference paper pattern image to 01 digital space, generating a similarity index by using Hamming distance, and measuring the similarity of the bit streams correspondingly generated by the reference paper pattern image and the paper pattern image to be identified.
15. The method according to claim 14, wherein the mapping the enhanced texture structure to a 01-digit space, generating a similarity index by using hamming distance, and the measuring the similarity of the reference texture image and the corresponding generated bit stream of the texture image to be identified comprises:
respectively calculating the average values of the final amplitude response matrix and the final phase angle response matrix of the interested areas of the reference paper pattern image and the paper pattern image to be identified, carrying out size ratio on each number in the response matrix and the average value, taking 1 which is larger than the average value, taking 0 which is smaller than the average value, spreading the two number matrices according to rows or columns, splicing the obtained bit streams together, and respectively taking the two number matrices as the digital paper patterns of the reference paper pattern image and the paper pattern image to be identified;
calculating the Hamming distance between the digital paper patterns of the reference paper pattern image and the paper pattern image to be identified, and taking the proportion of the Hamming distance in the total length of the digital paper patterns as a similarity index between the reference paper pattern and the paper pattern to be identified;
and selecting a threshold t, wherein if the similarity index is greater than the threshold t, the identification fails, and if the similarity index is less than the threshold t, the identification succeeds.
CN202010348238.9A 2020-04-28 2020-04-28 Paper grain identification method Active CN111680549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010348238.9A CN111680549B (en) 2020-04-28 2020-04-28 Paper grain identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010348238.9A CN111680549B (en) 2020-04-28 2020-04-28 Paper grain identification method

Publications (2)

Publication Number Publication Date
CN111680549A true CN111680549A (en) 2020-09-18
CN111680549B CN111680549B (en) 2023-12-05

Family

ID=72452212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010348238.9A Active CN111680549B (en) 2020-04-28 2020-04-28 Paper grain identification method

Country Status (1)

Country Link
CN (1) CN111680549B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022082431A1 (en) * 2020-10-20 2022-04-28 Beijing Tripmonkey Technology Limited Systems and methods for extracting information from paper media based on depth information

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604385A (en) * 2009-07-09 2009-12-16 深圳大学 A kind of palm grain identification method and palmmprint recognition device
US20100067691A1 (en) * 2008-04-25 2010-03-18 Feng Lin Document certification and authentication system
CN101894256A (en) * 2010-07-02 2010-11-24 西安理工大学 Iris identification method based on odd-symmetric 2D Log-Gabor filter
CN102073865A (en) * 2010-12-24 2011-05-25 兆日科技(深圳)有限公司 Anti-counterfeiting method and system using autologous fiber textures of paper
US20120148164A1 (en) * 2010-12-08 2012-06-14 Electronics And Telecommunications Research Institute Image matching devices and image matching methods thereof
CN102651074A (en) * 2012-02-22 2012-08-29 大连理工大学 Texture feature-based printed paper identification method
US20120275711A1 (en) * 2011-04-28 2012-11-01 Sony Corporation Image processing device, image processing method, and program
CN103049905A (en) * 2012-12-07 2013-04-17 中国人民解放军海军航空工程学院 Method for realizing image registration of synthetic aperture radar (SAR) by using three components of monogenic signals
CN103295241A (en) * 2013-06-26 2013-09-11 中国科学院光电技术研究所 Frequency domain saliency target detection method based on Gabor wavelets
CN103927527A (en) * 2014-04-30 2014-07-16 长安大学 Human face feature extraction method based on single training sample
CN104834909A (en) * 2015-05-07 2015-08-12 长安大学 Image characteristic description method based on Gabor synthetic characteristic
WO2015160340A1 (en) * 2014-04-16 2015-10-22 Halliburton Energy Services, Inc. Ultrasonic signal time-frequency decomposition for borehole evaluation or pipeline inspection
WO2016146265A1 (en) * 2015-03-17 2016-09-22 Zynaptiq Gmbh Methods for extending frequency transforms to resolve features in the spatio-temporal domain
CN106022391A (en) * 2016-05-31 2016-10-12 哈尔滨工业大学深圳研究生院 Hyperspectral image characteristic parallel extraction and classification method
CN106875543A (en) * 2017-01-25 2017-06-20 杭州视氪科技有限公司 A kind of visually impaired people's bill acceptor system and recognition methods based on RGB D cameras
CN107292273A (en) * 2017-06-28 2017-10-24 西安电子科技大学 Based on the special double Gabor palmmprint ROI matching process of extension eight neighborhood
US20180039831A1 (en) * 2015-02-13 2018-02-08 Paper Dna Ag Method of authentication using surface paper texture
US20180211401A1 (en) * 2017-01-26 2018-07-26 Samsung Electronics Co., Ltd. Stereo matching method and apparatus, image processing apparatus, and training method therefor
CN108596197A (en) * 2018-05-15 2018-09-28 汉王科技股份有限公司 A kind of seal matching process and device
WO2018192023A1 (en) * 2017-04-21 2018-10-25 深圳大学 Method and device for hyperspectral remote sensing image classification
CN109598205A (en) * 2018-11-09 2019-04-09 国网山东省电力公司淄博供电公司 The method of Finger print characteristic abstract and compressed encoding based on Gabor transformation
CN110084754A (en) * 2019-06-25 2019-08-02 江苏德劭信息科技有限公司 A kind of image superimposing method based on improvement SIFT feature point matching algorithm
CN110472479A (en) * 2019-06-28 2019-11-19 广州中国科学院先进技术研究所 A kind of finger vein identification method based on SURF feature point extraction and part LBP coding
CN110599665A (en) * 2018-06-13 2019-12-20 深圳兆日科技股份有限公司 Paper pattern recognition method and device, computer equipment and storage medium
CN110738222A (en) * 2018-07-18 2020-01-31 深圳兆日科技股份有限公司 Image matching method and device, computer equipment and storage medium
US20200098119A1 (en) * 2017-05-29 2020-03-26 Olympus Corporation Image processing device, image processing method, and image processing program
CN110930398A (en) * 2019-12-09 2020-03-27 嘉兴学院 Log-Gabor similarity-based full-reference video quality evaluation method
CN110941989A (en) * 2019-10-18 2020-03-31 北京达佳互联信息技术有限公司 Image verification method, image verification device, video verification method, video verification device, equipment and storage medium

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100067691A1 (en) * 2008-04-25 2010-03-18 Feng Lin Document certification and authentication system
CN101604385A (en) * 2009-07-09 2009-12-16 深圳大学 A kind of palm grain identification method and palmmprint recognition device
CN101894256A (en) * 2010-07-02 2010-11-24 西安理工大学 Iris identification method based on odd-symmetric 2D Log-Gabor filter
US20120148164A1 (en) * 2010-12-08 2012-06-14 Electronics And Telecommunications Research Institute Image matching devices and image matching methods thereof
CN102073865A (en) * 2010-12-24 2011-05-25 兆日科技(深圳)有限公司 Anti-counterfeiting method and system using autologous fiber textures of paper
US20120275711A1 (en) * 2011-04-28 2012-11-01 Sony Corporation Image processing device, image processing method, and program
CN102651074A (en) * 2012-02-22 2012-08-29 大连理工大学 Texture feature-based printed paper identification method
CN103049905A (en) * 2012-12-07 2013-04-17 中国人民解放军海军航空工程学院 Method for realizing image registration of synthetic aperture radar (SAR) by using three components of monogenic signals
CN103295241A (en) * 2013-06-26 2013-09-11 中国科学院光电技术研究所 Frequency domain saliency target detection method based on Gabor wavelets
WO2015160340A1 (en) * 2014-04-16 2015-10-22 Halliburton Energy Services, Inc. Ultrasonic signal time-frequency decomposition for borehole evaluation or pipeline inspection
CN103927527A (en) * 2014-04-30 2014-07-16 长安大学 Human face feature extraction method based on single training sample
US20180039831A1 (en) * 2015-02-13 2018-02-08 Paper Dna Ag Method of authentication using surface paper texture
WO2016146265A1 (en) * 2015-03-17 2016-09-22 Zynaptiq Gmbh Methods for extending frequency transforms to resolve features in the spatio-temporal domain
CN104834909A (en) * 2015-05-07 2015-08-12 长安大学 Image characteristic description method based on Gabor synthetic characteristic
CN106022391A (en) * 2016-05-31 2016-10-12 哈尔滨工业大学深圳研究生院 Hyperspectral image characteristic parallel extraction and classification method
CN106875543A (en) * 2017-01-25 2017-06-20 杭州视氪科技有限公司 A kind of visually impaired people's bill acceptor system and recognition methods based on RGB D cameras
US20180211401A1 (en) * 2017-01-26 2018-07-26 Samsung Electronics Co., Ltd. Stereo matching method and apparatus, image processing apparatus, and training method therefor
WO2018192023A1 (en) * 2017-04-21 2018-10-25 深圳大学 Method and device for hyperspectral remote sensing image classification
US20200098119A1 (en) * 2017-05-29 2020-03-26 Olympus Corporation Image processing device, image processing method, and image processing program
CN107292273A (en) * 2017-06-28 2017-10-24 西安电子科技大学 Based on the special double Gabor palmmprint ROI matching process of extension eight neighborhood
CN108596197A (en) * 2018-05-15 2018-09-28 汉王科技股份有限公司 A kind of seal matching process and device
CN110599665A (en) * 2018-06-13 2019-12-20 深圳兆日科技股份有限公司 Paper pattern recognition method and device, computer equipment and storage medium
CN110738222A (en) * 2018-07-18 2020-01-31 深圳兆日科技股份有限公司 Image matching method and device, computer equipment and storage medium
CN109598205A (en) * 2018-11-09 2019-04-09 国网山东省电力公司淄博供电公司 The method of Finger print characteristic abstract and compressed encoding based on Gabor transformation
CN110084754A (en) * 2019-06-25 2019-08-02 江苏德劭信息科技有限公司 A kind of image superimposing method based on improvement SIFT feature point matching algorithm
CN110472479A (en) * 2019-06-28 2019-11-19 广州中国科学院先进技术研究所 A kind of finger vein identification method based on SURF feature point extraction and part LBP coding
CN110941989A (en) * 2019-10-18 2020-03-31 北京达佳互联信息技术有限公司 Image verification method, image verification device, video verification method, video verification device, equipment and storage medium
CN110930398A (en) * 2019-12-09 2020-03-27 嘉兴学院 Log-Gabor similarity-based full-reference video quality evaluation method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SHIHAB HAMAD KHALEEFAH; MOHAMMAD FAIDZUL NASRUDIN; SALAMA A. MOSTAFA: "Fingerprinting of Deformed Paper Images Acquired by Scanners", pages 393 - 397 *
李振宏;吴慧中;: "基于SIFT变换的水印图像几何失真校正算法", 计算机工程与设计, no. 12, pages 231 - 233 *
项楠: "票据诈骗案频发 纸纹为票据"加把锁"", 《中国防伪报道》, no. 03, pages 108 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022082431A1 (en) * 2020-10-20 2022-04-28 Beijing Tripmonkey Technology Limited Systems and methods for extracting information from paper media based on depth information

Also Published As

Publication number Publication date
CN111680549B (en) 2023-12-05

Similar Documents

Publication Publication Date Title
US10872265B2 (en) Database for detecting counterfeit items using digital fingerprint records
CN103761799B (en) A kind of bill anti-counterfeit method based on texture image feature and device
Mushtaq et al. Digital image forgeries and passive image authentication techniques: a survey
DK2024899T3 (en) Means of use of the material surface microstructure as a unique identifier
US8190901B2 (en) Layered security in digital watermarking
CN101582162B (en) Art identifying method based on texture analysis
CN104464079B (en) Multiple Currencies face amount recognition methods based on template characteristic point and topological structure thereof
Kashyap et al. An evaluation of digital image forgery detection approaches
CN106327534B (en) A kind of tire inner wall Texture Recognition based on locating piece
JP2011507101A (en) Identification and verification of unknown documents by eigenimage processing
JP2009104663A (en) Counterfeiting detection method and image detection method
CN104969268A (en) Authentication of security documents and mobile device to carry out the authentication
CN102903075A (en) Robust watermarking method based on image feature point global correction
CN102609947B (en) Forgery detection method for spliced and distorted digital photos
Ferrer et al. Signature verification using local directional pattern (LDP)
CN111680549B (en) Paper grain identification method
Wu et al. A printer forensics method using halftone dot arrangement model
Rabah et al. The supatlantique scanned documents database for digital image forensics purposes
Benhamza et al. Image forgery detection review
CN102646194B (en) Method for performing printer type evidence obtainment by utilizing character edge features
Bollimpalli et al. SIFT based robust image watermarking resistant to resolution scaling
Yohannan et al. Detection of copy-move forgery based on Gabor filter
Hildebrandt et al. High-resolution printed amino acid traces: a first-feature extraction approach for fingerprint forgery detection
Khuspe et al. Robust image forgery localization and recognition in copy-move using bag of features and SVM
Duraipandi et al. A grid based iris biometric watermarking using wavelet transform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Room 01, 21st Floor, Building 1, Huigu Space, No. 206 Laowuhuang Road, Guandong Street, Wuhan Donghu New Technology Development Zone, Wuhan City, Hubei Province, 430223

Patentee after: Xiaophoton (Wuhan) Technology Co.,Ltd.

Patentee after: HUAZHONG University OF SCIENCE AND TECHNOLOGY

Address before: 430000 science and technology building, 243 Luoyu Road, Donghu Development Zone, Wuhan City, Hubei Province

Patentee before: CONVERGENCE TECHNOLOGY Co.,Ltd.

Patentee before: HUAZHONG University OF SCIENCE AND TECHNOLOGY