CN111104857A - Identity recognition method and system based on gait energy diagram - Google Patents

Identity recognition method and system based on gait energy diagram Download PDF

Info

Publication number
CN111104857A
CN111104857A CN201911131622.7A CN201911131622A CN111104857A CN 111104857 A CN111104857 A CN 111104857A CN 201911131622 A CN201911131622 A CN 201911131622A CN 111104857 A CN111104857 A CN 111104857A
Authority
CN
China
Prior art keywords
pedestrian
image
gait
gait energy
energy map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911131622.7A
Other languages
Chinese (zh)
Inventor
熊九龙
李忠
张玘
叶湘滨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN201911131622.7A priority Critical patent/CN111104857A/en
Publication of CN111104857A publication Critical patent/CN111104857A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/467Encoded features or binary features, e.g. local binary patterns [LBP]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an identity recognition method and system based on a gait energy map, which comprises the steps of obtaining a pedestrian foreground image sequence of a normal walking state of a pedestrian to be recognized, extracting a gait cycle and generating the gait energy map; dividing the gait energy image into a plurality of subarea images with fixed sizes, and calculating to obtain an LBP value of each pixel point in each subarea image; counting histograms of LBP values of all pixel points in each subregion image, connecting the statistical histograms of all subregion images into a one-dimensional characteristic vector, and using the one-dimensional characteristic vector as an LBP characteristic of a gait energy map to be identified; and inputting the LBP characteristics of the gait energy diagram to be identified into a nearest neighbor classifier for distance measurement, and performing identity prediction identification according to the distance measurement result. The problem that a gait energy image is easily influenced by external environments such as weather and illumination change is solved, the extracted local binary pattern features have rotation invariance and illumination invariance, and the correct identification rate of the identity of the pedestrian under a real and complex application scene in gait identification is effectively improved.

Description

Identity recognition method and system based on gait energy diagram
Technical Field
The invention relates to the fields of biological feature recognition, image processing and computer vision, in particular to an identity recognition method and an identity recognition system based on a gait energy map, and specifically relates to an identity recognition method and an identity recognition system based on Local Binary Patterns (LBP) features of the gait energy map.
Background
In modern digital society, more and more real-time application scenes such as international border crossing, financial transactions, criminal investigation evidence collection, computer security, entrance guard and the like need to perform quick, safe and reliable authentication on the identity of a pedestrian. Conventional biometrics such as human faces, fingerprints, irises, etc. have their own drawbacks and require active cooperation of the person to be identified. Gait recognition is a new generation of biological feature recognition technology, has the advantages of long distance, non-contact, low resolution requirement, difficulty in camouflage and simulation and the like, and is very suitable for identifying the identity of a pedestrian under the long-distance condition. The intelligent video monitoring system integrated with the gait recognition technology has huge research value and market demand, and is mainly embodied in the fields of military affairs, public safety, criminal investigation and intelligent home.
At present, gait-based pedestrian identity recognition is mainly realized by two strategies, namely a sensor and a visual camera. Conventional physical sensor-based methods require auxiliary equipment to collect gait information and have limited application areas. The method based on the visual camera collects gait information of pedestrians through the common camera without active cooperation of the pedestrians, and has strong practicability. The gait feature extraction is to describe the dynamic change process of the walking posture of the pedestrian in space-time by a group of data. The commonly used feature models are mainly classified into two categories: model-based (Model-based) and Appearance-based (Appearance-based). The feature extraction method based on the model establishes a mathematical model for a basic structure of a human body, and extracts relevant parameters of the model as gait features. Although the method is robust and very differentiable, it is more complex, time consuming and computationally expensive to implement. The appearance-based feature extraction method can also be called a non-model-based method, the pedestrian contour is directly segmented from a video image, and the gait similarity is measured based on the autocorrelation of the contour sequence. The method does not need to construct a mathematical model, has much lower calculation complexity and stronger real-time property.
The most popular appearance-based method at present is a gait energy map (GEI), which is to calculate a gait cycle, average a pedestrian contour image in the gait cycle to obtain a gray image to represent the gait of the whole cycle. However, the gait energy map has limited representation capability, the image dimension is large, a large amount of redundant information exists, and in addition, the pedestrian foreground image obtained by detection and segmentation in a real scene is easily influenced by external environments such as weather and illumination change, so that the calculated gait energy map contains a large amount of background noise.
In order to overcome the problems, secondary feature extraction needs to be carried out on the gait energy diagram, so that the dimension of the image is reduced, and dimension disaster is avoided; and secondly, extracting the characteristics which are not easily influenced by external factors such as illumination change and the like in the gait energy diagram, and improving the pedestrian identity recognition rate.
The invention relates to a gait energy map acquisition and identity recognition method based on human body HOG characteristics, which is applied for the following steps: CN 104794449 a discloses a gait energy map acquisition method based on human body HOG features,
each frame of image detection in the human body gait video image sequence is divided into a human body contour and a human body part, HOG feature descriptors of the human body contour and the human body part are respectively calculated, a gait energy image of the HOG features of the human body contour and a gait energy image of the HOG features of the human body part are respectively obtained, and a gait energy image based on the HOG features of the human body is obtained through combination. The obtained image can keep better invariance to geometric and optical deformation and is insensitive to illumination change; however, the generation process of the HOG feature descriptor is tedious, so that the real-time performance is poor, and the descriptor is sensitive to noise.
Disclosure of Invention
The invention provides a method for acquiring Local Binary Pattern (LBP) characteristics based on a gait energy map and realizing identity recognition based on the LBP characteristics extracted from the gait energy map, aiming at the problem that the robustness of a calculated gait energy map is not strong due to the fact that a segmented pedestrian contour image is easily influenced by external environments such as weather and illumination change in gait pedestrian identity recognition under a real application scene. Under a real application scene, a large amount of background noise is easily generated due to environmental changes such as illumination, shadow and the like, and the generated gait energy diagram also contains a large amount of noise, so that the gait energy diagram is unfavorable for gait feature extraction and identity recognition. The invention aims to extract the characteristic descriptors insensitive to rotation and illumination change in the gait energy diagram, reduce the influence of noise on effective gait characteristic extraction and improve the pedestrian identity recognition rate.
In order to achieve the above object, the present invention provides an identity recognition method based on a gait energy map, comprising the following steps:
s1, acquiring a pedestrian foreground image sequence of the pedestrian to be identified in the normal walking state, extracting a gait cycle for the pedestrian foreground image sequence and generating a gait energy map;
s2, dividing the gait energy map into a plurality of subarea images with fixed sizes, and calculating to obtain LBP values of all pixel points in each subarea image;
s3, counting histograms of LBP values of all pixel points in each subregion image, carrying out normalization processing, and connecting the statistical histograms of all subregion images in the gait energy map into a one-dimensional characteristic vector to be used as LBP characteristics of the whole gait energy map to be recognized;
and S4, inputting the LBP characteristics of the gait energy diagram to be identified into the nearest neighbor classifier for distance measurement, and performing identity prediction identification according to the distance measurement result.
Further preferably, in step S1, the specific implementation process of extracting a gait cycle from the pedestrian foreground image sequence and generating a gait energy map includes:
s1.1, sequentially carrying out morphological denoising, size normalization and gravity center alignment on foreground images of all pedestrians in a pedestrian foreground image sequence to obtain a pedestrian contour binary image sequence;
s1.2, extracting a gait cycle for the normalized pedestrian contour binary image sequence based on the width change of a shank region to obtain a gait energy map.
Further preferably, in step S1.1, the morphological denoising process includes:
and (3) sequentially carrying out corrosion treatment, expansion treatment, opening operation treatment and closing operation treatment on the pedestrian foreground image by adopting a circular structural element with the radius of 2, so as to realize denoising and smoothing of the pedestrian foreground image.
Further preferably, in step S1.1, the size normalization process is:
and determining four boundary points of the leftmost boundary point, the rightmost boundary point, the highest boundary point and the lowest boundary point in each pedestrian foreground image by traversing each pixel point of each pedestrian foreground image, selecting a pedestrian outline minimum rectangle with width W and height H from the frame, uniformly scaling the pedestrian outline minimum rectangle to the height of a P pixel, and obtaining the pedestrian outline image with the width scaled to the P multiplied by W/H pixel in an equal proportion.
Further preferably, in step S1.1, the process of aligning the centers of gravity is as follows:
s1.1.1, calculating to obtain the gravity center of each pedestrian contour image:
Figure BDA0002278477070000041
Figure BDA0002278477070000042
in the formula, xiAnd yiRespectively representing the horizontal and vertical coordinates of pixel points with pixel values of 1 in the pedestrian outline image; n represents the total number of pixel points with the pixel values of 1 in the pedestrian contour image; x is the number ofcAnd ycRespectively representing the horizontal and vertical coordinates of the gravity center of the pedestrian outline image;
s1.1.2, creating a standard template with the pixel value of all 0 and the size of P multiplied by P pixels, and then arranging the pedestrian outline image in the standard template after aligning the gravity center of the pedestrian outline image with the gravity center of the standard template, and obtaining the pedestrian outline binary image sequence, wherein the gravity center coordinates of the standard template are (P/2 ).
Further preferably, in step S1.2, the process of extracting the gait cycle based on the change in width of the lower leg region includes:
s1.2.1, setting a region with the height from 0 to 0.28H in the pedestrian contour binary image as a shank region, traversing pixel points with the height from 0 to 0.28H in the pedestrian contour binary image line by line, recording coordinates of the first pixel value and the last pixel value of 1 in each line, and taking the coordinates as boundary points;
s1.2.2, subtracting the abscissa of the two boundary pixel points obtained in each line, taking the absolute value of the two boundary pixel points, and taking the maximum value of all the absolute values as the width value of the current pedestrian contour binary image;
s1.2.3, taking the time span between two pedestrian contour binary images with the maximum width value in the pedestrian contour binary image sequence as a half gait cycle.
Further preferably, in step S1.2, the specific process of obtaining the gait energy map is as follows:
averaging all pedestrian contour binary images subjected to size normalization and gravity center alignment in one gait cycle to obtain a gait energy map:
Figure BDA0002278477070000043
the method comprises the steps of acquiring a gait energy image, acquiring a pedestrian contour binary image in a gait cycle, acquiring a pedestrian contour binary image in a two-dimensional space corresponding to the pedestrian contour binary image, acquiring a pedestrian contour binary image sequence number in the gait cycle, and acquiring a pedestrian contour binary image sequence number in the gait cycle.
Further preferably, in step S2, the calculating process of obtaining the LBP value of each pixel point in each sub-region image includes:
s2.1, selecting any central pixel point in the subregion image as a circle center, and comparing 8 pixel points in a circular neighborhood with the radius of 2;
s2.2, after comparison, marking the gray values of 8 pixel points in the neighborhood of the central pixel point as 1 or 0, and combining according to a clockwise sequence to obtain an 8-bit binary number as an LBP value of the central pixel point:
Figure BDA0002278477070000051
Figure BDA0002278477070000052
in the formula (x)c,yc) Coordinates, g, representing the center pixel pointcGray value, g, representing the center pixelpRepresenting the gray value of the pixel in the neighborhood of the central pixel, and s (x) representing the sign function.
Further preferably, in step S4, the specific implementation process of inputting the LBP features of the gait energy map into the nearest neighbor classifier for distance measurement and performing identity prediction and identification according to the distance measurement result includes:
s4.1, changing LBP feature G of the gait energy map to be identified obtained in step S3 to (G)(1),g(2),g(3),...,g(n)) Inputting the data into a nearest neighbor classifier;
s4.2, the LBP feature G ═ G (G) of the gait energy map to be recognized is calculated(1),g(2),g(3),...,g(n)) Euclidean distances to all samples in the sample library of the nearest neighbor classifier:
Figure BDA0002278477070000053
wherein P (i) ═ p (i)(1),p(i)(2),p(i)(3),...,p(i)(n)) The ith sample in the sample library of the nearest neighbor classifier;
and S4.3, selecting a sample which is closest to the LBP characteristic of the gait energy image to be identified in the sample library, predicting the class of the sample as the class of the LBP characteristic of the gait energy image to be identified, and identifying the identity of the pedestrian corresponding to the gait energy image to be identified.
In order to achieve the above object, the present invention further provides an identity recognition system based on a gait energy map, comprising: the gait energy image recognition system comprises a memory and a processor, wherein the memory stores an identification program based on a gait energy image, and the processor executes the steps of the method when running the program.
The gait energy map-based identity recognition method and system provided by the invention solve the problem that the gait energy map is easily influenced by external environments such as weather and illumination change, the extracted Local Binary Pattern (LBP) features have rotation invariance and illumination invariance, and the correct recognition rate of the identity of a pedestrian under a real and complex application scene in gait recognition is effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a schematic flow chart of an identity recognition method based on a gait energy map according to an embodiment of the invention;
FIG. 2 is a schematic flow chart of extracting a gait cycle from a pedestrian foreground image sequence and generating a gait energy map according to an embodiment of the invention;
FIG. 3 is a diagram illustrating a morphological denoising result of a foreground image of a pedestrian according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart illustrating center of gravity alignment according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart of the gait cycle extraction based on the width change of the calf region according to the embodiment of the invention;
FIG. 6 is a schematic diagram of a human body part scale according to an embodiment of the present invention;
FIG. 7 is a graph of gait energy in an embodiment of the invention;
fig. 8 is a schematic flow chart illustrating a process of calculating and obtaining an LBP value of each pixel point in each sub-region image according to the embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating a comparison between a center pixel point and a pixel point in a circular neighborhood according to an embodiment of the present invention;
FIG. 10 is a schematic flow chart illustrating identity prediction and identification according to a distance measurement result in an embodiment of the present invention;
fig. 11 is an image obtained by combining LBP values calculated by pixel points of each sub-region of the gait energy map in the embodiment of the invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that all the directional indicators (such as up, down, left, right, front, and rear … …) in the embodiment of the present invention are only used to explain the relative position relationship between the components, the movement situation, etc. in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indicator is changed accordingly.
In addition, the descriptions related to "first", "second", etc. in the present invention are only for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "connected," "secured," and the like are to be construed broadly, and for example, "secured" may be a fixed connection, a removable connection, or an integral part; the connection can be mechanical connection, electrical connection, physical connection or wireless communication connection; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In addition, the technical solutions in the embodiments of the present invention may be combined with each other, but it must be based on the realization of those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope of the present invention.
As shown in fig. 1, an identity recognition method based on a gait energy map comprises the following steps:
s1, acquiring a pedestrian foreground image sequence of the pedestrian to be identified in the normal walking state, extracting a gait cycle for the pedestrian foreground image sequence and generating a gait energy map;
s2, dividing the gait energy map into a plurality of subarea images with fixed sizes, and calculating to obtain LBP values of all pixel points in each subarea image;
s3, counting histograms of LBP values of all pixel points in each subregion image, carrying out normalization processing, and connecting the statistical histograms of all subregion images in the gait energy map into a one-dimensional characteristic vector to be used as LBP characteristics of the whole gait energy map to be recognized;
and S4, inputting the LBP characteristics of the gait energy diagram to be identified into the nearest neighbor classifier for distance measurement, and performing identity prediction identification according to the distance measurement result.
The gait energy image is easily influenced by external environments such as weather and illumination change, the extracted Local Binary Pattern (LBP) features have rotation invariance and illumination invariance, the correct pedestrian identity recognition rate of gait recognition in real and complex application scenes is effectively improved, the calculation is simple, the data size is small, the real-time performance is strong, and the identity recognition efficiency can be effectively improved.
In step S1, the number of the pedestrians to be identified may be one or multiple, and in this embodiment, the specific implementation process of obtaining the pedestrian foreground image sequence of the normal walking state of the pedestrian to be identified is as follows: and shooting foreground images of a plurality of pedestrians at a shooting angle of 90 degrees.
Referring to fig. 2, in step S1, the specific implementation process of extracting a gait cycle from the pedestrian foreground image sequence and generating a gait energy map includes:
s1.1, sequentially carrying out morphological denoising, size normalization and gravity center alignment on foreground images of all pedestrians in a pedestrian foreground image sequence to obtain a pedestrian contour binary image sequence;
s1.2, extracting a gait cycle for the normalized pedestrian contour binary image sequence based on the width change of a shank region to obtain a gait energy map.
In step S1.1, the morphological denoising process includes: and (3) sequentially carrying out corrosion treatment, expansion treatment, opening operation treatment and closing operation treatment on the pedestrian foreground image by adopting a circular structural element with the radius of 2, so as to realize denoising and smoothing of the pedestrian foreground image. That is, as shown in fig. 3, fig. 3(a) is an original pedestrian foreground image, fig. 3(b) is an image after erosion processing, fig. 3(c) is an image after dilation processing, fig. 3(d) is an image after opening operation processing, and fig. 3(e) is an image after closing operation processing. Therefore, the image denoised by morphology is smoother and fuller and is close to the actual pedestrian contour.
In step S1.1, the size normalization process is:
determining four boundary points of the leftmost border point, the rightmost border point, the highest border point and the lowest border point in each pedestrian foreground image by traversing each pixel point of each pedestrian foreground image, selecting a pedestrian outline minimum rectangle with width W and height H from the frame, uniformly scaling the pedestrian outline minimum rectangle to the height of a P pixel, and obtaining the pedestrian outline image with the width scaled to the P multiplied by W/H pixel in equal proportion; in this embodiment, P is 240.
Referring to fig. 4, in step S1.1, the process of center of gravity alignment is:
s1.1.1, calculating to obtain the gravity center of each pedestrian contour image:
Figure BDA0002278477070000091
Figure BDA0002278477070000092
in the formula, xiAnd yiRespectively representing the horizontal and vertical coordinates of pixel points with pixel values of 1 in the pedestrian outline image; n represents the total number of pixel points with the pixel values of 1 in the pedestrian contour image; x is the number ofcAnd ycRespectively representing the horizontal and vertical coordinates of the gravity center of the pedestrian outline image;
s1.1.2, creating a standard template with the pixel value of all 0 and the size of 240 x 240 pixels, wherein the barycentric coordinates of the standard template are (120 ), finally aligning the barycenter of the pedestrian outline image with the barycenter of the standard template, and then arranging the pedestrian outline image in the standard template to obtain the pedestrian outline binary image sequence.
Referring to fig. 5, in step S1.2, the process of extracting the gait cycle based on the width change of the calf region is as follows:
s1.2.1, according to medical knowledge of human body structure, the length of a shank part is about 0.28H (H is height), as shown in FIG. 6, therefore, a region with the height from 0 to 0.28H in a pedestrian contour binary image is set as a shank region, pixel points with the height from 0 to 0.28H in the pedestrian contour binary image are traversed line by line, and the coordinates of the first pixel point and the last pixel point with the value of 1 in each line are recorded and are used as boundary points;
s1.2.2, subtracting the abscissa of the two boundary pixel points obtained in each line, taking the absolute value of the two boundary pixel points, and taking the maximum value of all the absolute values as the width value of the current pedestrian contour binary image;
s1.2.3, taking the time span between two pedestrian contour binary images with the maximum width value in the pedestrian contour binary image sequence as a half gait cycle.
In step S1.2, the specific process of obtaining the gait energy map is as follows:
averaging all pedestrian contour binary images subjected to size normalization and gravity center alignment in one gait cycle to obtain a gait energy map shown in fig. 7:
Figure BDA0002278477070000101
the method comprises the steps of acquiring a gait energy image, acquiring a pedestrian contour binary image in a gait cycle, acquiring a pedestrian contour binary image in a two-dimensional space corresponding to the pedestrian contour binary image, acquiring a pedestrian contour binary image sequence number in the gait cycle, and acquiring a pedestrian contour binary image sequence number in the gait cycle.
In step S2, the specific process of dividing the gait energy map into a plurality of sub-region images with fixed sizes is as follows: since the gait energy map is 240 × 240 in size, the gait energy map is divided into 225 sub-area images of 16 × 16 size.
Referring to fig. 8, in step S2, the process of calculating the LBP value of each pixel point in each sub-region image includes:
s2.1, selecting any central pixel point in the subregion image as a circle center, and comparing 8 pixel points in a circular neighborhood with the radius of 2, namely as shown in figure 9; the central pixel point refers to a pixel point with a circular neighborhood with the radius of 2 in the subregion image;
s2.2, after comparison, marking the gray values of 8 pixel points in the neighborhood of the central pixel point as 1 or 0, and combining according to a clockwise sequence to obtain an 8-bit binary number as an LBP value of the central pixel point:
Figure BDA0002278477070000102
Figure BDA0002278477070000111
in the formula (x)c,yc) Coordinates, g, representing the center pixel pointcGray value, g, representing the center pixelpRepresenting gray values of pixels in the neighborhood of the central pixelAnd s (x) denotes a sign function.
In step S3, the specific implementation process of the histogram normalization processing is as follows: the image histogram reflects a statistical table of image pixel distribution, and the real abscissa represents the type of image pixels; the ordinate represents the total number of pixels in the image or the percentage of all pixels for each color value. The gait energy image is a gray level image, after a histogram of the image is obtained, the histogram is normalized, namely, the vertical coordinate is changed into the probability that the current times account for the total times, namely, the times (corresponding vertical coordinates in the histogram) of different gray levels (each horizontal coordinate) in the histogram are divided by the total pixel number, a new mapping relation is established, namely the horizontal coordinate of the histogram after normalization is unchanged, and the vertical coordinate is changed into the quotient value of the original vertical coordinate and the total pixel number.
Referring to fig. 10, in step S4, the specific implementation process of inputting the LBP features of the gait energy map into the nearest neighbor classifier for distance measurement and performing identity prediction and identification according to the distance measurement result includes:
s4.1, changing LBP feature G of the gait energy map to be identified obtained in step S3 to (G)(1),g(2),g(3),...,g(n)) Input into a nearest neighbor classifier, wherein the LBP feature G ═ G (G) of the gait energy map to be recognized(1),g(2),g(3),...,g(n)) That is, as shown in fig. 11, an image is obtained by combining LBP values calculated for pixel points of each sub-region of the gait energy map;
s4.2, the LBP feature G ═ G (G) of the gait energy map to be recognized is calculated(1),g(2),g(3),...,g(n)) Euclidean distances to all samples in the sample library of the nearest neighbor classifier:
Figure BDA0002278477070000112
wherein P (i) ═ p (i)(1),p(i)(2),p(i)(3),...,p(i)(n)) The ith sample in the sample library of the nearest neighbor classifier;
and S4.3, selecting a sample which is closest to the LBP characteristic of the gait energy image to be identified in the sample library, predicting the class of the sample as the class of the LBP characteristic of the gait energy image to be identified, and identifying the identity of the pedestrian corresponding to the gait energy image to be identified.
In the embodiment, 4 gait energy graphs extracted from gait sequences in normal walking states are selected as a training set, and the other 2 gait energy graphs extracted from gait sequences in normal walking states are selected as a testing set; extracting Local Binary Pattern (LBP) characteristics of the gait energy images in the training set and the testing set, inputting the LBP characteristics of all the gait energy images in the training set into a nearest neighbor classifier to serve as a sample library, then calculating Euclidean distances between the LBP characteristics of one gait energy image in the testing set and all the sample characteristics in the training set, and predicting a sample class corresponding to the minimum value of the Euclidean distances as an identity recognition result according to a nearest neighbor method.
Calculating the Euclidean distance between the LBP characteristics of each gait energy map in the test set and all LBP characteristics in the sample library, and giving a sample in the training set: g ═ G (G)(1),g(2),g(3),...,g(n)) (ii) a Given the ith sample in the sample library: p (i) ═ p (i)(1),p(i)(2),p(i)(3),...,p(i)(n)). The euclidean distance calculation is as described in equation (5):
Figure BDA0002278477070000121
and selecting the sample closest to the sample in the sample library, and predicting the class of the sample as the class of the sample to be detected so as to identify the identity of the pedestrian.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An identity recognition method based on a gait energy map is characterized by comprising the following steps:
s1, acquiring a pedestrian foreground image sequence of the pedestrian to be identified in the normal walking state, extracting a gait cycle for the pedestrian foreground image sequence and generating a gait energy map;
s2, dividing the gait energy map into a plurality of subarea images with fixed sizes, and calculating to obtain LBP values of all pixel points in each subarea image;
s3, counting histograms of LBP values of all pixel points in each subregion image, carrying out normalization processing, and connecting the statistical histograms of all subregion images in the gait energy map into a one-dimensional characteristic vector to be used as LBP characteristics of the whole gait energy map to be recognized;
and S4, inputting the LBP characteristics of the gait energy diagram to be identified into the nearest neighbor classifier for distance measurement, and performing identity prediction identification according to the distance measurement result.
2. The gait energy map-based identity recognition method according to claim 1, wherein in step S1, the specific implementation process of extracting a gait cycle from the pedestrian foreground image sequence and generating the gait energy map is as follows:
s1.1, sequentially carrying out morphological denoising, size normalization and gravity center alignment on foreground images of all pedestrians in a pedestrian foreground image sequence to obtain a pedestrian contour binary image sequence;
s1.2, extracting a gait cycle for the normalized pedestrian contour binary image sequence based on the width change of a shank region to obtain a gait energy map.
3. The gait energy map-based identity recognition method according to claim 2, wherein in step S1.1, the morphological denoising process is:
and (3) sequentially carrying out corrosion treatment, expansion treatment, opening operation treatment and closing operation treatment on the pedestrian foreground image by adopting a circular structural element with the radius of 2, so as to realize denoising and smoothing of the pedestrian foreground image.
4. The gait energy map-based identity recognition method according to claim 2, wherein in step S1.1, the size normalization process is:
and determining four boundary points of the leftmost boundary point, the rightmost boundary point, the highest boundary point and the lowest boundary point in each pedestrian foreground image by traversing each pixel point of each pedestrian foreground image, selecting a pedestrian outline minimum rectangle with width W and height H from the frame, uniformly scaling the pedestrian outline minimum rectangle to the height of a P pixel, and obtaining the pedestrian outline image with the width scaled to the P multiplied by W/H pixel in an equal proportion.
5. The gait energy map-based identity recognition method according to claim 4, wherein in step S1.1, the process of aligning the center of gravity is as follows:
s1.1.1, calculating to obtain the gravity center of each pedestrian contour image:
Figure FDA0002278477060000021
Figure FDA0002278477060000022
in the formula, xiAnd yiRespectively representing the horizontal and vertical coordinates of pixel points with pixel values of 1 in the pedestrian outline image; n represents the total number of pixel points with the pixel values of 1 in the pedestrian contour image; x is the number ofcAnd ycRespectively representing the horizontal and vertical coordinates of the gravity center of the pedestrian outline image;
s1.1.2, creating a standard template with the pixel value of all 0 and the size of P multiplied by P pixels, and then arranging the pedestrian outline image in the standard template after aligning the gravity center of the pedestrian outline image with the gravity center of the standard template, and obtaining the pedestrian outline binary image sequence, wherein the gravity center coordinates of the standard template are (P/2 ).
6. The gait energy map-based identity recognition method according to claim 4, wherein in step S1.2, the process of extracting gait cycles based on the width change of the calf region comprises:
s1.2.1, setting a region with the height from 0 to 0.28H in the pedestrian contour binary image as a shank region, traversing pixel points with the height from 0 to 0.28H in the pedestrian contour binary image line by line, recording coordinates of the first pixel value and the last pixel value of 1 in each line, and taking the coordinates as boundary points;
s1.2.2, subtracting the abscissa of the two boundary pixel points obtained in each line, taking the absolute value of the two boundary pixel points, and taking the maximum value of all the absolute values as the width value of the current pedestrian contour binary image;
s1.2.3, taking the time span between two pedestrian contour binary images with the maximum width value in the pedestrian contour binary image sequence as a half gait cycle.
7. The gait energy map-based identity recognition method according to claim 4, wherein in step S1.2, the specific process of obtaining the gait energy map is as follows:
averaging all pedestrian contour binary images subjected to size normalization and gravity center alignment in one gait cycle to obtain a gait energy map:
Figure FDA0002278477060000023
the method comprises the steps of acquiring a gait energy image, acquiring a pedestrian contour binary image in a gait cycle, acquiring a pedestrian contour binary image in a two-dimensional space corresponding to the pedestrian contour binary image, acquiring a pedestrian contour binary image sequence number in the gait cycle, and acquiring a pedestrian contour binary image sequence number in the gait cycle.
8. An identity recognition method based on gait energy map according to any one of claims 1 to 7, characterized in that in S2, the process of calculating the LBP value of each pixel point in each subregion image is:
s2.1, selecting any central pixel point in the subregion image as a circle center, and comparing 8 pixel points in a circular neighborhood with the radius of 2;
s2.2, after comparison, marking the gray values of 8 pixel points in the neighborhood of the central pixel point as 1 or 0, and combining according to a clockwise sequence to obtain an 8-bit binary number as an LBP value of the central pixel point:
Figure FDA0002278477060000031
Figure FDA0002278477060000032
in the formula (x)c,yc) Coordinates, g, representing the center pixel pointcGray value, g, representing the center pixelpRepresenting the gray value of the pixel in the neighborhood of the central pixel, and s (x) representing the sign function.
9. The gait energy map-based identity recognition method according to any one of claims 1 to 7, wherein in step S4, the specific implementation process of inputting the LBP features of the gait energy map into the nearest neighbor classifier for distance measurement and performing identity prediction recognition according to the distance measurement result is as follows:
s4.1, changing LBP feature G of the gait energy map to be identified obtained in step S3 to (G)(1),g(2),g(3),...,g(n)) Inputting the data into a nearest neighbor classifier;
s4.2, calculating the LBP characteristic G ═ G (G) of the gait energy map to be identified(1),g(2),g(3),...,g(n)) Euclidean distances to all samples in the sample library of the nearest neighbor classifier:
Figure FDA0002278477060000033
wherein P (i) ═ p (i)(1),p(i)(2),p(i)(3),...,p(i)(n)) The ith sample in the sample library of the nearest neighbor classifier;
and S4.3, selecting a sample which is closest to the LBP characteristic of the gait energy image to be identified in the sample library, predicting the class of the sample as the class of the LBP characteristic of the gait energy image to be identified, and identifying the identity of the pedestrian corresponding to the gait energy image to be identified.
10. A gait energy map-based identity recognition system, comprising a memory and a processor, wherein the memory stores a gait energy map-based identity recognition program, and the processor executes the steps of any one of the methods of claims 1 to 9 when running the program.
CN201911131622.7A 2019-11-19 2019-11-19 Identity recognition method and system based on gait energy diagram Pending CN111104857A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911131622.7A CN111104857A (en) 2019-11-19 2019-11-19 Identity recognition method and system based on gait energy diagram

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911131622.7A CN111104857A (en) 2019-11-19 2019-11-19 Identity recognition method and system based on gait energy diagram

Publications (1)

Publication Number Publication Date
CN111104857A true CN111104857A (en) 2020-05-05

Family

ID=70420608

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911131622.7A Pending CN111104857A (en) 2019-11-19 2019-11-19 Identity recognition method and system based on gait energy diagram

Country Status (1)

Country Link
CN (1) CN111104857A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111968063A (en) * 2020-09-07 2020-11-20 北京凌云光技术集团有限责任公司 Morphological image filtering device and method
CN117132948A (en) * 2023-10-27 2023-11-28 南昌理工学院 Scenic spot tourist flow monitoring method, system, readable storage medium and computer

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100108744A (en) * 2009-03-30 2010-10-08 연세대학교 산학협력단 Method and system of backpack removal for gait recognition
CN103942577A (en) * 2014-04-29 2014-07-23 上海复控华龙微***技术有限公司 Identity identification method based on self-established sample library and composite characters in video monitoring
US20160217319A1 (en) * 2012-10-01 2016-07-28 The Regents Of The University Of California Unified face representation for individual recognition in surveillance videos and vehicle logo super-resolution system
CN109919137A (en) * 2019-03-28 2019-06-21 广东省智能制造研究所 A kind of pedestrian's structured features expression

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100108744A (en) * 2009-03-30 2010-10-08 연세대학교 산학협력단 Method and system of backpack removal for gait recognition
US20160217319A1 (en) * 2012-10-01 2016-07-28 The Regents Of The University Of California Unified face representation for individual recognition in surveillance videos and vehicle logo super-resolution system
CN103942577A (en) * 2014-04-29 2014-07-23 上海复控华龙微***技术有限公司 Identity identification method based on self-established sample library and composite characters in video monitoring
CN109919137A (en) * 2019-03-28 2019-06-21 广东省智能制造研究所 A kind of pedestrian's structured features expression

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
AMER G. BINSAADOON ET AL.: "Kernel-Based Fuzzy Local Binary Pattern for Gait Recognition", pages 35 - 40 *
CSDN: "机器学习——降维、聚类、分类、回归", pages 1 - 10, Retrieved from the Internet <URL:http://t.csdn.cn/9ToWP> *
ZHONG LI ET AL.: "Gait Energy Image Based on Static Region Alignment for Pedestrian Gait Recognition", no. 49, pages 1 - 6, XP058884957, DOI: 10.1145/3387168.3387201 *
刘志勇;冯国灿;陈伟福;: "基于局部二值模式和辨识共同向量的步态识别", 计算机科学, vol. 40, no. 09, pages 262 - 265 *
刘志勇等: "基于局部二值模式和辨识共同向量的步态识别" *
刘文婷;卢新明;: "基于LBP和HOG特征分层融合的步态识别", 计算机工程与应用, vol. 54, no. 24, pages 168 - 175 *
姜佳楠: "基于视频序列的步态识别方法研究", 中国优秀硕士学位论文全文数据库 (信息科技辑), no. 08, pages 138 - 584 *
宋克臣;颜云辉;陈文辉;张旭;: "局部二值模式方法研究与展望", 自动化学报, vol. 39, no. 06, pages 730 - 744 *
宋克臣等: "局部二值模式方法研究与展望" *
蒋忠仁: "计算机应用专业英语", 重庆:重庆大学出版社, pages: 139 - 140 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111968063A (en) * 2020-09-07 2020-11-20 北京凌云光技术集团有限责任公司 Morphological image filtering device and method
CN111968063B (en) * 2020-09-07 2024-01-26 凌云光技术股份有限公司 Morphological image filtering device and method
CN117132948A (en) * 2023-10-27 2023-11-28 南昌理工学院 Scenic spot tourist flow monitoring method, system, readable storage medium and computer
CN117132948B (en) * 2023-10-27 2024-01-30 南昌理工学院 Scenic spot tourist flow monitoring method, system, readable storage medium and computer

Similar Documents

Publication Publication Date Title
CN110427905B (en) Pedestrian tracking method, device and terminal
US20210065384A1 (en) Target tracking method, device, system and non-transitory computer readable storage medium
JP4216668B2 (en) Face detection / tracking system and method for detecting and tracking multiple faces in real time by combining video visual information
CN105740780B (en) Method and device for detecting living human face
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
Merad et al. Fast people counting using head detection from skeleton graph
JP2017016593A (en) Image processing apparatus, image processing method, and program
CN108416291B (en) Face detection and recognition method, device and system
Everingham et al. Automated person identification in video
Lu et al. Multimodal facial feature extraction for automatic 3D face recognition
Shen et al. Adaptive pedestrian tracking via patch-based features and spatial–temporal similarity measurement
CN110222661B (en) Feature extraction method for moving target identification and tracking
JP2011113313A (en) Attitude estimation device
Bedagkar-Gala et al. Gait-assisted person re-identification in wide area surveillance
Almaadeed et al. Partial shoeprint retrieval using multiple point-of-interest detectors and SIFT descriptors
Sasikala et al. Feature extraction of real-time image using Sift algorithm
CN111104857A (en) Identity recognition method and system based on gait energy diagram
JP2021071769A (en) Object tracking device and object tracking method
Wang et al. A gradient based weighted averaging method for estimation of fingerprint orientation fields
Qi et al. Segmentation of fingerprint images using the gradient vector field
CN114373203A (en) Picture archiving method and device, terminal equipment and computer readable storage medium
KR100711223B1 (en) Face recognition method using Zernike/LDA and recording medium storing the method
Campadelli et al. A color based method for face detection
KR100540889B1 (en) Tracking Method for The Moving Object using Basic Points
CN117037272B (en) Method and system for monitoring fall of old people

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination