CN114067421B - Personnel duplicate removal identification method, storage medium and computer equipment - Google Patents

Personnel duplicate removal identification method, storage medium and computer equipment Download PDF

Info

Publication number
CN114067421B
CN114067421B CN202210046435.4A CN202210046435A CN114067421B CN 114067421 B CN114067421 B CN 114067421B CN 202210046435 A CN202210046435 A CN 202210046435A CN 114067421 B CN114067421 B CN 114067421B
Authority
CN
China
Prior art keywords
face
person
information
similarity
face information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210046435.4A
Other languages
Chinese (zh)
Other versions
CN114067421A (en
Inventor
周耿城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Zhongyun Information Technology Co ltd
Original Assignee
Guangdong Zhongyun Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Zhongyun Information Technology Co ltd filed Critical Guangdong Zhongyun Information Technology Co ltd
Priority to CN202210046435.4A priority Critical patent/CN114067421B/en
Publication of CN114067421A publication Critical patent/CN114067421A/en
Application granted granted Critical
Publication of CN114067421B publication Critical patent/CN114067421B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/10Movable barriers with registering means
    • G07C9/15Movable barriers with registering means with arrangements to prevent the passage of more than one individual at a time
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a person duplicate removal identification method, a storage medium and computer equipment, and relates to the technical field of image processing, wherein the person duplicate removal identification method can effectively perform person distinguishing duplicate removal identification aiming at persons in a list or strangers of persons in a non-list by maintaining a cache queue T; by adopting the face frame position overlapping degree and matching with the similarity double threshold value of the face algorithm, on one hand, the problem of person tracking loss caused by video decoding frame dropping or low detection rate of the face algorithm and the like of the person can be solved, and extra wrong passing records generated when the person passes in the list are effectively reduced; on the other hand, the similarity double thresholds of the face algorithm are adopted to realize effective duplication removal on both static targets and non-static targets; no matter the overlapping degree of the positions of the face frames or the similarity calculation of the face algorithm, the calculation process has low requirements on calculation force and can run smoothly on most equipment.

Description

Personnel duplicate removal identification method, storage medium and computer equipment
Technical Field
The invention relates to the technical field of image processing, in particular to a person duplicate removal identification method, a storage medium and computer equipment.
Background
With the maturity and development of face recognition technology, the application of face recognition in intelligent access control systems is more and more extensive, and a plurality of corresponding face recognition access control systems appear in the market. However, when people walk in front of the camera, people often take a snapshot of two face photos, so that the face recognition intelligent access control system can generate face image records of the same person but with a small difference. For this purpose, it is necessary to minimize such recordings by means of a deduplication algorithm. At present, the main face image deduplication algorithms mainly include a deduplication method based on traditional image features (such as a color histogram and an LBP histogram), a fast deduplication method based on motion matching, and a deduplication algorithm based on tracking.
The method for removing the duplicate based on the traditional image characteristics can remove 50% of repeated human faces, and the false duplicate removal ratio is low. However, due to the fact that light differences of different video monitoring scenes are too large, all application scenes cannot be compatible by using the same threshold value, and the method is not universal; the fast duplicate removal method based on motion matching needs to set two parameter values of a pixel difference threshold and a difference ratio, mainly removes static repeated targets, and cannot process duplicate removal of non-static targets; the deduplication algorithm based on tracking requires three parameters to be set: tracking frequency, deduplication frequency and retention frequency, and the deduplication effect depends on the detection algorithm and the tracking algorithm. If the detection frequency is low or the detection rate of the detection algorithm is poor, the person can be lost; and because of having the minimum detection frequency requirement, have higher requirement to machine performance.
Meanwhile, in the above several duplicate removal algorithms, no consideration is given to the classification of people (i.e. people in a list or strangers) in the actual access control application, and no special targeted treatment is performed; particularly, when a person in the list just appears on a camera screen, the captured face may not reach the recognition similarity threshold, and the person is determined as a stranger passing record; when the user walks to the middle area of the video monitoring, the face captured by the capturing device can reach the threshold value of the recognition similarity, and then the user can be judged as a passing record of the people in the list; when people in the list leave the camera screen soon, the face captured by the camera may not reach the recognition similarity threshold, and the person is judged to be a stranger passing record; when people in the list pass through the camera, two different types of traffic records occur; these are additional false passage records that are generated without consideration of the classification of people that may be present in the application.
Disclosure of Invention
The present invention is directed to a person duplicate removal recognition method, a storage medium, and a computer device, so as to solve the problems in the background art.
In order to solve the technical problems, the invention provides the following technical scheme: a person duplicate removal identification method comprises the following steps:
s1, creating an empty buffer queue
Figure 404081DEST_PATH_IMAGE001
Buffer queue
Figure 946721DEST_PATH_IMAGE001
The queue element data comprises face images
Figure 297591DEST_PATH_IMAGE002
Position of human face frame
Figure 900695DEST_PATH_IMAGE003
Face pose score
Figure 286252DEST_PATH_IMAGE004
Face feature code
Figure 274418DEST_PATH_IMAGE005
Class bit
Figure 762101DEST_PATH_IMAGE006
And second-level time stamp of element addition time
Figure 400765DEST_PATH_IMAGE007
S2, circularly taking out the decoded picture from the network camera; suppose that one frame of decoded pictures is
Figure 403536DEST_PATH_IMAGE008
And obtaining the second-level time stamp of the current time
Figure 183890DEST_PATH_IMAGE009
(ii) a Checking buffer queues
Figure 840614DEST_PATH_IMAGE001
The second-level time stamps of all queue elements in the queue, if the second-level time stamp of the current time is
Figure 262243DEST_PATH_IMAGE010
Subtract a queue element
Figure 261515DEST_PATH_IMAGE011
Time stamp of second order
Figure 90849DEST_PATH_IMAGE012
The obtained difference
Figure 476793DEST_PATH_IMAGE013
Further determining the queue element
Figure 398704DEST_PATH_IMAGE011
Is classified into a plurality of groups
Figure 191611DEST_PATH_IMAGE014
The value of (a) is,
Figure 893030DEST_PATH_IMAGE015
the time required for a person to normally pass in front of the camera is accurate to seconds; if it is
Figure 721786DEST_PATH_IMAGE016
Then push the queue element
Figure 938183DEST_PATH_IMAGE011
And from the cache queue
Figure 605887DEST_PATH_IMAGE001
Delete the queue element
Figure 958550DEST_PATH_IMAGE011
(ii) a If it is
Figure 163927DEST_PATH_IMAGE017
Then directly from the buffer queue
Figure 624410DEST_PATH_IMAGE001
Delete the teamColumn element
Figure 310313DEST_PATH_IMAGE011
S3, displaying the picture
Figure 305904DEST_PATH_IMAGE018
Face detection is carried out to obtain pictures
Figure 419533DEST_PATH_IMAGE018
Extracting face feature codes of face images of all the face information to obtain face feature code information corresponding to the face images;
s4, aligning the pictures respectively
Figure 886554DEST_PATH_IMAGE018
Performing face feature code first similarity on face feature codes extracted from all face information and face feature codes of people in the list one by one
Figure 354449DEST_PATH_IMAGE019
Calculating and sorting the similarity values in a descending order; if the maximum similarity value is larger than or equal to the similarity threshold value of the same person judged by the algorithm
Figure 944354DEST_PATH_IMAGE020
If not, the person corresponding to the face information is regarded as a stranger in the non-list;
s5, aligning the pictures respectively
Figure 431674DEST_PATH_IMAGE018
All face information and buffer queue in the system
Figure 635258DEST_PATH_IMAGE001
The overlapping degree of the positions of the face frames of all the internal elements is carried out one by one
Figure 663436DEST_PATH_IMAGE021
Second similarity with face feature code
Figure 104918DEST_PATH_IMAGE022
Calculating;
s6, if one element exists
Figure 88578DEST_PATH_IMAGE023
With certain face information
Figure 776529DEST_PATH_IMAGE024
Meet the overlapping degree of the face frame position
Figure 682434DEST_PATH_IMAGE021
Threshold value
Figure 432615DEST_PATH_IMAGE025
Or a second similarity of the face feature codes
Figure 81317DEST_PATH_IMAGE022
Threshold value
Figure 203036DEST_PATH_IMAGE026
Then the face information is considered
Figure 835666DEST_PATH_IMAGE024
The corresponding personnel are already present, and classification judgment is further carried out according to the personnel types;
s7, if one element does not exist
Figure 986161DEST_PATH_IMAGE023
With certain face information
Figure 574273DEST_PATH_IMAGE024
Meet the overlapping degree of the face frame position
Figure 536285DEST_PATH_IMAGE021
Threshold value
Figure 430996DEST_PATH_IMAGE025
Or a second similarity of the face feature codes
Figure 970085DEST_PATH_IMAGE022
Threshold value
Figure 932361DEST_PATH_IMAGE026
(ii) a An element is newly created
Figure 556901DEST_PATH_IMAGE027
To convert the face information
Figure 72945DEST_PATH_IMAGE024
And time stamp
Figure 819538DEST_PATH_IMAGE028
Supplement to elements
Figure 749452DEST_PATH_IMAGE027
(ii) a Further carrying out classification judgment according to personnel types and setting new elements
Figure 327200DEST_PATH_IMAGE027
Corresponding classification zone bit
Figure 447869DEST_PATH_IMAGE029
(ii) a Will new elements
Figure 433322DEST_PATH_IMAGE027
Store in buffer queue
Figure 293385DEST_PATH_IMAGE001
In step (2), the addition of the new cache record is completed.
Further, in step S2,
Figure 304901DEST_PATH_IMAGE030
is a positive integer; in step S5, the face frame position overlap degree
Figure 149365DEST_PATH_IMAGE021
Calculating the second similarity of face feature code for duplication elimination of the first feature
Figure 777746DEST_PATH_IMAGE031
A second feature calculation is performed for deduplication.
Further, the degree of overlap of the positions of the face frames in step S5
Figure 995936DEST_PATH_IMAGE021
The calculation method of (2) is as follows:
assume a rectangle
Figure 79434DEST_PATH_IMAGE032
The vertex coordinates of the upper left corner are respectively
Figure 199361DEST_PATH_IMAGE033
Figure 607740DEST_PATH_IMAGE034
Having a height and a width of
Figure 805287DEST_PATH_IMAGE035
And
Figure 299116DEST_PATH_IMAGE036
rectangular shape
Figure 149965DEST_PATH_IMAGE037
The vertex coordinates of the upper left corner are respectively
Figure 752465DEST_PATH_IMAGE038
Figure 837756DEST_PATH_IMAGE039
Having a height and a width of
Figure 102039DEST_PATH_IMAGE040
And
Figure 746522DEST_PATH_IMAGE041
duplication removing first characteristic face frame position overlapping degree
Figure 946739DEST_PATH_IMAGE042
The calculation formula of (a) is as follows:
Figure 437352DEST_PATH_IMAGE043
duplication removing first characteristic face frame position overlapping degree
Figure 909970DEST_PATH_IMAGE042
Has a numerical value range of
Figure 420461DEST_PATH_IMAGE044
Further, overlapping two rectangular edges is not considered overlapping.
Further, in step S6:
s61, determining face information
Figure 224254DEST_PATH_IMAGE045
Is a person in the list, and element
Figure 195888DEST_PATH_IMAGE046
Inner pushing zone bit
Figure 774378DEST_PATH_IMAGE047
If the value of (2) is 0, the face information is immediately pushed
Figure 462065DEST_PATH_IMAGE045
And modifying the elements
Figure 102785DEST_PATH_IMAGE046
Inner pushing zone bit
Figure 491659DEST_PATH_IMAGE047
Has a value of 1;
s62, if the face information
Figure 186000DEST_PATH_IMAGE045
Is a person in the list, and element
Figure 138697DEST_PATH_IMAGE046
Inner pushing zone bit
Figure 633924DEST_PATH_IMAGE047
If the value of (1) is less than the predetermined threshold, the face information does not need to be pushed
Figure 664997DEST_PATH_IMAGE045
S63, determining face information
Figure 612885DEST_PATH_IMAGE045
If the person is not in the list, the judgment element is not needed
Figure 308817DEST_PATH_IMAGE046
Inner pushing zone bit
Figure 285916DEST_PATH_IMAGE047
A value of (d);
s64, then face information
Figure 547751DEST_PATH_IMAGE045
Performing face pose scoring
Figure 448847DEST_PATH_IMAGE048
And (6) calculating and judging.
Further, in step S64, the face pose score
Figure 194609DEST_PATH_IMAGE048
The calculation method is as follows:
face pose score
Figure 130341DEST_PATH_IMAGE048
According to the face pose angle
Figure 506619DEST_PATH_IMAGE049
Performing calculations, respectively, of
Figure 169379DEST_PATH_IMAGE050
Figure 996131DEST_PATH_IMAGE051
And
Figure 114265DEST_PATH_IMAGE052
three angles, all of which have values in the range
Figure 395865DEST_PATH_IMAGE053
Within the range of degrees, the closer any angle value is to zero, the better the face pose angle is, and the corresponding face pose score
Figure 333870DEST_PATH_IMAGE054
The higher;
concrete face pose score
Figure 830890DEST_PATH_IMAGE054
The formula for the calculation is as follows:
Figure 537951DEST_PATH_IMAGE055
wherein the face pose score
Figure 438786DEST_PATH_IMAGE054
Has a numerical value range of
Figure 398176DEST_PATH_IMAGE056
Figure 894428DEST_PATH_IMAGE057
To calculate the absolute value function of a real number.
Further, in step S64:
s641, if the face information
Figure 399538DEST_PATH_IMAGE058
Face pose score of
Figure 339118DEST_PATH_IMAGE059
Higher than element
Figure 257575DEST_PATH_IMAGE060
Inner face pose score
Figure 830345DEST_PATH_IMAGE061
Using the face information
Figure 511998DEST_PATH_IMAGE058
Face image of
Figure 100110DEST_PATH_IMAGE062
Face pose score
Figure 965916DEST_PATH_IMAGE063
Position of human face frame
Figure 611359DEST_PATH_IMAGE064
Face feature code
Figure 944729DEST_PATH_IMAGE065
Replacement elements
Figure 641426DEST_PATH_IMAGE060
Information of the corresponding item in (1);
s642, if the face information
Figure 268895DEST_PATH_IMAGE058
Face pose score of
Figure 449520DEST_PATH_IMAGE066
Not higher than element
Figure 105766DEST_PATH_IMAGE060
Inner face pose score
Figure 679754DEST_PATH_IMAGE061
Then only the face information is used
Figure 732203DEST_PATH_IMAGE058
Face frame position of
Figure 190152DEST_PATH_IMAGE064
Face feature code
Figure 175605DEST_PATH_IMAGE065
Replacement elements
Figure 206581DEST_PATH_IMAGE060
Information of the corresponding item in (1);
s643, update element
Figure 271625DEST_PATH_IMAGE060
Time stamp of
Figure 364091DEST_PATH_IMAGE067
As a time stamp
Figure 879084DEST_PATH_IMAGE068
Further, in step S7, if there is no one of the elements
Figure 619643DEST_PATH_IMAGE060
And the face information
Figure 783036DEST_PATH_IMAGE058
Meet the overlapping degree of the face frame position
Figure 710153DEST_PATH_IMAGE069
Threshold value
Figure 367799DEST_PATH_IMAGE070
Or a second similarity of the face feature codes
Figure 485451DEST_PATH_IMAGE071
Threshold value
Figure 152453DEST_PATH_IMAGE072
Then the face information is considered
Figure 810491DEST_PATH_IMAGE058
Corresponding personnel do not appear, and the face information needs to be added
Figure 525906DEST_PATH_IMAGE058
Recording for new buffer; newly building an element
Figure 159934DEST_PATH_IMAGE073
To convert the face information
Figure 689796DEST_PATH_IMAGE058
Face frame position of
Figure 159246DEST_PATH_IMAGE074
Human face pose angle
Figure 237758DEST_PATH_IMAGE075
Human face image
Figure 25109DEST_PATH_IMAGE076
Face feature code
Figure 773759DEST_PATH_IMAGE077
Time stamp of the order of seconds
Figure 573267DEST_PATH_IMAGE078
Supplement to elements
Figure 91445DEST_PATH_IMAGE073
(ii) a Further carrying out classification judgment according to the personnel types, if the face information
Figure 49224DEST_PATH_IMAGE058
If the corresponding person is a person in the list, the face information is immediately pushed
Figure 285643DEST_PATH_IMAGE058
And new elements are combined
Figure 298296DEST_PATH_IMAGE073
Classification markSign position
Figure 105459DEST_PATH_IMAGE079
Is set to 1; if the face information
Figure 706384DEST_PATH_IMAGE058
If the corresponding person is a stranger of persons in the non-list, the new element is added
Figure 497802DEST_PATH_IMAGE073
Push flag bit
Figure 302428DEST_PATH_IMAGE079
Is set to 0; and new elements are added
Figure 576814DEST_PATH_IMAGE073
Store in buffer queue
Figure 433128DEST_PATH_IMAGE080
In step (2), the addition of the new cache record is completed.
The invention also provides a storage medium for person deduplication identification, on which computer instructions are stored, which, when executed by a processor, implement the steps of a person deduplication identification method as described in any one of the above.
The invention also provides computer equipment for person duplicate removal identification, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein when the processor runs the computer program, the person duplicate removal identification method is realized.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention carries out the operation on all the faces recognized by each frame and maintains a buffer queue
Figure 912175DEST_PATH_IMAGE080
Can effectively distinguish and duplicate-removing personnel aiming at the personnel in the list or strangers of the personnel in the non-list(ii) a Because other traditional characteristics are not adopted, the required duplication removing characteristics are face pose scores and face similarity values obtained by the position of a face frame and the face pose angle; the characteristics are all information possessed by a general face algorithm, are irrelevant to a camera scene and have universality; by adopting the face frame position overlapping degree and the similarity double threshold value of the face algorithm, on one hand, the problem of person tracking loss caused by frame dropping of video decoding or low detection rate of the face algorithm and the like of the person can be solved, and the extra wrong passing record generated when the person passes in the list is effectively reduced by matching and tracking the second similarity threshold value of the face algorithm again; on the other hand, by adopting the similarity double thresholds of the face algorithm, even if the personnel are still for a long time, the personnel can be effectively compared, and the static target and the non-static target can be effectively removed; no matter the overlapping degree of the positions of the face frames or the similarity calculation of the face algorithm, the calculation process has low requirements on calculation force and can run smoothly on most equipment;
2. when the person in the list just appears in the camera screen, the captured face may not reach the recognition similarity threshold and can be judged as a stranger passing record, but due to the adoption of the delay method
Figure 546858DEST_PATH_IMAGE081
Second push, so push will not be done immediately at this time, and only buffer queue will exist
Figure 449647DEST_PATH_IMAGE080
Internal; when the user walks to the middle area of video monitoring, the captured face possibly reaches the threshold value of the recognition similarity, the user can be judged to be the passing record of the people in the list, at the moment, according to the judgment logic of the invention, the information of the people in the list can be immediately pushed, and the information is cached in a cache queue
Figure 189112DEST_PATH_IMAGE080
Matching the cache records, updating the data of the records, and updating the classification mark of the records to be 1; when people in the list are about to leave the camera picture, the face captured by the camera may not reach the threshold of the recognition similarity, and the face can be judgedMaking a pass record of strangers, and then according to the cache queue
Figure 886946DEST_PATH_IMAGE080
The internal cache records can be normally matched with the cache records meeting the conditions, the classification mark is 1, and the system cannot push the stranger information; therefore, the invention can ensure that only one correct pass record is generated when a single person passes by the system to the maximum extent.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic illustration of the present invention when the rectangles are not overlapping;
FIG. 2 is a schematic illustration of the invention when rectangles overlap;
FIG. 3 is a schematic diagram of the face pose angles of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
A person deduplication identification method as shown in fig. 1-3, comprises the following steps:
s1, creating an empty buffer queue
Figure 301883DEST_PATH_IMAGE080
Buffer queue
Figure 37281DEST_PATH_IMAGE080
The queue element data comprises face images
Figure 998246DEST_PATH_IMAGE082
Position of human face frame
Figure 923656DEST_PATH_IMAGE083
Face pose score
Figure 936091DEST_PATH_IMAGE084
Face feature code
Figure 672801DEST_PATH_IMAGE085
Class bit
Figure 19648DEST_PATH_IMAGE086
And second-level time stamp of element addition time
Figure 426495DEST_PATH_IMAGE087
S2, circularly taking out the decoded picture from the network camera; suppose that one frame of decoded pictures is
Figure 183234DEST_PATH_IMAGE088
And obtaining the second-level time stamp of the current time
Figure 893225DEST_PATH_IMAGE089
(ii) a Checking buffer queues
Figure 605447DEST_PATH_IMAGE080
The second-level time stamps of all queue elements in the queue, if the second-level time stamp of the current time is
Figure 505449DEST_PATH_IMAGE089
Subtract a queue element
Figure 365547DEST_PATH_IMAGE090
Time stamp of second order
Figure 195623DEST_PATH_IMAGE091
The obtained difference
Figure 84605DEST_PATH_IMAGE092
Further determining the queue element
Figure 466044DEST_PATH_IMAGE090
Is classified into a plurality of groups
Figure 155132DEST_PATH_IMAGE093
The value of (a) is,
Figure 249169DEST_PATH_IMAGE094
the time required for a person to normally pass in front of the camera is accurate to seconds; if it is
Figure 569246DEST_PATH_IMAGE095
Then push the queue element
Figure 440910DEST_PATH_IMAGE090
And from the cache queue
Figure 879688DEST_PATH_IMAGE080
Delete the queue element
Figure 418778DEST_PATH_IMAGE090
(ii) a If it is
Figure 649562DEST_PATH_IMAGE096
Then directly from the buffer queue
Figure 734155DEST_PATH_IMAGE080
Delete the queue element
Figure 920639DEST_PATH_IMAGE090
In step S2, the process proceeds,
Figure 717830DEST_PATH_IMAGE094
is a positive integer; in step S5, the face frame position overlap degree
Figure 385095DEST_PATH_IMAGE097
Calculating the second similarity of face feature code for duplication elimination of the first feature
Figure 968703DEST_PATH_IMAGE098
Computing for the de-duplication second features;
s3, displaying the picture
Figure 43564DEST_PATH_IMAGE099
Face detection is carried out to obtain pictures
Figure 929881DEST_PATH_IMAGE099
Extracting face feature codes of face images of all the face information to obtain face feature code information corresponding to the face images;
s4, aligning the pictures respectively
Figure 180431DEST_PATH_IMAGE099
Performing face feature code first similarity on face feature codes extracted from all face information and face feature codes of people in the list one by one
Figure 774177DEST_PATH_IMAGE100
Calculating and sorting the similarity values in a descending order; if the maximum similarity value is larger than or equal to the similarity threshold value of the same person judged by the algorithm
Figure 36885DEST_PATH_IMAGE101
If not, the person corresponding to the face information is regarded as a stranger in the non-list;
s5, aligning the pictures respectively
Figure 139494DEST_PATH_IMAGE099
All face information and buffer queue in the system
Figure 656283DEST_PATH_IMAGE102
The overlapping degree of the positions of the face frames of all the internal elements is carried out one by one
Figure 211553DEST_PATH_IMAGE097
Second similarity with face feature code
Figure 527219DEST_PATH_IMAGE103
Calculating;
s6, if one element exists
Figure 133001DEST_PATH_IMAGE104
With certain face information
Figure 47391DEST_PATH_IMAGE105
Meet the overlapping degree of the face frame position
Figure 525778DEST_PATH_IMAGE097
Threshold value
Figure 918237DEST_PATH_IMAGE106
Or a second similarity of the face feature codes
Figure 995438DEST_PATH_IMAGE103
Threshold value
Figure 343379DEST_PATH_IMAGE107
Then the face information is considered
Figure 195278DEST_PATH_IMAGE105
The corresponding personnel are already present, and classification judgment is further carried out according to the personnel types;
s61, determining face information
Figure 422113DEST_PATH_IMAGE105
Is a person in the list, and element
Figure 631118DEST_PATH_IMAGE108
Inner pushing zone bit
Figure 816205DEST_PATH_IMAGE109
If the value of (2) is 0, the face information is immediately pushed
Figure 316853DEST_PATH_IMAGE105
And modifying the elements
Figure 657972DEST_PATH_IMAGE108
Inner pushing zone bit
Figure 193255DEST_PATH_IMAGE109
Has a value of 1;
s62, if the face information
Figure 865421DEST_PATH_IMAGE105
Is a person in the list, and element
Figure 181261DEST_PATH_IMAGE108
Inner pushing zone bit
Figure 981863DEST_PATH_IMAGE109
If the value of (1) is less than the predetermined threshold, the face information does not need to be pushed
Figure 229441DEST_PATH_IMAGE105
S63, determining face information
Figure 214246DEST_PATH_IMAGE105
If the person is not in the list, the judgment element is not needed
Figure 65051DEST_PATH_IMAGE108
Inner pushing zone bit
Figure 550570DEST_PATH_IMAGE109
A value of (d);
s64, then face information
Figure 992269DEST_PATH_IMAGE105
Performing face pose scoring
Figure 761165DEST_PATH_IMAGE110
Calculating and judging;
s641, if the face information
Figure 734353DEST_PATH_IMAGE105
Face pose score ofNumber of
Figure 148196DEST_PATH_IMAGE111
Higher than element
Figure 376225DEST_PATH_IMAGE108
Inner face pose score
Figure 850111DEST_PATH_IMAGE112
Using the face information
Figure 743685DEST_PATH_IMAGE105
Face image of
Figure 305426DEST_PATH_IMAGE113
Face pose score
Figure 777702DEST_PATH_IMAGE111
Position of human face frame
Figure 420033DEST_PATH_IMAGE114
Face feature code
Figure 336654DEST_PATH_IMAGE115
Replacement elements
Figure 89370DEST_PATH_IMAGE108
Information of the corresponding item in (1);
s642, if the face information
Figure 407836DEST_PATH_IMAGE105
Face pose score of
Figure 225981DEST_PATH_IMAGE111
Not higher than element
Figure 240158DEST_PATH_IMAGE108
Inner face pose score
Figure 186778DEST_PATH_IMAGE112
Then only the face message is usedInformation processing device
Figure 165278DEST_PATH_IMAGE105
Face frame position of
Figure 617780DEST_PATH_IMAGE116
Face feature code
Figure 952535DEST_PATH_IMAGE117
Replacement elements
Figure 222286DEST_PATH_IMAGE108
Information of the corresponding item in (1);
s643, update element
Figure 31854DEST_PATH_IMAGE108
Time stamp of
Figure 592940DEST_PATH_IMAGE118
As a time stamp
Figure 567855DEST_PATH_IMAGE119
S7, if one element does not exist
Figure 338501DEST_PATH_IMAGE108
And the face information
Figure 757504DEST_PATH_IMAGE105
Meet the overlapping degree of the face frame position
Figure 676443DEST_PATH_IMAGE120
Threshold value
Figure 565685DEST_PATH_IMAGE121
Or a second similarity of the face feature codes
Figure 467918DEST_PATH_IMAGE122
Threshold value
Figure 811268DEST_PATH_IMAGE123
Then the face information is considered
Figure 969284DEST_PATH_IMAGE105
Corresponding personnel do not appear, and the face information needs to be added
Figure 516858DEST_PATH_IMAGE105
Recording for new buffer; newly building an element
Figure 219853DEST_PATH_IMAGE124
To convert the face information
Figure 347869DEST_PATH_IMAGE105
Face frame position of
Figure 874189DEST_PATH_IMAGE125
Human face pose angle
Figure 378970DEST_PATH_IMAGE126
Human face image
Figure 157094DEST_PATH_IMAGE127
Face feature code
Figure 20842DEST_PATH_IMAGE128
Time stamp of the order of seconds
Figure 906677DEST_PATH_IMAGE129
Supplement to elements
Figure 308407DEST_PATH_IMAGE124
(ii) a Further carrying out classification judgment according to the personnel types, if the face information
Figure 412591DEST_PATH_IMAGE105
If the corresponding person is a person in the list, the face information is immediately pushed
Figure 252550DEST_PATH_IMAGE105
And new elements are combined
Figure 927863DEST_PATH_IMAGE124
Classification zone bit
Figure 76608DEST_PATH_IMAGE130
Is set to 1; if the face information
Figure 600129DEST_PATH_IMAGE105
If the corresponding person is a stranger of persons in the non-list, the new element is added
Figure 840402DEST_PATH_IMAGE124
Push flag bit
Figure 952195DEST_PATH_IMAGE130
Is set to 0; and new elements are added
Figure 398356DEST_PATH_IMAGE124
Store in buffer queue
Figure 178098DEST_PATH_IMAGE131
In step (2), the addition of the new cache record is completed.
Face frame position overlap degree in step S5
Figure 588582DEST_PATH_IMAGE132
The calculation method of (2) is as follows:
assume a rectangle
Figure 905364DEST_PATH_IMAGE133
The vertex coordinates of the upper left corner are respectively
Figure 229034DEST_PATH_IMAGE134
Figure 235776DEST_PATH_IMAGE135
Having a height and a width of
Figure 882190DEST_PATH_IMAGE136
And
Figure 541978DEST_PATH_IMAGE137
rectangular shape
Figure 36767DEST_PATH_IMAGE138
The vertex coordinates of the upper left corner are respectively
Figure 302187DEST_PATH_IMAGE139
Figure 825751DEST_PATH_IMAGE140
Having a height and a width of
Figure 420158DEST_PATH_IMAGE141
And
Figure 980597DEST_PATH_IMAGE142
duplication removing first characteristic face frame position overlapping degree
Figure 871546DEST_PATH_IMAGE143
The calculation formula of (a) is as follows:
Figure 639830DEST_PATH_IMAGE144
duplication removing first characteristic face frame position overlapping degree
Figure 551073DEST_PATH_IMAGE143
Has a numerical value range of
Figure 480506DEST_PATH_IMAGE145
The overlap of two rectangular edges is not considered to be an overlap.
In step S64, the face pose score
Figure 615957DEST_PATH_IMAGE146
The calculation method is as follows:
face pose score
Figure 873834DEST_PATH_IMAGE146
According to the face pose angle
Figure 49728DEST_PATH_IMAGE147
Performing calculations, respectively, of
Figure 528775DEST_PATH_IMAGE148
Figure 142951DEST_PATH_IMAGE149
And
Figure 620248DEST_PATH_IMAGE150
three angles, all of which have values in the range
Figure 825625DEST_PATH_IMAGE151
Within the range of degrees, the closer any angle value is to zero, the better the face pose angle is, and the corresponding face pose score
Figure 971792DEST_PATH_IMAGE146
The higher;
concrete face pose score
Figure 920817DEST_PATH_IMAGE146
The formula for the calculation is as follows:
Figure 874126DEST_PATH_IMAGE152
wherein the face pose score
Figure 247475DEST_PATH_IMAGE146
Has a numerical value range of
Figure 217624DEST_PATH_IMAGE153
Figure 134250DEST_PATH_IMAGE154
To calculate the absolute value function of a real number.
The invention also provides a storage medium for person deduplication identification, on which computer instructions are stored, which, when executed by a processor, implement the steps of a person deduplication identification method as described in any one of the above.
The invention also provides computer equipment for person duplicate removal identification, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein when the processor runs the computer program, the person duplicate removal identification method is realized.
In the face recognition intelligent access control system, when people in a name list are recognized, the recognition records of the related people in the name list are immediately pushed, and the door opening operation is carried out; for stranger records of people not on the list, delay can be adopted
Figure 942065DEST_PATH_IMAGE155
Quasi-real-time pushing is carried out in a second mode, namely a stranger leaves the range of the camera picture
Figure 748965DEST_PATH_IMAGE155
Pushing the record after the second; the delay pushing of strangers does not affect the main functions of the intelligent face access control system; wherein
Figure 424321DEST_PATH_IMAGE155
Generally equal to the time it takes a person to normally pass in front of the camera (i.e. the time that occurs within the frame of the camera, accurate to seconds); because the focal lengths of the lenses of the cameras are different and the installation heights and the inclination angles of the cameras are different,
Figure 646972DEST_PATH_IMAGE155
the value range of (1) can be any positive integer, because the time of each person appearing in the range of the camera is not fixed, the person possibly lingers for a long time; herein, the
Figure 588454DEST_PATH_IMAGE155
Emphasis is placed on the time consumed by a person in normal passage;
suppose the picture in the network camera is an image
Figure 306535DEST_PATH_IMAGE156
(ii) a Creating an empty buffer queue
Figure 890359DEST_PATH_IMAGE157
Buffer queue
Figure 984878DEST_PATH_IMAGE157
The queue element data comprises face images
Figure 77604DEST_PATH_IMAGE158
Position of human face frame
Figure 294009DEST_PATH_IMAGE159
Face pose score
Figure 943957DEST_PATH_IMAGE160
Face feature code
Figure 980655DEST_PATH_IMAGE161
Class bit
Figure 255311DEST_PATH_IMAGE162
And second-level time stamp of element addition time
Figure 580774DEST_PATH_IMAGE163
Face image
Figure 401840DEST_PATH_IMAGE158
Is not larger than the size of the image
Figure 838161DEST_PATH_IMAGE156
Size of face frame position
Figure 303215DEST_PATH_IMAGE159
Is not larger than the image
Figure 534000DEST_PATH_IMAGE164
Pixel range of (2), classification flag bit
Figure 630311DEST_PATH_IMAGE165
The value of (A) can be 0, 1; when in use
Figure 791497DEST_PATH_IMAGE166
When the queue element is not the push cache record, the queue element is described as the push cache record; when in use
Figure 116916DEST_PATH_IMAGE167
When the queue element is pushed, the queue element is indicated to be a pushed cache record;
the invention starts to connect the network camera first, and cyclically takes out the decoded picture from the network camera for duplication removal and identification; suppose that one frame of decoded pictures is
Figure 64409DEST_PATH_IMAGE168
And obtaining the second-level time of the current moment
Figure 645087DEST_PATH_IMAGE169
(ii) a Checking buffer queues
Figure 213616DEST_PATH_IMAGE170
The second-level time stamps of all queue elements in the queue, if the second-level time stamp of the current time is
Figure 196139DEST_PATH_IMAGE169
Subtract a queue element
Figure 296955DEST_PATH_IMAGE171
Time stamp of second order
Figure 771927DEST_PATH_IMAGE172
The obtained difference
Figure 82303DEST_PATH_IMAGE173
Figure 97891DEST_PATH_IMAGE174
A positive integer), the queue element is further determined
Figure 35380DEST_PATH_IMAGE171
Is classified into a plurality of groups
Figure 187055DEST_PATH_IMAGE175
A value of (d); if it is
Figure 41402DEST_PATH_IMAGE176
Then push the queue element
Figure 668164DEST_PATH_IMAGE171
And from the cache queue
Figure 54326DEST_PATH_IMAGE170
Delete the queue element
Figure 859541DEST_PATH_IMAGE171
(ii) a If it is
Figure 517579DEST_PATH_IMAGE177
Then directly from the buffer queue
Figure 323342DEST_PATH_IMAGE170
Delete the queue element
Figure 942721DEST_PATH_IMAGE171
For the picture
Figure 207004DEST_PATH_IMAGE168
Face detection is carried out to obtain pictures
Figure 789964DEST_PATH_IMAGE168
Extracting face feature codes of face images of all face information (including face frame positions and face posture scores) to obtain face feature code information corresponding to the face images;
respectively align the pictures
Figure 391319DEST_PATH_IMAGE168
All the face information in the system is processed as follows:
assuming one of face information
Figure 240514DEST_PATH_IMAGE178
Comprises the following steps: face frame position
Figure 189496DEST_PATH_IMAGE179
Face pose score
Figure 738073DEST_PATH_IMAGE180
Human face image
Figure 922023DEST_PATH_IMAGE181
Face feature code
Figure 214031DEST_PATH_IMAGE182
(ii) a Carrying out first similarity of the face feature codes on the face feature codes and the face feature codes of the persons in the list one by one
Figure 337061DEST_PATH_IMAGE183
Calculating and sorting the similarity values in a descending order; if the maximum similarity value is larger than or equal to the similarity threshold value of the same person judged by the algorithm
Figure 668821DEST_PATH_IMAGE184
If not, the person corresponding to the face information is regarded as a stranger in the non-list;
the face information of the face is processed
Figure 978716DEST_PATH_IMAGE178
Respectively associated with the buffer queue
Figure 401678DEST_PATH_IMAGE185
The overlapping degree of the positions of the face frames of all the internal elements is carried out one by one
Figure 65532DEST_PATH_IMAGE186
(duplication-removing first feature) and second similarity of face feature code
Figure 3580DEST_PATH_IMAGE187
(de-duplication second feature) calculation;
the detailed process of calculating the face frame position overlapping degree (duplication removing first characteristic) is as follows:
because the face frame is generally rectangular, and the sides of the rectangle are generally parallel to the coordinate axes of the pixel points; therefore, the problem of the overlapping degree of the face frames is calculated, namely the problem of the overlapping degree of the two rectangles is calculated; assume a rectangle
Figure 713788DEST_PATH_IMAGE188
The vertex coordinates of the upper left corner are respectively
Figure 151859DEST_PATH_IMAGE189
Figure 627976DEST_PATH_IMAGE190
Having a height and a width of
Figure 773310DEST_PATH_IMAGE191
And
Figure 182431DEST_PATH_IMAGE192
rectangular shape
Figure 449652DEST_PATH_IMAGE193
The vertex coordinates of the upper left corner are respectively
Figure 678644DEST_PATH_IMAGE194
Figure 25405DEST_PATH_IMAGE195
Having a height and a width of
Figure 981644DEST_PATH_IMAGE196
And
Figure 970639DEST_PATH_IMAGE197
(ii) a To calculate the overlapping degree problem of two rectangles, firstly, whether the two rectangles are overlapped (the edge overlapping is not considered as overlapping) is judged; because the overlapping conditions of the two rectangles are too many, if the two rectangles are listed one by one according to each condition, the two rectangles are too complicated; therefore, the condition that the two rectangles are not overlapped can be judged in a reverse thinking mode, and then the judgment of the overlapping condition is obtained by taking the inverse;
if the center-most rectangle is a rectangle, as shown in FIG. 1
Figure 23612DEST_PATH_IMAGE198
The surrounding rectangle is a rectangle
Figure 962012DEST_PATH_IMAGE199
Not in line with rectangle
Figure 176948DEST_PATH_IMAGE198
In various cases where the rectangular shape is superimposed, when any one of the following conditions is satisfied, the rectangular shape is shown in the figure
Figure 930319DEST_PATH_IMAGE198
And a rectangle
Figure 868325DEST_PATH_IMAGE199
Non-overlapping:
Figure 138644DEST_PATH_IMAGE200
the inference based on the above conditions is based on the method of negation, and the rectangle is known
Figure 379794DEST_PATH_IMAGE198
And a rectangle
Figure 894612DEST_PATH_IMAGE199
Overlapping, the following four conditions need to be satisfied simultaneously:
Figure 104932DEST_PATH_IMAGE201
if rectangular
Figure 996189DEST_PATH_IMAGE198
And a rectangle
Figure 967210DEST_PATH_IMAGE199
If the two points are overlapped, two intersection points are necessary; as shown in FIG. 3, two points of intersection (where the solid circles are located) are assumed to be points respectively
Figure 856193DEST_PATH_IMAGE202
And point
Figure 99616DEST_PATH_IMAGE203
The formula is as follows:
Figure 605156DEST_PATH_IMAGE204
if rectangular
Figure 292668DEST_PATH_IMAGE198
And a rectangle
Figure 81113DEST_PATH_IMAGE199
Overlap, dot
Figure 551640DEST_PATH_IMAGE205
And point
Figure 479172DEST_PATH_IMAGE206
The following conditions hold:
Figure 490032DEST_PATH_IMAGE207
combining the above equations and conditions, rectangles
Figure 452308DEST_PATH_IMAGE198
And a rectangle
Figure 605077DEST_PATH_IMAGE199
Overlap, the following set of conditions holds:
Figure 316860DEST_PATH_IMAGE208
when it is rectangular
Figure 510124DEST_PATH_IMAGE198
And a rectangle
Figure 837774DEST_PATH_IMAGE199
When overlapping, the overlapping area is rectangular
Figure 887293DEST_PATH_IMAGE209
Area of (2)
Figure 682995DEST_PATH_IMAGE210
Comprises the following steps:
Figure 707328DEST_PATH_IMAGE211
as shown in fig. 2, rectangular
Figure 856209DEST_PATH_IMAGE198
And a rectangle
Figure 679111DEST_PATH_IMAGE199
Total area of formed region
Figure 455399DEST_PATH_IMAGE212
Equal to rectangle
Figure 23920DEST_PATH_IMAGE198
Area of (2)
Figure 522337DEST_PATH_IMAGE213
And a rectangle
Figure 40589DEST_PATH_IMAGE214
Area of (2)
Figure 360848DEST_PATH_IMAGE215
And then subtracting the rectangle
Figure 67033DEST_PATH_IMAGE216
Area of (2)
Figure 247002DEST_PATH_IMAGE217
Namely, the following steps are provided:
Figure 269060DEST_PATH_IMAGE218
rectangle
Figure 410589DEST_PATH_IMAGE198
And a rectangle
Figure 696438DEST_PATH_IMAGE199
Degree of overlap
Figure 259360DEST_PATH_IMAGE219
Is rectangular
Figure 744484DEST_PATH_IMAGE216
Area of (2)
Figure 253881DEST_PATH_IMAGE220
And a rectangle
Figure 382991DEST_PATH_IMAGE198
And a rectangle
Figure 656762DEST_PATH_IMAGE199
Total area of formed region
Figure 599885DEST_PATH_IMAGE221
The area ratio of (A) to (B) is as follows:
Figure 661569DEST_PATH_IMAGE222
combining the above equations, the following equations can be derived:
Figure 309569DEST_PATH_IMAGE224
rectangle
Figure 957504DEST_PATH_IMAGE198
And a rectangle
Figure 503767DEST_PATH_IMAGE199
Degree of overlap
Figure 307299DEST_PATH_IMAGE219
Has a numerical value range of
Figure 485037DEST_PATH_IMAGE225
(ii) a In the invention, the position overlapping degree of the face frame with the first characteristic is removed
Figure 80103DEST_PATH_IMAGE219
The optimal threshold value of (2) is 0.1;
if one of the elements is present
Figure 382171DEST_PATH_IMAGE226
And the face information
Figure 726744DEST_PATH_IMAGE227
Meet the overlapping degree of the face frame position
Figure 552797DEST_PATH_IMAGE219
Threshold value
Figure 369362DEST_PATH_IMAGE228
Or a second similarity of the face feature codes
Figure 780232DEST_PATH_IMAGE229
Threshold value
Figure 253462DEST_PATH_IMAGE230
Then the face information is considered
Figure 337617DEST_PATH_IMAGE227
The corresponding personnel are already present, and classification judgment is further carried out according to the personnel types; if the face information
Figure 29413DEST_PATH_IMAGE227
The corresponding person is a person in the list, and the elements
Figure 193159DEST_PATH_IMAGE226
Inner pushing zone bit
Figure 189852DEST_PATH_IMAGE231
If the value of (2) is 0, the face information is immediately pushed
Figure 419392DEST_PATH_IMAGE227
And modifying the elements
Figure 285019DEST_PATH_IMAGE226
Inner pushing zone bit
Figure 473079DEST_PATH_IMAGE231
Is 1, then face pose score is performed
Figure 570003DEST_PATH_IMAGE232
Calculating; if the face information
Figure 231013DEST_PATH_IMAGE227
Is a person in the list, and element
Figure 238246DEST_PATH_IMAGE226
Inner pushing zone bit
Figure 859600DEST_PATH_IMAGE231
If the value of (1) is less than the threshold value, the face pose score is directly performed
Figure 99830DEST_PATH_IMAGE232
Computing(ii) a If the face information
Figure 343909DEST_PATH_IMAGE227
If the corresponding person is not in the list, the judgment element is not needed
Figure 61990DEST_PATH_IMAGE226
Inner pushing zone bit
Figure 684694DEST_PATH_IMAGE231
By directly performing face pose scoring
Figure 716896DEST_PATH_IMAGE232
Calculating;
wherein the face pose score
Figure 615148DEST_PATH_IMAGE232
The calculation method of (2) is as follows:
face pose score
Figure 701533DEST_PATH_IMAGE232
According to the face pose angle
Figure 617060DEST_PATH_IMAGE233
Performing calculations, respectively, of
Figure 746761DEST_PATH_IMAGE234
Figure 165765DEST_PATH_IMAGE235
And
Figure 960069DEST_PATH_IMAGE236
three angles (as shown in fig. 3), all of which have values in the range
Figure 94383DEST_PATH_IMAGE237
Within the range of degrees, the closer any angle value is to zero, the better the face pose angle is, and the corresponding face pose score
Figure 996616DEST_PATH_IMAGE232
The higher; in the invention, the face pose score
Figure 532776DEST_PATH_IMAGE232
The formula for the calculation is as follows:
Figure 916225DEST_PATH_IMAGE238
face pose score
Figure 6677DEST_PATH_IMAGE232
Has a numerical value range of
Figure 998688DEST_PATH_IMAGE239
Figure 861125DEST_PATH_IMAGE240
To calculate the absolute value function of the real number;
if the face information
Figure 454355DEST_PATH_IMAGE241
Face pose score of
Figure 103209DEST_PATH_IMAGE242
Higher than element
Figure 149842DEST_PATH_IMAGE243
Inner face pose score
Figure 395014DEST_PATH_IMAGE244
Using the face information
Figure 967602DEST_PATH_IMAGE241
Face image of
Figure 301155DEST_PATH_IMAGE245
Face pose score
Figure 311864DEST_PATH_IMAGE242
Position of human face frame
Figure 682981DEST_PATH_IMAGE246
Face feature code
Figure 488787DEST_PATH_IMAGE247
Replacement elements
Figure 289128DEST_PATH_IMAGE243
And updating the element
Figure 898404DEST_PATH_IMAGE243
Time stamp of
Figure 628820DEST_PATH_IMAGE248
As a second-order time stamp
Figure 543210DEST_PATH_IMAGE249
If the face information
Figure 585776DEST_PATH_IMAGE241
Face pose score of
Figure 506463DEST_PATH_IMAGE242
Not higher than element
Figure 852173DEST_PATH_IMAGE243
Inner face pose score
Figure 666026DEST_PATH_IMAGE244
Then only the face information is used
Figure 461467DEST_PATH_IMAGE241
Face frame position of
Figure 657617DEST_PATH_IMAGE250
Face feature code
Figure 606903DEST_PATH_IMAGE251
Replacement elements
Figure 863095DEST_PATH_IMAGE243
And updating the element
Figure 880254DEST_PATH_IMAGE243
Time stamp of
Figure 642744DEST_PATH_IMAGE252
As a second-order time stamp
Figure 428958DEST_PATH_IMAGE253
If one of the elements is not present
Figure 316104DEST_PATH_IMAGE243
And the face information
Figure 868227DEST_PATH_IMAGE241
Meet the overlapping degree of the face frame position
Figure 940268DEST_PATH_IMAGE254
Threshold value
Figure 330057DEST_PATH_IMAGE255
Or a second similarity of the face feature codes
Figure 928053DEST_PATH_IMAGE256
Threshold value
Figure 385715DEST_PATH_IMAGE257
Then the face information is considered
Figure 583010DEST_PATH_IMAGE241
Corresponding personnel do not appear, and the face information needs to be added
Figure 809728DEST_PATH_IMAGE241
Recording for new buffer; newly building an element
Figure 106853DEST_PATH_IMAGE258
To convert the face information
Figure 51812DEST_PATH_IMAGE241
Face frame position of
Figure 465655DEST_PATH_IMAGE259
Human face pose angle
Figure 227772DEST_PATH_IMAGE260
Human face image
Figure 433149DEST_PATH_IMAGE261
Face feature code
Figure 382982DEST_PATH_IMAGE262
Time stamp of the order of seconds
Figure 580008DEST_PATH_IMAGE263
Supplement to elements
Figure 315407DEST_PATH_IMAGE264
(ii) a Further carrying out classification judgment according to the personnel types, if the face information
Figure 838490DEST_PATH_IMAGE241
If the corresponding person is a person in the list, the face information is immediately pushed
Figure 981811DEST_PATH_IMAGE241
And new elements are combined
Figure 997176DEST_PATH_IMAGE264
Classification zone bit
Figure 52993DEST_PATH_IMAGE265
Is set to 1; if the face information
Figure 600173DEST_PATH_IMAGE241
If the corresponding person is a stranger of persons in the non-list, the new element is added
Figure 597092DEST_PATH_IMAGE264
Push flag bit
Figure 424937DEST_PATH_IMAGE266
Is set to 0; and new elements are added
Figure 397578DEST_PATH_IMAGE264
Store in buffer queue
Figure 581571DEST_PATH_IMAGE267
In the middle, the addition of a new cache record is completed;
wherein
Figure 430975DEST_PATH_IMAGE268
The reason is that a general face recognition algorithm has a similarity recommendation threshold value which is used for judging whether the similarity of the face feature codes extracted by two face images can be judged as the same person; however, a single threshold value appears, and the similarity values of the face feature codes extracted from two face images of the same person do not meet the requirement of the threshold value, so that the person can be judged as two persons; for this purpose, the invention sets a second similarity threshold
Figure 185878DEST_PATH_IMAGE269
Or is appropriately lower than the similarity recommendation threshold of the same person determined by the algorithm
Figure 18884DEST_PATH_IMAGE270
All the faces recognized by each frame are subjected to the operation, and a buffer queue is maintained
Figure 716521DEST_PATH_IMAGE267
The person distinguishing duplicate removal identification can be effectively carried out on the persons in the list or strangers of the persons in the non-list; because other traditional characteristics are not adopted, the required duplication removing characteristics are face pose scores and face similarity values obtained by the position of a face frame and the face pose angle; all of these featuresThe face recognition method is information which is possessed by a general face algorithm, is irrelevant to a camera scene, and has universality; by adopting the face frame position overlapping degree and the similarity double threshold value of the face algorithm, on one hand, the problem of person tracking loss caused by frame dropping of video decoding or low detection rate of the face algorithm and the like of the person can be solved, and the extra wrong passing record generated when the person passes in the list is effectively reduced by matching and tracking the second similarity threshold value of the face algorithm again; on the other hand, by adopting the similarity double thresholds of the face algorithm, even if the personnel are still for a long time, the personnel can be effectively compared, and the static target and the non-static target can be effectively removed; no matter the overlapping degree of the positions of the face frames or the similarity calculation of the face algorithm, the calculation process has low requirements on calculation force and can run smoothly on most equipment;
by combining the advantages, when people in the list just appear on the camera screen, the captured face may not reach the recognition similarity threshold, and the person is judged to be a stranger passing record due to the adoption of the delay
Figure 857482DEST_PATH_IMAGE271
Second push, so push will not be done immediately at this time, and only buffer queue will exist
Figure 969406DEST_PATH_IMAGE267
Internal; when the user walks to the middle area of video monitoring, the captured face possibly reaches the threshold value of the recognition similarity, the user can be judged to be the passing record of the people in the list, at the moment, according to the judgment logic of the invention, the information of the people in the list can be immediately pushed, and the information is cached in a cache queue
Figure 407651DEST_PATH_IMAGE267
Matching the cache records, updating the data of the records, and updating the classification mark of the records to be 1; when people in the list leave the camera screen soon, the face captured by the camera may not reach the recognition similarity threshold, and the person is judged to be a stranger passing record, and at the moment, the person is judged to be a stranger passing record according to the cache queue
Figure 401021DEST_PATH_IMAGE267
The internal cache records can be normally matched with the cache records meeting the conditions, the classification mark is 1, and the system cannot push the stranger information; therefore, the invention can ensure that only one correct pass record is generated when a single person passes by the system to the maximum extent.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A person duplicate removal identification method is characterized by comprising the following steps: the method comprises the following steps:
s1, creating an empty buffer queue
Figure 369213DEST_PATH_IMAGE001
Buffer queue
Figure 568245DEST_PATH_IMAGE001
The queue element data comprises face images
Figure 144719DEST_PATH_IMAGE002
Position of human face frame
Figure 336666DEST_PATH_IMAGE003
Face pose score
Figure 940254DEST_PATH_IMAGE004
Face feature code
Figure 293875DEST_PATH_IMAGE005
Class bit
Figure 92066DEST_PATH_IMAGE006
And second-level time stamp of element addition time
Figure 822125DEST_PATH_IMAGE007
S2, circularly taking out the decoded picture from the network camera; suppose that one frame of decoded pictures is
Figure 772895DEST_PATH_IMAGE008
And obtaining the second-level time stamp of the current time
Figure 297417DEST_PATH_IMAGE009
(ii) a Checking buffer queues
Figure 848484DEST_PATH_IMAGE010
The second-level time stamps of all queue elements in the queue, if the second-level time stamp of the current time is
Figure 116654DEST_PATH_IMAGE009
Subtract a queue element
Figure 420465DEST_PATH_IMAGE011
Time stamp of second order
Figure 381468DEST_PATH_IMAGE012
The obtained difference
Figure 154252DEST_PATH_IMAGE013
Further determining the queue element
Figure 976846DEST_PATH_IMAGE011
Is classified into a plurality of groups
Figure 151475DEST_PATH_IMAGE014
The value of (a) is,
Figure 17800DEST_PATH_IMAGE015
the time required for a person to normally pass in front of the camera is accurate to seconds; if it is
Figure 543459DEST_PATH_IMAGE016
Then push the queue element
Figure 405629DEST_PATH_IMAGE011
And from the cache queue
Figure 434765DEST_PATH_IMAGE017
Delete the queue element
Figure 737571DEST_PATH_IMAGE011
(ii) a If it is
Figure 219368DEST_PATH_IMAGE018
Then directly from the buffer queue
Figure 649343DEST_PATH_IMAGE017
Delete the queue element
Figure 532985DEST_PATH_IMAGE011
S3, displaying the picture
Figure 6692DEST_PATH_IMAGE019
Face detection is carried out to obtain pictures
Figure 225053DEST_PATH_IMAGE019
Extracting face feature codes of face images of all the face information to obtain face feature code information corresponding to the face images;
s4, aligning the pictures respectively
Figure 176828DEST_PATH_IMAGE019
Performing face feature code first similarity on face feature codes extracted from all face information and face feature codes of people in the list one by one
Figure 446135DEST_PATH_IMAGE020
Calculating and sorting the similarity values in a descending order; if the maximum similarity value is larger than or equal to the similarity threshold value of the same person judged by the algorithm
Figure 90743DEST_PATH_IMAGE021
If not, the person corresponding to the face information is regarded as a stranger in the non-list;
s5, aligning the pictures respectively
Figure 297865DEST_PATH_IMAGE019
All face information and buffer queue in the system
Figure 318911DEST_PATH_IMAGE017
The overlapping degree of the positions of the face frames of all the internal elements is carried out one by one
Figure 911566DEST_PATH_IMAGE022
Second similarity with face feature code
Figure 973413DEST_PATH_IMAGE023
Calculating;
s6, if one element exists
Figure 917098DEST_PATH_IMAGE024
With certain face information
Figure 741835DEST_PATH_IMAGE025
Meet the overlapping degree of the face frame position
Figure 454576DEST_PATH_IMAGE026
Threshold value
Figure 926140DEST_PATH_IMAGE027
Or a second similarity of the face feature codes
Figure 357121DEST_PATH_IMAGE028
Threshold value
Figure 454390DEST_PATH_IMAGE029
Then the face information is considered
Figure 802064DEST_PATH_IMAGE030
The corresponding personnel are already present, and classification judgment is further carried out according to the personnel types;
s7, if one element does not exist
Figure 428217DEST_PATH_IMAGE024
With certain face information
Figure 346494DEST_PATH_IMAGE030
Meet the overlapping degree of the face frame position
Figure 247454DEST_PATH_IMAGE026
Threshold value
Figure 669208DEST_PATH_IMAGE027
Or face feature code secondDegree of similarity
Figure 748154DEST_PATH_IMAGE028
Threshold value
Figure 888148DEST_PATH_IMAGE029
(ii) a An element is newly created
Figure 592799DEST_PATH_IMAGE031
To convert the face information
Figure 134639DEST_PATH_IMAGE030
And time stamp
Figure 885951DEST_PATH_IMAGE032
Supplement to elements
Figure 513241DEST_PATH_IMAGE031
(ii) a Further carrying out classification judgment according to personnel types and setting new elements
Figure 756004DEST_PATH_IMAGE031
Corresponding classification zone bit
Figure 886771DEST_PATH_IMAGE033
(ii) a Will new elements
Figure 307519DEST_PATH_IMAGE031
Store in buffer queue
Figure 422105DEST_PATH_IMAGE034
In step (2), the addition of the new cache record is completed.
2. The person deduplication identification method of claim 1, wherein: in the step S2, in step S2,
Figure 202980DEST_PATH_IMAGE035
is a positive integer; in thatIn step S5, the overlap of the face frame positions
Figure 984991DEST_PATH_IMAGE036
Calculating the second similarity of face feature code for duplication elimination of the first feature
Figure 75175DEST_PATH_IMAGE028
A second feature calculation is performed for deduplication.
3. The person deduplication identification method of claim 1, wherein: face frame position overlap degree in step S5
Figure 942637DEST_PATH_IMAGE036
The calculation method of (2) is as follows:
assume a rectangle
Figure 527202DEST_PATH_IMAGE037
The vertex coordinates of the upper left corner are respectively
Figure 648873DEST_PATH_IMAGE038
Figure 660692DEST_PATH_IMAGE039
Having a height and a width of
Figure 484291DEST_PATH_IMAGE040
And
Figure 138126DEST_PATH_IMAGE041
rectangular shape
Figure 621628DEST_PATH_IMAGE042
The vertex coordinates of the upper left corner are respectively
Figure 69927DEST_PATH_IMAGE043
Figure 380822DEST_PATH_IMAGE044
Having a height and a width of
Figure 323502DEST_PATH_IMAGE045
And
Figure 403453DEST_PATH_IMAGE046
duplication removing first characteristic face frame position overlapping degree
Figure 288233DEST_PATH_IMAGE036
The calculation formula of (a) is as follows:
Figure 86424DEST_PATH_IMAGE048
duplication removing first characteristic face frame position overlapping degree
Figure 65751DEST_PATH_IMAGE036
Has a numerical value range of
Figure 209DEST_PATH_IMAGE049
4. A person deduplication identification method according to claim 3, wherein: overlapping two rectangular edges is not considered overlapping.
5. The person deduplication identification method of claim 1, wherein: in step S6:
s61, determining face information
Figure 790310DEST_PATH_IMAGE050
Is a person in the list, and element
Figure 75798DEST_PATH_IMAGE051
Inner pushing zone bit
Figure 94701DEST_PATH_IMAGE052
If the value of (2) is 0, the face information is immediately pushed
Figure 149244DEST_PATH_IMAGE050
And modifying the elements
Figure 110247DEST_PATH_IMAGE051
Inner pushing zone bit
Figure 617452DEST_PATH_IMAGE052
Has a value of 1;
s62, if the face information
Figure 941510DEST_PATH_IMAGE050
Is a person in the list, and element
Figure 116140DEST_PATH_IMAGE051
Inner pushing zone bit
Figure 982464DEST_PATH_IMAGE052
If the value of (1) is less than the predetermined threshold, the face information does not need to be pushed
Figure 242544DEST_PATH_IMAGE050
S63, determining face information
Figure 603250DEST_PATH_IMAGE050
If the person is not in the list, the judgment element is not needed
Figure 632386DEST_PATH_IMAGE051
Inner pushing zone bit
Figure 669612DEST_PATH_IMAGE052
A value of (d);
s64, then face information
Figure 151409DEST_PATH_IMAGE050
Performing face pose scoring
Figure 814340DEST_PATH_IMAGE053
And (6) calculating and judging.
6. The person deduplication identification method of claim 5, wherein: in step S64, the face pose score
Figure 963562DEST_PATH_IMAGE053
The calculation method is as follows:
face pose score
Figure 437268DEST_PATH_IMAGE053
According to the face pose angle
Figure 406361DEST_PATH_IMAGE054
Performing calculations, respectively, of
Figure 843290DEST_PATH_IMAGE055
Figure 847018DEST_PATH_IMAGE056
And
Figure 226047DEST_PATH_IMAGE057
three angles, all of which have values in the range
Figure 948015DEST_PATH_IMAGE058
Within the range of degrees, the closer any angle value is to zero, the better the face pose angle is, and the corresponding face pose score
Figure 949820DEST_PATH_IMAGE053
The higher;
concrete face pose score
Figure 808054DEST_PATH_IMAGE053
The formula for the calculation is as follows:
Figure 623563DEST_PATH_IMAGE059
wherein the face pose score
Figure 567249DEST_PATH_IMAGE053
Has a numerical value range of
Figure 611559DEST_PATH_IMAGE060
Figure 589879DEST_PATH_IMAGE061
To calculate the absolute value function of a real number.
7. The person deduplication identification method of claim 5, wherein: in step S64:
s641, if the face information
Figure 576290DEST_PATH_IMAGE062
Face pose score of
Figure 256539DEST_PATH_IMAGE063
Higher than element
Figure 353808DEST_PATH_IMAGE064
Inner face pose score
Figure 921055DEST_PATH_IMAGE065
Using the face information
Figure 78367DEST_PATH_IMAGE062
Face image of
Figure 731066DEST_PATH_IMAGE066
Face pose score
Figure 382758DEST_PATH_IMAGE063
Position of human face frame
Figure 70091DEST_PATH_IMAGE067
Face feature code
Figure 398304DEST_PATH_IMAGE068
Replacement elements
Figure 790496DEST_PATH_IMAGE064
Information of the corresponding item in (1);
s642, if the face information
Figure 495147DEST_PATH_IMAGE062
Face pose score of
Figure 36987DEST_PATH_IMAGE063
Not higher than element
Figure 536101DEST_PATH_IMAGE064
Inner face pose score
Figure 179703DEST_PATH_IMAGE065
Then only the face information is used
Figure 156886DEST_PATH_IMAGE062
Face frame position of
Figure 818812DEST_PATH_IMAGE067
Face feature code
Figure 488828DEST_PATH_IMAGE068
Replacement elements
Figure 852682DEST_PATH_IMAGE064
Information of the corresponding item in (1);
s643, update element
Figure 633556DEST_PATH_IMAGE064
Time stamp of
Figure 415567DEST_PATH_IMAGE069
As a time stamp
Figure 272796DEST_PATH_IMAGE070
8. The person deduplication identification method of claim 1, wherein: in step S7, if there is no one of the elements
Figure 874678DEST_PATH_IMAGE064
And the face information
Figure 459244DEST_PATH_IMAGE062
Meet the overlapping degree of the face frame position
Figure 830182DEST_PATH_IMAGE071
Threshold value
Figure 100057DEST_PATH_IMAGE072
Or a second similarity of the face feature codes
Figure 923657DEST_PATH_IMAGE073
Threshold value
Figure 311913DEST_PATH_IMAGE074
Then the face information is considered
Figure 271778DEST_PATH_IMAGE062
Corresponding personnel do not appear, and the face information needs to be added
Figure 205230DEST_PATH_IMAGE062
Recording for new buffer; newly building an element
Figure 155607DEST_PATH_IMAGE075
To convert the face information
Figure 81974DEST_PATH_IMAGE062
Face frame position of
Figure 161926DEST_PATH_IMAGE076
Human face pose angle
Figure 531858DEST_PATH_IMAGE077
Human face image
Figure 64471DEST_PATH_IMAGE078
Face feature code
Figure 528950DEST_PATH_IMAGE079
Time stamp of the order of seconds
Figure 994567DEST_PATH_IMAGE080
Supplement to elements
Figure 36865DEST_PATH_IMAGE075
(ii) a Further carrying out classification judgment according to the personnel types, if the face information
Figure 56774DEST_PATH_IMAGE062
If the corresponding person is a person in the list, the face information is immediately pushed
Figure 590523DEST_PATH_IMAGE062
And new elements are combined
Figure 645067DEST_PATH_IMAGE075
Classification zone bit
Figure 91223DEST_PATH_IMAGE081
Is set to 1; if the face information
Figure 129586DEST_PATH_IMAGE062
If the corresponding person is a stranger of persons in the non-list, the new element is added
Figure 201447DEST_PATH_IMAGE075
Push flag bit
Figure 110498DEST_PATH_IMAGE081
Is set to 0; and new elements are added
Figure 738007DEST_PATH_IMAGE075
Store in buffer queue
Figure 466929DEST_PATH_IMAGE082
In step (2), the addition of the new cache record is completed.
9. A storage medium for person deduplication identification, having computer instructions stored thereon, wherein the computer instructions, when executed by a processor, implement the steps of a person deduplication identification method of any one of claims 1-8.
10. Computer device for person deduplication identification, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements a person deduplication identification method as claimed in any of claims 1-8.
CN202210046435.4A 2022-01-17 2022-01-17 Personnel duplicate removal identification method, storage medium and computer equipment Active CN114067421B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210046435.4A CN114067421B (en) 2022-01-17 2022-01-17 Personnel duplicate removal identification method, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210046435.4A CN114067421B (en) 2022-01-17 2022-01-17 Personnel duplicate removal identification method, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN114067421A CN114067421A (en) 2022-02-18
CN114067421B true CN114067421B (en) 2022-04-19

Family

ID=80231019

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210046435.4A Active CN114067421B (en) 2022-01-17 2022-01-17 Personnel duplicate removal identification method, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN114067421B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110084103A (en) * 2019-03-15 2019-08-02 深圳英飞拓科技股份有限公司 A kind of same pedestrian's analysis method and system based on face recognition technology
CN110222627A (en) * 2019-05-31 2019-09-10 深圳算子科技有限公司 A kind of face amended record method
WO2020094091A1 (en) * 2018-11-07 2020-05-14 杭州海康威视数字技术股份有限公司 Image capturing method, monitoring camera, and monitoring system
CN113255621A (en) * 2021-07-13 2021-08-13 浙江大华技术股份有限公司 Face image filtering method, electronic device and computer-readable storage medium
CN113269127A (en) * 2021-06-10 2021-08-17 北京睿芯高通量科技有限公司 Face recognition and pedestrian re-recognition monitoring method and system for real-time automatic database building

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10691925B2 (en) * 2017-10-28 2020-06-23 Altumview Systems Inc. Enhanced face-detection and face-tracking for resource-limited embedded vision systems
EP3706035A1 (en) * 2019-03-07 2020-09-09 Koninklijke Philips N.V. Device, system and method for tracking and/or de-identification of faces in video data
US11334746B2 (en) * 2019-05-01 2022-05-17 EMC IP Holding Company LLC Facial recognition for multi-stream video using high probability group
CN112241687A (en) * 2020-09-16 2021-01-19 安徽超视野智能科技有限公司 Face recognition method and system with strange face library function
CN113537107A (en) * 2021-07-23 2021-10-22 山东浪潮通软信息科技有限公司 Face recognition and tracking method, device and equipment based on deep learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020094091A1 (en) * 2018-11-07 2020-05-14 杭州海康威视数字技术股份有限公司 Image capturing method, monitoring camera, and monitoring system
CN110084103A (en) * 2019-03-15 2019-08-02 深圳英飞拓科技股份有限公司 A kind of same pedestrian's analysis method and system based on face recognition technology
CN110222627A (en) * 2019-05-31 2019-09-10 深圳算子科技有限公司 A kind of face amended record method
CN113269127A (en) * 2021-06-10 2021-08-17 北京睿芯高通量科技有限公司 Face recognition and pedestrian re-recognition monitoring method and system for real-time automatic database building
CN113255621A (en) * 2021-07-13 2021-08-13 浙江大华技术股份有限公司 Face image filtering method, electronic device and computer-readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Face Identification With Top-Push Constrained Generalized Low-Rank Approximation of Matrices;YUANJIAN CHEN .et.al;《IEEE ACCESS》;20191114;第7卷;160998-161007 *
多路实时监控视频数据处理与分析***的设计与实现;杜爽;《全国优秀硕士学位论文全文数据库 信息科技辑》;20220115(第1期);I136-1957 *

Also Published As

Publication number Publication date
CN114067421A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
WO2019218824A1 (en) Method for acquiring motion track and device thereof, storage medium, and terminal
Bäuml et al. Multi-pose face recognition for person retrieval in camera networks
Kart et al. How to make an rgbd tracker?
US11676390B2 (en) Machine-learning model, methods and systems for removal of unwanted people from photographs
WO2021139167A1 (en) Method and apparatus for facial recognition, electronic device, and computer readable storage medium
Avrithis et al. Broadcast news parsing using visual cues: A robust face detection approach
Yen et al. Facial feature extraction using genetic algorithm
CN113537107A (en) Face recognition and tracking method, device and equipment based on deep learning
Tsai et al. Robust in-plane and out-of-plane face detection algorithm using frontal face detector and symmetry extension
CN111445442A (en) Crowd counting method and device based on neural network, server and storage medium
CN109146913B (en) Face tracking method and device
CN114067421B (en) Personnel duplicate removal identification method, storage medium and computer equipment
e Souza et al. Survey on visual rhythms: A spatio-temporal representation for video sequences
Vural et al. Multi-view fast object detection by using extended haar filters in uncontrolled environments
CN108334811B (en) Face image processing method and device
Wei et al. Omni-face detection for video/image content description
Nakashima et al. Privacy protection for social video via background estimation and CRF-based videographer's intention modeling
CN112258575B (en) Method for quickly identifying object in synchronous positioning and map construction
CN115115976A (en) Video processing method and device, electronic equipment and storage medium
US10147199B2 (en) Method and apparatus for determining an orientation of a video
Duanmu et al. A multi-view pedestrian tracking framework based on graph matching
Nigam et al. EgoTracker: Pedestrian tracking with re-identification in egocentric videos
Schmidt Feris et al. Detection and tracking of facial features in video sequences
Shen et al. Towards intelligent photo composition-automatic detection of unintentional dissection lines in environmental portrait photos
CN117935171B (en) Target tracking method and system based on gesture key points

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant