CN112241683B - Method and system for identifying and judging fellow persons - Google Patents

Method and system for identifying and judging fellow persons Download PDF

Info

Publication number
CN112241683B
CN112241683B CN202010973449.1A CN202010973449A CN112241683B CN 112241683 B CN112241683 B CN 112241683B CN 202010973449 A CN202010973449 A CN 202010973449A CN 112241683 B CN112241683 B CN 112241683B
Authority
CN
China
Prior art keywords
person
matching
personnel
face
snapshot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010973449.1A
Other languages
Chinese (zh)
Other versions
CN112241683A (en
Inventor
罗想
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Tianyi Network Co ltd
Original Assignee
Sichuan Tianyi Network Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Tianyi Network Service Co ltd filed Critical Sichuan Tianyi Network Service Co ltd
Priority to CN202010973449.1A priority Critical patent/CN112241683B/en
Publication of CN112241683A publication Critical patent/CN112241683A/en
Application granted granted Critical
Publication of CN112241683B publication Critical patent/CN112241683B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a method for identifying and judging the same-row personnel, which realizes the detection of the same-row personnel under a global path and a local path by the steps of collecting the human face image information, matching the human face characteristics, inquiring the snapshot record of the matched personnel, extracting the equipment through which the matched personnel pass, and realizing the snapshot record, path generation, path matching and the like of the personnel to be analyzed, wherein the personnel record, the inquiry and the matched personnel passing through the inquiry equipment have the same snapshot record. The method aims to solve the technical problems that the difficulty in identifying and judging the fellow staff of the tag staff is high and the efficiency is low in the prior art.

Description

Method and system for identifying and judging fellow persons
Technical Field
The invention relates to the technical field of face recognition, in particular to a method for identifying and judging a fellow person.
Background
At present, a plurality of difficulties exist in relation confirmation among fixed area personnel, for example, in community personnel relation confirmation, particularly in the confirmation of the relation of the personnel in the same row, the confirmation is generally carried out by a security registration entry and exit record mode and the like, and the problems of high labor cost, low reliability, complex relation of personnel in fixed places, various track data, high contingency and the like exist. In the prior art, the Cupressaceae family has a family tree calculation algorithm, and the family tree is calculated through family membership and network data to identify personnel relationships, but certain difficulty still exists in confirming the relationships of the personnel in the same row.
With the rapid development of the face recognition technology, the personnel identity information recognition technology is extended and applied to a plurality of scenes, and particularly, the obvious effect is highlighted in the field of intelligent security. Therefore, a tag personnel information base is established, and tag personnel refer to personnel with attribute tags which are maintained by a third-party system, a nationwide fleeing personnel database and a user in a customized manner. For example, a distrusted person also belongs to the tag personnel. The information of the tag personnel can be recognized through a face recognition technology, but the relation determination of the corresponding peer personnel has certain difficulty. The persons in the same row at a certain number of times can consider that the persons in the same row have a certain relationship with the tag persons. Therefore, more information can be obtained by judging the relation between the fellow staff and the tag staff, and the personnel monitoring and the personnel track query are facilitated.
Therefore, how to realize accurate identification and judgment of the staff in the same row as the tag staff is a technical problem which needs to be solved urgently.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a method for identifying and judging fellow persons, and aims to solve the technical problems of high difficulty and low efficiency in identifying and judging fellow persons of tag persons in the prior art.
In order to achieve the above object, a first aspect of the present invention provides a method for identifying and judging a fellow passenger, including the steps of:
collecting face image information;
storing the collected face image information and the corresponding face image characteristics in a face snapshot library, and calculating face characteristic values of the face image characteristics;
comparing the face characteristic value of the person to be identified with the face characteristic value in the face archive library, calculating the similarity, screening out the persons with the similarity being more than or equal to the display threshold, and adding the persons with the similarity being more than or equal to the display threshold into a face matching person list;
inquiring a face snapshot library according to the face matching person list and the time range for acquiring the face of the person to acquire snapshot records of the matching persons, and generating a snapshot record list;
extracting a snapshot device list through which the person passes according to the snapshot record, inquiring a person record list in the corresponding time range in the snapshot device list, and matching the person with the same snapshot record list in the person record list with the person to be identified;
and carrying out path matching on paths of the matched personnel and the personnel to be identified, and judging whether the matched personnel and the personnel to be identified are the same-party personnel.
Preferably, the method for identifying and judging the fellow staff extracts the snapshot equipment list through which the staff passes, and obtains the equipment list through which the matched staff passes by performing combined duplicate removal on two fields of the archive ID of the staff and the equipment ID through which the staff passes.
Preferably, the method for identifying and judging the fellow staff obtains the staff record list through which the equipment passes by inquiring the staff record list in the corresponding time range in the snapshot equipment list through the snapshot records of the time range, the equipment list, the corresponding equipment and the corresponding time period.
Preferably, the method for identifying and judging the fellow passenger further includes the following steps: and generating paths of the human face matching personnel and the personnel to be analyzed according to the personnel record list.
Preferably, the path matching includes global path matching:
the similarity of the whole path between the human face matching person and the person to be analyzed is larger than the threshold value of the similarity of path matching. And if the global path matching between the person to be analyzed and the human face matching person is successful, determining that the person to be analyzed is the person in the same row.
Preferably, the path matching includes local path matching:
the local path length of the human face matching person and the local path length of the person to be analyzed are larger than the local path length matching threshold, and the local path similarity is larger than the path matching similarity threshold. And if the local path matching exists between the person to be analyzed and the person for face matching, determining that the person to be analyzed is the person in the same row.
In a second aspect of the present invention, a peer identification and judgment system is provided, which includes:
face image acquisition equipment: collecting face image information;
a face feature value calculation unit: storing the collected face image information and the corresponding face image characteristics in a face snapshot library, and calculating face characteristic values of the face image characteristics;
a face feature matching unit: comparing the face characteristic value of the person to be identified with the face characteristic value in the face archive library, calculating the similarity, screening out the persons with the similarity being more than or equal to the display threshold, and adding the persons with the similarity being more than or equal to the display threshold into a face matching person list;
a snapshot record generation unit: inquiring a face snapshot library according to the face matching person list and the time range for acquiring the face of the person to acquire snapshot records of the matching persons, and generating a snapshot record list;
with equipment snapshot personnel matching unit: extracting a snapshot device list through which the person passes according to the snapshot record, inquiring a person record list in the corresponding time range in the snapshot device list, and matching the person with the same snapshot record list in the person record list with the person to be identified;
a passage identification unit: and carrying out path matching on paths of the matched personnel and the personnel to be identified, and judging whether the matched personnel and the personnel to be identified are the same-party personnel.
The method comprises the steps of collecting face image information, matching face features, inquiring snapshot records of matched personnel, extracting equipment through which the matched personnel pass, and realizing detection of the same-person personnel under the global path and the local path through the steps of recording personnel through which the equipment passes, inquiring snapshot records of personnel to be analyzed, path generation, path matching and the like, wherein the matched personnel have the same snapshot record. The method aims to solve the technical problems that the difficulty in identifying and judging the fellow staff of the tag staff is high and the efficiency is low in the prior art.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating the principle of steps of a method for identifying and determining a fellow passenger according to the present invention;
FIG. 2 is a schematic diagram illustrating a principle process of a peer identification method according to the present invention;
the implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
The invention provides an embodiment, and referring to fig. 1, fig. 1 is a schematic diagram illustrating a step principle of a peer identification and judgment method provided by the invention.
As shown in fig. 1, in this embodiment, a method for identifying and determining a fellow passenger includes the following steps:
collecting face image information;
storing the collected face image information and the corresponding face image characteristics in a face snapshot library, and calculating face characteristic values of the face image characteristics;
comparing the face characteristic value of the person to be identified with the face characteristic value in the face archive library, calculating the similarity, screening out the persons with the similarity being more than or equal to the display threshold, and adding the persons with the similarity being more than or equal to the display threshold into a face matching person list;
inquiring a face snapshot library according to the face matching person list and the time range for acquiring the face of the person to acquire snapshot records of the matching persons, and generating a snapshot record list;
extracting a snapshot device list through which the person passes according to the snapshot record, inquiring a person record list in the corresponding time range in the snapshot device list, and matching the person with the same snapshot record list in the person record list with the person to be identified;
and carrying out path matching on paths of the matched personnel and the personnel to be identified, and judging whether the matched personnel and the personnel to be identified are the same-party personnel.
It is to be understood that, extracting the snapshot device list that the person passes through obtains the device list that the matching person passes through by performing joint deduplication on two fields of the archive ID of the person and the device ID that the person passes through. And inquiring the personnel record list in the corresponding time range in the snapshot equipment list to obtain the personnel record list passed by the equipment through the snapshot records of the time range, the equipment list, the corresponding equipment and the corresponding time period.
Further, the method for identifying and judging the fellow passenger further comprises the following steps: and generating paths of the human face matching personnel and the personnel to be analyzed according to the personnel record list.
In a possible embodiment:
path matching includes global path matching: the similarity of the whole path between the human face matching person and the person to be analyzed is larger than the threshold value of the similarity of path matching. And if the global path matching between the person to be analyzed and the human face matching person is successful, determining that the person to be analyzed is the person in the same row.
Path matching includes local path matching: the local path length of the human face matching personnel and the local path length of the personnel to be analyzed are larger than the local path length matching threshold, and the local path similarity is larger than the path matching similarity threshold. And if the local path matching exists between the person to be analyzed and the person for face matching, determining that the person to be analyzed is the person in the same row.
In another embodiment, as shown in fig. 2, the implementation principle process of the present invention is described as follows:
face feature matching:
and comparing the face characteristic value with a face characteristic value in a face archive library, screening out persons with the similarity greater than or equal to a display threshold (confidence), and adding the persons with the similarity greater than or equal to the display threshold into a face matching person list. And if the face matching person list is not empty, continuously inquiring the snapshot record of the matching person.
Inquiring the snapshot record of the matched personnel:
and inquiring a face snapshot library according to the face matching personnel list and the start-stop time range in the same-row parameters to obtain snapshot records of the matching personnel, and generating a snapshot record list. If the snapshot record list is not empty, the program continues to extract the devices passed by the matched people.
Extracting equipment passed by matched personnel:
and according to the snapshot record of the matched personnel, jointly removing the duplicate of two fields of the archive id of the personnel and the equipment id of the personnel passing through to obtain an equipment list of the matched personnel passing through.
Inquiring personnel records passed by the equipment:
and obtaining a personnel record list of the passing of the equipment according to the starting and stopping time range in the same-row parameters, the equipment list of the passing of the matched personnel and the corresponding snapshot records of the equipment and the time period.
Inquiring and matching snap records of the personnel to be analyzed, which have the same snap record:
and inquiring the complete snapshot record of the personnel to be analyzed in the snapshot record table according to the start-stop time range in the same-row parameters and the personnel record list passing by the equipment.
Path generation:
and generating paths of the human face matching personnel and the personnel to be analyzed according to the personnel record list.
Path matching:
global path matching: the similarity of the whole path between the human face matching person and the person to be analyzed is larger than the threshold value of the path matching similarity. And if the global path matching between the person to be analyzed and the human face matching person is successful, determining that the person to be analyzed is the person in the same row.
Local path matching: the local path length of the human face matching person and the local path length of the person to be analyzed are larger than the local path length matching threshold, and the local path similarity is larger than the path matching similarity threshold. And if the local path matching exists between the person to be analyzed and the person for face matching, determining that the person to be analyzed is the person in the same row.
In the face recognition process, a face image is extracted from an image captured by a front-end camera, and the features of the face image are extracted; the face archive library stores structured face archive data; the face snapshot library stores face images and face features captured by the front-end camera. The identity of a person is determined through face feature value matching, the moving path of the person within a certain period of time is restored through the face image feature value, the equipment number and the snapshot time, and finally whether the person is a fellow person of a target person or not is determined through the matching of the path similarity of the target person and the person to be analyzed.
In the embodiment, the detection of the same-person under the global path and the local path is realized by acquiring the face image information, matching the face features, inquiring the snapshot record of the matched person, extracting the device through which the matched person passes, and carrying out the steps of snapshot record, path generation, path matching and the like on the person to be analyzed, which has the same snapshot record, of the person record, inquiry and matched person through which the device passes. The method aims to solve the technical problems that the difficulty in identifying and judging the peer personnel of the tag personnel is high and the efficiency is low in the prior art.
The methods, systems, and modules disclosed herein may be implemented in other ways. For example, the above-described embodiments are merely illustrative, and for example, the division of the modules may be merely a logical division, and an actual implementation may have another division, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be referred to as an indirect coupling or communication connection through some interfaces, systems or modules, and may be in an electrical, mechanical or other form.
The modules described as discrete components may or may not be physically separate, and the components shown as modules may or may not be physical modules, may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The foregoing is illustrative of the preferred embodiments of this invention, and it is to be understood that the invention is not limited to the precise form disclosed herein and that various other combinations, modifications, and environments may be resorted to, falling within the scope of the concept as disclosed herein, either as described above or as apparent to those skilled in the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (5)

1. A method for identifying and judging a fellow person is characterized by comprising the following steps:
collecting face image information;
storing the collected face image information and the corresponding face image characteristics in a face snapshot library, and calculating face characteristic values of the face image characteristics;
comparing the face characteristic value of the person to be identified with the face characteristic value in the face archive library, calculating the similarity, screening out the persons with the similarity being more than or equal to the display threshold, and adding the persons with the similarity being more than or equal to the display threshold into a face matching person list;
inquiring a face snapshot library according to the face matching person list and the time range for collecting the faces of the persons to obtain snapshot records of the matching persons and generating a snapshot record list;
extracting a snapshot device list through which the person passes according to the snapshot record, inquiring a person record list in the corresponding time range in the snapshot device list, and matching the person with the same snapshot record list in the person record list with the person to be identified;
carrying out path matching on paths of matching personnel and personnel to be identified, and judging whether the matching personnel and the personnel to be identified are the same-person personnel or not;
the path matching comprises global path matching: the similarity of the whole path between the human face matching person and the person to be analyzed is greater than a path matching similarity threshold; if the global path matching between the person to be analyzed and the human face matching person is successful, the person to be analyzed is determined to be a peer person;
the path matching comprises local path matching: the local path length of the human face matching personnel and the local path length of the personnel to be analyzed are larger than a local path length matching threshold, and the local path similarity is larger than a path matching similarity threshold; and if the local path matching exists between the person to be analyzed and the person for face matching, determining that the person to be analyzed is the person in the same row.
2. The peer personnel identification and judgment method as claimed in claim 1, wherein the extraction of the snapshot device list through which the personnel pass obtains the device list through which the matching personnel pass by performing joint deduplication on two fields of the archive ID of the personnel and the device ID through which the personnel pass.
3. The peer personnel identification and judgment method according to claim 1, wherein the personnel record list in the corresponding time range in the inquiry snapshot device list obtains the personnel record list passed by the device through snapshot records of the time range, the device list, the corresponding device and the corresponding time period.
4. The peer person identification and judgment method according to claim 1, further comprising a path generation step of: and generating paths of the human face matching personnel and the personnel to be analyzed according to the personnel record list.
5. A fellow person identification and judgment system, comprising:
face image acquisition equipment: collecting face image information;
a face feature value calculation unit: storing the collected face image information and the corresponding face image characteristics in a face snapshot library, and calculating face characteristic values of the face image characteristics;
a face feature matching unit: comparing the face characteristic value of the person to be identified with the face characteristic value in the face archive library, calculating the similarity, screening out the persons with the similarity being more than or equal to the display threshold, and adding the persons with the similarity being more than or equal to the display threshold into a face matching person list;
a snapshot record generation unit: inquiring a face snapshot library according to the face matching person list and the time range for acquiring the face of the person to acquire snapshot records of the matching persons, and generating a snapshot record list;
with equipment snapshot personnel matching unit: extracting a snapshot device list through which the person passes according to the snapshot record, inquiring a person record list in the corresponding time range in the snapshot device list, and matching the person with the same snapshot record list in the person record list with the person to be identified;
a passage identification unit: carrying out path matching on paths of matching personnel and personnel to be identified, and judging whether the matching personnel and the personnel to be identified are the same-row personnel, wherein the path matching comprises global path matching: the similarity of the whole path between the human face matching person and the person to be analyzed is greater than a path matching similarity threshold; if the global path matching between the person to be analyzed and the human face matching person is successful, the person to be analyzed is determined to be a peer person; the path matching comprises local path matching: the local path length of the human face matching personnel and the local path length of the personnel to be analyzed are greater than a local path length matching threshold, and the local path similarity is greater than a path matching similarity threshold; and if the local path matching exists between the person to be analyzed and the person matched with the human face, determining that the person to be analyzed is a person in the same row.
CN202010973449.1A 2020-09-16 2020-09-16 Method and system for identifying and judging fellow persons Active CN112241683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010973449.1A CN112241683B (en) 2020-09-16 2020-09-16 Method and system for identifying and judging fellow persons

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010973449.1A CN112241683B (en) 2020-09-16 2020-09-16 Method and system for identifying and judging fellow persons

Publications (2)

Publication Number Publication Date
CN112241683A CN112241683A (en) 2021-01-19
CN112241683B true CN112241683B (en) 2022-07-05

Family

ID=74170963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010973449.1A Active CN112241683B (en) 2020-09-16 2020-09-16 Method and system for identifying and judging fellow persons

Country Status (1)

Country Link
CN (1) CN112241683B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006031282A (en) * 2004-07-14 2006-02-02 Glory Ltd Person identification system
CN109117714A (en) * 2018-06-27 2019-01-01 北京旷视科技有限公司 A kind of colleague's personal identification method, apparatus, system and computer storage medium
CN109934177A (en) * 2019-03-15 2019-06-25 艾特城信息科技有限公司 Pedestrian recognition methods, system and computer readable storage medium again
CN110084103A (en) * 2019-03-15 2019-08-02 深圳英飞拓科技股份有限公司 A kind of same pedestrian's analysis method and system based on face recognition technology
CN111104915A (en) * 2019-12-23 2020-05-05 云粒智慧科技有限公司 Method, device, equipment and medium for peer analysis
CN111461153A (en) * 2019-01-22 2020-07-28 刘宏军 Crowd characteristic deep learning method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006031282A (en) * 2004-07-14 2006-02-02 Glory Ltd Person identification system
CN109117714A (en) * 2018-06-27 2019-01-01 北京旷视科技有限公司 A kind of colleague's personal identification method, apparatus, system and computer storage medium
CN111461153A (en) * 2019-01-22 2020-07-28 刘宏军 Crowd characteristic deep learning method
CN109934177A (en) * 2019-03-15 2019-06-25 艾特城信息科技有限公司 Pedestrian recognition methods, system and computer readable storage medium again
CN110084103A (en) * 2019-03-15 2019-08-02 深圳英飞拓科技股份有限公司 A kind of same pedestrian's analysis method and system based on face recognition technology
CN111104915A (en) * 2019-12-23 2020-05-05 云粒智慧科技有限公司 Method, device, equipment and medium for peer analysis

Also Published As

Publication number Publication date
CN112241683A (en) 2021-01-19

Similar Documents

Publication Publication Date Title
Stringa et al. Real-time video-shot detection for scene surveillance applications
US20220092881A1 (en) Method and apparatus for behavior analysis, electronic apparatus, storage medium, and computer program
CN109635146B (en) Target query method and system based on image characteristics
JP5602135B2 (en) Method and system for automatic personal annotation of video content
US8570376B1 (en) Method and system for efficient sampling of videos using spatiotemporal constraints for statistical behavior analysis
WO2018180588A1 (en) Facial image matching system and facial image search system
CN110139075B (en) Video data processing method, video data processing device, computer equipment and storage medium
WO2020259099A1 (en) Information processing method and device, and storage medium
CN109800691A (en) Demographics method and system based on face recognition technology
US20100318566A1 (en) Behavior history retrieval apparatus and behavior history retrieval method
CN111209776A (en) Method, device, processing server, storage medium and system for identifying pedestrians
CN111078922A (en) Information processing method and device and storage medium
CN113343913A (en) Target determination method, target determination device, storage medium and computer equipment
CN113949823A (en) Video concentration method and device
CN112241683B (en) Method and system for identifying and judging fellow persons
CN112418063A (en) Face recognition method and device, electronic equipment and storage medium
CN112268554A (en) Regional range loitering detection method and system based on path trajectory analysis
CN112559583B (en) Method and device for identifying pedestrians
CN109871456A (en) A kind of detention house personnel relationship analysis method, device and electronic equipment
CN112241686A (en) Trajectory comparison matching method and system based on feature vectors
CN114817638A (en) Backtracking system for fixed scene monitoring video
CN114092809A (en) Object identification method and device and electronic equipment
CN113886631A (en) Video archive generation method and device and storage medium
CN112149071A (en) Identity authentication method, storage medium, processor and system
CN112001280B (en) Real-time and online optimized face recognition system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Third floor, No.10, Jiuxing Avenue, high tech Zone, Chengdu, Sichuan 610041

Patentee after: Sichuan Tianyi Network Co.,Ltd.

Address before: Third floor, No.10, Jiuxing Avenue, high tech Zone, Chengdu, Sichuan 610041

Patentee before: SICHUAN TIANYI NETWORK SERVICE Co.,Ltd.

CP01 Change in the name or title of a patent holder