CN114067270B - Vehicle tracking method and device, computer equipment and storage medium - Google Patents

Vehicle tracking method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114067270B
CN114067270B CN202111372602.6A CN202111372602A CN114067270B CN 114067270 B CN114067270 B CN 114067270B CN 202111372602 A CN202111372602 A CN 202111372602A CN 114067270 B CN114067270 B CN 114067270B
Authority
CN
China
Prior art keywords
vehicle
monitoring
information
detection frame
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111372602.6A
Other languages
Chinese (zh)
Other versions
CN114067270A (en
Inventor
朱子威
张星明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202111372602.6A priority Critical patent/CN114067270B/en
Publication of CN114067270A publication Critical patent/CN114067270A/en
Application granted granted Critical
Publication of CN114067270B publication Critical patent/CN114067270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a vehicle tracking method, a vehicle tracking device, computer equipment and a storage medium, wherein the vehicle tracking method comprises the following steps: acquiring vehicle running monitoring videos sent by a plurality of monitoring devices; detecting a vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain characteristic information of the vehicle, wherein the characteristic information comprises an image in a detection frame (bounding box) of the vehicle; vehicle features (featuremaps) are extracted from the detection frame image, AMOC-RNN time series features are fused, and a vehicle feature set acquired by each monitoring device is obtained; carrying out similarity comparison on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the re-identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in a road.

Description

Vehicle tracking method and device, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a vehicle tracking method, a vehicle tracking device, computer equipment and a storage medium.
Background
An intensive skynet system covers every corner of a city nowadays, a monitoring network is formed by a large number of cameras arranged on lanes of a street, the system is a legal treasure for public security organs to fight against street crimes, is a strong back shield for city public security, and is also combined with traffic monitoring to be used as a vehicle monitoring system. The vehicle passing by controlled cross section is captured and video-recorded, vehicle license plate information, vehicle appearance information, vehicle front and assistant driver face image information (road segment equipment), vehicle passing time information, vehicle passing place information and vehicle running speed information are obtained, vehicle passing tracks can be analyzed according to the information, behaviors such as violation of law, red light violation (intersection equipment) and running violation not according to a guide mark of an overspeed vehicle are collected for evidence collection, and auxiliary case handling functions such as investigation, prevention and control are carried out on a suspect vehicle.
Although the traffic video monitoring system can provide supervision without dead angles, some sudden events and illegal vehicles in videos can not be positioned at any time when running at high speed, and can not be observed, retrieved and reported timely through the naked eyes of a screen under a video monitoring picture, and a traffic department can not receive a report at the first time, so that traffic accident escapers or the information delay between the hidden danger of the existing traffic events and a traffic management department is caused. When the same vehicle appears in different cameras along the line, the comparison can be only carried out manually. Related workers cannot be dispatched to the site for interception or tracking in the first time. For example, a suspicious vehicle is speeding on a four-way-eight-reach highway, and the direction of traffic cannot be determined. If the position of the vehicle cannot be compared in time through the video monitoring pictures at different positions, information is delayed, and difficulty is brought to case handling work of tracking and intercepting.
Disclosure of Invention
The embodiment of the invention provides a vehicle tracking method and device, computer equipment and a storage medium.
In order to solve the above technical problem, the embodiment of the present invention adopts a technical solution that: there is provided a vehicle tracking method comprising the steps of:
acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
detecting a vehicle in the vehicle running monitoring video by adopting a preset Fast R-CNN algorithm to obtain characteristic information of the vehicle, wherein the characteristic information comprises a detection box (bounding box) image of the vehicle;
extracting vehicle features from the detection frame image, and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device;
carrying out similarity comparison on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge;
and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the target of the vehicle.
Further, the detecting the vehicle in the vehicle running monitoring video by using a preset Fast R-CNN algorithm to obtain the characteristic information of the vehicle includes:
detecting the vehicle in the vehicle running monitoring video by using a trained Fast R-CNN network based on a convolutional neural network to obtain a detection frame (bounding box) image of the vehicle;
extracting vehicle information from the detection frame image, wherein the vehicle information includes: vehicle code, time information, position information of monitoring device, and coordinate information of the vehicle
Further, the extracting vehicle features from the detection frame image and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device includes:
extracting image features of each frame of detection frame image by a convolutional neural network image feature extraction method to serve as re-identification features;
pooling the re-identification features through a pooling layer of a convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
and taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
Further, after the vehicle driving monitoring videos sent by the multiple monitoring devices are obtained, the method further includes:
collecting t continuous monitoring picture images in preset quantity from the vehicle running monitoring video as an initialization background model;
judging whether the t +1 frame monitoring picture image has jitter according to the initialization background model;
when the jitter occurs, a digital video de-jitter algorithm is triggered to remove the video jitter.
Further, identifying the same vehicle in the multiple monitoring devices according to the similarity comparison result to track the vehicle includes:
when a plurality of vehicle characteristics with similarity larger than a preset value exist in the plurality of monitoring devices, judging whether the monitoring devices pointed by the plurality of vehicle characteristics are adjacent or not according to the serial numbers of the monitoring devices on the same road;
when the positions of the monitoring equipment pointed by the vehicle characteristics are adjacent, judging whether time information of vehicles pointed by the vehicle characteristics adjacent to the positions is smaller than a preset time period value or not;
and identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
An embodiment of the present invention provides a vehicle tracking apparatus, including:
the acquisition module is used for acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
the processing module is used for detecting the vehicles in the vehicle running monitoring video by adopting a preset Fast R-CNN algorithm to obtain the characteristic information of the vehicles, wherein the characteristic information comprises detection frame images of the vehicles;
the processing module is further configured to extract vehicle features from the detection frame image and perform time series feature fusion to obtain a vehicle feature set acquired by each monitoring device;
the processing module is further used for comparing the similarity of the vehicle characteristics in the vehicle characteristic set acquired by the monitoring devices installed on the same geographical line;
and the execution module is used for identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle.
Further, the processing module comprises:
the first acquisition submodule is used for detecting the vehicle in the vehicle running monitoring video by utilizing a trained Fast R-CNN network based on a convolutional neural network to obtain a detection frame image of the vehicle;
a first processing sub-module, configured to extract vehicle information from the detection frame image, where the vehicle information includes: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
Further, the processing module comprises:
the second acquisition submodule is used for extracting image features of each frame of detection frame image as re-identification features through a convolutional neural network image feature extraction method;
the second processing submodule is used for performing pooling on the re-identification features through a pooling layer of the convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
and the first execution submodule is used for taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
Further, still include:
the third acquisition submodule is used for acquiring t continuous monitoring picture images in preset quantity from the vehicle running monitoring video to serve as an initialization background model;
the third processing submodule is used for judging whether the monitoring picture image of the t +1 frame shakes or not according to the initialized background model;
and the second execution submodule is used for triggering the digital video de-jitter algorithm to remove the video jitter when the jitter occurs.
Further, the execution module includes:
the fourth processing submodule is used for judging whether the monitoring equipment pointed by the vehicle characteristics are adjacent or not according to the serial numbers of the monitoring equipment on the same road when the vehicle characteristics with the similarity larger than the preset value exist in the monitoring equipment;
the fifth processing submodule is used for judging whether time information of vehicles pointed by the plurality of vehicle characteristics adjacent to the position is smaller than a preset time period value or not when the positions of the monitoring equipment pointed by the plurality of vehicle characteristics are adjacent;
and the third execution submodule is used for identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
Embodiments also provide a storage medium storing computer readable instructions, which when executed by one or more processors, cause the one or more processors to perform the steps of the vehicle tracking method described above.
The embodiment of the invention has the beneficial effects that: similarity comparison is carried out on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the re-identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in a road.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic basic flow chart of a vehicle tracking method according to an embodiment of the present invention;
fig. 2 is a block diagram of a basic structure of a vehicle tracking device according to an embodiment of the present invention;
fig. 3 is a block diagram of a basic structure of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention.
In some of the flows described in the present specification and claims and in the above figures, a number of operations are included that occur in a particular order, but it should be clearly understood that these operations may be performed out of order or in parallel as they occur herein, with the order of the operations being indicated as 101, 102, etc. merely to distinguish between the various operations, and the order of the operations by themselves does not represent any order of performance. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
As will be understood by those skilled in the art, a "terminal" as used herein includes both devices having a wireless signal receiver, which are only devices having a wireless signal receiver without transmit capability, and devices having receive and transmit hardware, which are devices having receive and transmit hardware capable of performing two-way communications over a two-way communications link. Such a device may include: a cellular or other communications device having a single line display or a multi-line display or a cellular or other communications device without a multi-line display; PCS (Personal Communications Service), which may combine voice, data processing, facsimile and/or data communication capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global Positioning System) receiver; a conventional laptop and/or palmtop computer or other appliance having and/or including a radio frequency receiver. As used herein, a "terminal" or "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. As used herein, a "terminal Device" may also be a communication terminal, a web terminal, a music/video playing terminal, such as a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with music/video playing function, or a smart tv, a set-top box, etc.
As shown in fig. 1, fig. 1 is a schematic basic flow chart of a vehicle tracking method according to an embodiment of the present invention, including the following steps:
s1, obtaining vehicle running monitoring videos sent by a plurality of monitoring devices;
the monitoring apparatus is a photographing device installed in a highway or a street for photographing a running vehicle, such as a camera or the like.
S2, detecting the vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain characteristic information of the vehicle, wherein the characteristic information comprises a detection box (bounding box) image of the vehicle;
the feature information is used to indicate information used to identify a vehicle in the vehicle driving monitoring video, such as a vehicle code of the vehicle, a face image of a driver, a license plate number, and the like, and may further include time information at the time of photographing, position information of a monitoring device, and coordinate information of the vehicle. In the embodiment of the invention, the detection frame image of the vehicle is obtained by detecting the vehicle, and the information is extracted from the detection frame.
Specifically, the method comprises the following steps:
firstly, detecting a vehicle in the vehicle running monitoring video by using a trained fast target detection method FastR-CNN network based on a convolutional neural network to obtain a detection box (bounding box) image of the vehicle;
step two, extracting vehicle information from the detection frame image, wherein the vehicle information comprises: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
According to one embodiment of the invention, a trained target detection method based on a convolutional neural network only for vehicles is used for processing vehicles in the current video image to obtain a vehicle detection frame of each frame of image. And continuously tracking the vehicle in a single video picture by using a Deep Sort method to obtain a detection frame sequence until the vehicle disappears in the picture.
S3, extracting vehicle features from the detection frame image, and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device;
the embodiment of the invention comprises the following steps:
firstly, extracting image features of each frame of detection frame image as re-identification features by a convolutional neural network image feature extraction method;
step two, pooling the re-identification features through a pooling layer (pooling layer) in a convolutional neural network feature extraction network to obtain fused time sequence re-identification features;
and thirdly, taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
In the embodiment of the invention, the time sequence re-identification characteristic is an AMOC-RNN time sequence characteristic. In some embodiments, the monitoring device may monitor a plurality of vehicles, and the set of vehicle characteristics includes vehicle characteristics of the plurality of vehicles.
S4, comparing the similarity of the vehicle characteristics in the vehicle characteristic set collected by a plurality of monitoring devices installed on the same geographical line;
in the embodiment of the invention, the same geographical line can be the same highway and a plurality of cameras are arranged in the same street. When the similarity comparison is carried out on the vehicle characteristics, the comparison can be carried out through the cosine similarity of the Euclidean distance.
The feature vectors for the two images, also two line segments in space, are both pointing from the origin ([0, 0. ]) in different directions. An included angle is formed between the two line segments. The smaller the angle, the more similar. Assuming that the vehicle feature matrix a and the vehicle feature matrix B are two n-dimensional vectors, a is [ a1, a 2., An ], and B is [ B1, B2., Bn ], then the cosine of the angle θ between a and B is equal to:
Figure GDA0003776781280000091
the cosine of the included angle between the vehicle characteristic matrix A and the vehicle characteristic matrix B can be obtained by using the formula. The closer the cosine value is to 1, the closer the angle is to 0 degrees, i.e. the more similar the two vectors are. After the included angle of the two vectors is obtained, the similarity degree of the two vectors can be judged according to the size of the angle. Therefore, the similarity calculation of the vehicle feature matrix A and the vehicle feature matrix B can be completed, and therefore similar images of the vehicle can be found through the algorithm.
And S5, identifying the same vehicle in the multiple monitoring devices according to the similarity comparison result so as to track the vehicle.
In some embodiments, there may be a plurality of vehicles with similar vehicle outlines, and in order to further accurately identify the same vehicle, the embodiment of the present invention performs accurate identification through the following steps:
step one, when a plurality of vehicle characteristics with similarity larger than a preset value exist in the plurality of monitoring devices, judging whether the monitoring devices pointed by the plurality of vehicle characteristics are adjacent or not according to the serial numbers of the monitoring devices on the same road;
step two, when the positions of the monitoring equipment pointed by the vehicle characteristics are adjacent, judging whether time information of vehicles pointed by the vehicle characteristics adjacent to the positions is smaller than a preset time period value or not;
and thirdly, identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
According to one embodiment of the invention, if the images of the vehicles in any camera in different cameras are searched, the images can be searched according to the serial numbers of the vehicles in the images in the current video, and the current real-time geographic position of the vehicle can be inquired according to the serial numbers of the cameras.
Vehicles along different roads, namely vehicles which are not adjacent in position, and vehicles which appear at different time periods along the same road are eliminated through the method, and the accuracy of the tracked vehicles is ensured.
In some embodiments, in order to obtain a clear vehicle driving monitoring video, the method further includes the step of removing the vehicle driving monitoring video shot by the monitoring device, and the method includes the following steps:
step one, collecting t continuous monitoring picture images with preset quantity from the vehicle running monitoring video as an initialization background model;
judging whether the monitoring picture image of the t +1 frame shakes or not according to the initialized background model;
and step three, when the jitter occurs, triggering a digital video de-jitter algorithm to remove the video jitter.
According to one embodiment of the invention, under a single camera, after the monitoring video starts, whether the camera shakes is judged by using the pictures of the first 20 frames, and if the camera shakes, the pictures are stabilized by using a video shaking removal method.
According to the vehicle tracking method provided by the embodiment of the invention, similarity comparison is carried out on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the re-identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in a road.
The embodiment of the invention also provides a vehicle tracking device. Referring to fig. 2, fig. 2 is a block diagram of a basic structure of the vehicle tracking device according to the present embodiment.
As shown in fig. 2, a vehicle tracking apparatus includes: an acquisition module 2100, a processing module 2200, and an execution module 2300. The acquiring module 2100 is configured to acquire vehicle running monitoring videos sent by multiple monitoring devices; a processing module 2200, configured to detect a vehicle detection box (bounding box) in the vehicle driving monitoring video by using a preset target detection algorithm for a vehicle, so as to obtain feature information of the vehicle, where the feature information includes a detection box image of the vehicle; the processing module 2200 is further configured to extract vehicle features from the detection frame image, perform time series feature fusion, and obtain a vehicle feature set collected by each monitoring device; the processing module 2200 is further configured to perform similarity comparison on vehicle features in a vehicle feature set acquired by a plurality of monitoring devices installed along the same geographical line; the executing module 2300 is configured to identify the same vehicle among the multiple monitoring devices according to the similarity comparison result so as to track the vehicle.
According to the vehicle tracking device provided by the embodiment of the invention, similarity comparison is carried out on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the re-identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in a road.
In some embodiments, the processing module comprises: the first acquisition submodule is used for detecting the vehicle in the vehicle running monitoring video by utilizing a trained Fast R-CNN network based on a convolutional neural network to obtain a detection frame (bounding box) image of the vehicle; a first processing sub-module, configured to extract vehicle information from the detection frame image, where the vehicle information includes: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
In some embodiments, the processing module comprises: the second acquisition submodule is used for extracting image features of each frame of detection frame image as re-identification features by a convolutional neural network image feature extraction method; the second processing submodule is used for performing pooling on the re-identification features through a pooling layer of the convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features; and the first execution submodule is used for taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
In some embodiments, further comprising: the third acquisition submodule is used for acquiring t continuous monitoring picture images in preset quantity from the vehicle running monitoring video to serve as an initialization background model; the third processing submodule is used for judging whether the t +1 frame monitoring picture image shakes or not according to the initialized background model; and the second execution submodule is used for triggering the digital video de-jitter algorithm to remove the video jitter when the jitter occurs.
In some embodiments, the execution module comprises: the fourth processing submodule is used for judging whether the monitoring equipment pointed by the vehicle characteristics are adjacent or not according to the serial numbers of the monitoring equipment on the same road when the vehicle characteristics with the similarity larger than the preset value exist in the monitoring equipment; the fifth processing submodule is used for judging whether time information of vehicles pointed by the plurality of vehicle characteristics adjacent to the position is smaller than a preset time period value or not when the positions of the monitoring equipment pointed by the plurality of vehicle characteristics are adjacent; and the third execution submodule is used for identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
In order to solve the above technical problem, an embodiment of the present invention further provides a computer device. Referring to fig. 3, fig. 3 is a block diagram of a basic structure of a computer device according to the present embodiment.
As shown in fig. 3, the internal structure of the computer device is schematically illustrated. As shown in fig. 3, the computer device includes a processor, a non-volatile storage medium, a memory, and a network interface connected through a system bus. The non-volatile storage medium of the computer device stores an operating system, a database in which control information sequences may be stored, and computer readable instructions that, when executed by the processor, cause the processor to implement a vehicle tracking method. The processor of the computer device is used for providing calculation and control capability and supporting the operation of the whole computer device. The memory of the computer device may have stored therein computer readable instructions that, when executed by the processor, may cause the processor to perform a vehicle tracking method. The network interface of the computer device is used for connecting and communicating with the terminal. Those skilled in the art will appreciate that the architecture shown in fig. 3 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In this embodiment, the processor is configured to execute specific contents of the obtaining module 2100, the processing module 2200, and the executing module 2300 in fig. 2, and the memory stores program codes and various data required for executing the modules. The network interface is used for data transmission to and from a user terminal or a server. The memory in the present embodiment stores program codes and data necessary for executing all the sub-modules in the vehicle tracking method, and the server can call the program codes and data of the server to execute the functions of all the sub-modules.
According to the computer device provided by the embodiment of the invention, the reference feature map is obtained by extracting the features of the high-definition image set in the reference pool, and due to the diversification of the images in the high-definition image set, the reference feature map contains all possible local features, so that high-frequency texture information can be provided for each low-resolution image, the feature richness is ensured, and the memory burden is reduced. In addition, the reference feature map is searched according to the low-resolution image, and the selected reference feature map can adaptively shield or enhance various different features, so that the details of the low-resolution image are richer.
The present invention also provides a storage medium having stored thereon computer-readable instructions which, when executed by one or more processors, cause the one or more processors to perform the steps of the vehicle tracking method of any of the embodiments described above.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and embellishments can be made without departing from the principle of the present invention, and these should also be construed as the scope of the present invention.

Claims (7)

1. A vehicle tracking method, comprising the steps of:
acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
detecting the vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain the characteristic information of the vehicle, wherein the characteristic information comprises a detection frame image of the vehicle;
extracting vehicle features from the detection frame image, and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device;
carrying out similarity comparison on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge;
identifying the same vehicle in the multiple monitoring devices according to the similarity comparison result so as to track the vehicle;
the target detection algorithm is a trained fast target detection method FastR-CNN network based on a convolutional neural network;
the extracting of vehicle features from the detection frame image for time series feature fusion to obtain a vehicle feature set collected by each monitoring device includes:
extracting image features of each frame of detection frame image by a convolutional neural network image feature extraction method to serve as re-identification features;
pooling the re-identification features through a pooling layer of a convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
all time sequence re-identification features in the vehicle running monitoring video are used as a vehicle feature set of the monitoring equipment;
wherein, according to the same vehicle in the comparison result of similarity discernment in a plurality of supervisory equipment in order to track the vehicle, include:
when a plurality of vehicle characteristics with similarity larger than a preset value exist in the plurality of monitoring devices, judging whether the monitoring devices pointed by the plurality of vehicle characteristics are adjacent or not according to the serial numbers of the monitoring devices on the same road;
when the positions of the monitoring equipment pointed by the vehicle characteristics are adjacent, judging whether time information of vehicles pointed by the vehicle characteristics adjacent to the positions is smaller than a preset time period value or not;
identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
2. The vehicle tracking method according to claim 1, wherein the detecting the vehicle in the vehicle driving monitoring video by using a preset Fast R-CNN algorithm to obtain the characteristic information of the vehicle comprises:
detecting the vehicle in the vehicle running monitoring video by using a trained Fast R-CNN network based on a convolutional neural network to obtain a detection frame image of the vehicle;
extracting vehicle information from the detection frame image, wherein the vehicle information includes: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
3. The vehicle tracking method according to claim 1, wherein after acquiring the vehicle driving monitoring video transmitted by the plurality of monitoring devices, the method further comprises:
collecting t continuous monitoring picture images in preset quantity from the vehicle running monitoring video as an initialization background model;
judging whether the t +1 frame monitoring picture image has jitter according to the initialization background model;
when the jitter occurs, a digital video de-jitter algorithm is triggered to remove the video jitter.
4. A vehicle tracking device, comprising:
the acquisition module is used for acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
the processing module is used for detecting the vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain the characteristic information of the vehicle, wherein the characteristic information comprises a detection frame image of the vehicle;
the processing module is further configured to extract vehicle features from the detection frame image and perform time series feature fusion to obtain a vehicle feature set acquired by each monitoring device;
the processing module is further used for carrying out similarity comparison on the vehicle characteristics in the vehicle characteristic set acquired by the monitoring equipment installed on the same geographical line;
the execution module is used for identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle;
the target detection algorithm is a trained fast target detection method FastR-CNN network based on a convolutional neural network;
the processing module comprises:
the second acquisition submodule is used for extracting the vehicle image characteristics of each frame of detection frame image through a convolutional neural network image characteristic extraction method to serve as re-identification characteristics;
the second processing submodule is used for performing pooling on the re-identification features through a pooling layer of the convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
the first execution submodule is used for taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment;
wherein the execution module comprises:
the fourth processing submodule is used for judging whether the monitoring equipment pointed by the vehicle characteristics are adjacent or not according to the serial numbers of the monitoring equipment on the same road when the vehicle characteristics with the similarity larger than the preset value exist in the monitoring equipment;
the fifth processing submodule is used for judging whether time information of vehicles pointed by the plurality of vehicle characteristics adjacent to the position is smaller than a preset time period value or not when the positions of the monitoring equipment pointed by the plurality of vehicle characteristics are adjacent;
and the third execution submodule is used for identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
5. The vehicle tracking device of claim 4, wherein the processing module comprises:
the first acquisition submodule is used for detecting the vehicle in the vehicle running monitoring video by utilizing a trained Fast R-CNN network based on a convolutional neural network to obtain a detection frame image of the vehicle;
the first processing submodule is used for extracting vehicle information from the detection frame image, wherein the vehicle information comprises: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
6. A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to carry out the steps of the vehicle tracking method according to any one of claims 1 to 3.
7. A storage medium having stored thereon computer-readable instructions which, when executed by one or more processors, cause the one or more processors to perform the steps of the vehicle tracking method of any one of claims 1 to 3.
CN202111372602.6A 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium Active CN114067270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111372602.6A CN114067270B (en) 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111372602.6A CN114067270B (en) 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114067270A CN114067270A (en) 2022-02-18
CN114067270B true CN114067270B (en) 2022-09-09

Family

ID=80278364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111372602.6A Active CN114067270B (en) 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114067270B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820699B (en) * 2022-03-29 2023-07-18 小米汽车科技有限公司 Multi-target tracking method, device, equipment and medium
CN115171378B (en) * 2022-06-28 2023-10-27 武汉理工大学 High-precision detection tracking method for long-distance multiple vehicles based on road side radar
CN117312598B (en) * 2023-11-27 2024-04-09 广东利通科技投资有限公司 Evidence obtaining method, device, computer equipment and storage medium for fee evasion auditing

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007235952A (en) * 2006-02-28 2007-09-13 Alpine Electronics Inc Method and device for tracking vehicle
JP2014191664A (en) * 2013-03-27 2014-10-06 Fujitsu Ltd Vehicle tracking program, image transmission program, server device, information processing apparatus, and vehicle tracking method
CN106599918A (en) * 2016-12-13 2017-04-26 开易(深圳)科技有限公司 Vehicle tracking method and system
CN108629791A (en) * 2017-03-17 2018-10-09 北京旷视科技有限公司 Pedestrian tracting method and device and across camera pedestrian tracting method and device
CN110443828A (en) * 2019-07-31 2019-11-12 腾讯科技(深圳)有限公司 Method for tracing object and device, storage medium and electronic device
WO2020145882A1 (en) * 2019-01-09 2020-07-16 Hitachi, Ltd. Object tracking systems and methods for tracking a target object
CN113221750A (en) * 2021-05-13 2021-08-06 杭州飞步科技有限公司 Vehicle tracking method, device, equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111954886A (en) * 2019-06-14 2020-11-17 北京嘀嘀无限科技发展有限公司 System and method for object tracking
CN111126223B (en) * 2019-12-16 2023-04-18 山西大学 Video pedestrian re-identification method based on optical flow guide features
CN111310633B (en) * 2020-02-10 2023-05-05 江南大学 Parallel space-time attention pedestrian re-identification method based on video
CN111709328B (en) * 2020-05-29 2023-08-04 北京百度网讯科技有限公司 Vehicle tracking method and device and electronic equipment
CN112069969B (en) * 2020-08-31 2023-07-25 河北省交通规划设计研究院有限公司 Expressway monitoring video cross-mirror vehicle tracking method and system
CN113256690B (en) * 2021-06-16 2021-09-17 中国人民解放军国防科技大学 Pedestrian multi-target tracking method based on video monitoring

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007235952A (en) * 2006-02-28 2007-09-13 Alpine Electronics Inc Method and device for tracking vehicle
JP2014191664A (en) * 2013-03-27 2014-10-06 Fujitsu Ltd Vehicle tracking program, image transmission program, server device, information processing apparatus, and vehicle tracking method
CN106599918A (en) * 2016-12-13 2017-04-26 开易(深圳)科技有限公司 Vehicle tracking method and system
CN108629791A (en) * 2017-03-17 2018-10-09 北京旷视科技有限公司 Pedestrian tracting method and device and across camera pedestrian tracting method and device
WO2020145882A1 (en) * 2019-01-09 2020-07-16 Hitachi, Ltd. Object tracking systems and methods for tracking a target object
CN110443828A (en) * 2019-07-31 2019-11-12 腾讯科技(深圳)有限公司 Method for tracing object and device, storage medium and electronic device
CN113221750A (en) * 2021-05-13 2021-08-06 杭州飞步科技有限公司 Vehicle tracking method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
multi-camera multi-object tracking;wenqian liu等;《computer vision and pattern recognition》;20170920;1-7 *
监控视频对象跟踪与行为识别;徐少强;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20210515(第(2021)05期);I138-1201 *

Also Published As

Publication number Publication date
CN114067270A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN114067270B (en) Vehicle tracking method and device, computer equipment and storage medium
US10930151B2 (en) Roadside parking management method, device, and system based on multiple cameras
CN110191424B (en) Specific suspect track generation method and apparatus
US10152858B2 (en) Systems, apparatuses and methods for triggering actions based on data capture and characterization
CN105702048B (en) Highway front truck illegal road occupation identifying system based on automobile data recorder and method
CN108229335A (en) It is associated with face identification method and device, electronic equipment, storage medium, program
CN102610102B (en) Suspect vehicle inspection and control method and system
AU2009243916B2 (en) A system and method for electronic surveillance
CN105574506A (en) Intelligent face tracking system and method based on depth learning and large-scale clustering
WO2016202027A1 (en) Object movement trajectory recognition method and system
CN107613244A (en) A kind of navigation channel monitoring objective acquisition methods and device
US20210089784A1 (en) System and Method for Processing Video Data from Archive
US20220392233A1 (en) Traffic information providing method and device, and computer program stored in medium in order to execute method
CN112861673A (en) False alarm removal early warning method and system for multi-target detection of surveillance video
Davies et al. A progress review of intelligent CCTV surveillance systems
Gochoo et al. Fisheye8k: A benchmark and dataset for fisheye camera object detection
CN113901946A (en) Abnormal behavior detection method and device, electronic equipment and storage medium
Lin et al. Moving camera analytics: Emerging scenarios, challenges, and applications
EP3244344A1 (en) Ground object tracking system
CN114913470B (en) Event detection method and device
Glasl et al. Video based traffic congestion prediction on an embedded system
CN113469080A (en) Individual, group and scene interactive collaborative perception method, system and equipment
KR20050034224A (en) A system for automatic parking violation regulation, parking control,and disclosure and roundup of illegal vehicles using wireless communication
CN110717386A (en) Method and device for tracking affair-related object, electronic equipment and non-transitory storage medium
CN113435352B (en) Civilized city scoring method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant