CN113593219A - Traffic flow statistical method and device, electronic equipment and storage medium - Google Patents

Traffic flow statistical method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113593219A
CN113593219A CN202110732356.4A CN202110732356A CN113593219A CN 113593219 A CN113593219 A CN 113593219A CN 202110732356 A CN202110732356 A CN 202110732356A CN 113593219 A CN113593219 A CN 113593219A
Authority
CN
China
Prior art keywords
matching degree
vehicle
determining
video stream
stream data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110732356.4A
Other languages
Chinese (zh)
Other versions
CN113593219B (en
Inventor
路金诚
张伟
谭啸
孙昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110732356.4A priority Critical patent/CN113593219B/en
Publication of CN113593219A publication Critical patent/CN113593219A/en
Application granted granted Critical
Publication of CN113593219B publication Critical patent/CN113593219B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/065Traffic control systems for road vehicles by counting the vehicles in a section of the road or in a parking area, i.e. comparing incoming count with outgoing count

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a traffic flow statistical method and device, electronic equipment and a storage medium, and relates to the technical field of computers, in particular to the technical field of artificial intelligence such as computer vision and intelligent traffic. The specific implementation scheme is as follows: acquiring traffic video stream data in a preset time period; analyzing the video stream data to determine the driving track of each vehicle in the video stream data; determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track; and determining the traffic flow in each direction within a preset time period according to the driving direction of each vehicle. Therefore, the driving direction of each vehicle is determined based on the reference track corresponding to each driving direction, and the traffic flow of each direction is further determined, so that the traffic flow of each direction can be more accurately acquired.

Description

Traffic flow statistical method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an artificial intelligence technology field such as computer vision and intelligent transportation, and in particular, to a traffic flow statistical method, an apparatus, an electronic device, and a storage medium.
Background
As the artificial intelligence technology has been continuously developed and perfected, it has played an extremely important role in various fields related to human daily life, for example, the artificial intelligence technology has made a significant progress in the field of intelligent transportation. At present, how to accurately acquire the traffic flow becomes a hot research direction.
Disclosure of Invention
The disclosure provides a traffic flow statistical method and device, electronic equipment and a storage medium.
According to a first aspect of the present disclosure, there is provided a traffic flow statistical method, including:
acquiring traffic video stream data in a preset time period;
analyzing the video stream data to determine a driving track of each vehicle in the video stream data;
determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track;
and determining the traffic flow of each direction in the preset time period according to the driving direction of each vehicle.
According to a second aspect of the present disclosure, there is provided a traffic flow statistic apparatus including:
the first acquisition module is used for acquiring traffic video stream data in a preset time period;
the first determining module is used for analyzing the video stream data to determine the running track of each vehicle in the video stream data;
the second determining module is used for determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track;
and the third determining module is used for determining the traffic flow in each direction in the preset time period according to the driving direction of each vehicle.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of the first aspect.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method according to the first aspect.
The traffic flow statistical method, the traffic flow statistical device, the electronic equipment and the storage medium have the following beneficial effects:
in the embodiment of the disclosure, traffic video stream data in a preset time period is obtained first, then the video stream data is analyzed to determine a driving track of each vehicle in the video stream data, a driving direction of each vehicle is determined according to the matching degree of the driving track of each vehicle and each reference track, and finally, a traffic flow in each direction in the preset time period is determined according to the driving direction of each vehicle. Therefore, the driving direction of each vehicle is determined based on the reference track corresponding to each driving direction, and the traffic flow of each direction is further determined, so that the traffic flow of each direction can be more accurately acquired.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 is a schematic flow chart of a traffic flow statistical method according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a traffic flow statistical method according to another embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a traffic flow statistic device according to an embodiment of the disclosure;
fig. 4 is a schematic structural diagram of a traffic flow statistic device according to an embodiment of the disclosure;
fig. 5 is a block diagram of an electronic device for implementing a statistical method of traffic flow according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The embodiment of the disclosure relates to the technical field of artificial intelligence such as computer vision, intelligent transportation and the like.
Artificial Intelligence (Artificial Intelligence), abbreviated in english as AI. The method is a new technical science for researching and developing theories, methods, technologies and application systems for simulating, extending and expanding human intelligence.
The intelligent transportation system is a comprehensive transportation system which effectively and comprehensively applies advanced scientific technologies (information technology, computer technology, data communication technology, sensor technology, electronic control technology, automatic control theory, operational research, artificial intelligence and the like) to transportation, service control and vehicle manufacturing and strengthens the relation among vehicles, roads and users, thereby ensuring safety, improving efficiency, improving environment and saving energy.
Computer vision, which means that a camera and a computer are used to replace human eyes to perform machine vision such as identification, tracking and measurement on a target, and further image processing is performed, so that the computer processing becomes an image more suitable for human eye observation or transmitted to an instrument for detection.
Fig. 1 is a schematic flow chart of a traffic flow statistical method according to an embodiment of the present disclosure;
it should be noted that the execution subject of the traffic flow statistics method in this embodiment is a traffic flow statistics device, which may be implemented in a software and/or hardware manner, and the device may be configured in an electronic device, and the electronic device may include but is not limited to a terminal, a server, and the like.
As shown in fig. 1, the traffic flow statistical method includes:
s101: and acquiring traffic video stream data in a preset time period.
Wherein the preset time period may be thirty seconds, one minute, five minutes, 1 signal lamp cycle, 3 signal lamp cycles, etc., which is not limited by the present disclosure.
The traffic video stream data may be a video stream of vehicle driving acquired by an electronic device with a video recording function. The video stream data may include a plurality of consecutive frames of images of the vehicle.
S102: the video stream data is parsed to determine a travel track of each vehicle in the video stream data.
Optionally, the optical flow method may be used to analyze the consecutive frames of images of the vehicle running in the video stream data to determine the running track of each vehicle in the video stream data.
Among them, the optical flow method is a concept related to object motion detection in the field of view. To describe the motion of an observed object, surface or edge caused by motion relative to an observer. The optical flow method is very useful in the fields of pattern recognition, computer vision and other image processing, and can be used for motion detection, object cutting, calculation of collision time and object expansion, motion compensation coding, or stereo measurement through the surface and the edge of an object, and the like.
Or, the method of inter-frame difference can also be used for analyzing the running images of a plurality of continuous frames of vehicles in the video stream data so as to determine the running track of each vehicle in the video stream data.
The interframe difference method is a method for obtaining the outline of a moving target by carrying out difference operation on two adjacent frames in a video image sequence, and can be well suitable for the condition that a plurality of moving targets and a camera move. When abnormal object motion occurs in a monitored scene, a frame is obviously different from a frame, the two frames are subtracted to obtain an absolute value of the brightness difference of the two frames, whether the absolute value is greater than a threshold value or not is judged to analyze the motion characteristic of a video or an image sequence, and whether object motion exists in the image sequence or not is determined.
S103: and determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track.
The reference track is a typical vehicle running track corresponding to any running direction. For example, if the intersection where the signal lamp in the video stream data is located is an intersection, there are twelve reference trajectories, which are a left turn, a straight turn and a right turn in four directions in the intersection.
Optionally, a reference data set corresponding to an intersection associated with the video stream data may be obtained first, where the reference data set includes driving tracks and driving directions of a plurality of vehicles at the intersection; and then determining a reference track corresponding to each driving direction in the intersection according to the driving tracks of the vehicles with the same driving direction at the intersection.
For example, the driving tracks of the vehicles in the same driving direction at the intersection may be subjected to track clustering to determine a reference track corresponding to each driving direction at the intersection. Or, model training may be performed on the traveling tracks of the vehicles in the same traveling direction at the intersection to determine a reference track corresponding to each traveling direction at the intersection.
Optionally, because the number of lanes in different driving directions in the intersection is different, or the speed limit of the road section where the intersection is located to the vehicle is different, the driving tracks of the vehicle may also be different. Therefore, the method and the device can also determine the type of the road section to which the intersection belongs and the number of lanes corresponding to each driving direction in the intersection, which are related to the video stream data, and then determine the reference track corresponding to each driving direction in the intersection according to the type of the road section and the number of lanes corresponding to each driving direction.
The road section type of the intersection related to the video stream data is used for representing the type of the road where the intersection is located. Such as a main trunk, express way, secondary trunk, etc. The speed limit of the vehicle is different for different road types, so that the reference track corresponding to each driving direction can be determined according to the type of the road section to which the intersection belongs and the number of lanes corresponding to each driving direction, which is not limited by the disclosure.
Alternatively, the matching degree of the travel track of each vehicle with each reference track may be calculated using a euclidean distance formula, a hausdorff distance formula, or the like, and the disclosure is not limited thereto.
And the driving direction corresponding to the reference track with the highest matching degree with the driving track of the vehicle is the driving direction of the vehicle.
S104: and determining the traffic flow in each direction within a preset time period according to the driving direction of each vehicle.
Optionally, vehicles in the same driving direction within a preset time period may be classified to count the traffic flow in each direction within the preset time period. Or, every time the driving direction of one vehicle is obtained, adding one to the traffic flow corresponding to the driving direction until the traffic flow in each direction in the preset time period is obtained.
In the embodiment of the disclosure, traffic video stream data in a preset time period is obtained first, then the video stream data is analyzed to determine a running track of each vehicle in the video stream data, the running direction of each vehicle is determined according to the matching degree of the running track of each vehicle and each reference track, and finally the traffic flow of each direction of an intersection in the preset time period is determined according to the running direction of each vehicle. Therefore, the driving direction of each vehicle is determined based on the reference track corresponding to each driving direction, and the traffic flow of each direction is further determined, so that the traffic flow of each direction can be more accurately acquired.
As can be seen from the above analysis, in the present disclosure, the driving direction of each vehicle can be determined based on the reference trajectory corresponding to each driving direction, and thus the traffic flow in each direction can be determined. In a possible implementation manner, the driving track of the vehicle may be obtained according to the color histogram, the detection frame, and the position of the vehicle, the traffic flow in each direction may be determined based on the matching degree between the driving track of the vehicle and each reference track, and the signal lamp may be further controlled according to the traffic flow. The above process is described in detail below with reference to fig. 2.
Fig. 2 is a schematic flow chart of a traffic flow statistical method according to yet another embodiment of the present disclosure.
As shown in fig. 2, the traffic flow statistical method includes:
s201: and acquiring traffic video stream data in a preset time period.
The specific implementation form of step S201 may refer to detailed descriptions of other embodiments in the present disclosure, and is not described herein again.
S202: determining a first vehicle contained in an ith frame of image in the video stream data and a first color histogram, a first detection frame and a first position corresponding to the first vehicle, and a second color histogram, a second detection frame and a second position corresponding to a second vehicle contained in an (i + 1) th frame of image and the second vehicle, wherein i is a natural number.
Optionally, the target detection network may be used to detect each frame of image in the video stream data to obtain a detection frame of the vehicle and a position of the vehicle in each frame of image, and extract a color of the vehicle in the detection frame to obtain a color histogram of each vehicle.
The target detection network may be a Convolutional Neural Network (CNN), a regional Convolutional Neural network (R-CNN), or the like, which is not limited in this disclosure.
It is understood that the number of the first vehicles in the ith frame image may be one, or may be multiple, and likewise, the number of the second vehicles included in the (i + 1) th frame image may be one, or may be multiple, which is not limited by the disclosure.
S203: and determining a first matching degree between the first color histogram and the second color histogram, a second matching degree between the first detection frame and the second detection frame, and a third matching degree between the first position and the second position.
Optionally, the first matching degree, the second matching degree, and the third matching degree may be obtained by using the following formulas:
dappearance(i,j)=1-cosine(Ci,Cj) (1)
Figure BDA0003140265010000071
Figure BDA0003140265010000072
wherein, formula (1) is a formula for calculating the first matching degree, formula (2) is a formula for calculating the second matching degree, formula (3) is a formula for calculating the third matching degree, dappearance(i, j) is a first degree of matching, dmotion(i, j) is a second degree of matching, dshape(i, j) is a third matching degree, i is a first vehicle in the ith frame image, j is a second vehicle in the (i + 1) th frame image, cosine is a cosine distance, CiIs the vector corresponding to the first color histogram, CjIs the vector corresponding to the second color histogram, XiIs the abscissa, Y, of the first positioniIs the ordinate, X, of the first positionjIs the abscissa of the second position, YjIs the ordinate, W, of the second positioniIs the length of the first detection frame, HiIs the width of the first detection frame, WjIs the length of the second detection frame, HjWidth of the second detection frame, ω1、ω2Are reference coefficients.
Or, the first cosine similarity between the first color histogram and the second color histogram, the second cosine similarity between the first detection frame and the second detection frame, and the third cosine similarity between the first position and the second position may be calculated, and then the first cosine similarity is used as the first matching degree, the second cosine similarity is used as the second matching degree, and the third cosine similarity is used as the third matching degree.
S204: and determining a fourth matching degree between the first vehicle and the second vehicle according to the first matching degree, the second matching degree and the third matching degree.
Optionally, an average of the first matching degree, the second matching degree, and the third matching degree may be used as a fourth matching degree of the first vehicle and the second vehicle. Or, the first matching degree, the second matching degree and the third matching degree are assigned with the same weight, and the weighted sum of the first matching degree, the second matching degree and the third matching degree is used as the fourth matching degree.
Optionally, the first matching degree, the second matching degree and the third matching degree are fused in different ways, so that the obtained fourth matching degree is different. Therefore, the current matching degree fusion mode may be determined first, and then the first matching degree, the second matching degree, and the third matching degree may be fused based on the current matching degree fusion mode to determine the fourth matching degree.
Optionally, because the speeds of the vehicles are different in different types of roads, so that the difference degrees of the positions, the shapes and the like of the same vehicle are different in two continuous frames of images, the present disclosure may determine the current matching degree fusion mode according to the type of the road segment to which the intersection belongs, which is associated with the video stream data.
For example, if the number of lanes of the road segment to which the intersection belongs is large, the position difference of the same vehicle appearing in the adjacent frames is small, and the difference of the shapes of the corresponding detection frames is small, the weights of the second matching degree between the first detection frame and the second detection frame corresponding to the same vehicle and the third matching degree between the first position and the second position can be appropriately increased. Since the vehicles of different colors may be the same, the weight of the matching degree determined based on the color histogram may be appropriately reduced.
And/or, since the matching reliability determined based on the color histogram and the shape is different when the areas of the vehicles in the images are different, the present disclosure may also determine the current matching degree fusion mode according to the area of the first detection frame in the ith frame image.
For example, if the area of the first detection frame in the ith frame image is larger, that is, the color histogram corresponding to the first vehicle may accurately and completely represent the color information of the first vehicle, so that the first matching degree determined based on the color histogram may have a higher weight, and the like, which is not limited in this disclosure.
And/or, the current matching degree fusion mode can be determined according to the area of the second detection frame in the (i + 1) th frame image.
It should be noted that the above examples are only simple examples, and are not intended to limit the first matching degree, the second matching degree, and the third matching degree in the present disclosure.
S205: and under the condition that the fourth matching degree meets the set condition, determining the driving track of the vehicle in the time period corresponding to the ith frame image and the (i + 1) th frame image according to the second position and the first position.
Further, after the driving track of the vehicle in the time period corresponding to the ith frame image and the (i + 1) th frame image is determined, it is indicated that the vehicle is a tracked vehicle and has a corresponding driving track.
It should be noted that, for the vehicle whose fourth matching degree does not satisfy the setting, the comparison matching may be further performed by another matching method, for example, the comparison matching may be further performed by using a cross-comparison matching method, so as to determine the driving track of each vehicle between the i-th frame image and the i + 1-th frame image.
S206: and determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track.
S207: and determining the traffic flow in each direction within a preset time period according to the driving direction of each vehicle.
S208: and controlling signal lamps of the intersection related to the video stream data according to the traffic flow in each direction.
It can be understood that if the traffic flow in a certain direction is small, the green time in the direction can be shortened, and the red time can be increased. Conversely, if the traffic flow in a certain direction is large, the green time in that direction needs to be increased and the red time needs to be shortened.
In the embodiment of the disclosure, the driving track of a vehicle is determined through the color histogram feature, the detection frame feature and the position feature of the vehicle in video stream data, then the driving direction of each vehicle is determined according to the matching degree of the driving track of each vehicle and each reference track, then the traffic flow in each direction in a preset time period is determined according to the driving direction of each vehicle, and finally the signal lamp is controlled according to the traffic flow in each direction. Therefore, the vehicle is tracked based on the color histogram feature, the detection frame feature and the position feature of the vehicle to determine the running track of the vehicle, and then the traffic flow in each direction is determined based on the matching degree of the determined running track and each reference track, so that the signal lamp can be accurately controlled, the traffic efficiency of the vehicle is further improved, and the condition of road congestion is avoided.
Fig. 3 is a schematic structural diagram of a traffic flow statistics apparatus according to an embodiment of the present disclosure.
As shown in fig. 3, the traffic flow statistics apparatus 300 includes: a first obtaining module 310, a first determining module 320, a second determining module 330, and a third determining module 340.
The first acquisition module is used for acquiring traffic video stream data in a preset time period.
The first determining module is used for analyzing the video stream data to determine the driving track of each vehicle in the video stream data.
And the second determining module is used for determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track.
And the third determining module is used for determining the traffic flow of each direction of the intersection in a preset time period according to the driving direction of each vehicle.
It should be noted that the explanation of the traffic flow statistics method is also applicable to the traffic flow statistics device of the present embodiment, and is not repeated herein.
The traffic flow counting device in the embodiment of the disclosure firstly obtains traffic video stream data in a preset time period, then analyzes the video stream data to determine a running track of each vehicle in the video stream data, determines a running direction of each vehicle according to the matching degree of the running track of each vehicle and each reference track, and finally determines the traffic flow in each direction in the preset time period according to the running direction of each vehicle. Therefore, the driving direction of each vehicle is determined based on the reference track corresponding to each driving direction, and the traffic flow of each direction is further determined, so that the traffic flow of each direction can be accurately acquired.
In some embodiments of the present disclosure, as shown in fig. 4, fig. 4 is a schematic structural diagram of a traffic flow statistics apparatus according to an embodiment of the present disclosure, where the traffic flow statistics apparatus 400 includes: a first obtaining module 410, a first determining module 420, a second determining module 430, a third determining module 440, a control module 450, a second obtaining module 460, and a third obtaining module 470.
In a possible implementation manner, the second obtaining module 460 is specifically configured to:
acquiring a reference data set corresponding to an intersection related to video stream data, wherein the reference data set comprises running tracks and running directions of a plurality of vehicles in the intersection;
and determining a reference track corresponding to each driving direction in the intersection according to the driving tracks of the vehicles with the same driving direction at the intersection.
In a possible implementation manner, the third obtaining module 470 is specifically configured to:
determining the type of a road section to which the intersection belongs and the number of lanes corresponding to each driving direction in the intersection, wherein the road section is associated with the video stream data;
and determining a reference track corresponding to each driving direction in the intersection according to the type of the road section and the number of lanes corresponding to each driving direction.
In one possible implementation, the first determining module 420 includes:
a first determining unit 4201, configured to determine a first color histogram, a first detection frame, and a first position corresponding to a first vehicle and the first vehicle included in an i-th frame image in video stream data, and a second color histogram, a second detection frame, and a second position corresponding to a second vehicle and the second vehicle included in an i + 1-th frame image, where i is a natural number;
a second determining unit 4202, configured to determine a first matching degree between the first color histogram and the second color histogram, a second matching degree between the first detection frame and the second detection frame, and a third matching degree between the first position and the second position;
a third determining unit 4203, configured to determine a fourth matching degree between the first vehicle and the second vehicle according to the first matching degree, the second matching degree, and the third matching degree;
a fourth determining unit 4204, configured to determine, when the fourth matching degree satisfies the setting condition, a driving trajectory of the vehicle in a period corresponding to the i-th frame image and the i + 1-th frame image according to the second position and the first position.
In a possible implementation manner, the third determining unit 4203 is specifically configured to:
determining a current matching degree fusion mode;
and fusing the first matching degree, the second matching degree and the third matching degree based on the current matching degree fusion mode to determine a fourth matching degree.
In a possible implementation manner, the third determining unit 4203 is further configured to:
determining a current matching degree fusion mode according to the type of the road section to which the intersection belongs and associated with the video stream data;
and/or the presence of a gas in the gas,
determining a current matching degree fusion mode according to the area of the first detection frame in the ith frame image;
and/or the presence of a gas in the gas,
and determining the current matching degree fusion mode according to the area of the second detection frame in the (i + 1) th frame image.
In a possible implementation manner, the control module 450 is specifically configured to:
and controlling signal lamps of the intersection related to the video stream data according to the traffic flow in each direction.
It is understood that the traffic flow statistics apparatus 400 in fig. 4 of the present embodiment and the traffic flow statistics apparatus 300 in the foregoing embodiment, the first obtaining module 410 and the first obtaining module 310 in the foregoing embodiment, the first determining module 420 and the first determining module 320 in the foregoing embodiment, the second determining module 430 and the second determining module 330 in the foregoing embodiment, and the third determining module 440 and the third determining module 340 in the foregoing embodiment may have the same functions and structures.
It should be noted that the explanation of the traffic flow statistics method is also applicable to the traffic flow statistics device of the present embodiment, and is not repeated herein.
The traffic flow counting device in the embodiment of the disclosure determines the driving tracks of vehicles through the color histogram features, the detection frame features and the position features of the vehicles in the video stream data, determines the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track, determines the traffic flow in each direction within a preset time period according to the driving direction of each vehicle, and controls the signal lamps according to the traffic flow in each direction. Therefore, the vehicle is tracked based on the color histogram feature, the detection frame feature and the position feature of the vehicle to determine the running track of the vehicle, and then the traffic flow in each direction is determined based on the matching degree of the determined running track and each reference track, so that the signal lamp can be accurately controlled, the traffic efficiency of the vehicle is improved, and the condition of road congestion is avoided.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 5 illustrates a schematic block diagram of an example electronic device 500 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 5, the apparatus 500 comprises a computing unit 501 which may perform various appropriate actions and processes in accordance with a computer program stored in a Read Only Memory (ROM)502 or a computer program loaded from a storage unit 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the device 500 can also be stored. The calculation unit 501, the ROM 502, and the RAM 503 are connected to each other by a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
A number of components in the device 500 are connected to the I/O interface 505, including: an input unit 506 such as a keyboard, a mouse, or the like; an output unit 507 such as various types of displays, speakers, and the like; a storage unit 508, such as a magnetic disk, optical disk, or the like; and a communication unit 509 such as a network card, modem, wireless communication transceiver, etc. The communication unit 509 allows the device 500 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 501 may be a variety of general-purpose and/or special-purpose processing components having processing and computing capabilities. Some examples of the computing unit 501 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 501 executes the respective methods and processes described above, such as the statistical method of the traffic flow. For example, in some embodiments, the statistical method of traffic flow may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 508. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 500 via the ROM 502 and/or the communication unit 509. When the computer program is loaded into the RAM 503 and executed by the computing unit 501, one or more steps of the statistical method of traffic flow described above may be performed. Alternatively, in other embodiments, the computing unit 501 may be configured to perform the statistical method of traffic flow in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
In the embodiment of the disclosure, traffic video stream data in a preset time period is obtained first, then the video stream data is analyzed to determine a driving track of each vehicle in the video stream data, a driving direction of each vehicle is determined according to the matching degree of the driving track of each vehicle and each reference track, and finally, a traffic flow in each direction in the preset time period is determined according to the driving direction of each vehicle. Therefore, the driving direction of each vehicle is determined based on the reference track corresponding to each driving direction, and the traffic flow of each direction is further determined, so that the traffic flow of each direction can be more accurately acquired.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (17)

1. A traffic flow statistical method comprises the following steps:
acquiring traffic video stream data in a preset time period;
analyzing the video stream data to determine a driving track of each vehicle in the video stream data;
determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track;
and determining the traffic flow of each direction in the preset time period according to the driving direction of each vehicle.
2. The method of claim 1, wherein before determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track, further comprising:
acquiring a reference data set corresponding to an intersection associated with the video stream data, wherein the reference data set comprises driving tracks and driving directions of a plurality of vehicles in the intersection;
and determining a reference track corresponding to each driving direction in the intersection according to the driving tracks of the vehicles with the same driving direction at the intersection.
3. The method of claim 1, wherein before determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track, further comprising:
determining the type of a road section to which the intersection belongs and the number of lanes corresponding to each driving direction in the intersection, wherein the road section is associated with the video stream data;
and determining a reference track corresponding to each driving direction in the intersection according to the type of the road section and the number of lanes corresponding to each driving direction.
4. The method of claim 1, wherein said parsing the video stream data to determine a driving trajectory for each vehicle in the video stream data comprises:
determining a first vehicle contained in an ith frame of image in the video stream data and a first color histogram, a first detection frame and a first position corresponding to the first vehicle, and a second vehicle contained in an (i + 1) th frame of image and a second color histogram, a second detection frame and a second position corresponding to the second vehicle, wherein i is a natural number;
determining a first matching degree between the first color histogram and the second color histogram, a second matching degree between the first detection frame and the second detection frame, and a third matching degree between the first position and the second position;
determining a fourth matching degree between the first vehicle and the second vehicle according to the first matching degree, the second matching degree and the third matching degree;
and under the condition that the fourth matching degree meets a set condition, determining the driving track of the vehicle in the time period corresponding to the ith frame image and the (i + 1) th frame image according to the second position and the first position.
5. The method of claim 4, wherein said determining a fourth degree of match between the first vehicle and the second vehicle based on the first degree of match, the second degree of match, and the third degree of match comprises:
determining a current matching degree fusion mode;
and fusing the first matching degree, the second matching degree and the third matching degree based on the current matching degree fusion mode to determine the fourth matching degree.
6. The method of claim 5, wherein the determining a current matching degree fusion pattern comprises:
determining the current matching degree fusion mode according to the type of the road section to which the intersection belongs and associated with the video stream data;
and/or the presence of a gas in the gas,
determining the current matching degree fusion mode according to the area of the first detection frame in the ith frame image;
and/or the presence of a gas in the gas,
and determining the current matching degree fusion mode according to the area of the second detection frame in the (i + 1) th frame image.
7. The method of any of claims 1-6, wherein after said determining the amount of traffic in each direction for said preset period of time, further comprising:
and controlling the signal lamps of the intersections related to the video stream data according to the traffic flow in each direction.
8. A traffic flow statistics apparatus comprising:
the first acquisition module is used for acquiring traffic video stream data in a preset time period;
the first determining module is used for analyzing the video stream data to determine the running track of each vehicle in the video stream data;
the second determining module is used for determining the driving direction of each vehicle according to the matching degree of the driving track of each vehicle and each reference track;
and the third determining module is used for determining the traffic flow in each direction in the preset time period according to the driving direction of each vehicle.
9. The apparatus of claim 8, further comprising:
the second acquisition module is used for acquiring a reference data set corresponding to the intersection associated with the video stream data, wherein the reference data set comprises the driving tracks and the driving directions of a plurality of vehicles in the intersection;
the second obtaining module is further configured to determine a reference track corresponding to each driving direction in the intersection according to the driving tracks of the vehicles in the same driving direction at the intersection.
10. The apparatus of claim 8, further comprising:
the third acquisition module is used for determining the type of the road section to which the intersection belongs and the number of lanes corresponding to each driving direction in the intersection, wherein the road section is associated with the video stream data;
the third obtaining module is further configured to determine a reference track corresponding to each driving direction in the intersection according to the type of the road segment and the number of lanes corresponding to each driving direction.
11. The apparatus of claim 8, wherein the first determining means comprises:
a first determining unit, configured to determine a first vehicle included in an ith frame image of the video stream data and a first color histogram, a first detection frame, and a first position corresponding to the first vehicle, and a second color histogram, a second detection frame, and a second position corresponding to a second vehicle included in an (i + 1) th frame image and the second vehicle, where i is a natural number;
a second determining unit, configured to determine a first matching degree between the first color histogram and the second color histogram, a second matching degree between the first detection frame and the second detection frame, and a third matching degree between the first position and the second position;
a third determining unit, configured to determine a fourth matching degree between the first vehicle and the second vehicle according to the first matching degree, the second matching degree, and the third matching degree;
and a fourth determining unit, configured to determine, according to the second position and the first position, a travel track of the vehicle in a time period corresponding to the i-th frame image and the i + 1-th frame image when the fourth matching degree satisfies a set condition.
12. The apparatus of claim 11, wherein the third determining unit is specifically configured to:
determining a current matching degree fusion mode;
and fusing the first matching degree, the second matching degree and the third matching degree based on the current matching degree fusion mode to determine the fourth matching degree.
13. The apparatus of claim 12, wherein the third determining unit is further configured to:
determining the current matching degree fusion mode according to the type of the road section to which the intersection belongs and associated with the video stream data;
and/or the presence of a gas in the gas,
determining the current matching degree fusion mode according to the area of the first detection frame in the ith frame image;
and/or the presence of a gas in the gas,
and determining the current matching degree fusion mode according to the area of the second detection frame in the (i + 1) th frame image.
14. The apparatus of any of claims 8-13, further comprising:
and the control module is used for controlling signal lamps of the intersection related to the video stream data according to the traffic flow in each direction.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-7.
17. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-7.
CN202110732356.4A 2021-06-30 2021-06-30 Traffic flow statistical method and device, electronic equipment and storage medium Active CN113593219B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110732356.4A CN113593219B (en) 2021-06-30 2021-06-30 Traffic flow statistical method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110732356.4A CN113593219B (en) 2021-06-30 2021-06-30 Traffic flow statistical method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113593219A true CN113593219A (en) 2021-11-02
CN113593219B CN113593219B (en) 2023-02-28

Family

ID=78245177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110732356.4A Active CN113593219B (en) 2021-06-30 2021-06-30 Traffic flow statistical method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113593219B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114093170A (en) * 2021-11-26 2022-02-25 阿波罗智联(北京)科技有限公司 Generation method, system and device of annunciator control scheme and electronic equipment
CN114495509A (en) * 2022-04-08 2022-05-13 四川九通智路科技有限公司 Method for monitoring tunnel running state based on deep neural network
CN114582125A (en) * 2022-03-02 2022-06-03 北京百度网讯科技有限公司 Method, device, equipment and storage medium for identifying road traffic direction
CN114822049A (en) * 2022-03-23 2022-07-29 山东省交通规划设计院集团有限公司 Vehicle flow direction monitoring and analyzing method and system
CN115547036A (en) * 2022-08-31 2022-12-30 北京罗克维尔斯科技有限公司 Track filtering method and device, electronic equipment, storage medium and vehicle

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102208036A (en) * 2010-03-31 2011-10-05 爱信艾达株式会社 Vehicle position detection system
JP2018055597A (en) * 2016-09-30 2018-04-05 株式会社東芝 Vehicle type discrimination device and vehicle type discrimination method
CN108009494A (en) * 2017-11-30 2018-05-08 中山大学 A kind of intersection wireless vehicle tracking based on unmanned plane
CN109584558A (en) * 2018-12-17 2019-04-05 长安大学 A kind of traffic flow statistics method towards Optimization Control for Urban Traffic Signals
CN110335139A (en) * 2019-06-21 2019-10-15 深圳前海微众银行股份有限公司 Appraisal procedure, device, equipment and readable storage medium storing program for executing based on similarity
US20190325738A1 (en) * 2018-04-18 2019-10-24 Here Global B.V. Lane-level geometry and traffic information
CN110674723A (en) * 2019-09-19 2020-01-10 北京三快在线科技有限公司 Method and device for determining driving track of unmanned vehicle
EP3614225A1 (en) * 2018-08-23 2020-02-26 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for positioning autonomous vehicle
KR102144975B1 (en) * 2019-11-08 2020-08-14 주식회사 알체라 Machine learning system and method for operating machine learning system
CN111652912A (en) * 2020-06-10 2020-09-11 北京嘀嘀无限科技发展有限公司 Vehicle counting method and system, data processing equipment and intelligent shooting equipment
CN111932901A (en) * 2019-05-13 2020-11-13 阿里巴巴集团控股有限公司 Road vehicle tracking detection apparatus, method and storage medium
CN112309126A (en) * 2020-10-30 2021-02-02 杭州海康威视数字技术股份有限公司 License plate detection method and device, electronic equipment and computer readable storage medium
CN112597822A (en) * 2020-12-11 2021-04-02 国汽(北京)智能网联汽车研究院有限公司 Vehicle track determination method and device, electronic equipment and computer storage medium
CN112990124A (en) * 2021-04-26 2021-06-18 湖北亿咖通科技有限公司 Vehicle tracking method and device, electronic equipment and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102208036A (en) * 2010-03-31 2011-10-05 爱信艾达株式会社 Vehicle position detection system
JP2018055597A (en) * 2016-09-30 2018-04-05 株式会社東芝 Vehicle type discrimination device and vehicle type discrimination method
CN108009494A (en) * 2017-11-30 2018-05-08 中山大学 A kind of intersection wireless vehicle tracking based on unmanned plane
US20190325738A1 (en) * 2018-04-18 2019-10-24 Here Global B.V. Lane-level geometry and traffic information
EP3614225A1 (en) * 2018-08-23 2020-02-26 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for positioning autonomous vehicle
CN109584558A (en) * 2018-12-17 2019-04-05 长安大学 A kind of traffic flow statistics method towards Optimization Control for Urban Traffic Signals
CN111932901A (en) * 2019-05-13 2020-11-13 阿里巴巴集团控股有限公司 Road vehicle tracking detection apparatus, method and storage medium
CN110335139A (en) * 2019-06-21 2019-10-15 深圳前海微众银行股份有限公司 Appraisal procedure, device, equipment and readable storage medium storing program for executing based on similarity
CN110674723A (en) * 2019-09-19 2020-01-10 北京三快在线科技有限公司 Method and device for determining driving track of unmanned vehicle
KR102144975B1 (en) * 2019-11-08 2020-08-14 주식회사 알체라 Machine learning system and method for operating machine learning system
CN111652912A (en) * 2020-06-10 2020-09-11 北京嘀嘀无限科技发展有限公司 Vehicle counting method and system, data processing equipment and intelligent shooting equipment
CN112309126A (en) * 2020-10-30 2021-02-02 杭州海康威视数字技术股份有限公司 License plate detection method and device, electronic equipment and computer readable storage medium
CN112597822A (en) * 2020-12-11 2021-04-02 国汽(北京)智能网联汽车研究院有限公司 Vehicle track determination method and device, electronic equipment and computer storage medium
CN112990124A (en) * 2021-04-26 2021-06-18 湖北亿咖通科技有限公司 Vehicle tracking method and device, electronic equipment and storage medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114093170A (en) * 2021-11-26 2022-02-25 阿波罗智联(北京)科技有限公司 Generation method, system and device of annunciator control scheme and electronic equipment
CN114093170B (en) * 2021-11-26 2023-01-24 阿波罗智联(北京)科技有限公司 Generation method, system and device of annunciator control scheme and electronic equipment
CN114582125A (en) * 2022-03-02 2022-06-03 北京百度网讯科技有限公司 Method, device, equipment and storage medium for identifying road traffic direction
CN114582125B (en) * 2022-03-02 2023-08-29 北京百度网讯科技有限公司 Method, device, equipment and storage medium for identifying road traffic direction
CN114822049A (en) * 2022-03-23 2022-07-29 山东省交通规划设计院集团有限公司 Vehicle flow direction monitoring and analyzing method and system
CN114495509A (en) * 2022-04-08 2022-05-13 四川九通智路科技有限公司 Method for monitoring tunnel running state based on deep neural network
CN114495509B (en) * 2022-04-08 2022-07-12 四川九通智路科技有限公司 Method for monitoring tunnel running state based on deep neural network
CN115547036A (en) * 2022-08-31 2022-12-30 北京罗克维尔斯科技有限公司 Track filtering method and device, electronic equipment, storage medium and vehicle

Also Published As

Publication number Publication date
CN113593219B (en) 2023-02-28

Similar Documents

Publication Publication Date Title
CN113593219B (en) Traffic flow statistical method and device, electronic equipment and storage medium
US11314973B2 (en) Lane line-based intelligent driving control method and apparatus, and electronic device
CN111709328B (en) Vehicle tracking method and device and electronic equipment
Zhang et al. High speed automatic power line detection and tracking for a UAV-based inspection
CN103246896B (en) A kind of real-time detection and tracking method of robustness vehicle
CN113221677B (en) Track abnormality detection method and device, road side equipment and cloud control platform
CN103065325B (en) A kind of method for tracking target based on the polymerization of color Distance geometry Iamge Segmentation
CN113012176B (en) Sample image processing method and device, electronic equipment and storage medium
Yang et al. Multiple object tracking with kernelized correlation filters in urban mixed traffic
CN112560580B (en) Obstacle recognition method, device, system, storage medium and electronic equipment
CN112528786A (en) Vehicle tracking method and device and electronic equipment
CN113221768A (en) Recognition model training method, recognition method, device, equipment and storage medium
CN115641359B (en) Method, device, electronic equipment and medium for determining movement track of object
CN113177968A (en) Target tracking method and device, electronic equipment and storage medium
CN113763427B (en) Multi-target tracking method based on coarse-to-fine shielding processing
CN112528927B (en) Confidence determining method based on track analysis, road side equipment and cloud control platform
CN111161325A (en) Three-dimensional multi-target tracking method based on Kalman filtering and LSTM
CN114037966A (en) High-precision map feature extraction method, device, medium and electronic equipment
CN113592903A (en) Vehicle track recognition method and device, electronic equipment and storage medium
CN115620081A (en) Training method of target detection model, target detection method and device
CN113705381A (en) Target detection method and device in foggy days, electronic equipment and storage medium
CN101877135B (en) Moving target detecting method based on background reconstruction
CN105279761B (en) A kind of background modeling method based on sample local density outlier detection
CN115953434A (en) Track matching method and device, electronic equipment and storage medium
CN111783613A (en) Anomaly detection method, model training method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant