CN111696360A - Parking monitoring method, system and camera - Google Patents

Parking monitoring method, system and camera Download PDF

Info

Publication number
CN111696360A
CN111696360A CN201910199802.2A CN201910199802A CN111696360A CN 111696360 A CN111696360 A CN 111696360A CN 201910199802 A CN201910199802 A CN 201910199802A CN 111696360 A CN111696360 A CN 111696360A
Authority
CN
China
Prior art keywords
vehicle
image frame
parking
time
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910199802.2A
Other languages
Chinese (zh)
Inventor
林常榕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201910199802.2A priority Critical patent/CN111696360A/en
Publication of CN111696360A publication Critical patent/CN111696360A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/149Traffic control systems for road vehicles indicating individual free spaces in parking areas coupled to means for restricting the access to the parking space, e.g. authorization, access barriers, indicative lights

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a parking monitoring method, a parking monitoring system and a camera, wherein the method comprises the following steps: acquiring image frames acquired by at least one camera of a target road section, wherein the image frames comprise images of vehicles in the target road section tracked by the camera; recording an entry time of a first vehicle and an ID of the first vehicle when the first vehicle is detected to enter a parking area of a target section from the image frame; when a second vehicle is detected to leave the parking area of the target road section from the image frame, recording the leaving time of the second vehicle and the ID of the second vehicle; generating parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle. The embodiment can avoid the trouble of manually acquiring the parking data, and improves the data acquisition efficiency and the accuracy.

Description

Parking monitoring method, system and camera
Technical Field
The present application relates to the field of monitoring technologies, and in particular, to a parking monitoring method and system, and a camera.
Background
In recent years, with the rapid increase of economy in China, the number of private car users is increasing, with the increasing of traffic, the pressure of parking is caused, and the existing parking lot cannot meet the requirement of parking and cannot meet the requirement of temporary parking, so that a scheme of roadside parking is implemented. However, in the current urban traffic management in China, the establishment of the roadside parking charging mode is not complete enough, and the roadside parking charging mode in use includes parking meter and manual supervision, which are insufficient in charging accuracy and low in efficiency. Especially, the manual supervision mode causes great loopholes and cash loss in finance, and is inconvenient to manage; the toll collector directly interferes with the vehicle owner, which causes the waste of time resources.
Disclosure of Invention
In view of the above, the present application provides a parking monitoring method, system and camera.
Specifically, the method is realized through the following technical scheme:
in a first aspect, an embodiment of the present application provides a parking monitoring method, where the method includes:
acquiring image frames acquired by at least one camera of a target road section, wherein the image frames comprise images of vehicles in the target road section tracked by the camera;
recording an entry time of a first vehicle and an ID of the first vehicle when the first vehicle is detected to enter a parking area of a target section from the image frame;
when a second vehicle is detected to leave the parking area of the target road section from the image frame, recording the leaving time of the second vehicle and the ID of the second vehicle;
generating parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
In one possible embodiment, the recording the entry time of the first vehicle and the ID of the first vehicle when the first vehicle is detected to enter the parking area of the target section from the image frame includes:
tracking the first vehicle based on an ID of the first vehicle when the first vehicle is detected to reach the target road segment from the image frames; when the first vehicle is detected to arrive at a parking area, recording the time when the first vehicle arrives at the parking area as the entering time;
associating the entry time with the ID of the first vehicle.
In one possible embodiment, before the tracking the first vehicle based on the ID of the first vehicle, the method further comprises:
obtaining vehicle characteristics of the first vehicle when the first vehicle reaches the target road section to obtain first vehicle characteristic information, wherein the first vehicle characteristic information comprises license plate information;
and generating the ID of the first vehicle based on the license plate information.
In one possible embodiment, after the generating the ID of the first vehicle based on the license plate information, the method further includes:
in the process of tracking the first vehicle, when the ID of the currently acquired first vehicle is inconsistent with the ID of the previously tracked first vehicle, comparing whether the currently acquired first vehicle characteristic information is the same as other information except the license plate information of the previously tracked first vehicle characteristic information;
and if so, correcting the currently acquired ID of the first vehicle to the ID of the previously tracked first vehicle.
In one possible embodiment, after the recording of the time when the first vehicle arrives at the parking area as the entry time when the first vehicle arrives at the parking area, the method further includes:
stopping tracking of the first vehicle after determining that the first vehicle is parked in a parking area.
In one possible embodiment, the recording of the departure time of the second vehicle and the ID of the second vehicle when the second vehicle is detected to depart from the parking area of the target section from the image frame includes:
tracking the second vehicle based on the ID of the second vehicle when the second vehicle is detected to move within the parking area of the target road segment;
when the second vehicle is detected to leave the parking area, recording the time when the second vehicle leaves the parking area as the leaving time;
associating the departure time with the ID of the second vehicle.
In one possible embodiment, the generating parking data based on the entry time and the exit time includes:
extracting a first target image frame corresponding to the entry time from the captured image frames based on the ID of the first vehicle;
extracting a second target image frame corresponding to the departure time from the captured image frames based on the ID of the second vehicle;
taking the first target image frame and the second target image frame as the parking data; and/or a first composite image obtained by compositing the first target image frame and the second target image frame is used as parking data.
In one possible embodiment, the generating parking data based on the entry time and the exit time includes:
extracting a third target image frame from the captured image frames based on the ID of the first vehicle, the third target image frame including: before the entry time, an image frame when the first vehicle arrives at the target road section, and an image frame corresponding to the entry time;
extracting a fourth target image frame from the captured image frames based on the ID of the second vehicle, the fourth target image frame including: the image frame corresponding to the leaving time, and the image frame when the second vehicle leaves the target road section after the leaving time;
extracting an image frame of the first vehicle stopped in the parking area from the acquired image frames as a fifth target image frame;
taking the third target image frame, the fourth target image frame, and the fifth target image frame as the parking data; and/or taking a second composite image obtained by combining the third target image frame, the fourth target image frame and the fifth target image frame as parking data.
In a possible embodiment, after the step of generating parking data based on the entry time and the exit time, the method further comprises:
saving the parking data, associating the parking data with the ID of the first vehicle or the ID of the second vehicle when saving the parking data;
displaying the parking data and/or transmitting the parking data to an external device.
In a second aspect, an embodiment of the present application provides a camera, including:
an image acquisition module to acquire image frames, wherein the image frames include images of vehicles within the target road segment tracked by the camera;
the first information recording module is used for recording the entering time of the first vehicle and the ID of the first vehicle when the first vehicle is detected to enter the parking area of the target road section from the image frame;
the second information recording module is used for recording the leaving time of a second vehicle and the ID of the second vehicle when the second vehicle is detected to leave the parking area of the target road section from the image frame;
a parking data generation module to generate parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
In a third aspect, an embodiment of the present application provides a parking monitoring system, where the system includes: the camera and the processing platform are erected on a target road section;
the camera, comprising:
an image acquisition module to acquire image frames, wherein the image frames include images of vehicles within the target road segment tracked by the camera;
the first information recording module is used for recording the entering time of the first vehicle and the ID of the first vehicle when the first vehicle is detected to enter the parking area of the target road section from the image frame;
the second information recording module is used for recording the leaving time of a second vehicle and the ID of the second vehicle when the second vehicle is detected to leave the parking area of the target road section from the image frame;
the processing platform comprises:
and a parking data generation module for acquiring an entry time of a first vehicle and an ID of the first vehicle and an exit time of a second vehicle and an ID of the second vehicle from the camera, and generating parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
In a fourth aspect, an embodiment of the present application provides a parking monitoring system, where the system includes: the camera and the processing platform are erected on a target road section;
the camera includes:
the image acquisition module is used for acquiring image frames and sending the image frames to a processing platform, wherein the image frames comprise images of vehicles in the target road section tracked by the camera;
the processing platform comprises:
the first information recording module is used for recording the entering time of the first vehicle and the ID of the first vehicle when the first vehicle is detected to enter the parking area of the target road section from the image frame;
the second information recording module is used for recording the leaving time of a second vehicle and the ID of the second vehicle when the second vehicle is detected to leave the parking area of the target road section from the image frame;
a parking data generation module to generate parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
The embodiment of the application has the following beneficial effects:
in this embodiment, from the image frames captured by the at least one camera of the acquired target section, it is possible to record the entry time of the first vehicle and the ID of the first vehicle when it is detected that the first vehicle enters the parking area of the target section, and record the exit time of the second vehicle and the ID of the second vehicle when it is detected that the second vehicle exits the parking area of the target section, and then automatically generate parking data based on the corresponding entry time and exit time when the ID of the first vehicle is the same as the ID of the second vehicle. The trouble of manually acquiring parking data is avoided, and the data acquisition efficiency and the data acquisition accuracy are improved.
Drawings
FIG. 1 is a flow chart illustrating the steps of one embodiment of a parking monitoring method in accordance with an exemplary embodiment;
FIG. 2 is a schematic view of a camera deployment shown in an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram illustrating trigger line identification in accordance with an exemplary embodiment of the present application;
FIG. 4 is a hardware block diagram of the camera of the present application;
FIG. 5 is a block diagram illustrating the structure of one embodiment of a camera according to an exemplary embodiment of the present application;
FIG. 6 is a block diagram illustrating an exemplary embodiment of a parking monitoring system according to the present application;
fig. 7 is a block diagram illustrating an exemplary parking monitoring system according to another exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a parking monitoring method according to an exemplary embodiment of the present application is shown, where the embodiment of the present application may include the following steps:
step 101, acquiring an image frame acquired by at least one camera of a target road section.
Wherein the image frames include images of vehicles within the camera tracking target road segment.
In one possible embodiment, a single camera may be mounted at either end of the target section, or cameras may be mounted at each end of the target section. As an example, the target section may be a parking toll section including a parking area, one of both ends of the target section may be an entrance, and the other end may be an exit, and the camera may include cameras respectively installed at the entrance and the exit of the parking toll section. For example, as shown in fig. 2, the first camera is a camera installed at the entrance of the parking toll road, the second camera is a camera installed at the exit of the parking toll road, the two cameras are opposite in direction, and monitoring of the roadside parking toll road is jointly completed.
In a possible embodiment, the acquiring the image frames captured by the at least one camera of the target road segment includes:
image frames acquired by a camera at an entrance of a parking toll road section are acquired.
In this scenario, the camera at the entrance may monitor the entire parking toll road section, and the image frame is an image including the entire parking toll road section, that is, the camera at the entrance may monitor vehicles entering the parking area of the toll road section and vehicles leaving the parking area of the toll road section.
In another possible embodiment, the acquiring the image frames captured by the at least one camera of the target road segment includes:
and acquiring image frames acquired by a camera at an exit of the parking charging section.
In this scenario, the camera at the exit may monitor the entire parking toll section, and the image frame is an image including the entire parking toll section, i.e., the camera at the exit may monitor vehicles entering the parking area of the toll section and vehicles leaving the parking area of the toll section.
In another possible embodiment, the acquiring the image frames captured by the at least one camera of the target road segment includes:
and acquiring image frames acquired by cameras at an entrance and an exit of the parking charging section.
In this scenario, the camera at the entrance and the camera at the exit jointly complete the monitoring of the entire parking toll road section. When the vehicle enters the parking area or leaves the parking area, the position of the vehicle is in the monitoring area of the camera at the entrance, and the camera at the entrance is responsible for detecting and tracking the vehicle; when the vehicle enters the parking area or leaves the parking area, the position of the vehicle is in the monitoring area of the camera at the exit, and the camera at the exit is responsible for detecting and tracking the vehicle; if the monitoring areas of the cameras at the entrance and the exit are overlapped and the vehicle is located in the overlapped area, the two cameras simultaneously detect and track the vehicle.
And 102, when a first vehicle enters a parking area of a target road section from the image frame, recording the entering time of the first vehicle and the ID of the first vehicle.
The embodiment may perform image analysis on the obtained image frames to determine whether a vehicle enters the parking area of the target road segment, and when it is detected that a vehicle (i.e., a first vehicle) enters the parking area of the target road segment, the entry time of the first vehicle into the parking area of the target road segment and the ID of the first vehicle may be recorded through the analysis result of the image.
In a possible implementation manner of this embodiment, step 102 may include the following sub-steps:
and a substep S11 of tracking the first vehicle based on the ID of the first vehicle when the first vehicle arriving at the target road segment is detected from the image frame.
In one possible embodiment, one or more trigger lines may be pre-marked for the target road segment. The trigger line may comprise an entrance trigger line for a target road segment, as shown in fig. 3, where the trigger line 1 is an entrance trigger line for a curb parking charge road segment.
When a first vehicle trigger entrance trigger line is detected (the meaning of triggering a certain trigger line here may include any time or all times from when the head of the first vehicle touches the trigger line to when the tail of the first vehicle leaves the trigger line), it may be determined that the first vehicle reaches the target road segment, and then the first vehicle may be locked as a tracking target, and a preset tracking algorithm is adopted to track the first vehicle based on the ID of the first vehicle, which is not limited in this embodiment.
It should be noted that the trigger line may include a virtual trigger line or a physical trigger line. For example, the trigger line of the entity may include a physical line, an electromagnetic induction line, and the like. The virtual departure line may be a geographic edge of a virtually divided area, which may be determined by geographic location, e.g., when the geographic location of the first vehicle is at the geographic edge of the divided area, it is determined that the first vehicle triggers the ingress departure line.
In a possible embodiment of this embodiment, before the step of tracking the first vehicle based on the ID of the first vehicle, this embodiment may further include the following step:
and acquiring the vehicle characteristics of the first vehicle when the first vehicle reaches the target road section to obtain first vehicle characteristic information.
In one example, an image frame may be input into a trained deep neural network model to perform a first vehicle detection on the image frame by the deep neural network model, and to extract first vehicle feature information of the detected first vehicle.
As an example, the deep neural network model may include a region-based (region) convolutional neural network (R-CNN), an upgraded version of Fast R-CNN, and the like.
Illustratively, the deep neural network model is a model obtained by off-line training by using a deep learning algorithm, the image sample used for training may include a sample related to parking in the roadside parking toll road section, and before training, the image sample may be marked, and the mark may include a mark of the toll road section, a mark of a parking area in the toll road section, a mark of a trigger line of the parking area, a mark of an entrance and an exit of the toll road section, a vehicle mark and the like.
As one example, the first vehicle characteristic information may include, but is not limited to: position information, license plate information, body color, vehicle major brand, vehicle minor brand, etc.
In a possible embodiment of this embodiment, after the step of obtaining the vehicle characteristic of the first vehicle when the first vehicle reaches the target road segment to obtain the first vehicle characteristic information, this embodiment may further include the steps of:
and generating the ID of the first vehicle based on the license plate information. In the present embodiment, the ID of the vehicle is a globally unique ID, and the same vehicle has the same ID.
In one possible implementation manner, the license plate information of the first vehicle may be generated into the ID of the first vehicle according to a preset generation rule.
The specific generation rule is not limited in this embodiment, for example, the license plate information may be generated into the ID according to a preset generation algorithm. For another example, the vehicle ID may be formed by setting in advance indicator characters corresponding to features such as a vehicle body color, a vehicle main brand, and a vehicle sub-brand, and combining the license plate information and the indicator characters corresponding to the vehicle body color, the vehicle main brand, and the vehicle sub-brand in a predetermined order.
In a possible implementation manner, after the step of generating the ID of the first vehicle based on the license plate information, the method may further include the steps of:
in the process of tracking the first vehicle, when the ID of the currently acquired first vehicle is inconsistent with the ID of the previously tracked first vehicle, comparing whether the currently acquired first vehicle characteristic information is the same as other information except the license plate information of the previously tracked first vehicle characteristic information; and if so, correcting the currently acquired ID of the first vehicle to the ID of the previously tracked first vehicle.
In this embodiment, in the process of tracking the first vehicle, there may be a case where the ID of the first vehicle currently acquired is inconsistent with the ID always tracked in the past due to false detection of the license plate number, for example, when the first vehicle enters the upper edge of the parking area, due to reasons such as that the vehicle is blocked or the vehicle runs abnormally, the license plate number detection is mistaken, so that the ID of the first vehicle currently generated is inconsistent with the ID always tracked by the algorithm, and at this time, the inconsistent ID may be corrected according to the first vehicle characteristic information. For example, if the first vehicle characteristic information corresponding to the ID of the inconsistent first vehicle is highly similar to the first vehicle characteristic information that has been tracked previously or is completely the same except for the license plate information, the inconsistent ID may be corrected to the ID of the first vehicle that has been tracked previously.
In practice, in the process of tracking the first vehicle, the confidence level of the camera may be further calculated according to the ID of the first vehicle detected by the error, and the confidence level may be used to describe the accuracy of the camera in detecting the license plate information.
And a substep S12, recording the time when the first vehicle arrives at the parking area as the entering time when the first vehicle arrives at the parking area.
In a possible embodiment, the trigger line may also comprise a parking trigger line, as shown in fig. 3, the trigger line 3 being a parking trigger line for a parking area in a roadside parking charging section.
When the parking trigger line of the first vehicle triggering the parking area is tracked, it may be determined that the first vehicle arrives at the parking area, and the time when the first vehicle arrives at the parking area is recorded as the entry time T1, i.e., the time when the first vehicle triggers the parking trigger line is recorded as the entry time T1. For example, when the first vehicle triggers the trigger line 1 of fig. 3, the tracking is triggered, and when the first vehicle trigger line 3 is tracked, it can be determined that the first vehicle arrives at the parking area, and at this time, the time when the first vehicle touches the trigger line 3 can be recorded as the entry time T1, and for example, the T1 can include any or all of the time of the whole process from the time when the head of the first vehicle arrives at the trigger line 3 to the time when the tail of the first vehicle leaves the trigger line 3.
A substep S13 of associating the time of entry with the ID of the first vehicle.
When the entry time of the first vehicle arriving at the parking area is obtained, the entry time can be associated with the ID of the tracked first vehicle, and a binding relationship between the two is generated.
In a possible implementation manner of this embodiment, after the step of recording the time when the first vehicle arrives at the parking area as the entry time when the first vehicle arrives at the parking area is detected, the method may further include the following steps:
stopping tracking of the first vehicle after determining that the first vehicle is parked in a parking area.
In this embodiment, after the first vehicle is confirmed to be parked in the parking area (for example, the first vehicle is parked in a parking space, and the parking area includes a plurality of parking spaces), the first vehicle may not be tracked any more, so as to avoid resource consumption caused by tracking all the time. In practice, the ID of the first vehicle may be released when the first vehicle is no longer being tracked, wherein the released ID can be reused in subsequent detection processes.
And 103, when a second vehicle is detected to leave the parking area of the target road section from the image frame, recording the leaving time of the second vehicle and the ID of the second vehicle.
The embodiment may further determine whether the second vehicle leaves the parking area of the target road section according to the image frame, and when it is detected that the second vehicle leaves the parking area of the target road section, the leaving time of the second vehicle leaving the parking area and the ID of the second vehicle may be recorded through an analysis result of the image.
In a possible implementation manner of this embodiment, step 103 may include the following sub-steps:
a substep S21 of tracking the second vehicle based on the ID of the second vehicle when it is detected that the second vehicle moves within the parking area of the target section.
In one possible embodiment, the second vehicle may be determined to be moving when a change in the position of the second vehicle in the target road segment is detected. At this time, the second vehicle may be locked as a tracking target, and a preset tracking algorithm is used to track the second vehicle, which is not limited in this embodiment.
In a possible implementation manner, when it is detected that the second vehicle moves in the parking area of the target road section, before tracking the second vehicle, second vehicle characteristic information of the second vehicle may be obtained through the deep neural network model, and the ID of the second vehicle may be generated according to license plate information in the second vehicle characteristic information. For a specific manner of obtaining the second vehicle characteristic information and generating the ID of the second vehicle, reference may be made to the manner of obtaining the first vehicle characteristic information and generating the ID of the first vehicle, which is not described herein again.
If the second vehicle characteristic information of the second vehicle is identical to the first vehicle characteristic information of the first vehicle, it can be determined that both are identical vehicles, and the ID of the second vehicle and the ID of the first vehicle are identical.
And a substep S22 of recording a time when the second vehicle leaves the parking area as the leaving time when it is detected that the second vehicle leaves the parking area.
In one possible embodiment, when the parking trigger line of the second vehicle triggering the parking area is tracked, it may be determined that the second vehicle leaves the parking area, and the time when the second vehicle leaves the parking area is recorded as the leaving time T2, i.e., the time when the second vehicle triggers the parking trigger line is recorded as the leaving time T2. For example, when the second vehicle triggers the trigger line 3 of fig. 3, it may be determined that the second vehicle leaves the parking area, and the time when the second vehicle triggers the trigger line 3 may be recorded as the leaving time T2, for example, T2 may include any or all of the time when the head of the second vehicle reaches the trigger line 3 to the end of the vehicle leaves the trigger line 3.
A substep S23 of associating the departure time with the ID of the second vehicle.
When the departure time of the second vehicle from the parking area is obtained, the departure time may be associated with the ID of the tracked second vehicle, and a binding relationship between the two may be generated.
Similar to the above-mentioned tracking of the first vehicle, in this embodiment, in the process of tracking the second vehicle, there may be a case where the currently acquired ID of the second vehicle is inconsistent with the previously always tracked ID due to false detection of the license plate number, and at this time, the inconsistent ID may be corrected according to the second vehicle characteristic information. For example, if the second vehicle characteristic information corresponding to the ID of the inconsistent second vehicle has a high similarity to the second vehicle characteristic information that has been tracked previously or is completely the same except for the license plate number, the inconsistent ID may be corrected to the ID of the second vehicle that has been tracked previously.
In one possible embodiment, after the second vehicle leaves the parking area, tracking of the second vehicle may continue until the second vehicle leaves the target road segment. Illustratively, the trigger line may also include an exit trigger line for the target road segment, as shown in trigger line 2 of fig. 3. When the second vehicle triggers the parking trigger line, tracking of the second vehicle may continue until the second vehicle crosses the exit trigger line, and the second vehicle is no longer tracked.
And 104, when the ID of the first vehicle is the same as the ID of the second vehicle, generating parking data based on the entering time and the leaving time.
For example, the ID of the first vehicle may be compared with the ID of the second vehicle, and if the two are the same, it indicates that the first vehicle and the second vehicle are the same vehicle, at this time, the entry time associated with the ID of the first vehicle and the exit time associated with the ID of the second vehicle may be acquired, and the parking data may be generated according to the entry time and the exit time.
As an example, the parking data may include parking fees, which in one possible implementation of this embodiment may be determined using the following sub-steps:
determining a difference value between the entering time and the leaving time as a parking time; and determining parking cost according to the parking time.
Specifically, the difference between the entering time and the leaving time of the vehicle is calculated to obtain the parking time of the vehicle in the parking area. And obtaining the parking fee of the vehicle according to the product of the preset charging unit price and the parking time.
In another possible implementation manner of this embodiment, step 104 may include the following sub-steps:
a sub-step S31 of extracting a first target image frame corresponding to the entry time from the captured image frames based on the ID of the first vehicle;
in one possible embodiment, after determining the entry time of the first vehicle into the parking area of the target road segment, an image corresponding to the entry time may be used as the first target image frame in real time. In another possible embodiment, upon detection of the first vehicle, image frames containing the first vehicle may be associated with the ID of the first vehicle, wherein each image frame includes timestamp information. Then, according to the determined entry time, an image frame corresponding to the entry time is extracted from the image frames associated with the ID of the first vehicle as a first target image frame.
In one example, the image corresponding to the entry time may be an image corresponding to any time of the entry time.
In another example, the images corresponding to the entry time may include images corresponding to all times of the entry time, and then the images corresponding to all times may constitute the target video corresponding to the entry time.
A sub-step S32 of extracting a second target image frame corresponding to the departure time from the captured image frames based on the ID of the second vehicle;
in one possible embodiment, after determining the departure time of the second vehicle from the parking area of the target road segment, the image corresponding to the departure time may be used as the second target image frame in real time. In another possible embodiment, upon detecting the second vehicle, image frames containing the second vehicle may be associated with the ID of the second vehicle, wherein each image frame includes timestamp information. Then, according to the determined departure time, an image frame corresponding to the departure time is extracted from the image frames associated with the ID of the second vehicle as a second target image frame.
In one example, the image corresponding to the departure time may be an image corresponding to an arbitrary time of departure time.
In another example, the images corresponding to the departure time may include images corresponding to all times of the departure time, and then the images corresponding to all times may constitute the target video corresponding to the departure time.
In one possible implementation, the first target image frame or the second target image frame may be retrieved from a memory device of the camera. Specifically, storage devices such as an SD card, a usb disk, and a TF card may be inserted into the camera (including the camera at the entrance and/or the camera at the exit), and after the camera is powered on or a video recording plan is started through page configuration in the middle of the process, the camera stores the acquired video stream to the storage devices in the whole process. Alternatively, the user may store the video stream in an external device such as a local computer or a terminal device through a channel such as IE, and at this time, the first target image frame or the second target image frame may be searched from the external device.
A substep S33 of taking the first target image frame and the second target image frame as the parking data; and/or a first composite image obtained by compositing the first target image frame and the second target image frame is used as parking data.
In one embodiment, after the first target image frame and the second target image frame are obtained, the first target image frame and the second target image frame may be directly used as parking data. In another embodiment, the first target image frame and the second target image frame may be merged to obtain a first composite image as the parking data.
In yet another possible implementation manner of this embodiment, step 104 may include the following sub-steps:
a sub-step S41 of extracting a third target image frame from the captured image frames based on the ID of the first vehicle, the third target image frame including: before the entry time, an image frame when the first vehicle arrives at the target road section, and an image frame corresponding to the entry time;
in the present embodiment, the image frame when the first vehicle reaches the target section and the image frame when the first vehicle reaches the parking area may be set as the third target image frame according to the ID of the first vehicle. For example, the image frame when the first vehicle reaches the entrance trigger line and the image frame when the first vehicle reaches the parking trigger line are taken as the third target image frame.
A sub-step S42 of extracting a fourth target image frame from the captured image frames based on the ID of the second vehicle, the fourth target image frame including: the image frame corresponding to the leaving time, and the image frame when the second vehicle leaves the target road section after the leaving time;
in the present embodiment, the image frame when the second vehicle leaves the parking area and the image frame when the second vehicle leaves the target link may be set as the fourth target image frame according to the ID of the second vehicle. For example, the image frame when the second vehicle leaves the parking trigger line and the image frame when the second vehicle leaves the exit trigger line are taken as the fourth target image frame.
A substep S43 of extracting an image frame of the first vehicle stopped in the parking area from the acquired image frames as a fifth target image frame;
in this embodiment, the image frame at any time may be selected as the fifth target image frame while the first vehicle stays in the parking area.
A substep S44 of taking the third target image frame, the fourth target image frame, and the fifth target image frame as the parking data; and/or taking a second composite image obtained by combining the third target image frame, the fourth target image frame and the fifth target image frame as parking data.
For example, in fig. 3, an image when the vehicle enters the trigger line 1, an image when the vehicle enters the trigger line 3, an image when the vehicle stays in the parking space area, an image when the vehicle leaves the trigger line 3, and an image when the vehicle leaves the trigger line 2 may be used as the parking data, and the image frames of the above-mentioned five times may be combined to obtain a second combined image for describing the parking process of the user as the parking data. In a possible implementation manner of this embodiment, the following steps may be further included:
saving the parking data and associating the parking data with the ID of the first vehicle or the ID of the second vehicle.
For example, after obtaining the parking data, the parking data may be saved in a local or external device, and the parking data may be associated with the ID of the first vehicle or the ID of the second vehicle when the parking data is saved.
In a possible implementation manner of this embodiment, the following steps may be further included:
displaying the parking data and/or transmitting the parking data to an external device.
In one example, after the parking data is obtained, the parking data and the associated vehicle ID may be displayed via a connected display device.
In other examples, the parking data may be actively pushed to the vehicle owner, or sent to the vehicle owner upon request by the vehicle owner. For example, the parking fee may be displayed or sent to the vehicle owner, and when the vehicle owner disputes the parking fee, the vehicle owner may request to view the corresponding first target image frame, second target image frame and/or composite image, which is used as a basis for parking to confirm the accuracy of the payment information.
In this embodiment, from the image frames captured by the at least one camera of the acquired target section, it is possible to record the entry time of the first vehicle and the ID of the first vehicle when it is detected that the first vehicle enters the parking area of the target section, and record the exit time of the second vehicle and the ID of the second vehicle when it is detected that the second vehicle exits the parking area of the target section, and then automatically generate parking data based on the corresponding entry time and exit time when the ID of the first vehicle is the same as the ID of the second vehicle. The trouble of manually acquiring parking data is avoided, and the data acquisition efficiency and the data acquisition accuracy are improved.
Corresponding to the embodiment of the method, the application also provides an embodiment of the camera.
The camera embodiment may be implemented by software, or by hardware, or by a combination of hardware and software. Taking a software implementation as an example, as a logical camera, the camera is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for operation through the processor of the device where the camera is located. From a hardware aspect, as shown in fig. 4, the hardware structure diagram of the device where the camera is located in the present application is shown, except for the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 4, the device where the camera is located in the embodiment may also include other hardware according to the actual function of the camera, which is not described again.
Referring to fig. 5, a block diagram of a structure of an embodiment of a camera according to an exemplary embodiment of the present application is shown, and the structure may specifically include the following modules:
an image acquisition module 501, configured to acquire image frames, where the image frames include images of vehicles in the target road segment tracked by the camera;
a first information recording module 502 for recording an entry time of a first vehicle and an ID of the first vehicle when the first vehicle is detected to enter a parking area of a target section from the image frame;
a second information recording module 503, configured to record a departure time of a second vehicle and an ID of the second vehicle when it is detected from the image frame that the second vehicle departs from a parking area of the target road segment;
a parking data generation module 504 configured to generate parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
In a possible embodiment of this embodiment, the first information recording module 502 includes:
a first tracking sub-module for tracking the first vehicle based on an ID of the first vehicle when the first vehicle arriving at the target road section is detected from the image frames;
the entry time determining submodule is used for recording the time when the first vehicle arrives at the parking area as the entry time when the first vehicle arrives at the parking area;
a first association submodule for associating the entry time with the ID of the first vehicle.
In a possible embodiment of this embodiment, the camera may further include the following modules:
the first vehicle characteristic information determining module is used for acquiring vehicle characteristics of the first vehicle when the first vehicle reaches the target road section to obtain first vehicle characteristic information, and the first vehicle characteristic information comprises license plate information;
and the first vehicle ID generation module is used for generating the ID of the first vehicle based on the license plate information.
In a possible embodiment of this embodiment, the following modules may be further included:
the ID correction module is used for comparing whether the currently acquired first vehicle characteristic information is the same as the previously tracked first vehicle characteristic information except the license plate information when the currently acquired ID of the first vehicle is inconsistent with the previously tracked ID of the first vehicle in the process of tracking the first vehicle; and if so, correcting the currently acquired ID of the first vehicle to the ID of the previously tracked first vehicle.
In a possible embodiment of this embodiment, the camera may further include the following modules:
a stop tracking module to stop tracking the first vehicle after determining that the first vehicle is stopped in a parking area.
In a possible embodiment of this embodiment, the second information recording module 503 may include the following sub-modules:
a second tracking sub-module configured to track the second vehicle based on an ID of the second vehicle when it is detected that the second vehicle moves within the parking area of the target section;
the leaving time determining submodule is used for recording the time when the second vehicle leaves the parking area as the leaving time when the second vehicle is detected to leave the parking area;
a second association submodule for associating the departure time with the ID of the second vehicle.
In a possible embodiment of this embodiment, the parking data generating module 504 includes:
a first target image frame extraction sub-module configured to extract a first target image frame corresponding to the entry time from the captured image frames based on the ID of the first vehicle;
a second target image frame extraction sub-module for extracting a second target image frame corresponding to the departure time from the captured image frames based on the ID of the second vehicle;
a first parking data determination submodule for taking the first target image frame and the second target image frame as the parking data; and/or a first composite image obtained by compositing the first target image frame and the second target image frame is used as parking data.
In another possible embodiment of this embodiment, the parking data generating module 504 includes:
a third target image frame extraction sub-module for extracting a third target image frame from the captured image frames based on the ID of the first vehicle, the third target image frame including: before the entry time, an image frame when the first vehicle arrives at the target road section, and an image frame corresponding to the entry time;
a fourth target image frame extraction sub-module for extracting a fourth target image frame from the captured image frames based on the ID of the second vehicle, the fourth target image frame including: the image frame corresponding to the leaving time, and the image frame when the second vehicle leaves the target road section after the leaving time;
a fifth target image frame extraction submodule for extracting an image frame of the first vehicle stopped in the parking area from the acquired image frames as a fifth target image frame;
a second parking data determination submodule for taking the third target image frame, the fourth target image frame, and the fifth target image frame as the parking data; and/or taking a second composite image obtained by combining the third target image frame, the fourth target image frame and the fifth target image frame as parking data.
In a possible embodiment of this embodiment, the camera further includes:
the data storage module is used for storing the parking data and associating the parking data with the ID of the first vehicle or the ID of the second vehicle when the parking data is stored;
the parking data display module is used for displaying the parking data;
and/or the presence of a gas in the gas,
and the parking data sending module is used for sending the parking data to external equipment.
The camera provided in this embodiment may record, according to the image frames acquired by the image acquisition module, the entry time of the first vehicle and the ID of the first vehicle when the first information recording module detects that the first vehicle enters the parking area of the target road segment, and record the exit time of the second vehicle and the ID of the second vehicle when the second information recording module detects that the second vehicle exits the parking area of the target road segment, and then automatically generate parking data based on the corresponding entry time and exit time through the parking data generation module when the ID of the first vehicle is the same as the ID of the second vehicle. The trouble of manually acquiring parking data is avoided, and the data acquisition efficiency and the data acquisition accuracy are improved.
In the embodiment, the camera has the functions of image acquisition, vehicle identification and tracking, vehicle entrance or exit time determination, parking data determination and the like, so that the performance of the camera is enriched, the layout complexity and the cost operation of the whole system are greatly reduced, and the use of common users is facilitated.
Referring to fig. 6, a block diagram of an embodiment of a parking monitoring system according to an exemplary embodiment of the present application is shown, where the parking monitoring system may include: a camera 60 and a processing platform 70 erected on a target section;
the camera 60 includes:
an image acquisition module 601, configured to acquire image frames, where the image frames include images of vehicles in the target road segment tracked by the camera;
a first information recording module 602, configured to record an entry time of a first vehicle and an ID of the first vehicle when the first vehicle is detected to enter a parking area of a target road segment from the image frame;
a second information recording module 603 configured to record a departure time of a second vehicle and an ID of the second vehicle when it is detected from the image frame that the second vehicle departs from a parking area of the target road segment;
the processing platform 70 comprises:
a parking data generating module 701, configured to acquire an entry time and an ID of a first vehicle and an exit time and an ID of a second vehicle from the camera, and generate parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
In this embodiment, the camera has the functions of image acquisition, vehicle identification and tracking, determination of the time of vehicle entering or exiting, and the like, so that the performance of the camera is enriched, the layout complexity and the cost operation of the whole system are greatly reduced, and the use of common users is facilitated.
Referring to fig. 7, a block diagram of an embodiment of a parking monitoring system according to another exemplary embodiment of the present application is shown, which may include: a camera 80 and a processing platform 90 erected on a target section;
the camera 80 includes:
an image acquisition module 801, configured to acquire image frames and send the image frames to a processing platform, where the image frames include images of vehicles in the target road segment tracked by the camera;
the processing platform 90 comprises:
a first information recording module 901, configured to record an entry time of a first vehicle and an ID of the first vehicle when the first vehicle is detected to enter a parking area of a target road segment from the image frame;
a second information recording module 902, configured to record a departure time of a second vehicle and an ID of the second vehicle when it is detected from the image frame that the second vehicle departs from a parking area of the target road segment;
a parking data generation module 903, configured to generate parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
In this embodiment, the camera and the processing platform together complete the process of acquiring the parking data, the camera is responsible for acquiring the image, the processing platform is responsible for recording the entry time of the first vehicle and the ID of the first vehicle when it is detected that the first vehicle enters the parking area of the target road segment, and recording the exit time of the second vehicle and the ID of the second vehicle when it is detected that the second vehicle exits the parking area of the target road segment, and then automatically generating the parking data based on the corresponding entry time and exit time when the ID of the first vehicle is the same as the ID of the second vehicle. The trouble of manually acquiring parking data is avoided, and the data acquisition efficiency and the data acquisition accuracy are improved.
For the system and camera embodiments, since they correspond substantially to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points.
The above-described embodiments of the camera and the system are merely illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of the above-described method embodiments.
The embodiment of the present application further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the above method embodiments when executing the program.
Embodiments of the subject matter and the functional operations described in this specification can be implemented in: digital electronic circuitry, tangibly embodied computer software or firmware, computer hardware including the structures disclosed in this specification and their structural equivalents, or a combination of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a tangible, non-transitory program carrier for execution by, or to control the operation of, data processing apparatus. Alternatively or additionally, the program instructions may be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode and transmit information to suitable receiver apparatus for execution by the data processing apparatus. The computer storage medium may be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
The processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform corresponding functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Computers suitable for executing computer programs include, for example, general and/or special purpose microprocessors, or any other type of central processing unit. Generally, a central processing unit will receive instructions and data from a read-only memory and/or a random access memory. The basic components of a computer include a central processing unit for implementing or executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer does not necessarily have such a device. Further, the computer may be embedded in another device, e.g., a vehicle-mounted terminal, a mobile telephone, a Personal Digital Assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.
Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices), magnetic disks (e.g., an internal hard disk or a removable disk), magneto-optical disks, and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. In other instances, features described in connection with one embodiment may be implemented as discrete components or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. Further, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some implementations, multitasking and parallel processing may be advantageous.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (12)

1. A method of parking monitoring, the method comprising:
acquiring image frames acquired by at least one camera of a target road section, wherein the image frames comprise images of vehicles in the target road section tracked by the camera;
recording an entry time of a first vehicle and an ID of the first vehicle when the first vehicle is detected to enter a parking area of a target section from the image frame;
when a second vehicle is detected to leave the parking area of the target road section from the image frame, recording the leaving time of the second vehicle and the ID of the second vehicle;
generating parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
2. The method of claim 1, wherein the recording the time of entry of the first vehicle and the ID of the first vehicle upon detecting from the image frames that the first vehicle entered a parking area of a target road segment comprises:
tracking the first vehicle based on an ID of the first vehicle when the first vehicle is detected to reach the target road segment from the image frames;
when the first vehicle is detected to arrive at a parking area, recording the time when the first vehicle arrives at the parking area as the entering time;
associating the entry time with the ID of the first vehicle.
3. The method of claim 2, wherein prior to the tracking the first vehicle based on the ID of the first vehicle, the method further comprises:
obtaining vehicle characteristics of the first vehicle when the first vehicle reaches the target road section to obtain first vehicle characteristic information, wherein the first vehicle characteristic information comprises license plate information;
and generating the ID of the first vehicle based on the license plate information.
4. The method of claim 3, further comprising, after the generating the ID of the first vehicle based on the license plate information:
in the process of tracking the first vehicle, when the ID of the currently acquired first vehicle is inconsistent with the ID of the previously tracked first vehicle, comparing whether the currently acquired first vehicle characteristic information is the same as other information except the license plate information of the previously tracked first vehicle characteristic information;
and if so, correcting the currently acquired ID of the first vehicle to the ID of the previously tracked first vehicle.
5. The method of any of claims 2-4, wherein after said recording the time the first vehicle arrived at the parking area as the entry time upon detecting the first vehicle arrived at the parking area, the method further comprises:
stopping tracking of the first vehicle after determining that the first vehicle is parked in a parking area.
6. The method of claim 1, wherein the recording of the departure time of the second vehicle and the ID of the second vehicle upon detecting from the image frames that the second vehicle departed from the parking area of the target road segment comprises:
tracking the second vehicle based on the ID of the second vehicle when the second vehicle is detected to move within the parking area of the target road segment;
when the second vehicle is detected to leave the parking area, recording the time when the second vehicle leaves the parking area as the leaving time;
associating the departure time with the ID of the second vehicle.
7. The method of claim 1, wherein generating parking data based on the entry time and the exit time comprises:
extracting a first target image frame corresponding to the entry time from the captured image frames based on the ID of the first vehicle;
extracting a second target image frame corresponding to the departure time from the captured image frames based on the ID of the second vehicle;
taking the first target image frame and the second target image frame as the parking data; and/or a first composite image obtained by compositing the first target image frame and the second target image frame is used as parking data.
8. The method of claim 1, wherein generating parking data based on the entry time and the exit time comprises:
extracting a third target image frame from the captured image frames based on the ID of the first vehicle, the third target image frame including: the image frame when the first vehicle arrives at the target road section and the image frame corresponding to the entry time;
extracting a fourth target image frame from the captured image frames based on the ID of the second vehicle, the fourth target image frame including: the image frame corresponding to the leaving time and the image frame when the second vehicle leaves the target road section;
extracting an image frame of the first vehicle stopped in the parking area from the acquired image frames as a fifth target image frame;
taking the third target image frame, the fourth target image frame, and the fifth target image frame as the parking data; and/or taking a second composite image obtained by combining the third target image frame, the fourth target image frame and the fifth target image frame as parking data.
9. The method of any of claims 1-4 and 6-8, wherein after the step of generating parking data based on the entry time and the exit time, the method further comprises:
saving the parking data, associating the parking data with the ID of the first vehicle or the ID of the second vehicle when saving the parking data;
displaying the parking data and/or transmitting the parking data to an external device.
10. A camera, characterized in that the camera comprises:
an image acquisition module to acquire image frames comprising images of vehicles within the target road segment tracked by the camera;
the first information recording module is used for recording the entering time of the first vehicle and the ID of the first vehicle when the first vehicle is detected to enter the parking area of the target road section from the image frame;
the second information recording module is used for recording the leaving time of a second vehicle and the ID of the second vehicle when the second vehicle is detected to leave the parking area of the target road section from the image frame;
a parking data generation module to generate parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
11. A parking monitoring system, the system comprising: the camera and the processing platform are erected on a target road section;
the camera, comprising:
an image acquisition module to acquire image frames, wherein the image frames include images of vehicles within the target road segment tracked by the camera;
the first information recording module is used for recording the entering time of the first vehicle and the ID of the first vehicle when the first vehicle is detected to enter the parking area of the target road section from the image frame;
the second information recording module is used for recording the leaving time of a second vehicle and the ID of the second vehicle when the second vehicle is detected to leave the parking area of the target road section from the image frame;
the processing platform comprises:
and a parking data generation module for acquiring an entry time of a first vehicle and an ID of the first vehicle and an exit time of a second vehicle and an ID of the second vehicle from the camera, and generating parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
12. A parking monitoring system, the system comprising: the camera and the processing platform are erected on a target road section;
the camera includes:
the image acquisition module is used for acquiring image frames and sending the image frames to a processing platform, wherein the image frames comprise images of vehicles in the target road section tracked by the camera;
the processing platform comprises:
the first information recording module is used for recording the entering time of the first vehicle and the ID of the first vehicle when the first vehicle is detected to enter the parking area of the target road section from the image frame;
the second information recording module is used for recording the leaving time of a second vehicle and the ID of the second vehicle when the second vehicle is detected to leave the parking area of the target road section from the image frame;
a parking data generation module to generate parking data based on the entry time and the exit time when the ID of the first vehicle is the same as the ID of the second vehicle.
CN201910199802.2A 2019-03-15 2019-03-15 Parking monitoring method, system and camera Pending CN111696360A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910199802.2A CN111696360A (en) 2019-03-15 2019-03-15 Parking monitoring method, system and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910199802.2A CN111696360A (en) 2019-03-15 2019-03-15 Parking monitoring method, system and camera

Publications (1)

Publication Number Publication Date
CN111696360A true CN111696360A (en) 2020-09-22

Family

ID=72475378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910199802.2A Pending CN111696360A (en) 2019-03-15 2019-03-15 Parking monitoring method, system and camera

Country Status (1)

Country Link
CN (1) CN111696360A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112382104A (en) * 2020-11-13 2021-02-19 重庆盘古美天物联网科技有限公司 Roadside parking management method based on vehicle track analysis
CN112885108A (en) * 2020-12-23 2021-06-01 爱泊车美好科技有限公司 Vehicle change detection method and system on parking space based on deep learning algorithm
CN115762172A (en) * 2022-11-02 2023-03-07 济南博观智能科技有限公司 Method, device, equipment and medium for identifying vehicles entering and exiting parking places

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080101656A1 (en) * 2006-10-30 2008-05-01 Thomas Henry Barnes Method and apparatus for managing parking lots
CN104157064A (en) * 2014-08-28 2014-11-19 王波兰 Park exit controlling device
WO2015193059A1 (en) * 2014-06-17 2015-12-23 Robert Bosch Gmbh Valet parking method and system
CN105702080A (en) * 2016-04-07 2016-06-22 张勋 Cloud computing parking management information system for smart city
CN107330983A (en) * 2017-06-13 2017-11-07 智慧互通科技有限公司 A kind of Roadside Parking data processing method, apparatus and system
CN107767673A (en) * 2017-11-16 2018-03-06 智慧互通科技有限公司 A kind of Roadside Parking management method based on multiple-camera, apparatus and system
CN108090975A (en) * 2017-12-14 2018-05-29 深圳市捷顺科技实业股份有限公司 Parking management method and system, computer installation and readable storage medium storing program for executing
CN108154708A (en) * 2018-01-19 2018-06-12 北京悦畅科技有限公司 Parking lot management method, server, video camera and terminal device
US20180204457A1 (en) * 2015-09-11 2018-07-19 International Business Machines Corporation Determining a parking position based on visual and non-visual factors
CN108460540A (en) * 2018-03-23 2018-08-28 西安艾润物联网技术服务有限责任公司 Parking data management method, system, intelligent terminal and storage medium
CN108806268A (en) * 2018-06-13 2018-11-13 智慧互通科技有限公司 A kind of parking management method and system based on biometric image
CN109003338A (en) * 2018-06-22 2018-12-14 南京慧尔视智能科技有限公司 A kind of Roadside Parking self-clocking charging method and device
CN109086750A (en) * 2018-09-20 2018-12-25 智慧互通科技有限公司 Parking image processing method apparatus and system based on enhancing server
JP2018538195A (en) * 2015-12-04 2018-12-27 ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー Method for autonomously parking a powered vehicle with internal monitoring, driver assistance system, and powered vehicle

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080101656A1 (en) * 2006-10-30 2008-05-01 Thomas Henry Barnes Method and apparatus for managing parking lots
WO2015193059A1 (en) * 2014-06-17 2015-12-23 Robert Bosch Gmbh Valet parking method and system
CN104157064A (en) * 2014-08-28 2014-11-19 王波兰 Park exit controlling device
US20180204457A1 (en) * 2015-09-11 2018-07-19 International Business Machines Corporation Determining a parking position based on visual and non-visual factors
JP2018538195A (en) * 2015-12-04 2018-12-27 ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー Method for autonomously parking a powered vehicle with internal monitoring, driver assistance system, and powered vehicle
CN105702080A (en) * 2016-04-07 2016-06-22 张勋 Cloud computing parking management information system for smart city
CN107330983A (en) * 2017-06-13 2017-11-07 智慧互通科技有限公司 A kind of Roadside Parking data processing method, apparatus and system
CN107767673A (en) * 2017-11-16 2018-03-06 智慧互通科技有限公司 A kind of Roadside Parking management method based on multiple-camera, apparatus and system
CN108090975A (en) * 2017-12-14 2018-05-29 深圳市捷顺科技实业股份有限公司 Parking management method and system, computer installation and readable storage medium storing program for executing
CN108154708A (en) * 2018-01-19 2018-06-12 北京悦畅科技有限公司 Parking lot management method, server, video camera and terminal device
CN108460540A (en) * 2018-03-23 2018-08-28 西安艾润物联网技术服务有限责任公司 Parking data management method, system, intelligent terminal and storage medium
CN108806268A (en) * 2018-06-13 2018-11-13 智慧互通科技有限公司 A kind of parking management method and system based on biometric image
CN109003338A (en) * 2018-06-22 2018-12-14 南京慧尔视智能科技有限公司 A kind of Roadside Parking self-clocking charging method and device
CN109086750A (en) * 2018-09-20 2018-12-25 智慧互通科技有限公司 Parking image processing method apparatus and system based on enhancing server

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112382104A (en) * 2020-11-13 2021-02-19 重庆盘古美天物联网科技有限公司 Roadside parking management method based on vehicle track analysis
CN112885108A (en) * 2020-12-23 2021-06-01 爱泊车美好科技有限公司 Vehicle change detection method and system on parking space based on deep learning algorithm
CN112885108B (en) * 2020-12-23 2022-02-01 爱泊车美好科技有限公司 Vehicle change detection method and system on parking space based on deep learning algorithm
CN115762172A (en) * 2022-11-02 2023-03-07 济南博观智能科技有限公司 Method, device, equipment and medium for identifying vehicles entering and exiting parking places

Similar Documents

Publication Publication Date Title
CN108986242B (en) Secondary license plate recognition system and method for expressway non-stop mobile payment lane
EP1975884B1 (en) Mobile object charging system and mobile object charging method by mobile object charging system
CN104574954B (en) A kind of vehicle auditing method, control device and system based on free streaming system
CN111444798B (en) Identification method and device for driving behavior of electric bicycle and computer equipment
CN111696360A (en) Parking monitoring method, system and camera
CN108986539A (en) Parking management system, method, vehicle information collecting device and management server
CN103632572A (en) Intelligent parking method and system
CN110853391A (en) Intelligent shared parking system
CN110826356B (en) Non-motor vehicle violation detection system, method and server
CN106780886B (en) Vehicle identification system and vehicle entrance and exit identification method
CN105654561B (en) Multilane free-flow vehicle matching process
CN110164164B (en) Method for enhancing accuracy of mobile phone navigation software for identifying complex road by utilizing camera shooting function
CN111369801B (en) Vehicle identification method, device, equipment and storage medium
CN113055823A (en) Method and device for sharing bicycle based on roadside parking management
CN113096150A (en) Method and system for generating travel track, storage medium and electronic device
CN111950471A (en) Target object identification method and device
KR20100033331A (en) System and method for recognizing car number
CN113920749A (en) Vehicle following identification method and system for parking lot and related device
CN109495849A (en) A kind of method and device of recognition and tracking vehicle
CN111291722A (en) Vehicle weight recognition system based on V2I technology
JPH08315196A (en) Method and system for specifying vehicle
KR20210066081A (en) Parking management system capable of recognizing thai car number and the method there of
CN112712626A (en) Vehicle verification method and device based on license plate information, computer equipment and storage medium
KR20020032049A (en) A fare collection a means
CN112330965B (en) Method and device for determining vehicle information, storage medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200922

RJ01 Rejection of invention patent application after publication