CN112926446A - Parabolic detection method and system - Google Patents

Parabolic detection method and system Download PDF

Info

Publication number
CN112926446A
CN112926446A CN202110206045.4A CN202110206045A CN112926446A CN 112926446 A CN112926446 A CN 112926446A CN 202110206045 A CN202110206045 A CN 202110206045A CN 112926446 A CN112926446 A CN 112926446A
Authority
CN
China
Prior art keywords
parabolic
curve
point cloud
confidence
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110206045.4A
Other languages
Chinese (zh)
Inventor
翁仁亮
钱扬
程鉴张
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Aibee Technology Co Ltd
Original Assignee
Beijing Aibee Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Aibee Technology Co Ltd filed Critical Beijing Aibee Technology Co Ltd
Priority to CN202110206045.4A priority Critical patent/CN112926446A/en
Publication of CN112926446A publication Critical patent/CN112926446A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application discloses a parabolic detection method and a parabolic detection system. The method comprises the following steps: acquiring a camera-based parabolic detection result and a laser radar-based parabolic detection result of a target scene; and fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene. Because the two parts of parabolic detection results respectively comprise a parabolic curve and confidence coefficients, the confidence coefficients can assist in improving the accuracy of the fusion result. This scheme has exerted the advantage of two kinds of sensors, promotes the coverage that the thing detected, can not only compensate artifical control and easily have the not enough of control leak, also can reduce personnel and patrol work load, promotes the managerial efficiency in the thing detection scene, reduces and patrols and administrative cost.

Description

Parabolic detection method and system
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to a method and a system for detecting a parabola.
Background
In public places, an abnormal parabola is an extremely dangerous behavior. Taking an airport as an example, throwing prohibited articles to an isolation area upstairs of an airport terminal may possibly disturb the operation order of the airport if the airport terminal is on the ground, and may possibly cause major safety accidents if the airport terminal is on the ground. Currently, it is detected whether such parabolic behavior is present in public places, mainly based on images taken by surveillance cameras deployed in public places.
At present, few researches related to parabolic detection are carried out at home and abroad. In the prior art, generally, parabolic behavior analysis can be performed only on the basis of videos shot by a camera afterwards, and whether parabolic behavior exists in related places or not is difficult to detect in real time on the basis of videos shot by the camera, so that it is difficult to alarm for parabolic behavior occurring in the related places in real time, and loss caused by the parabolic behavior is reduced. In addition, there are some researches for monitoring the intrusion of articles by using a laser correlation mode, mainly a perimeter intrusion alarm detector which takes laser as a light source. The parabolic detection is carried out by using a single sensor, and a large amount of false reports and false report missing problems still exist in practical application.
Disclosure of Invention
Based on the above problems, the present application provides a method and a system for detecting a parabola, which can improve the coverage of the parabola detection and the accuracy of the detection.
The embodiment of the application discloses the following technical scheme:
in a first aspect, the present application provides a method of detecting a parabola, comprising:
acquiring a camera-based parabolic detection result and a laser radar-based parabolic detection result of a target scene; the camera-based parabolic detection result comprises: identifying a first parabolic curve and a first parabolic confidence corresponding to the first parabolic curve based on a two-dimensional image shot by a camera; the lidar based parabolic detection results include: a second parabolic curve identified based on the three-dimensional point cloud detected by the laser radar and a second parabolic confidence corresponding to the second parabolic curve;
and fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene.
Optionally, the fusing the parabolic detection result based on the camera and the parabolic detection result based on the lidar to obtain a parabolic detection fusion result of the target scene, including:
when the first parabolic confidence coefficient and the second parabolic confidence coefficient are both larger than a first confidence coefficient threshold and smaller than a second confidence coefficient threshold, obtaining the distance between the first parabolic curve and the second parabolic curve; the first confidence threshold is less than the second confidence threshold;
when the distance is smaller than a preset distance threshold value, determining that the first parabolic curve and the second parabolic curve are matched, judging that parabolic behavior occurs in the target scene, and taking the first parabolic curve or the second parabolic curve as a parabolic curve corresponding to the parabolic behavior.
Optionally, the obtaining a distance between the first parabolic curve and the second parabolic curve includes:
and mapping the second parabolic curve to the coordinate system of the two-dimensional image by using a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain the distance between the first parabolic curve and the mapped second parabolic curve.
Optionally, the camera-based parabolic detection result further includes: a first parabolic time, wherein the first parabolic time is an earliest parabolic time corresponding to the first parabolic curve; the lidar based parabolic detection result further comprises: a second parabolic time, wherein the second parabolic time is an earliest parabolic time corresponding to the second parabolic curve;
after the determining that the first parabolic curve and the second parabolic curve match, the method further comprises:
and taking the earliest parabolic time in the first parabolic time and the second parabolic time as the fused parabolic time.
Optionally, after the determining that the first parabolic curve and the second parabolic curve match, the method further comprises:
and taking the maximum confidence coefficient of the first parabolic confidence coefficient and the second parabolic confidence coefficient as the fused parabolic confidence coefficient.
Optionally, the method further comprises:
and when the first parabolic confidence coefficient and/or the second parabolic confidence coefficient is/are larger than or equal to the second confidence coefficient threshold value, judging that the parabolic behavior occurs in the target scene, and taking the curve with the maximum parabolic confidence coefficient in the first parabolic curve and the second parabolic curve as the parabolic curve corresponding to the parabolic behavior.
Optionally, the method further comprises:
and carrying out combined calibration on the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud.
Optionally, obtaining the camera-based parabolic detection result comprises:
obtaining a two-dimensional image of the target scene captured by the camera;
performing motion detection on the two-dimensional image to generate a track image;
performing curve detection on the track image, and fitting to obtain a first parabolic curve;
and carrying out parabolic judgment on the first parabolic curve to obtain a first parabolic confidence corresponding to the first parabolic curve.
Optionally, obtaining the lidar based parabolic detection result comprises:
obtaining a three-dimensional point cloud of the target scene detected by the laser radar;
preprocessing the three-dimensional point cloud to obtain queue point cloud data;
carrying out parabolic detection by utilizing the queue point cloud data, and fitting to obtain the second parabolic curve;
and carrying out parabolic judgment on the second parabolic curve to obtain a second parabolic confidence corresponding to the second parabolic curve.
Optionally, the preprocessing the three-dimensional point cloud to obtain queue point cloud data includes:
setting an interested area and a non-key area of the laser radar in a monitoring area of the target scene;
screening points in each frame of point cloud data of the three-dimensional point cloud by using a preset effective condition, discarding the points which do not meet the preset effective condition in each frame of point cloud data, and adding the rest points in each frame of point cloud data into a target queue by using a frame as a unit according to a time sequence to form queue point cloud data;
the preset effective conditions comprise: the points are located within the region of interest and outside the non-emphasized region.
In a second aspect, the present application provides a parabolic detection system comprising: a camera, a laser radar, a processor;
the camera is used for shooting a two-dimensional image of the target scene;
the laser radar is used for detecting the three-dimensional point cloud of the target scene;
the processor is used for obtaining a camera-based parabolic detection result of the target scene according to the two-dimensional image and obtaining a laser radar-based parabolic detection result of the target scene according to the three-dimensional point cloud; fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene;
the camera-based parabolic detection result comprises: identifying a first parabolic curve and a first parabolic confidence corresponding to the first parabolic curve based on a two-dimensional image shot by a camera; the lidar based parabolic detection results include: and identifying a second parabolic curve and a second parabolic confidence corresponding to the second parabolic curve based on the three-dimensional point cloud detected by the laser radar.
Optionally, the system further comprises:
a display for obtaining the parabolic detection fusion result from the processor; and displaying the fusion result of the parabolic detection.
Compared with the prior art, the method has the following beneficial effects:
the parabolic detection method provided by the application comprises the following steps: acquiring a camera-based parabolic detection result and a laser radar-based parabolic detection result of a target scene; and fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene. Because the two parts of parabolic detection results respectively comprise a parabolic curve and confidence coefficients, the confidence coefficients can assist in improving the accuracy of the fusion result. This scheme has exerted the advantage of two kinds of sensors, promotes the coverage that the thing detected, can not only compensate artifical control and easily have the not enough of control leak, also can reduce personnel and patrol work load, promotes the managerial efficiency in the thing detection scene, reduces and patrols and administrative cost.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of a method for detecting a parabola according to an embodiment of the present application;
fig. 2 is a flowchart of another method for detecting a parabola according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a parabolic detection system according to an embodiment of the present disclosure.
Detailed Description
As described above, the prior art generally relies solely on a sensor for parabolic detection. However, in a parabolic scene, a sensor is difficult to meet the accuracy requirement of parabolic detection, and meanwhile, the coverage range of the detection scene is limited by the detection mode of the sensor. In order to solve the above problems, the inventors have studied and provided a new method for detecting a parabola. The method exerts the advantages of the two sensors, improves the coverage range of the parabolic detection, and improves the accuracy of the parabolic detection by fusing the parabolic detection results of the two sensors.
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Exemplary method
Referring to fig. 1, a flowchart of a method for detecting a parabola according to an embodiment of the present application is shown. As shown in fig. 1, the method includes:
step 101: and acquiring a parabolic detection result based on a camera and a parabolic detection result based on the laser radar for the target scene.
In the method for detecting a parabola provided by the embodiment of the application, the parabola detection results obtained based on the data respectively collected by the two sensors are specifically used for fusion. These two sensors are a camera and a lidar, respectively. The camera referred to herein may be any type of camera capable of capturing RGB cameras, and the embodiment is not limited thereto. The laser radar referred to herein is any type of laser radar capable of detecting an object in a scene to form a three-dimensional point cloud, and this embodiment is not limited.
In order to realize the fusion of the parabolic detection results, in this step, it is first necessary to obtain a camera-based parabolic detection result for the target scene and a lidar-based parabolic detection result for the target scene. The implementation manner may be an active request to obtain the two parabolic detection results, or may be to receive the two parabolic detection results, and the implementation manner for obtaining the two parabolic detection results is not limited herein.
In addition, in the specific implementation of step 101, a parabolic detection result based on the camera may be obtained first, and then a parabolic detection result based on the laser radar may be obtained; or the parabolic detection result based on the laser radar is obtained first, and then the parabolic detection result based on the camera is obtained; it is also possible to obtain the above two parabolic detection results simultaneously. The execution order of obtaining the above two parabolic detection results is not limited here.
The camera-based parabolic detection result obtained by performing step 101 at least includes: the method comprises the steps of identifying a first parabolic curve based on a two-dimensional image shot by a camera and a first parabolic confidence corresponding to the first parabolic curve. The first parabolic confidence is a confidence level of the detection result of the first parabolic curve. The higher the confidence of the first parabola, the higher the confidence of the first parabola curve as the detection result.
The laser radar-based parabolic detection result obtained by performing step 101 includes at least: and identifying a second parabolic curve and a second parabolic confidence corresponding to the second parabolic curve based on the three-dimensional point cloud detected by the laser radar. The second parabolic confidence is a confidence level that the second parabolic curve is used as the detection result. The higher the confidence of the second parabola, the higher the confidence of the second parabola curve as the detection result.
Step 102: and fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene.
In step 102, fusion may be performed specifically based on the two parabolic detection results obtained in step 101. It should be noted here that, the fusion of the two parabolic detection results is not performed by using a direct mathematical operation manner, but a parabolic detection fusion result for the target scene is obtained based on the first parabolic confidence and the second parabolic confidence.
In a possible implementation manner, a first confidence threshold and a second confidence threshold are preset, where the first confidence threshold is lower than the second confidence threshold, and both thresholds are positive numbers. When the confidence coefficient of the parabolic curve is smaller than or equal to the first confidence coefficient threshold value, the confidence coefficient indicates that the identified parabolic curve is low in confidence degree; when the confidence of the parabolic curve is greater than or equal to the second confidence threshold, the confidence level of the identified parabolic curve is medium; when the confidence of the parabolic curve is greater than or equal to the second confidence threshold, it indicates that the identified parabolic curve is high in confidence.
In the technical solution of the present application, when both the first parabolic confidence and the second parabolic confidence are greater than the first confidence threshold and less than the second confidence threshold, it can be determined that both the first parabolic curve and the second parabolic curve have medium confidence levels, and at this time, in order to ensure the fusion effect, the two parabolic curves need to be verified with each other. And sets a distance threshold.
Specifically, the distance between the first parabolic curve and the second parabolic curve may be obtained, and then the distance may be compared with a preset distance threshold:
when the distance between the first parabolic curve and the second parabolic curve is smaller than a preset distance threshold value, the first parabolic curve and the second parabolic curve are determined to be matched, at the moment, the first parabolic curve verifies that the second parabolic curve is credible, the second parabolic curve verifies that the first parabolic curve is credible, and the target scene can be judged to have parabolic behavior. Since the distance between the first parabolic curve and the second parabolic curve is smaller than the preset distance threshold, it can be known that the relative deviation between the first parabolic curve and the second parabolic curve is very small, so that the first parabolic curve or the second parabolic curve can be used as the parabolic curve corresponding to the parabolic behavior. On the contrary, if the distance between the first parabolic curve and the second parabolic curve is greater than or equal to the preset distance threshold, it means that the first parabolic curve and the second parabolic curve cannot mutually authenticate. Therefore, it is impossible to determine the occurrence of a parabolic behavior in the target scene, and a highly accurate parabolic detection result cannot be obtained.
When at least one of the first parabolic confidence level and the second parabolic confidence level is greater than or equal to the second confidence level threshold, it is indicated that at least one parabolic curve is reliable. At the moment, verification is not needed by combining the distance, and the parabolic detection result which best accords with the parabolic behavior of the target scene can be determined only according to the confidence coefficient. Specifically, it is determined that a parabolic behavior occurs in the target scene, and a curve with the maximum parabolic confidence in the first parabolic curve and the second parabolic curve is used as a parabolic curve corresponding to the parabolic behavior. For example, if the first parabolic confidence corresponding to the first parabolic curve is greater than the second confidence threshold and the first parabolic confidence is greater than the second parabolic confidence, the first parabolic curve may be used as the finally determined parabolic curve, and the first parabolic curve may be used as the expression of the parabolic behavior detected in the target scene.
If the first and second parabolic confidences are both less than or equal to the first confidence threshold, or one of the first and second parabolic confidences is less than or equal to the first confidence threshold, but the other is greater than the first confidence threshold and less than the second confidence threshold, then it is determined that there is not a sufficiently reliable parabolic curve. And not outputting the fusion result of the parabolic detection or outputting a prompt message to report errors.
In the method for detecting a parabola provided in the above embodiment, the method includes: acquiring a camera-based parabolic detection result and a laser radar-based parabolic detection result of a target scene; and fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene. Because the two parts of parabolic detection results respectively comprise a parabolic curve and confidence coefficients, the confidence coefficients can assist in improving the accuracy of the fusion result. This scheme has exerted the advantage of two kinds of sensors, promotes the coverage that the thing detected, can not only compensate artifical control and easily have the not enough of control leak, also can reduce personnel and patrol work load, promotes the managerial efficiency in the thing detection scene, reduces and patrols and administrative cost.
In the parabolic detection method described above, it is mentioned that when the first parabolic confidence of the first parabolic curve and the second parabolic confidence of the second parabolic curve are both greater than the first confidence threshold and less than the second confidence threshold, the distance between the two parabolic curves needs to be obtained. However, since the first parabolic curve is a curve in a two-dimensional coordinate system and the second parabolic curve is a curve in a three-dimensional coordinate system, it is necessary to converge the two parabolic curves in the same coordinate system in order to measure the distance accurately. Therefore, another method for detecting a parabola is provided in the embodiments of the present application, which is described below with reference to the embodiments and the accompanying drawings.
Referring to fig. 2, a flowchart of another method for detecting a parabola according to the embodiment of the present application is shown. As shown in fig. 2, the parabola detection method includes:
step 201: and acquiring a parabolic detection result based on a camera and a parabolic detection result based on the laser radar for the target scene.
The camera-based parabolic detection result comprises a first parabolic curve, a first parabolic confidence corresponding to the first parabolic curve and a first parabolic time. Wherein the first parabolic time is the earliest parabolic time corresponding to the first parabolic curve. The parabolic detection result based on the laser radar comprises a second parabolic curve and a second parabolic confidence coefficient corresponding to the second parabolic curve, and further comprises a second parabolic time. Wherein the second parabolic time is the earliest parabolic time corresponding to the second parabolic curve.
Step 202: and carrying out combined calibration on the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud.
In this embodiment, the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud are jointly calibrated, that is, the conversion relationship between the two coordinate systems needs to be calculated by combining the position of the camera in the three-dimensional space, the position of the radar in the three-dimensional space, the mapping relationship between the pixel points in the image captured by the camera and the midpoint in the three-dimensional space, and the like. Thus, a transformation matrix that changes from the coordinate system of the three-dimensional point cloud to the coordinate system of the two-dimensional image can be finally obtained.
Step 203: judging whether the first parabolic confidence coefficient and the second parabolic confidence coefficient are both larger than a first confidence coefficient threshold and smaller than a second confidence coefficient threshold, if so, entering step 204; if not, step 209 is entered.
Step 204: and mapping the second parabolic curve to the coordinate system of the two-dimensional image by using a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain the distance between the first parabolic curve and the mapped second parabolic curve.
The second parabolic curve can be mapped to the coordinate system of the two-dimensional image according to the expression of the second parabolic curve and by combining the transformation matrix obtained in step 202, so that the mapped second parabolic curve and the first parabolic curve are in the same coordinate system. And further, the method is beneficial to obtaining more accurate curve distance.
Step 205: and judging whether the distance between the first parabolic curve and the mapped second parabolic curve is smaller than a preset distance threshold, if so, entering the step 206, and if not, entering the step 211.
The distance between the first parabolic curve and the mapped second parabolic curve is obtained in step 204, and may be compared with a preset distance threshold in this step in order to determine whether the two parabolic curves can verify each other. The smaller the distance, the more likely it is to indicate a match between the two parabolic curves, and thus that the parabolic curves detected by the two different sensors (camera and lidar) are very close to each other, see step 206. On the contrary, it indicates that the two curves cannot be verified mutually, and at this time, the occurrence of the parabolic behavior cannot be determined, and since the confidence degrees of the two parabolic curves do not reach the second confidence degree threshold, a curve capable of expressing the parabolic behavior cannot be determined therefrom, see step 211.
Step 206: and determining that the first parabolic curve is matched with the mapped second parabolic curve, judging that the parabolic behavior occurs in the target scene, and taking the first parabolic curve or the second parabolic curve as a parabolic curve corresponding to the parabolic behavior.
Wherein the first parabolic curve is used as an expression of parabolic behavior in the two-dimensional image coordinate system; the second parabolic curve is used as a representation of parabolic behavior under the three-dimensional point cloud coordinate system. This step 206 is equivalent to achieving the fusion of the parabolic curves.
Step 207: the earliest parabolic time of the first parabolic time and the second parabolic time is taken as the fused parabolic time.
This step 207 is equivalent to achieving a fusion to the parabolic time. Since the first parabolic time is the earliest parabolic time corresponding to the first parabolic curve and the second parabolic time is the earliest parabolic time corresponding to the second parabolic curve, the earliest parabolic time of the first parabolic time and the second parabolic time is the earliest occurrence time recognized by the two sensors of the camera and the lidar for the parabolic behavior. By capturing the earliest time of the parabolic behavior, the method is beneficial to giving out an alarm prompt in time, and reduces the probability that the life and property are damaged due to the parabolic behavior.
Step 208: and taking the maximum confidence coefficient of the first parabolic confidence coefficient and the second parabolic confidence coefficient as the fused parabolic confidence coefficient.
This step 208 is equivalent to achieving a fusion of the confidence of the parabolic curves. As mentioned above, the first parabolic curve and the second parabolic curve are matched with each other, that is, they complete mutual authentication and succeed each other. Thus, the higher confidence parabolic curve verifies the confidence of the lower confidence parabolic curve. Therefore, the maximum confidence in the two confidences can be used as the confidence of the merged parabola to represent the maximum confidence of the verified parabola curve.
By the above step 206-208, the fusion of the parabolic curves, the parabolic time and the parabolic confidence is completed after the two parabolic curves mutually verify to obtain the mutually matched verification result. The fusion result of the parabolic detection also comprises the fusion result of the three parts.
Step 209: and judging whether at least one of the first parabolic confidence coefficient and the second parabolic confidence coefficient is greater than or equal to the second confidence coefficient threshold value, if so, entering the step 210, and if not, entering the step 211.
Step 210: and when the first parabolic confidence coefficient and/or the second parabolic confidence coefficient is/are larger than or equal to the second confidence coefficient threshold value, judging that the parabolic behavior occurs in the target scene, and taking the curve with the maximum parabolic confidence coefficient in the first parabolic curve and the second parabolic curve as the parabolic curve corresponding to the parabolic behavior.
Step 211: no parabolic behavior is determined to occur.
In the above embodiment, verification of the parabolic curve is achieved by comparison of the confidence and comparison of the distance of the parabolic curve. So that when the verification is successful, a fusion of the parabolic curve, the parabolic time and the parabolic confidence is performed. According to the method, the parabolic behavior is detected by the camera and the laser radar, so that the parabolic detection capability is enhanced, the accuracy of the parabolic result is improved, and the problems of wrong detection and missed detection are not easy to occur. Meanwhile, by real-time and efficient detection and timely output after a parabolic detection fusion result is obtained, the inspection work of a target scene can be assisted, the supervision efficiency is improved, and the accident probability is reduced. For example, physical interception may be performed after the fusion result of the parabolic detection is obtained.
In the above embodiments, it is described that before the fusion is performed, it is necessary to obtain a camera-based parabolic detection and a lidar-based parabolic detection result. An exemplary implementation is described below.
Obtaining camera-based parabolic detection results may include:
obtaining a two-dimensional image of a target scene shot by a camera;
performing motion detection on the two-dimensional image to generate a track image;
performing curve detection on the track image, and fitting to obtain a first parabolic curve;
and judging the parabola of the first parabolic curve to obtain a first parabolic confidence corresponding to the first parabolic curve.
The motion detection of the two-dimensional image to generate the track image may include:
detecting a motion area included in the two-dimensional image, and configuring corresponding time information for the motion area according to the sequence of the two-dimensional image in the target video stream;
and generating a track image corresponding to the moving target based on the moving areas corresponding to the same moving target and included in the two-dimensional images of the multiple frames.
The curve detection is performed on the track image, and the fitting is performed to obtain a first parabolic curve, which may include:
and fitting a first parabolic curve corresponding to the moving target according to the moving pixel points in each moving area in the track image. This process can be subdivided into:
forming a foreground point set by using the moving pixel points in each moving area in the track image; sampling foreground points in the foreground point set to obtain a sample set; the samples in the sample set comprise a first preset number of foreground points; fitting a curve corresponding to the sample according to the position of each foreground point included in the sample; determining an adaptive value corresponding to the sample according to the relative position between each foreground point in the foreground point set and a curve corresponding to the sample; and determining a first parabolic curve according to the adaptive value corresponding to each sample in the sample set.
The parabolic judgment on the first parabolic curve to obtain a first parabolic confidence corresponding to the first parabolic curve may include:
determining, by a parabolic behavior recognition model, a confidence level of the first parabolic curve corresponding to a parabolic behavior according to the position information and the time information corresponding to each of the plurality of motion pixel points on the first parabolic curve; the time information corresponding to the motion pixel point is the time information corresponding to the motion area to which the motion pixel point belongs; determining whether the moving object corresponds to parabolic behavior according to the confidence.
(II) obtaining a parabolic detection result based on the laser radar, which can comprise:
obtaining three-dimensional point cloud of a target scene detected by a laser radar;
preprocessing the three-dimensional point cloud to obtain queue point cloud data;
carrying out parabola detection by using the queue point cloud data, and fitting to obtain a second parabolic curve;
and judging the second parabolic curve to obtain a second parabolic confidence corresponding to the second parabolic curve.
Performing parabolic detection by using the queue point cloud data, and fitting to obtain a second parabolic curve; the operation of judging the second parabolic curve to obtain the second parabolic confidence corresponding to the second parabolic curve is similar to the operation of the first part (a), and therefore, the description is omitted. Please refer to the description of the first section.
In the following, a detailed description is made about an implementation manner of the step of preprocessing the three-dimensional point cloud to obtain the queue point cloud data.
Firstly, an interested area and a non-important area in a monitoring area of a target scene by the laser radar can be set. The region of interest is a region needing attention, and the point cloud data in the region of interest has a large effect and significance on detecting parabolic curves and identifying parabolic behaviors; the non-key area can be ignored, and the point cloud data falling in the non-key area has very small effect and significance for detecting the parabolic curve and identifying the parabolic behavior. And then, screening points in each frame of point cloud data of the three-dimensional point cloud by using a preset effective condition, and discarding the points which do not meet the preset effective condition in each frame of point cloud data. Wherein, the preset effective conditions comprise: the points are located within the region of interest and outside the non-emphasized region. That is, for the three-dimensional point cloud, the points located in the region of interest and outside the non-key region are retained as valid points, and the points which do not satisfy the above condition are discarded. Therefore, only effective points are reserved through screening of preset effective conditions.
These points may be regarded as "one frame" in a preset time length. For example, every 20ms of valid point cloud data is taken as a frame. And adding the remaining points in each frame of point cloud data into the target queue by using the frame as a unit according to the time sequence. The length of the target queue may be set based on a priori knowledge, for example to length L. L is a positive integer. And if the number of frames needing to be added into the target queue exceeds L, removing the frames which enter the target queue earliest according to a first-in first-out principle. Finally, a target queue with a queue length L is formed, and the data in the target queue is called queue point cloud data.
Based on the parabola detection method provided by the foregoing embodiment, correspondingly, the present application further provides a parabola detection system. The following describes an implementation of the parabolic detection system with reference to the embodiments and the accompanying drawings.
Exemplary System
Referring to fig. 3, the figure is a schematic structural diagram of a parabolic detection system according to an embodiment of the present application. As shown in fig. 3, a parabolic detection system 300 includes: camera 301, laser radar 302, processor 303;
a camera 301 for capturing a two-dimensional image of a target scene;
a laser radar 302 for detecting a three-dimensional point cloud of a target scene;
a processor 303, configured to obtain a parabolic detection result based on the camera 301 for the target scene according to the two-dimensional image, and obtain a parabolic detection result based on the laser radar 302 for the target scene according to the three-dimensional point cloud; a parabolic detection result based on the camera 301 and a parabolic detection result based on the laser radar 302 are fused to obtain a parabolic detection fusion result of the target scene;
the results of the camera 301-based parabolic detection include: a first parabolic curve identified based on a two-dimensional image captured by the camera 301 and a first parabolic confidence corresponding to the first parabolic curve; the parabolic detection results based on lidar 302 include: and identifying a second parabolic curve and a second parabolic confidence corresponding to the second parabolic curve based on the three-dimensional point cloud detected by the laser radar 302.
Because the two parts of parabolic detection results respectively comprise a parabolic curve and confidence coefficients, the confidence coefficients can assist in improving the accuracy of the fusion result. This parabolic detection system 300 has exerted the advantage of two kinds of sensors, promotes the coverage that the thing detected, can not only compensate artifical control and easily have the not enough of control leak, also can reduce personnel and patrol work load, promotes the managerial efficiency in the thing detection scene, reduces and patrols and administrative cost.
In one possible implementation, the processor 303 is specifically configured to:
when the first parabolic confidence coefficient and the second parabolic confidence coefficient are both larger than a first confidence coefficient threshold and smaller than a second confidence coefficient threshold, obtaining the distance between the first parabolic curve and the second parabolic curve; the first confidence threshold is less than the second confidence threshold;
when the distance is smaller than a preset distance threshold value, determining that the first parabolic curve and the second parabolic curve are matched, judging that parabolic behavior occurs in the target scene, and taking the first parabolic curve or the second parabolic curve as a parabolic curve corresponding to the parabolic behavior.
In one possible implementation, the processor 303 is specifically configured to:
and mapping the second parabolic curve to the coordinate system of the two-dimensional image by using a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain the distance between the first parabolic curve and the mapped second parabolic curve.
In one possible implementation, the camera-based parabolic detection result further includes: a first parabolic time, wherein the first parabolic time is an earliest parabolic time corresponding to the first parabolic curve; the lidar based parabolic detection result further comprises: a second parabolic time, wherein the second parabolic time is an earliest parabolic time corresponding to the second parabolic curve;
the processor 303 is further configured to:
and taking the earliest parabolic time in the first parabolic time and the second parabolic time as the fused parabolic time.
In one possible implementation, the processor 303 is further configured to:
and taking the maximum confidence coefficient of the first parabolic confidence coefficient and the second parabolic confidence coefficient as the fused parabolic confidence coefficient.
In one possible implementation, the processor 303 is further configured to:
and when the first parabolic confidence coefficient and/or the second parabolic confidence coefficient is/are larger than or equal to the second confidence coefficient threshold value, judging that the parabolic behavior occurs in the target scene, and taking the curve with the maximum parabolic confidence coefficient in the first parabolic curve and the second parabolic curve as the parabolic curve corresponding to the parabolic behavior.
In one possible implementation, the processor 303 is further configured to:
and carrying out combined calibration on the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud.
In one possible implementation, the processor 303 is specifically configured to:
obtaining a two-dimensional image of the target scene captured by the camera;
performing motion detection on the two-dimensional image to generate a track image;
performing curve detection on the track image, and fitting to obtain a first parabolic curve;
and carrying out parabolic judgment on the first parabolic curve to obtain a first parabolic confidence corresponding to the first parabolic curve.
In one possible implementation, the processor 303 is specifically configured to:
obtaining a three-dimensional point cloud of the target scene detected by the laser radar;
preprocessing the three-dimensional point cloud to obtain queue point cloud data;
carrying out parabolic detection by utilizing the queue point cloud data, and fitting to obtain the second parabolic curve;
and carrying out parabolic judgment on the second parabolic curve to obtain a second parabolic confidence corresponding to the second parabolic curve.
In one possible implementation, the processor 303 is specifically configured to:
setting an interested area and a non-key area of the laser radar in a monitoring area of the target scene;
screening points in each frame of point cloud data of the three-dimensional point cloud by using a preset effective condition, discarding the points which do not meet the preset effective condition in each frame of point cloud data, and adding the rest points in each frame of point cloud data into a target queue by using a frame as a unit according to a time sequence to form queue point cloud data;
the preset effective conditions comprise: the points are located within the region of interest and outside the non-emphasized region.
The parabolic detection system 300 further comprises:
a display 304 for obtaining a parabolic detection fusion result from the processor 303; and displaying the fusion result of the parabolic detection. The display 304 displays the fusion result of the parabolic detection, so that a user (for example, a person monitoring the parabolic behavior in a scene) can know the occurrence of the parabolic behavior and know the parabolic time and the parabolic path, thereby improving the management efficiency in the scene and reducing the patrol and management cost. The user's knowledge of the parabolic behavior is made more timely and accurate by display in the display 304.
It should be noted that, in the present specification, all the embodiments are described in a progressive manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus and system embodiments, since they are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described embodiments of the apparatus and system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts suggested as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only one specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A method of detecting a parabola, comprising:
acquiring a camera-based parabolic detection result and a laser radar-based parabolic detection result of a target scene; the camera-based parabolic detection result comprises: identifying a first parabolic curve and a first parabolic confidence corresponding to the first parabolic curve based on a two-dimensional image shot by a camera; the lidar based parabolic detection results include: a second parabolic curve identified based on the three-dimensional point cloud detected by the laser radar and a second parabolic confidence corresponding to the second parabolic curve;
and fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene.
2. The method according to claim 1, wherein the fusing the camera-based parabolic detection result and the lidar-based parabolic detection result to obtain a parabolic detection fusion result for the target scene, comprises:
when the first parabolic confidence coefficient and the second parabolic confidence coefficient are both larger than a first confidence coefficient threshold and smaller than a second confidence coefficient threshold, obtaining the distance between the first parabolic curve and the second parabolic curve; the first confidence threshold is less than the second confidence threshold;
when the distance is smaller than a preset distance threshold value, determining that the first parabolic curve and the second parabolic curve are matched, judging that parabolic behavior occurs in the target scene, and taking the first parabolic curve or the second parabolic curve as a parabolic curve corresponding to the parabolic behavior.
3. The method of claim 2, wherein said obtaining a distance of said first parabolic curve from said second parabolic curve comprises:
and mapping the second parabolic curve to the coordinate system of the two-dimensional image by using a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain the distance between the first parabolic curve and the mapped second parabolic curve.
4. The method of claim 2, wherein the camera-based parabolic detection result further comprises: a first parabolic time, wherein the first parabolic time is an earliest parabolic time corresponding to the first parabolic curve; the lidar based parabolic detection result further comprises: a second parabolic time, wherein the second parabolic time is an earliest parabolic time corresponding to the second parabolic curve;
after the determining that the first parabolic curve and the second parabolic curve match, the method further comprises:
and taking the earliest parabolic time in the first parabolic time and the second parabolic time as the fused parabolic time.
5. The method of claim 2, wherein after said determining that said first parabolic curve and said second parabolic curve match, said method further comprises:
and taking the maximum confidence coefficient of the first parabolic confidence coefficient and the second parabolic confidence coefficient as the fused parabolic confidence coefficient.
6. The method of claim 2, further comprising:
and when the first parabolic confidence coefficient and/or the second parabolic confidence coefficient is/are larger than or equal to the second confidence coefficient threshold value, judging that the parabolic behavior occurs in the target scene, and taking the curve with the maximum parabolic confidence coefficient in the first parabolic curve and the second parabolic curve as the parabolic curve corresponding to the parabolic behavior.
7. The method of claim 4, further comprising:
and carrying out combined calibration on the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud to obtain a transformation matrix between the coordinate system of the two-dimensional image and the coordinate system of the three-dimensional point cloud.
8. The method of claim 1, wherein obtaining the camera-based parabolic detection result comprises:
obtaining a two-dimensional image of the target scene captured by the camera;
performing motion detection on the two-dimensional image to generate a track image;
performing curve detection on the track image, and fitting to obtain a first parabolic curve;
and carrying out parabolic judgment on the first parabolic curve to obtain a first parabolic confidence corresponding to the first parabolic curve.
9. The method of claim 1, wherein obtaining the lidar based parabolic detection result comprises:
obtaining a three-dimensional point cloud of the target scene detected by the laser radar;
preprocessing the three-dimensional point cloud to obtain queue point cloud data;
carrying out parabolic detection by utilizing the queue point cloud data, and fitting to obtain the second parabolic curve;
and carrying out parabolic judgment on the second parabolic curve to obtain a second parabolic confidence corresponding to the second parabolic curve.
10. The method of claim 9, wherein the pre-processing the three-dimensional point cloud to obtain queue point cloud data comprises:
setting an interested area and a non-key area of the laser radar in a monitoring area of the target scene;
screening points in each frame of point cloud data of the three-dimensional point cloud by using a preset effective condition, discarding the points which do not meet the preset effective condition in each frame of point cloud data, and adding the rest points in each frame of point cloud data into a target queue by using a frame as a unit according to a time sequence to form queue point cloud data;
the preset effective conditions comprise: the points are located within the region of interest and outside the non-emphasized region.
11. A parabolic detection system, comprising: a camera, a laser radar, a processor;
the camera is used for shooting a two-dimensional image of the target scene;
the laser radar is used for detecting the three-dimensional point cloud of the target scene;
the processor is used for obtaining a camera-based parabolic detection result of the target scene according to the two-dimensional image and obtaining a laser radar-based parabolic detection result of the target scene according to the three-dimensional point cloud; fusing the parabolic detection result based on the camera and the parabolic detection result based on the laser radar to obtain a parabolic detection fusion result of the target scene;
the camera-based parabolic detection result comprises: identifying a first parabolic curve and a first parabolic confidence corresponding to the first parabolic curve based on a two-dimensional image shot by a camera; the lidar based parabolic detection results include: and identifying a second parabolic curve and a second parabolic confidence corresponding to the second parabolic curve based on the three-dimensional point cloud detected by the laser radar.
12. The system of claim 11, further comprising:
a display for obtaining the parabolic detection fusion result from the processor; and displaying the fusion result of the parabolic detection.
CN202110206045.4A 2021-02-24 2021-02-24 Parabolic detection method and system Pending CN112926446A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110206045.4A CN112926446A (en) 2021-02-24 2021-02-24 Parabolic detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110206045.4A CN112926446A (en) 2021-02-24 2021-02-24 Parabolic detection method and system

Publications (1)

Publication Number Publication Date
CN112926446A true CN112926446A (en) 2021-06-08

Family

ID=76171558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110206045.4A Pending CN112926446A (en) 2021-02-24 2021-02-24 Parabolic detection method and system

Country Status (1)

Country Link
CN (1) CN112926446A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509918A (en) * 2018-04-03 2018-09-07 中国人民解放军国防科技大学 Target detection and tracking method fusing laser point cloud and image
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
CN111340797A (en) * 2020-03-10 2020-06-26 山东大学 Laser radar and binocular camera data fusion detection method and system
CN112102409A (en) * 2020-09-21 2020-12-18 杭州海康威视数字技术股份有限公司 Target detection method, device, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509918A (en) * 2018-04-03 2018-09-07 中国人民解放军国防科技大学 Target detection and tracking method fusing laser point cloud and image
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
CN111340797A (en) * 2020-03-10 2020-06-26 山东大学 Laser radar and binocular camera data fusion detection method and system
CN112102409A (en) * 2020-09-21 2020-12-18 杭州海康威视数字技术股份有限公司 Target detection method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109887281B (en) Method and system for monitoring traffic incident
CN102693603B (en) Dual spectrum based intelligent monitoring system for forest fire prevention
Semertzidis et al. Video sensor network for real-time traffic monitoring and surveillance
US9520040B2 (en) System and method for real-time 3-D object tracking and alerting via networked sensors
CN110084165B (en) Intelligent identification and early warning method for abnormal events in open scene of power field based on edge calculation
CN109471128B (en) Positive sample manufacturing method and device
CN102833478B (en) Fault-tolerant background model
CN113671480A (en) Radar and video fusion traffic target tracking method, system, equipment and terminal
CN103646250B (en) Pedestrian monitoring method and device based on distance image head and shoulder features
CN111045000A (en) Monitoring system and method
CN112068111A (en) Unmanned aerial vehicle target detection method based on multi-sensor information fusion
CN111985365A (en) Straw burning monitoring method and system based on target detection technology
CN111091098A (en) Training method and detection method of detection model and related device
CN101569194A (en) Network surveillance system
CN113096158A (en) Moving object identification method and device, electronic equipment and readable storage medium
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN115083088A (en) Railway perimeter intrusion early warning method
CN115909223A (en) Method and system for matching WIM system information with monitoring video data
CN111382610B (en) Event detection method and device and electronic equipment
CN114415173A (en) Fog-penetrating target identification method for high-robustness laser-vision fusion
CN109708659B (en) Distributed intelligent photoelectric low-altitude protection system
CN112926446A (en) Parabolic detection method and system
CN105303825A (en) Violating inclined side parking evidence obtaining device and method
CN114912536A (en) Target identification method based on radar and double photoelectricity
CN110930362B (en) Screw safety detection method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination