CN113095345A - Data matching method and device and data processing equipment - Google Patents

Data matching method and device and data processing equipment Download PDF

Info

Publication number
CN113095345A
CN113095345A CN202010017517.7A CN202010017517A CN113095345A CN 113095345 A CN113095345 A CN 113095345A CN 202010017517 A CN202010017517 A CN 202010017517A CN 113095345 A CN113095345 A CN 113095345A
Authority
CN
China
Prior art keywords
track
detection result
radar
video
tracks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010017517.7A
Other languages
Chinese (zh)
Inventor
张兆宇
底欣
王乐菲
田军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to CN202010017517.7A priority Critical patent/CN113095345A/en
Priority to JP2020200956A priority patent/JP2021111364A/en
Publication of CN113095345A publication Critical patent/CN113095345A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a data matching method, a data matching device and data processing equipment, wherein the data matching method comprises the following steps: determining the track of an object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system; matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the tracks; and recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.

Description

Data matching method and device and data processing equipment
Technical Field
The present application relates to the field of information technologies, and in particular, to a data matching method and apparatus, and a data processing device.
Background
In order to achieve intelligent monitoring of traffic, for example on roads or intersections, millimeter wave radars and cameras are placed to detect vehicles, pedestrians and other objects (objects). When an object hits both the radar and the camera, both the radar and the camera detect the object. Therefore, it is necessary to fuse the radar data and the camera data together so that the system can perform further analysis.
It should be noted that the above background description is only for the convenience of clear and complete description of the technical solutions of the present application and for the understanding of those skilled in the art. Such solutions are not considered to be known to the person skilled in the art merely because they have been set forth in the background section of the present application.
Disclosure of Invention
The inventors have found that matching of radar and objects in the video is required before data fusion can take place. In the conventional method, the matching process is manually processed, which is very time-consuming.
In order to solve the above problems or other similar problems, embodiments of the present application provide a data matching method, apparatus, and data processing device.
According to an aspect of the embodiments of the present application, there is provided a data matching method, where the method includes:
determining the track of an object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the tracks;
and recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.
According to another aspect of the embodiments of the present application, there is provided a data matching apparatus, wherein the apparatus includes:
the determining unit is used for determining the track of the object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
the matching unit is used for matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the track;
and the recording unit is used for recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.
According to still another aspect of the embodiments of the present application, there is provided a data processing apparatus, wherein the data processing apparatus includes the aforementioned data matching device.
One of the beneficial effects of the embodiment of the application lies in: according to the radar detection result and the video detection result, the track of the object in the radar detection result and the track of the object in the video detection result are found, and the radar detection result and the video detection result are fused by matching the track of the object in the radar detection result and the track of the object in the video detection result.
Specific embodiments of the present application are disclosed in detail with reference to the following description and drawings, indicating the manner in which the principles of the application may be employed. It should be understood that the embodiments of the present application are not so limited in scope. The embodiments of the application include many variations, modifications and equivalents within the spirit and scope of the appended claims.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments, in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
Elements and features described in one drawing or one implementation of an embodiment of the application may be combined with elements and features shown in one or more other drawings or implementations. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views, and may be used to designate corresponding parts for use in more than one embodiment.
The accompanying drawings, which are included to provide a further understanding of the embodiments of the application, are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 is a schematic diagram of a data matching method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of one example of determining a trajectory ID for the radar trajectory of FIG. 1;
FIG. 3 is a schematic diagram of one example of FIG. 1 determining a track ID for a video track;
FIG. 4 is a schematic illustration of determining a track in an ith frame;
FIG. 5 is a schematic illustration of matching a radar track and a video track;
FIG. 6 is a schematic diagram of one example of matching in FIG. 5;
FIG. 7 is a schematic diagram of one example of matching a radar track and a video track;
FIG. 8 is a schematic diagram of another example of matching a radar track and a video track;
FIG. 9 is another schematic illustration of matching a radar track and a video track;
FIG. 10 is a schematic diagram of one example of matching in FIG. 9;
FIG. 11 is a schematic diagram of a data matching apparatus according to an embodiment of the present application;
fig. 12 is a schematic diagram of a data processing apparatus according to an embodiment of the present application.
Detailed Description
The foregoing and other features of the present application will become apparent from the following description, taken in conjunction with the accompanying drawings. In the description and drawings, particular embodiments of the application are disclosed in detail as being indicative of some of the embodiments in which the principles of the application may be employed, it being understood that the application is not limited to the described embodiments, but, on the contrary, is intended to cover all modifications, variations, and equivalents falling within the scope of the appended claims.
In the embodiments of the present application, the terms "first", "second", and the like are used for distinguishing different elements by reference, but do not denote a spatial arrangement, a temporal order, or the like of the elements, and the elements should not be limited by the terms. The term "and/or" includes any and all combinations of one or more of the associated listed terms. The terms "comprising," "including," "having," and the like, refer to the presence of stated features, elements, components, and do not preclude the presence or addition of one or more other features, elements, components, and elements.
In the embodiments of the present application, the singular forms "a", "an", and the like include the plural forms and are to be construed broadly as "a" or "an" and not limited to the meaning of "a" or "an"; furthermore, the term "comprising" should be understood to include both the singular and the plural, unless the context clearly dictates otherwise. Further, the term "according to" should be understood as "at least partially according to … …," and the term "based on" should be understood as "based at least partially on … …," unless the context clearly dictates otherwise.
When radar and camera are used to detect objects on the road, the data is recorded frame by frame. In each frame of the radar detection result, an object detected by the radar is contained; in each frame of the video detection result, an object detected by the camera is included. The speed and position of the object can be obtained by radar detection, and whether the object is a vehicle or a pedestrian can be identified by video detection, such as the color and brand of the vehicle. In the radar coordinate system, the coordinates of an object (simply referred to as a radar object) are represented as (x, y), and in the video coordinate system, the coordinates (positions of pixel points) of an object (simply referred to as a video object) are represented as (u, v).
According to the embodiment of the present application, the relationship between the two coordinate systems can be found, and thus, which object in the radar detection result and which object in the video detection result are the same object in the real world can be known. For example, from data from radar and cameras, information on the road can be known, making traffic more intelligent.
Various embodiments of the present application will be described below with reference to the drawings. These embodiments are merely exemplary and are not intended to limit the embodiments of the present application.
First aspect of the embodiments
A first aspect of an embodiment of the present application provides a data matching method.
Fig. 1 is a schematic diagram of an example of a data matching method according to an embodiment of the present application, please refer to fig. 1, where the method includes:
101: determining the track of an object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
102: matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the tracks;
103: and recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.
In the embodiment of the application, the track of the object in the radar detection result and the track of the object in the video detection result are found according to the radar detection result and the video detection result, the conversion relation between the radar coordinate system and the image coordinate system is obtained by matching the track of the object in the radar detection result and the track of the object in the video detection result, and the radar detection result and the video detection result are fused.
As shown in fig. 1, in 101, in time-aligned frames of the radar and the camera, a track of an object in the radar detection result in the radar coordinate system and a track of an object in the video detection result in the video coordinate system are determined, that is, a track ID is assigned to the detected track. For convenience of explanation, in the embodiment of the present application, a frame to be processed in time-aligned frames of the radar and the camera is referred to as a current frame.
Fig. 2 is a schematic diagram of determining a trajectory of an object in a radar detection result in a radar coordinate system, taking that a current frame is an ith frame as an example, as shown in fig. 2, the method includes:
201: for each object (marked as an object j) in the current frame (i frame), searching an object (marked as an object k) with the minimum distance to the object j in at least one frame (for example, n frames) before the i frame according to the radar coordinates of the object;
202: comparing the distance (namely the distance between the object j and the object k) with a preset threshold (denoted as d);
203: and if the distance is smaller than the threshold value d, marking the track identification of the object j as the track ID of the object k, and otherwise, allocating a new track ID to the object j.
As shown in fig. 2, in 201, the distance between the object j and the object k may be calculated according to the radar detection result in the radar coordinate system. For example, assume that the coordinate in the radar detection result of the object 1 is (x)1,y1) The coordinate in the radar detection result of the object 2 is (x)2,y2) Then the distance between object 1 and object 2 can be calculated by the following formula:
Figure BDA0002359458840000051
the distance d represents a linear distance between the object 1 and the object 2 in the radar coordinate system.
As shown in fig. 2, in 202, the preset threshold d may be set according to actual needs, which is not limited in the present application.
As shown in fig. 2, in 203, if the distance is smaller than the threshold d, that is, the distance between the object j and the object k is small enough, it indicates that the object j and the object k may be the same object in the real world, and then the trajectory ID of the object k is assigned to the object j; on the contrary, if the distance is not less than the threshold d, that is, the distance between the object j and the object k is relatively large, it indicates that the object j and the object k may not be the same object in the real world, and a new trajectory ID is assigned to the object j.
Fig. 3 is a schematic diagram of determining a trajectory of an object in a video detection result in a video coordinate system, and still taking the current frame as the ith frame as an example, as shown in fig. 3, the method includes:
301: for each object (denoted as object j) in the current frame (i-th frame), according to the video coordinates of the object, searching an object (denoted as object k) with the smallest distance to the object j in at least one frame (for example, n frames) before the i-th frame;
302: comparing the distance (namely the distance between the object j and the object k) with a preset threshold (denoted as d);
303: and if the distance is smaller than the threshold value d, marking the track identification of the object j as the track ID of the object k, and otherwise, allocating a new track ID to the object j.
As shown in fig. 3, in 301, the distance between the object j and the object k can be calculated according to the video detection result in the video coordinate system. For example, assume that the pixel coordinate in the video detection result of the object 1 is (u)1,v1) The pixel coordinate in the video detection result of the object 2 is (u)2,v2) Then the distance between object 1 and object 2 can be calculated by the following formula:
Figure BDA0002359458840000061
the distance D represents the pixel distance between the object 1 and the object 2 in the video coordinate system.
As shown in fig. 3, in 302, the preset threshold d may be set according to actual needs, which is not limited in the present application.
As shown in fig. 3, in 303, if the pixel distance is smaller than the threshold d, that is, the pixel distance between the object j and the object k is small enough, it indicates that the object j and the object k may be the same object in the real world, and then the trajectory ID of the object k is assigned to the object j; on the contrary, if the pixel distance is not less than the threshold d, that is, the pixel distance between the object j and the object k is relatively large, it indicates that the object j and the object k may not be the same object in the real world, and then a new trajectory ID is assigned to the object j.
FIG. 4 is a schematic diagram of determining a trajectory in an ith frame, as shown in FIG. 4, for each object j in the ith frame, a nearest object k is found in its previous n frames; if the distance between k and j is smaller than preset d, distributing the track ID of k to j; otherwise, j is assigned a new track ID.
As shown in fig. 1, in 102, the related information of the track may be the direction and the number of the track, or the direction, the number, and the speed of the track, which is not limited herein.
Fig. 5 is a schematic diagram of an example of matching the trajectory of an object according to the direction and number of the trajectory, as shown in fig. 5, the method including:
501: determining the direction and the number of the tracks of the objects in the radar detection result and the direction and the number of the tracks of the objects in the video detection result;
502: and matching the track of the unmatched object in the radar detection result with the track of the unmatched object in the video detection result according to the direction and the number of the tracks.
As shown in fig. 5, in 501, the direction of the trajectory of the object in the radar detection result (referred to as the radar trajectory) may be a direction toward the radar or a direction away from the radar, and the radar trajectory direction may be determined by an average speed direction, for example, the direction of the trajectory of the object in the radar detection result is determined according to the average speed direction of the object within a predetermined time window tw. The specific determination method is not limited in the present application, and the current frame may be a last frame in the predetermined time window tw, that is, a period of time tw is traced back from the current frame, and the direction of the trajectory of the object in the radar detection result of the current frame is determined according to the average speed direction of the object in the period of time tw.
As shown in fig. 5, in 501, the direction of the trajectory of the object in the video detection result (referred to as a video trajectory) may be a direction toward the camera or a direction away from the camera, and the direction of the video trajectory may be determined by the position of the object in the last frame and the first frame of the predetermined time window, for example, the direction of the trajectory of the object in the video detection result is determined according to the position of the object in the last frame and the first frame within the predetermined time window tw. The present application does not limit the specific determination method, and the current frame may also be a last frame in the predetermined time window tw, that is, a period of time tw is traced back from the current frame, and the direction of the trajectory of the object in the video detection result of the current frame is determined according to the positions of the object in the first frame and the last frame (current frame) in the period of time tw.
As shown in fig. 5, at 502, radar tracks and video tracks having the same track direction may be matched in a sorted order of the positions of the tracks. Fig. 6 is a schematic diagram of matching a radar track and a video track with the same track direction according to the sequence of the positions of the tracks, as shown in fig. 6, the method includes:
601: respectively sequencing the track (called urt) of the unmatched object in the radar detection result and the track (called uvt) of the unmatched object in the video detection result according to the positions of the tracks;
602: if the number of the urts and uvt with the same track direction is equal, matching the urts and uvt with the same track direction according to the sorting order;
603: if the number of the same track direction urts and uvt is not equal, then it is assumed that the same track direction urts and uvt cannot match.
As shown in fig. 6, in 601, the radar tracks and the video tracks may be sorted based on the position of each track in the respective coordinate system, and may be sorted according to a left-to-right sequence or a far-to-near sequence, which is not limited in this application.
As shown in fig. 6, if the number of unmatched radar tracks (urt) and unmatched video tracks (uvt) with the same track direction are equal in 602, each track may be matched in a sorted order. For example, the leftmost radar track in the radar coordinate system matches the leftmost video track in the video coordinate system, the second left radar track in the radar coordinate system matches the second left video track in the video coordinate system, and so on.
As shown in fig. 6, in 603, if the number of unmatched radar tracks (urt) and unmatched video tracks (uvt) with the same track direction are not equal, then the urt and uvt with the same track direction are considered to be unmatched.
In some embodiments, as shown in fig. 6, the method may further include:
604: and storing the matched tracks into the matched track group, and storing the unmatched tracks into the unmatched track group.
As shown in fig. 6, in 604, the tracks are stored by the matched track group and the unmatched track group, so that the matching condition of the tracks can be confirmed, and the matching operation of the tracks of the next frame can be performed accordingly.
Fig. 7 is a schematic diagram showing that the number of radar tracks is the same as the number of video tracks, and as shown in fig. 7, in the 1 st scene, the number of radar tracks and the number of video tracks are both 1, and the directions are the same, then the two tracks are matched with each other and stored in a matched track group; in scene 2, the number of the radar tracks and the number of the video tracks are both 2, after sequencing, the number of the radar tracks and the number of the video tracks in the same track direction are also the same, then the radar track a close to the left is matched with the video track b close to the left, the radar track b close to the right is matched with the video track b close to the right, and the tracks a and b are stored in a matched track group; in scenario 3, the number of the radar tracks and the number of the video tracks are both 2, but after the radar tracks and the video tracks are sequenced, the number of the radar tracks and the number of the video tracks which have the same track direction are different, the radar tracks and the video tracks do not match, and then the tracks a and the tracks b are stored in an unmatched track group.
Fig. 8 is a schematic diagram that the number of radar tracks is different from the number of video tracks, as shown in fig. 8, in the 1 st scenario, the number of radar tracks is 2, the number of video tracks is 1, after sorting, the number of tracks a with the same direction is the same, the radar tracks a and the video tracks a are matched with each other and stored in a matched track group, and the tracks b are not matched and stored in an unmatched track group; in the 2 nd scene, the number of radar tracks is 2, the number of video tracks is 1, after sequencing, the number of radar tracks with the same track direction is different from the number of video tracks, the radar tracks and the video tracks are not matched, and the tracks a and b are stored in an unmatched track group.
In the embodiment of the present application, after assigning a track ID to each track according to 101, it is known which track has been matched and which track has not been matched in the previous frame. According to the method of the embodiment of the application, an unmatched radar track (urt) and an unmatched video track (uvt) are found in each frame. The unmatched radar tracks (urt) and the unmatched video tracks (uvt) are ordered by track position. If the number of radar tracks (urt) and video tracks (uvt) with the same track direction is equal, matching the tracks in a sorted order.
Fig. 9 is a schematic diagram of an example of matching the trajectory of an object according to the direction, number and speed of the trajectory, as shown in fig. 9, the method comprising:
901: determining the direction, the quantity and the speed of the track of the object in the radar detection result and the direction, the quantity and the speed of the track of the object in the video detection result;
902: and matching the track of the unmatched object in the radar detection result with the track of the unmatched object in the video detection result according to the direction, the number and the speed of the tracks.
As shown in fig. 9, in 901, similar to the example of fig. 5, the direction of the trajectory of the object in the radar detection result may be a direction toward the radar or a direction away from the radar, and for example, the direction of the trajectory of the object in the radar detection result may be determined according to the average speed direction of the object within a predetermined time window tw; the direction of the trajectory of the object in the video detection result may be a direction toward the camera or a direction away from the camera, and the direction of the trajectory of the object in the video detection result may be determined, for example, according to the positions of the object in the last frame and the first frame within the predetermined time window tw. Here, the current frame may be the last frame within the predetermined time window tw.
As shown in fig. 9, in 901, the speed of the trajectory of the object in the radar detection result may be determined from the average speed of the object, for example, within a predetermined time window tw. The present application does not limit the specific determination method, and the current frame may be a last frame in the predetermined time window tw, that is, a period of time tw is traced back from the current frame, in the period of time tw, the radar detection result of each frame can measure the speed of the object, and an average value of the speeds of the objects in the period of time is used as the speed of the trajectory, that is, the speed of the trajectory of the object in the radar detection result of the current frame is determined according to the average speed of the object in the period of time.
As shown in fig. 9, in 901, the speed of the trajectory of the object in the video detection result may be determined by the position of the object in the last frame and the first frame of the predetermined time window, for example, the speed of the trajectory of the object in the video detection result is determined according to the position of the object in the last frame and the first frame within the predetermined time window tw. The present application is not limited to a specific determination method, and the current frame may be a last frame in the predetermined time window tw, that is, a period of time tw is traced back from the current frame, and in the period of time tw, the position of the object in the first frame is set as (u)start,vstart) The position of the object in the last frame is set as (u)end,vend) If the distance between the first frame and the last frame is D, the velocity of the trajectory of the object in the current frame can be represented as D/tw, where D can be calculated by the following formula:
Figure BDA0002359458840000091
as shown in fig. 9, at 902, radar tracks and video tracks having the same track direction may be matched in a sorted order of the speed of the tracks. Fig. 10 is a schematic diagram of matching a radar track and a video track having the same track direction in a sorted order of the speed of the track, as shown in fig. 10, the method comprising:
1001: respectively sorting the track (urt) of the objects which are not matched in the radar detection result and the track (uvt) of the objects which are not matched in the video detection result according to the speed of the tracks;
1002: if the number of the same-direction urts and uvt of the track is equal, matching the same-direction urts and uvt according to the sorting order;
1003: if the number of equally oriented urts uvt is not equal, then it is assumed that the identically oriented urts uvt do not match.
As shown in fig. 10, in 1001, the radar track and the video track may be sorted respectively based on the speed of the track, for example, the radar track and the video track may be sorted in the order from the speed to the speed, or the radar track and the video track may be sorted in the order from the speed to the speed, which is not limited in the present application.
As shown in fig. 10, in 1002, if the number of unmatched radar tracks (urt) and unmatched video tracks (uvt) having the same track direction are equal, each track may be matched in order of the ranking of speeds. For example, the radar track with the highest speed in the radar coordinate system is matched with the video track with the highest speed in the video coordinate system, the radar track with the second speed in the radar coordinate system is matched with the video track with the second speed in the video coordinate system, and so on.
As shown in fig. 10, in 1003, if the number of unmatched radar tracks (urt) and unmatched video tracks (uvt) with the same track direction is not equal, it is considered that the urt and uvt with the same track direction cannot be matched.
In some embodiments, as shown in fig. 10, the method may further comprise:
1004: and storing the matched tracks into the matched track group, and storing the unmatched tracks into the unmatched track group.
As shown in fig. 10, in 1004, the tracks are stored by the matched track group and the unmatched track group, so that the matching condition of the tracks can be confirmed, and the matching operation of the tracks of the next frame can be performed accordingly.
In the embodiment of the present application, after assigning a track ID to each track according to 101, it is known which track has been matched and which track has not been matched in the previous frame. According to the method of the embodiment of the application, an unmatched radar track (urt) and an unmatched video track (uvt) are found in each frame. The unmatched radar tracks (urt) and the unmatched video tracks (uvt) are ordered according to the speed of the tracks. If the number of radar tracks (urt) and video tracks (uvt) with the same track direction is equal, matching the tracks in a sorted order.
As shown in fig. 1, after matching the radar track and the video track, data of the same object may be stored in association 103. In some embodiments, data for the same object may be recorded as [ (x, y), (u, v) ], where (x, y) is the position of the object detected in the radar coordinate system and (u, v) is the pixel coordinates of the object in the video coordinate system.
The present application does not limit the recording manner of the data, and for example, the data of the same object may be stored in the manner as shown in table 1 below, or may be stored in the manner as shown in table 2 below.
Table 1:
Figure BDA0002359458840000101
in Table 1, the data is divided into four columns, the first column representing the frame id of the object appearing in the video, the second column representing the object id, the third column representing the radar coordinates of the object, and xmn,ymnRepresents; the fourth column represents the pixel coordinates of the object, umn,vmnAnd (4) showing. Where m denotes the id of the object and n denotes the video frame id.
Table 2:
Figure BDA0002359458840000111
in Table 2, the data is divided into two columns, the first column representing frame id of the video, and the second column representing object data after matching radar with the image, using [ (x)mn,ymn),(umn,vmn)]Expressed in which the radar coordinates are expressed as (x)mn,ymn) Expressed in pixel coordinates of (u)mn,vmn) M denotes the id of the object and n denotes the video frame id.
It should be noted that fig. 1 to 3, fig. 5 to 6, and fig. 9 to 10 only schematically illustrate embodiments of the present application, but the present application is not limited thereto. For example, the order of execution of various operations may be appropriately adjusted, and other operations may be added or some of the operations may be subtracted. Those skilled in the art can make appropriate modifications in light of the above disclosure, without being limited to the descriptions of fig. 1 to 3, fig. 5 to 6, and fig. 9 to 10 described above.
It should be noted that the above description only describes each operation or process related to the present application, but the present application is not limited thereto. The method may also comprise other operations or procedures, reference being made to the prior art with regard to specific content of these operations or procedures.
According to the embodiment of the application, the matching process of the radar track and the video track can be automatically processed. In addition, the traditional method needs extra time to match the radar track and the video track before the sensor is deployed, the method provided by the embodiment of the application can be used immediately after the sensor is deployed on a road, and the processing efficiency is improved.
Second aspect of the embodiments
A second aspect of the embodiments of the present application provides a data matching apparatus, where the data matching apparatus corresponds to the data matching method of the first aspect of the embodiments, and a description of the same contents is not repeated.
Fig. 11 is a schematic diagram of a data matching apparatus according to an embodiment of the present application. As shown in fig. 11, the data matching apparatus 1100 according to the embodiment of the present application includes: a determination unit 1101, a matching unit 1102 and a recording unit 1103.
In this embodiment of the application, the determining unit 1101 is configured to determine a trajectory of an object in a radar detection result in a radar coordinate system of a current frame and a trajectory of an object in a video detection result in a video coordinate system of the current frame; the matching unit 1102 is configured to match the track of the object that is not matched in the radar detection result of the current frame with the track of the object that is not matched in the video detection result of the current frame according to the relevant information of the tracks; the recording unit 1103 is configured to record data of the radar object and data of the video object corresponding to each matching track as data of the same object.
In some embodiments, as shown in fig. 11, the matching unit 1102 includes a first determining unit 11021 and a first matching unit 11022, where the first determining unit 11021 is configured to determine the direction and the number of the trajectories of the objects in the radar detection result and the direction and the number of the trajectories of the objects in the video detection result; the first matching unit 11022 is configured to match the track of the object that is not matched in the radar detection result with the track of the object that is not matched in the video detection result according to the direction and the number of the tracks.
In some embodiments, the first matching unit 11022 matches the track of the object that is not matched in the radar detection result with the track of the object that is not matched in the video detection result, including:
respectively sorting the track (urt) of the unmatched objects in the radar detection result and the track (uvt) of the unmatched objects in the video detection result according to the positions of the tracks;
if the number of the urts and uvt with the same track direction is equal, matching the urts and uvt with the same track direction according to the sorting order;
if the number of the same track direction urts and uvt is not equal, then it is assumed that the same track direction urts and uvt cannot match.
In some embodiments, the first matching unit 11022 also stores matched tracks into matched track groups and unmatched tracks into unmatched track groups.
In the embodiment of the application, the track direction of the object in the radar detection result is a direction towards the radar or a direction away from the radar; the direction of the track of the object in the video detection result is a direction towards the camera or a direction away from the camera.
In some embodiments, the first determination unit 11021 determines the direction of the trajectory of the object in the radar detection result from the average velocity direction within the predetermined time window tw. Also, in some embodiments, the first determining unit 11021 determines the direction of the trajectory of the object in the video detection result according to the positions of the object in the last frame and the first frame within the predetermined time window tw. The current frame is the last frame in the predetermined time window tw.
In some embodiments, as shown in fig. 11, the matching unit 1102 includes a second determining unit 11023 and a second matching unit 11024, where the second determining unit 11023 is configured to determine the direction, the number, and the speed of the trajectory of the object in the radar detection result and the direction, the number, and the speed of the trajectory of the object in the video detection result; the second matching unit 11024 is configured to match the track of the object that is not matched in the radar detection result with the track of the object that is not matched in the video detection result according to the direction, number, and speed of the tracks.
In some embodiments, the second matching unit 11024 matches the track of the object that is not matched in the radar detection result with the track of the object that is not matched in the video detection result, including:
respectively sorting the track (urt) of the objects which are not matched in the radar detection result and the track (uvt) of the objects which are not matched in the video detection result according to the speed of the tracks;
if the number of the urts and uvt with the same track direction is equal, matching the urts and uvt with the same track direction according to the sorting order;
if the number of the same track direction urts and uvt is not equal, then it is assumed that the same track direction urts and uvt cannot match.
In some embodiments, the second matching unit 11024 also stores matched tracks into matched track groups and unmatched tracks into unmatched track groups.
In the embodiment of the application, the track direction of the object in the radar detection result is a direction towards the radar or a direction away from the radar; the direction of the track of the object in the video detection result is a direction towards the camera or a direction away from the camera.
In some embodiments, the second determination unit 11023 determines the direction of the trajectory of the object in the radar detection result from the average speed direction of the object within the predetermined time window tw. Also, in some embodiments, the second determining unit 11023 determines the direction of the trajectory of the object in the video detection result according to the positions of the object in the last frame and the first frame within the predetermined time window tw. The current frame is the last frame in the predetermined time window tw.
In some embodiments, the second determination unit 11023 determines the velocity of the trajectory of the object in the radar detection result from the average velocity of the object within the predetermined time window tw. Also, in some embodiments, the second determining unit 11023 determines the velocity of the trajectory of the object in the video detection result according to the position of the object in the last frame and the first frame within the predetermined time window tw. The current frame is the last frame in the predetermined time window tw.
In some embodiments, the determining unit 1101 determines the trajectory of the object in the radar detection result in the radar coordinate system of the current frame, including:
for each object j in the current frame, searching an object k with the minimum distance to the object j in at least one frame before the current frame according to the radar coordinates of the object;
comparing the distance with a preset threshold value;
if the distance is smaller than the threshold value, marking the track identification of the object j as the track ID of the object k, otherwise, allocating a new track ID to the object j.
In some embodiments, the determining unit 1101 determines the trajectory of the object in the video detection result in the video coordinate system of the current frame, including:
for each object j in the current frame, according to the video coordinates of the object, searching an object k with the minimum distance from the object j in at least one frame before the current frame;
comparing the distance with a preset threshold value;
if the distance is smaller than the threshold value, marking the track identification of the object j as the track ID of the object k, otherwise, allocating a new track ID to the object j.
In some embodiments, the recording unit 1103 records the data of the same object as [ (x, y), (u, v) ], where (x, y) is the position of the object detected in the radar coordinate system and (u, v) is the pixel coordinates of the object in the video coordinate system.
It should be noted that the above description only describes the components or modules related to the present application, but the present application is not limited thereto. The data matching apparatus 1100 may also include other components or modules, and reference may be made to the related art regarding the specific contents of the components or modules.
According to the embodiment of the application, the matching process of the radar track and the video track can be automatically processed. In addition, the traditional method needs extra time to match the radar track and the video track before the sensor is deployed, the method provided by the embodiment of the application can be used immediately after the sensor is deployed on a road, and the processing efficiency is improved.
Third aspect of the embodiments
A third aspect of embodiments of the present application provides a data processing device, which may be, for example, a computer, a server, a workstation, a laptop, a smartphone, or the like; the embodiments of the present application are not limited thereto.
Fig. 12 is a schematic diagram of a data processing apparatus according to an embodiment of the present application, and as shown in fig. 12, the data processing apparatus 1200 may include: at least one interface (not shown), a processor (e.g., a Central Processing Unit (CPU))1201, a memory 1202; a memory 1202 is coupled to the processor 1201. Wherein the memory 1202 may store various data; a data matching program 1203 is also stored, and the program 1203 is executed under the control of the processor 1201, and stores various data such as preset values and predetermined conditions and the like.
In an embodiment, the functions of the data matching apparatus 1100 described in the second aspect of the embodiment may be integrated into the processor 1201, so as to implement the data matching method described in the first aspect of the embodiment. For example, the processor 1201 may be configured to:
determining the track of an object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the tracks;
and recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.
In another embodiment, the data matching apparatus 1100 according to the second aspect of the embodiment may be configured separately from the processor 1201, for example, the data matching apparatus 1100 may be configured as a chip connected to the processor 1201, and the function of determining the data matching apparatus 1100 may be implemented by the control of the processor 1201.
It is noted that the data processing device 1200 may also include a display 1205 and an I/O device 1204, or may not necessarily include all of the components shown in fig. 12, such as a camera and radar (not shown) for acquiring input image frames; the data processing device 1200 may also comprise components not shown in fig. 12, which can be referred to in the prior art.
In the present embodiment, the processor 1201, which is sometimes referred to as a controller or operational control, may include a microprocessor or other processor device and/or logic device, and the processor 1201 receives input and controls the operation of the various components of the data processing apparatus 1200.
In the present embodiment, the memory 1202 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. Various information may be stored, and programs for executing the information may be stored. And the processor 1201 may execute the program stored in the memory 1202 to realize information storage or processing or the like. The functions of other parts are similar to the prior art and are not described in detail here. The components of the data processing apparatus 1200 may be implemented in dedicated hardware, firmware, software, or combinations thereof, without departing from the scope of the present application.
The data processing equipment of the embodiment of the application can automatically process the matching process of the radar track and the video track, and improves the processing efficiency.
Embodiments of the present application also provide a computer readable program, where the program, when executed in a data processing apparatus, causes the data processing apparatus to perform the method of the first aspect of the embodiments.
Embodiments of the present application further provide a storage medium storing a computer-readable program, where the computer-readable program enables a data processing apparatus to execute the method of the first aspect of the embodiments.
The above apparatus and method of the present application may be implemented by hardware, or may be implemented by hardware in combination with software. The present application relates to a computer-readable program which, when executed by a logic component, enables the logic component to implement the above-described apparatus or constituent components, or to implement various methods or steps described above. The present application also relates to a storage medium such as a hard disk, a magnetic disk, an optical disk, a DVD, a flash memory, or the like, for storing the above program.
The methods/apparatus described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. One or more of the functional block diagrams and/or one or more combinations of the functional block diagrams illustrated in the figures may correspond to individual software modules of the computer program flow or may correspond to individual hardware modules. These software modules may correspond to various steps shown in the figures, respectively. These hardware modules may be implemented, for example, by solidifying these software modules using a Field Programmable Gate Array (FPGA).
A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. A storage medium may be coupled to the processor such that the processor can read information from, and write information to, the storage medium; or the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The software module may be stored in the memory of the mobile terminal or in a memory card that is insertable into the mobile terminal. For example, if the device (e.g., mobile terminal) employs a relatively large capacity MEGA-SIM card or a large capacity flash memory device, the software module may be stored in the MEGA-SIM card or the large capacity flash memory device.
One or more of the functional blocks and/or one or more combinations of the functional blocks described in the figures can be implemented as a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof designed to perform the functions described herein. One or more of the functional blocks and/or one or more combinations of the functional blocks described in connection with the figures may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP communication, or any other such configuration.
The present application has been described in conjunction with specific embodiments, but it should be understood by those skilled in the art that these descriptions are intended to be illustrative, and not limiting. Various modifications and adaptations of the present application may occur to those skilled in the art based on the spirit and principles of the application and are within the scope of the application.
Regarding the above-described embodiments disclosed in the embodiments of the present application, the following remarks are also disclosed:
1. a data matching method, wherein the method comprises:
determining the track of an object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the tracks;
and recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.
2. The method according to supplementary note 1, wherein matching a track of an object that is not matched in a radar detection result of a current frame with a track of an object that is not matched in a video detection result of the current frame according to the relevant information of the tracks, comprises:
determining the direction and the number of the tracks of the objects in the radar detection result and the direction and the number of the tracks of the objects in the video detection result;
and matching the track of the unmatched object in the radar detection result with the track of the unmatched object in the video detection result according to the direction and the number of the tracks.
3. The method according to supplementary note 2, wherein matching a trajectory of an object that is not matched in the radar detection result with a trajectory of an object that is not matched in the video detection result includes:
respectively sorting the track (urt) of the unmatched objects in the radar detection result and the track (uvt) of the unmatched objects in the video detection result according to the positions of the tracks;
if the number of the urts and uvt with the same track direction is equal, matching the urts and uvt with the same track direction according to the sorting order;
if the number of the same track direction urts and uvt is not equal, then it is assumed that the same track direction urts and uvt cannot match.
4. The method according to supplementary note 3, wherein matching a trajectory of an object that is not matched in the radar detection result with a trajectory of an object that is not matched in the video detection result, further comprises:
and storing the matched tracks into the matched track group, and storing the unmatched tracks into the unmatched track group.
5. The method according to supplementary note 2, wherein,
the track direction of the object in the radar detection result is towards the direction of the radar or away from the direction of the radar;
the direction of the track of the object in the video detection result is a direction towards the camera or a direction away from the camera.
6. The method according to supplementary note 2 or 5, wherein,
determining a direction of a trajectory of an object in the radar detection results, comprising: determining the direction of the track of the object in the radar detection result according to the average speed direction in the preset time window tw;
determining the direction of the trajectory of the object in the video detection result, including: determining the direction of the trajectory of the object in the video detection result according to the position of the object in the last frame and the first frame within the predetermined time window tw,
wherein the current frame is the last frame within the predetermined time window tw.
7. The method according to supplementary note 1, wherein matching a track of an object that is not matched in a radar detection result of a current frame with a track of an object that is not matched in a video detection result of the current frame according to the relevant information of the tracks, comprises:
determining the direction, the quantity and the speed of the track of the object in the radar detection result and the direction, the quantity and the speed of the track of the object in the video detection result;
and matching the track of the unmatched object in the radar detection result with the track of the unmatched object in the video detection result according to the direction, the number and the speed of the tracks.
8. The method according to supplementary note 7, wherein matching a trajectory of an object that is not matched in the radar detection result with a trajectory of an object that is not matched in the video detection result includes:
respectively sorting the track (urt) of the objects which are not matched in the radar detection result and the track (uvt) of the objects which are not matched in the video detection result according to the speed of the tracks;
if the number of the urts and uvt with the same track direction is equal, matching the urts and uvt with the same track direction according to the sorting order;
if the number of the same track direction urts and uvt is not equal, then it is assumed that the same track direction urts and uvt cannot match.
9. The method according to supplementary note 8, wherein matching a trajectory of an object that is not matched in the radar detection result with a trajectory of an object that is not matched in the video detection result, further comprises:
and storing the matched tracks into the matched track group, and storing the unmatched tracks into the unmatched track group.
10. The method according to supplementary note 7, wherein,
the track direction of the object in the radar detection result is towards the direction of the radar or away from the direction of the radar;
the direction of the track of the object in the video detection result is a direction towards the camera or a direction away from the camera.
11. The method according to supplementary note 7 or 10, wherein,
determining a direction of a trajectory of an object in the radar detection results, comprising: determining the direction of the track of the object in the radar detection result according to the average speed direction of the object in the preset time window tw;
determining the direction of the trajectory of the object in the video detection result, including: determining the direction of the trajectory of the object in the video detection result according to the position of the object in the last frame and the first frame within the predetermined time window tw,
wherein the current frame is the last frame within the predetermined time window tw.
12. The method according to supplementary note 7, wherein,
determining a velocity of a trajectory of an object in the radar detection results, comprising: determining the speed of the track of the object in the radar detection result according to the average speed of the object in the preset time window tw;
determining the speed of the trajectory of the object in the video detection result, including: determining the speed of the trajectory of the object in the video detection result according to the position of the object in the last frame and the first frame within the predetermined time window tw,
wherein the current frame is the last frame within the predetermined time window tw.
13. The method according to supplementary note 1, wherein determining a trajectory of an object in a radar detection result in a radar coordinate system of a current frame includes:
for each object j in the current frame, searching an object k with the minimum distance to the object j in at least one frame before the current frame according to the radar coordinates of the object;
comparing the distance with a preset threshold value;
if the distance is smaller than the threshold value, marking the track identification of the object j as the track ID of the object k, otherwise, allocating a new track ID to the object j.
14. The method according to supplementary note 1, wherein determining the trajectory of the object in the video detection result in the video coordinate system of the current frame comprises:
for each object j in the current frame, according to the video coordinates of the object, searching an object k with the minimum distance from the object j in at least one frame before the current frame;
comparing the distance with a preset threshold value;
if the distance is smaller than the threshold value, marking the track identification of the object j as the track ID of the object k, otherwise, allocating a new track ID to the object j.
15. The method according to supplementary note 1, wherein the data of the same object is recorded as [ (x, y), (u, v) ], where (x, y) is the position of the object detected in a radar coordinate system, and (u, v) is the pixel coordinates of the object in a video coordinate system.
16. A data processing apparatus, wherein the data processing apparatus comprises a data matching device configured to:
determining the track of an object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the tracks;
and recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.

Claims (10)

1. A data matching apparatus, characterized in that the apparatus comprises:
the determining unit is used for determining the track of the object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
the matching unit is used for matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the track;
and the recording unit is used for recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.
2. The apparatus of claim 1, wherein the matching unit comprises:
a first determination unit that determines a direction and a number of trajectories of the object in the radar detection result and a direction and a number of trajectories of the object in the video detection result;
and the first matching unit is used for matching the track of the object which is not matched in the radar detection result with the track of the object which is not matched in the video detection result according to the direction and the number of the tracks.
3. The apparatus according to claim 2, wherein the first matching unit matches a trajectory of an object that is not matched in the radar detection result with a trajectory of an object that is not matched in the video detection result according to the direction and the number of the trajectories, includes:
respectively sequencing a first track of an unmatched object in the radar detection result and a second track of the unmatched object in the video detection result according to the positions of the tracks;
if the number of the first tracks and the number of the second tracks in the same track direction are equal, matching the first tracks and the second tracks in the same track direction according to the sorting sequence;
and if the number of the first tracks and the number of the second tracks with the same track direction are not equal, the first tracks and the second tracks with the same track direction are considered to be incapable of being matched.
4. The apparatus of claim 3, wherein the first matching unit further stores matching tracks into matching track groups and unmatched tracks into unmatched track groups.
5. The apparatus according to claim 2, wherein the track direction of the object in the radar detection result is a direction toward the radar or a direction away from the radar; the direction of the track of the object in the video detection result is a direction towards the camera or a direction away from the camera.
6. The apparatus of claim 2, wherein,
the first determining unit determines the direction of the track of the object in the radar detection result according to the average speed direction in a preset time window;
the first determining unit determines a direction of a trajectory of an object in the video detection result according to a last frame within a predetermined time window and a position of the object in the first frame,
wherein the current frame is the last frame within the predetermined time window.
7. The apparatus of claim 1, wherein the matching unit comprises:
a second determination unit that determines a direction, a number, and a speed of a trajectory of the object in the radar detection result and a direction, a number, and a speed of a trajectory of the object in the video detection result;
and the second matching unit is used for matching the track of the object which is not matched in the radar detection result with the track of the object which is not matched in the video detection result according to the direction, the number and the speed of the tracks.
8. The apparatus of claim 1, wherein,
the determining unit determines the track of the object in the radar detection result of the current frame in the radar coordinate system, and the determining unit comprises the following steps:
for each object j in the current frame, searching an object k with the minimum distance to the object j in at least one frame before the current frame according to the radar coordinates of the object;
comparing the distance with a preset threshold value;
if the distance is smaller than the threshold value, marking the track identification of the object j as the track ID of the object k, otherwise, allocating a new track ID to the object j;
the determining unit determines the track of the object in the video detection result in the video coordinate system of the current frame, and the determining unit comprises the following steps:
for each object j in the current frame, according to the video coordinates of the object, searching an object k with the minimum distance from the object j in at least one frame before the current frame;
comparing the distance with a preset threshold value;
if the distance is smaller than the threshold value, marking the track identification of the object j as the track ID of the object k, otherwise, allocating a new track ID to the object j.
9. The apparatus of claim 1, wherein the data for the same object is recorded as [ (x, y), (u, v) ], wherein (x, y) is the position of the object detected in a radar coordinate system and (u, v) is the pixel coordinates of the object in a video coordinate system.
10. A method of data matching, the method comprising:
determining the track of an object in the radar detection result of the current frame under the radar coordinate system and the track of the object in the video detection result of the current frame under the video coordinate system;
matching the track of the unmatched object in the radar detection result of the current frame with the track of the unmatched object in the video detection result of the current frame according to the relevant information of the tracks;
and recording the data of the radar object and the data of the video object corresponding to each matching track as the data of the same object.
CN202010017517.7A 2020-01-08 2020-01-08 Data matching method and device and data processing equipment Pending CN113095345A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010017517.7A CN113095345A (en) 2020-01-08 2020-01-08 Data matching method and device and data processing equipment
JP2020200956A JP2021111364A (en) 2020-01-08 2020-12-03 Data matching method and apparatus, and data processing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010017517.7A CN113095345A (en) 2020-01-08 2020-01-08 Data matching method and device and data processing equipment

Publications (1)

Publication Number Publication Date
CN113095345A true CN113095345A (en) 2021-07-09

Family

ID=76663381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010017517.7A Pending CN113095345A (en) 2020-01-08 2020-01-08 Data matching method and device and data processing equipment

Country Status (2)

Country Link
JP (1) JP2021111364A (en)
CN (1) CN113095345A (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115144843A (en) * 2022-06-28 2022-10-04 海信集团控股股份有限公司 Fusion method and device for object positions
CN115220005B (en) * 2022-07-20 2023-04-28 中国科学院长春光学精密机械与物理研究所 Automatic target recognition method for photoelectric tracking equipment based on multi-source information fusion

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109343050A (en) * 2018-11-05 2019-02-15 浙江大华技术股份有限公司 A kind of radar video monitoring method and device
CN109459750A (en) * 2018-10-19 2019-03-12 吉林大学 A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision
CN109583505A (en) * 2018-12-05 2019-04-05 百度在线网络技术(北京)有限公司 A kind of object correlating method, device, equipment and the medium of multisensor
CN110515073A (en) * 2019-08-19 2019-11-29 南京慧尔视智能科技有限公司 The trans-regional networking multiple target tracking recognition methods of more radars and device
US20190391254A1 (en) * 2018-06-20 2019-12-26 Rapsodo Pte. Ltd. Radar and camera-based data fusion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190391254A1 (en) * 2018-06-20 2019-12-26 Rapsodo Pte. Ltd. Radar and camera-based data fusion
CN109459750A (en) * 2018-10-19 2019-03-12 吉林大学 A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision
CN109343050A (en) * 2018-11-05 2019-02-15 浙江大华技术股份有限公司 A kind of radar video monitoring method and device
CN109583505A (en) * 2018-12-05 2019-04-05 百度在线网络技术(北京)有限公司 A kind of object correlating method, device, equipment and the medium of multisensor
CN110515073A (en) * 2019-08-19 2019-11-29 南京慧尔视智能科技有限公司 The trans-regional networking multiple target tracking recognition methods of more radars and device

Also Published As

Publication number Publication date
JP2021111364A (en) 2021-08-02

Similar Documents

Publication Publication Date Title
EP3806064B1 (en) Method and apparatus for detecting parking space usage condition, electronic device, and storage medium
CN110298300B (en) Method for detecting vehicle illegal line pressing
CN110929655B (en) Lane line identification method in driving process, terminal device and storage medium
CN110135377B (en) Method and device for detecting motion state of object in vehicle-road cooperation and server
CN110032947B (en) Method and device for monitoring occurrence of event
CN111145555A (en) Method and device for detecting vehicle violation
CN111860219B (en) High-speed channel occupation judging method and device and electronic equipment
CN112753038A (en) Method and device for identifying lane change trend of vehicle
CN113095345A (en) Data matching method and device and data processing equipment
CN113359125A (en) Data fusion method and device and data processing equipment
CN112115939A (en) Vehicle license plate recognition method and device
CN113469075A (en) Method, device and equipment for determining traffic flow index and storage medium
CN112862856A (en) Method, device and equipment for identifying illegal vehicle and computer readable storage medium
CN110889388A (en) Violation identification method, device, equipment and storage medium
CN107506753B (en) Multi-vehicle tracking method for dynamic video monitoring
CN109948436B (en) Method and device for monitoring vehicles on road
CN112447060A (en) Method and device for recognizing lane and computing equipment
CN110444026B (en) Triggering snapshot method and system for vehicle
CN112700653A (en) Method, device and equipment for judging illegal lane change of vehicle and storage medium
CN114693722B (en) Vehicle driving behavior detection method, detection device and detection equipment
CN115762153A (en) Method and device for detecting backing up
CN115965636A (en) Vehicle side view generating method and device and terminal equipment
CN109740518B (en) Method and device for determining object in video
CN112597924B (en) Electric bicycle track tracking method, camera device and server
CN111081027B (en) License plate recognition method and device, computer device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination