WO2023040137A1 - Traitement de données - Google Patents

Traitement de données Download PDF

Info

Publication number
WO2023040137A1
WO2023040137A1 PCT/CN2022/070559 CN2022070559W WO2023040137A1 WO 2023040137 A1 WO2023040137 A1 WO 2023040137A1 CN 2022070559 W CN2022070559 W CN 2022070559W WO 2023040137 A1 WO2023040137 A1 WO 2023040137A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
point cloud
target
transfer matrix
radars
Prior art date
Application number
PCT/CN2022/070559
Other languages
English (en)
Chinese (zh)
Inventor
黄超
张�浩
Original Assignee
上海仙途智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海仙途智能科技有限公司 filed Critical 上海仙途智能科技有限公司
Publication of WO2023040137A1 publication Critical patent/WO2023040137A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • This description relates to the technical field of automatic driving, and in particular to methods, devices, terminals and media for data processing.
  • autonomous driving technology can reduce traffic accidents and improve driving safety.
  • the high-wire beam lidar is generally used as the main sensor of the perception system of the autonomous vehicle, so that the laser beam is emitted in multiple directions through the high-wire beam lidar, and then the location of the autonomous vehicle is determined according to the time when the laser beam is received.
  • the distribution of obstacles in the environment is generally used as the main sensor of the perception system of the autonomous vehicle, so that the laser beam is emitted in multiple directions through the high-wire beam lidar, and then the location of the autonomous vehicle is determined according to the time when the laser beam is received. The distribution of obstacles in the environment.
  • this specification provides the following data methods, devices, terminals and media.
  • a data processing method comprising: acquiring a point cloud map of the first target scene and point cloud data obtained by scanning the first target scene with multiple radars; The point cloud data corresponding to each radar and the point cloud map of the first target scene are obtained, and the first transfer matrix corresponding to each radar is determined.
  • the first transfer matrix is the transfer of the coordinate system of the corresponding radar relative to the coordinate system of the first target scene matrix; using any one of the multiple radars as a reference radar, based on the first transfer matrix corresponding to the reference radar, determine the second transfer matrix corresponding to other radars in the multiple radars except the reference radar, and the second transfer matrix is the corresponding radar
  • a transfer matrix of the coordinate system relative to the coordinate system of the reference radar based on the second transfer matrix corresponding to the reference radar and other radars, data fusion is performed on the point cloud data obtained by scanning the second target scene by multiple radars.
  • obtaining the point cloud map of the first target scene includes: based on the point cloud data collected by the preset radar located in the first target scene, determining the equation of motion for the target object equipped with the preset radar and an observation equation, the motion equation is used to indicate the relationship between the position of the target object and the motion data of the target object, and the observation equation is used to indicate the relationship between the position of the preset point in the first target scene and the position of the target object; Based on the motion equation of the target object and the observation equation, the position of the target object and the positions of multiple preset points in the first target scene are determined to obtain a point cloud map of the first target scene.
  • determining the first transfer matrix corresponding to each radar includes: for any radar in the plurality of radars, obtaining The target parameters of any radar meet the target parameter value of the setting condition, and the setting condition is the matching degree of the position of the point corresponding to the point cloud data of any radar with the position of the point in the point cloud map of the first target scene maximum; based on the target parameter value and the point cloud map of the first target scene, determine the first transfer matrix corresponding to any radar.
  • obtaining the target parameter value whose target parameter satisfies the set condition of any radar includes: responding to the parameter value adjustment operation on the target parameter, based on the adjusted parameter value , to display the point cloud map corresponding to the point cloud data of any radar; in response to the submission operation of the parameter value of the target parameter, obtain the current parameter value of the target parameter as the target parameter value of the target parameter.
  • the first transfer matrix includes a first rotation matrix and a first translation matrix; based on the obtained point cloud data corresponding to each radar and the point cloud map of the first target scene, determine the first rotation matrix corresponding to each radar.
  • the transfer matrix includes: for any radar among multiple radars, the point cloud data corresponding to any radar and the point cloud data corresponding to the target parameters are used as the intermediate point cloud data corresponding to any radar, based on the intermediate point cloud data, the first A point cloud map of a target scene and a target error function, determine a first rotation matrix and a first translation matrix corresponding to the minimum function value of the target error function, and obtain a first transfer matrix.
  • the target parameter includes at least one of roll angle, yaw angle, pitch angle, abscissa, ordinate and altitude.
  • determining the second transfer matrix corresponding to other radars in the plurality of radars except the reference radar includes: for The target radar among the radars except the reference radar determines the second transfer matrix corresponding to the target radar based on the first transfer matrix corresponding to the reference radar and the inverse matrix of the first transfer matrix corresponding to the target radar.
  • a data processing device comprising: an acquisition unit, configured to acquire a point cloud map of the first target scene and a point cloud obtained by scanning the first target scene by multiple radars Data; a first determination unit, configured to determine a first transfer matrix corresponding to each radar based on the obtained point cloud data corresponding to each radar and the point cloud map of the first target scene, where the first transfer matrix is the coordinates of the corresponding radar The transfer matrix of the coordinate system relative to the first target scene; the second determination unit is used to use any radar in the plurality of radars as a reference radar, based on the first transfer matrix corresponding to the reference radar, determine the difference between the plurality of radars except the reference The second transfer matrix corresponding to other radars outside the radar, the second transfer matrix is the transfer matrix of the coordinate system of the corresponding radar relative to the coordinate system of the reference radar; the data fusion unit is used for the second transfer corresponding to the reference radar and other radars The matrix performs data fusion on the point
  • the acquiring unit when used to acquire the point cloud map of the first target scene, it is specifically configured to: based on the point cloud data collected by the preset radar located in the first target scene, determine the Based on the motion equation and observation equation of the target object equipped with the preset radar, the motion equation is used to indicate the relationship between the position of the target object and the motion data of the target object, and the observation equation is used to indicate the relationship between the preset point in the first target scene The relationship between the position and the position of the target object; based on the motion equation of the target object and the observation equation, determine the position of the target object and the positions of a plurality of preset points in the first target scene, and obtain the position of the first target scene Point cloud map.
  • the first determination unit when determining the first transfer matrix corresponding to each radar based on the acquired point cloud data corresponding to each radar and the point cloud map of the first target scene, includes An acquisition subunit and a determination subunit; wherein, the acquisition subunit is used to acquire a target parameter value whose target parameter of any radar satisfies a setting condition for any radar among multiple radars, and the setting condition is any radar
  • the acquisition subunit is used to acquire a target parameter value whose target parameter of any radar satisfies a setting condition for any radar among multiple radars, and the setting condition is any radar
  • the position of the point corresponding to the point cloud data has the largest matching degree with the position of the point in the point cloud map of the first target scene; the determination subunit is used for point cloud based on the target parameter value and the first target scene map to determine the first transfer matrix corresponding to any radar.
  • the obtaining subunit when used to obtain a target parameter value whose target parameter of any radar satisfies the set condition for any radar among the plurality of radars, it is specifically used to: respond to the target parameter The parameter value adjustment operation, based on the adjusted parameter value, displays the point cloud map corresponding to the point cloud data of any radar; in response to the submission operation of the parameter value of the target parameter, obtain the current parameter value of the target parameter as the target The target parameter value for the parameter.
  • the first transfer matrix includes a first rotation matrix and a first translation matrix; the first determination unit is used to obtain point cloud data corresponding to each radar and the points of the first target scene
  • the cloud map when determining the first transfer matrix corresponding to each radar, is specifically used for: for any radar among multiple radars, the point cloud data corresponding to any radar and the point cloud data corresponding to the target parameters are used as the corresponding point cloud data of any radar
  • the intermediate point cloud data based on the intermediate point cloud data, the point cloud map of the first target scene and the target error function, determine the corresponding first rotation matrix and first translation matrix under the condition that the function value of the target error function is the smallest, and obtain first transition matrix.
  • the target parameter includes at least one of roll angle, yaw angle, pitch angle, abscissa, ordinate and altitude.
  • the second determining unit is configured to use any radar among the plurality of radars as a reference radar, and determine other radars among the plurality of radars except the reference radar based on the first transfer matrix corresponding to the reference radar
  • the corresponding second transfer matrix it is specifically used for: for the target radar in other radars except the reference radar among the multiple radars, based on the first transfer matrix corresponding to the reference radar, and the inverse of the first transfer matrix corresponding to the target radar matrix, to determine the second transfer matrix corresponding to the target radar.
  • a terminal including a memory, a processor, and a computer program stored in the memory and operable on the processor, wherein, when the processor executes the computer program, the above-mentioned data processing method is implemented. The action to perform.
  • a computer-readable storage medium is provided.
  • a program is stored on the computer-readable storage medium, and the program is used by a processor to execute the operations performed by the above data processing method.
  • a computer program product including a computer program, and when the program is executed by a processor, operations performed by the above data processing method are implemented.
  • the technical solution provided by the embodiments of this specification may include the following beneficial effects: the technical solution provided by the embodiments of this specification obtains the point cloud map of the first target scene and the point cloud data obtained by scanning the first target scene with multiple radars ; Based on the obtained point cloud data corresponding to each radar and the point cloud map of the first target scene, determine the first transfer matrix corresponding to each radar, the first transfer matrix is the coordinate system of the corresponding radar relative to the first target scene The transfer matrix of the coordinate system; using any radar in the plurality of radars as a reference radar, based on the first transfer matrix corresponding to the reference radar, determine the second transfer matrix corresponding to other radars in the plurality of radars except the reference radar, and the second transfer matrix The matrix is the transfer matrix of the coordinate system of the corresponding radar relative to the coordinate system of the reference radar, so that the data collected by each radar can be mapped to the coordinate system of the reference radar; then based on the second coordinate system corresponding to the reference radar and other radars The transfer matrix performs data fusion on the point cloud data
  • Fig. 1 is a flow chart of a data processing method shown in this specification according to an exemplary embodiment.
  • Fig. 2 is a schematic diagram showing a visual display result of point cloud data according to an exemplary embodiment in this specification.
  • Fig. 3 is a block diagram of a data processing device shown in this specification according to an exemplary embodiment.
  • Fig. 4 is a schematic structural diagram of a terminal shown in this specification according to an exemplary embodiment.
  • first, second, third, etc. may be used in this specification to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from one another. For example, without departing from the scope of this specification, first information may also be called second information, and similarly, second information may also be called first information. Depending on the context, the word “if” as used herein may be interpreted as “at” or “when” or “in response to a determination.”
  • the present application provides a data processing method for processing data collected by multiple radars of an automatic driving vehicle.
  • the data processing method can be performed by a terminal, and the terminal can be a vehicle-mounted portable terminal, for example, a mobile phone (such as a high-performance mobile phone), a tablet computer, a game console, a portable computer, a vehicle-mounted portable computer installed on an automatic driving vehicle (such as vehicle-mounted terminal), etc.
  • the present application does not limit the specific type of the terminal.
  • multiple radars for example, low-beam laser radar
  • the detection radius of each radar can be the same or different, but the detection directions of each radar are not the same, so that the detection radius of each radar The detection ranges are all different.
  • the self-driving vehicle collects point cloud data in the detection range corresponding to each radar through these multiple radars, and then transmits the collected point cloud data to the terminal, and the terminal performs fusion processing on the point cloud data collected by each radar , so that the fused point cloud data corresponds to the same coordinate origin and positive direction, and since the point data collected by each radar corresponds to different detection ranges, it is possible to obtain point cloud data within a larger detection range without the need for Using high-beam lidar can reduce the cost in the process of automatic driving on the basis of ensuring the detection range of the radar in the process of automatic driving.
  • Figure 1 is a flow chart of a data processing method shown in this specification according to an exemplary embodiment, including the following steps:
  • step 101 a point cloud map of the first target scene and multiple radar scans are acquired Point cloud data obtained from the first target scene.
  • the first target scene is a plant scene for factory testing of the self-driving vehicle, or the first target scene is other scenes, which are not limited in this application.
  • the point cloud map of the first target scene can be established in advance, and then the established point cloud map can be stored, so that the stored point cloud map of the first target scene can be directly acquired when performing radar calibration.
  • the first target scene is scanned by a preset radar in advance to obtain point cloud data of the first target scene, and then a point cloud map of the first target scene is constructed based on the point cloud data of the first target scene.
  • the preset radar in the first target scene is a calibrated radar, that is, the coordinate origin and positive direction corresponding to the preset radar are known.
  • the point cloud map of the first target scene is obtained through the preset radar, there are two ways as follows: In a possible implementation, when the point cloud data of the first target scene is obtained through the preset radar, the preset The radar is placed at any position in the first target scene, so that the first target scene is scanned (for example, laser scanning) by the preset radar placed at any position to collect the first target scene point cloud data.
  • the preset radar is installed on a movable object (such as a cart or trolley), so that the movable object moves in the first target scene, and during the moving process of the movable object, by installing The preset radar on the movable object scans the first target scene (for example, performs laser scanning), so as to collect point cloud data of the first target scene in real time.
  • a movable object such as a cart or trolley
  • step 102 based on the obtained point cloud data corresponding to each radar and the point cloud map of the first target scene, determine the first transfer matrix corresponding to each radar, the first transfer matrix is the coordinate system of the corresponding radar relative to the first The transition matrix of the coordinate system of the target scene.
  • the first transfer matrix corresponding to any radar represents the point cloud data corresponding to any radar, which needs to be converted into the corresponding point cloud map part in the point cloud map of the first target scene
  • the first transfer matrix is a 4*3 matrix, or the first transfer matrix is a 4*4 matrix, which is not limited in this application.
  • the first transfer matrix is a 4*3 matrix
  • the part corresponding to the 1-3 row and the 1-3 column is a rotation matrix
  • the part corresponding to row 4 and column 1-3 is the translation matrix.
  • the first transfer matrix is a 4*4 matrix
  • the part corresponding to the 1-3 row and the 1-3 column is a rotation matrix
  • the part corresponding to the 4th row and the 1-3 column is a translation matrix
  • the 1-3 column corresponds to a translation matrix.
  • the part corresponding to the 4th row and the 4th column is a sequence (0,0,0,1), so that the first transfer matrix is a homogeneous matrix, which is convenient for subsequent matrix transformation, thereby improving the matrix transformation efficiency.
  • Step 103 Using any one of the multiple radars as a reference radar, based on the first transfer matrix corresponding to the reference radar, determine the second transfer matrix corresponding to other radars in the multiple radars except the reference radar, the second transfer matrix is the corresponding The transfer matrix of the coordinate system of the radar with respect to the coordinate system of the reference radar.
  • the target scene may not be involved during the driving process of the self-driving vehicle, after obtaining the first transfer matrix corresponding to each radar, by selecting one of the multiple radars as the reference radar, the The first transfer matrices corresponding to other radars other than the radar are transformed into the second transfer matrix relative to the reference radar, so that the point cloud data acquired by each radar can be fused.
  • the second transfer matrix is a 4*3 matrix, or, the second transfer matrix is a 4*4 matrix, which is not limited in the present application.
  • dimensions of the first transfer matrix and the second transfer matrix may be the same or different.
  • the second transfer matrix is a 4*3 matrix as an example, the same as the 4*3 transfer matrix, the part corresponding to the 1-3 rows and 1-3 columns of the second transfer matrix is a rotation matrix, and the 4th row The part corresponding to columns 1-3 is a translation matrix, and for the reference radar, the second transfer matrix of the reference radar is a zero matrix, that is, both the rotation matrix and the translation matrix of the reference radar are zero matrices.
  • the part corresponding to the 1-3 rows and 1-3 columns of the second transfer matrix is a rotation matrix, and the 4th row
  • the part corresponding to the 1-3 column is the translation matrix
  • the part corresponding to the 1-4 row and the 4th column is the sequence (0,0,0,1)
  • the reference radar the rotation matrix of the reference radar
  • the translation matrices are all zero matrices.
  • Step 104 based on the second transfer matrix corresponding to the reference radar and other radars, perform data fusion on the point cloud data obtained by scanning the second target scene by multiple radars.
  • the second target scene is any driving scene, for example, the second target scene is the road on which the self-driving vehicle is driving, or the second target scene is other scenes, which are not limited in this application.
  • the coordinates of each point in the point cloud data collected by each radar are transformed, so that the coordinates of each point are relative to the same coordinate system , and then obtain point cloud data corresponding to multiple points relative to the same coordinate system, thereby realizing the fusion of point cloud data.
  • the point cloud map of the target scene determines the first transfer matrix corresponding to each radar, and then use any radar in the multiple radars as a reference radar, based on the first transfer matrix corresponding to the reference radar, determine the The second transfer matrix corresponding to other radars, so that the data collected by each radar can be mapped to the coordinate system of the reference radar, so that based on the second transfer matrix corresponding to the reference radar and other radars, the second scan of multiple radars
  • the point cloud data obtained by the target scene is fused to realize the fusion of data obtained by multiple radars.
  • the method provided by this application can realize the factory calibration of the radar of the self-driving vehicles, so that after the self-driving vehicles are put into use,
  • the point cloud data of multiple radars can be fused directly based on the determined second transfer matrix.
  • the point cloud map of the first target scene can be acquired in the following manner.
  • Step 1 based on the point cloud data collected by the preset radar located in the first target scene, determine the motion equation and observation equation for the target object equipped with the preset radar.
  • the motion equation is used to indicate the relationship between the position of the target object and the motion data of the target object
  • the observation equation is used to indicate the relationship between the position of the preset point in the first target scene and the position of the target object.
  • the target object carries a motion sensor for acquiring motion data of the target object.
  • the motion sensor is an acceleration sensor, which is used to obtain the motion acceleration of the target object, and the motion acceleration is the motion data of the target object.
  • the equation of motion of the target object is determined.
  • x k represents the coordinates of the target object at the second moment
  • f represents the abstract function
  • x k-1 represents the coordinates of the target object at the first moment
  • u k represents the motion data (such as motion acceleration) of the target object
  • w k represents The noise that exists during the movement of the target object.
  • the determination process of the observation equation is as follows: a plurality of landmark preset points are preset in the first target scene, for example, the preset point is the position of the signboard preset in the first target scene, or, the preset Points are other types, which are not limited in this application.
  • the target is determined based on the current position of the target object and the position of any detected preset point The observation equation of the object.
  • z k, j represents the observation of the detected preset point
  • h represents the abstract function
  • y j represents the coordinates corresponding to the position described by the detected preset point
  • x k represents the coordinates of the target object
  • v k,j represent the noise existing in the observation process.
  • equations of motion and observation equations involved in the above process are only two exemplary expressions. In more possible implementations, the equations of motion and observation equations can also be expressed by other formulas, which are not included in this application. limited.
  • Step 2 Based on the motion equation and observation equation of the target object, the position of the target object and the positions of multiple preset points in the first target scene are determined to obtain a point cloud map of the first target scene.
  • the positioning problem there are two main problems to be solved, one is the positioning problem, and the other is the mapping problem.
  • the equation of motion By determining the equation of motion, the positioning of the target object can be realized.
  • the observation equation it can be realized
  • the detection of multiple preset points in the first target scene, and the so-called map is a collection of all preset points, so by determining the position of the target object and the positions of multiple preset points in the first target scene , the construction of the point cloud map of the first target scene can be realized.
  • determining the first transfer matrix corresponding to each radar includes: Step 1, for any of the multiple radars Radar, to obtain the target parameter value whose target parameter of any radar satisfies the setting condition, the setting condition is the position of the point corresponding to the point cloud data of any radar, and the position of the point in the point cloud map of the first target scene the maximum degree of matching.
  • a parameter value adjustment control for adjusting the parameter value of the target parameter is provided on the visual interface, so that relevant technical personnel can adjust the parameter value of the target parameter by operating the parameter value adjustment control.
  • the parameter adjustment control is a slide bar, or the parameter adjustment control is an input box, etc., which are not limited in this application.
  • the target parameter includes at least one of roll angle (Roll), yaw angle (Yaw), pitch angle (Pitch), abscissa (x), ordinate (y) and height (z).
  • a target parameter corresponds to a parameter value adjustment control.
  • the target parameter including roll angle, yaw angle, pitch angle, abscissa, ordinate and altitude
  • 6 parameter values provided in the visual interface Adjust the control so that the parameter values of roll angle, yaw angle, pitch angle, abscissa, ordinate and height are adjusted respectively by adjusting the control through these 6 parameter values, and then in response to the parameter value adjustment operation of the target parameter, Based on the adjusted parameter value, display the point cloud map corresponding to the point cloud data of any radar, so as to achieve the purpose of displaying the parameter adjustment result in real time based on the parameter adjustment situation, so that relevant technical personnel can know the adjustment situation of the parameter value in time Whether to meet the requirements.
  • the parameter value adjustment operation on the target parameter is a trigger operation on the parameter value adjustment control.
  • the trigger operation may be a click operation or a drag operation, or the trigger operation may be another type of operation, which is not limited in this application.
  • the trigger operation is a click operation
  • the relevant technicians can directly click anywhere on the slide bar, and the terminal will respond to the click operation on the slide bar with the corresponding
  • the parameter value is determined as the adjusted parameter value.
  • the trigger operation is a drag operation
  • the relevant technical personnel can drag the slider on the slider, and the terminal will end the drag operation in response to the drag operation on the slider.
  • the parameter value corresponding to the position of is determined as the adjusted parameter value.
  • FIG. 2 is a schematic diagram showing a visual display result of point cloud data according to an exemplary embodiment in this specification.
  • the point cloud map of the first target scene is displayed, and the visual display result corresponding to the point cloud data of any radar, the visualization result corresponding to the point cloud data of any radar can be See the part shown in the rectangular box in Figure 2.
  • the relevant technicians adjust the parameter values of each target parameter to meet the required parameter values, they trigger a submission operation in the visual interface, and the terminal responds to the submission operation of the parameter values of the target parameters to obtain the current value of the target parameter.
  • parameter value as the target parameter value for the target parameter.
  • a submit control is provided in the visual interface, for example, a GICP button shown in FIG. 2 , and a person skilled in the art may trigger the submit control to trigger a submit operation in the visual interface.
  • the terminal acquires the current parameter value of the target parameter as the target parameter value of the target parameter.
  • Step 2 Determine a first transfer matrix corresponding to any radar based on the target parameter value and the point cloud map of the first target scene.
  • the first transfer matrix includes a first rotation matrix and a first translation matrix.
  • the point cloud data corresponding to any radar and the point cloud data corresponding to the target parameters are used as the intermediate point cloud data corresponding to any radar, based on the intermediate point
  • the cloud data, the point cloud map of the first target scene, and the target error function determine the first rotation matrix and the first translation matrix corresponding to the minimum function value of the target error function to obtain the first transfer matrix.
  • the target error function represents the error of the point cloud data corresponding to the target parameters of the intermediate point cloud data of any radar under the first transfer matrix.
  • the target error function determines the corresponding first rotation matrix and first translation matrix when the function value of the target error function is the smallest, that is, from the middle
  • the nearest neighbor point (p i , q i ) is determined according to the target constraint condition, so that based on the nearest neighbor point (p i , q i ) and the formula ( 3)
  • the first rotation matrix and the first translation matrix are determined.
  • Formula (3) sees the following formula:
  • R represents the first rotation matrix
  • t represents the first translation matrix
  • f(R, t) represents the target error function
  • n represents the number of nearest neighbor point pairs
  • p i represents the intermediate point cloud data of any radar One point
  • q i represents a point in the point cloud data corresponding to the target parameter.
  • the target constraints involved in the above process can be the point p i in the intermediate point cloud data of any radar, and the distance between the point cloud data corresponding to the target parameters is the smallest, that is, the target constraints can be referred to the following formula (4):
  • p i represents a point in the intermediate point cloud data of any radar
  • q i represents a point in the point cloud data corresponding to the target parameters
  • Q represents the point cloud data corresponding to the target parameters
  • d(p i , Q) represents any A distance between the point p i in the intermediate point cloud data of the radar and the point cloud data Q corresponding to the target parameter.
  • the point cloud map corresponding to the point cloud data of each radar has been manually adjusted, and has basically matched the corresponding part of the point cloud map of the first target scene, and then obtained by manual adjustment.
  • the determination of the first transfer matrix can reduce the processing pressure during calculation, thereby increasing the calculation speed and reducing the calculation time.
  • any radar in the plurality of radars as a reference radar based on the first transfer matrix corresponding to the reference radar, determining the second transfer matrix corresponding to other radars in the plurality of radars except the reference radar, including : For the target radar in other radars except the reference radar, based on the first transfer matrix corresponding to the reference radar and the inverse matrix of the first transfer matrix corresponding to the target radar, determine the second transfer matrix corresponding to the target radar .
  • a matrix obtained by performing dot product processing on the first transfer matrix corresponding to the reference radar and the inverse matrix of the first transfer matrix corresponding to the target radar is used as the second transfer matrix corresponding to the target radar.
  • the first transfer matrix is a 4*3 matrix
  • the first transition matrix is a 4*4 matrix, no other processing is required, and subsequent matrix calculations can be performed directly based on the first transition matrix.
  • this specification also provides embodiments of a device and a terminal to which it is applied.
  • Fig. 3 is a block diagram of a data processing device shown in this specification according to an exemplary embodiment.
  • the data processing device includes: an acquisition unit 301, configured to acquire a point cloud map of a first target scene and a plurality of radar scans The point cloud data obtained by the first target scene; the first determination unit 302 is configured to determine the first transfer matrix corresponding to each radar based on the obtained point cloud data corresponding to each radar and the point cloud map of the first target scene , the first transfer matrix is the transfer matrix of the coordinate system of the corresponding radar relative to the coordinate system of the first target scene; the second determination unit 303 is configured to use any radar in the plurality of radars as a reference radar, based on the first target scene corresponding to the reference radar A transfer matrix, determining the second transfer matrix corresponding to other radars except the reference radar among the multiple radars, the second transfer matrix is the transfer matrix of the coordinate system of the corresponding radar relative to the coordinate system of the reference radar; the data fusion unit 304 uses Based on the second transfer
  • the acquiring unit 301 when used to acquire the point cloud map of the first target scene, it is specifically configured to: based on the point cloud data collected by the preset radar located in the first target scene, determine The motion equation and observation equation of the target object equipped with the preset radar, the motion equation is used to indicate the relationship between the position of the target object and the motion data of the target object, and the observation equation is used to indicate the preset point in the first target scene The relationship between the position of the target object and the position of the target object; based on the motion equation of the target object and the observation equation, determine the position of the target object and the positions of a plurality of preset points in the first target scene, and obtain the first target scene point cloud map.
  • the first determining unit 302 when used to determine the first transfer matrix corresponding to each radar based on the acquired point cloud data corresponding to each radar and the point cloud map of the first target scene, It includes an acquisition subunit and a determination subunit; wherein, the acquisition subunit is used to acquire a target parameter value whose target parameter of any radar satisfies a setting condition for any radar among multiple radars, and the setting condition is any The position of the point corresponding to the point cloud data of the radar has the greatest matching degree with the position of the point in the point cloud map of the first target scene; the determining subunit is used to Cloud map, determine the first transfer matrix corresponding to any radar.
  • the obtaining subunit when used to obtain a target parameter value whose target parameter of any radar satisfies the set condition for any radar among the plurality of radars, it is specifically used to: respond to the target parameter The parameter value adjustment operation, based on the adjusted parameter value, displays the point cloud map corresponding to the point cloud data of any radar; in response to the submission operation of the parameter value of the target parameter, obtain the current parameter value of the target parameter as the target The target parameter value for the parameter.
  • the first transfer matrix includes a first rotation matrix and a first translation matrix; the first determination unit 302 is used to obtain point cloud data corresponding to each radar and the first target scene based on The point cloud map, when determining the first transfer matrix corresponding to each radar, is specifically used for: for any radar among multiple radars, the point cloud data corresponding to any radar and the point cloud data corresponding to the target parameters are used as any radar.
  • the corresponding intermediate point cloud data based on the intermediate point cloud data, the point cloud map of the first target scene and the target error function, determine the first rotation matrix and the first translation matrix corresponding to the minimum function value of the target error function, Get the first transition matrix.
  • the target parameter includes at least one of roll angle, yaw angle, pitch angle, abscissa, ordinate and altitude.
  • the second determining unit 303 is configured to use any radar among the multiple radars as a reference radar, and determine the other radars among the multiple radars except the reference radar based on the first transfer matrix corresponding to the reference radar.
  • the second transfer matrix corresponding to the radar it is specifically used for: for the target radar in other radars except the reference radar among the multiple radars, based on the first transfer matrix corresponding to the reference radar and the first transfer matrix corresponding to the target radar The inverse matrix is used to determine the second transfer matrix corresponding to the target radar.
  • the device embodiment since it basically corresponds to the method embodiment, for related parts, please refer to the part description of the method embodiment.
  • the device embodiments described above are only illustrative, and the modules described as separate components may or may not be physically separated, and the components shown as modules may or may not be physical modules, that is, they may be located in One place, or it can be distributed to multiple network modules. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution in this specification. It can be understood and implemented by those skilled in the art without creative effort.
  • FIG. 4 is a schematic structural diagram of a terminal shown in this specification according to an exemplary embodiment.
  • the terminal includes a processor 410, a memory 420, and a network interface 430.
  • the memory 420 is used to store computer instructions that can be run on the processor 410.
  • the processor 410 is used to implement the present application when executing the computer instructions.
  • the network interface 430 is used to implement input and output functions.
  • the terminal may further include other hardware, which is not limited in this application.
  • the present application also provides a computer-readable storage medium.
  • the computer-readable storage medium can be in various forms.
  • the computer-readable storage medium can be: RAM (Radom Access Memory, Random Access Memory) access memory), volatile memory, non-volatile memory, flash memory, storage drives (such as hard disk drives), solid-state drives, storage disks of any type (such as compact discs, DVDs, etc.), or similar storage media, or combinations thereof .
  • the computer-readable medium may also be paper or other suitable medium capable of printing programs.
  • a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the data processing method provided by any embodiment of the present application is implemented.
  • the present application also provides a computer program product, including a computer program.
  • a computer program product including a computer program.
  • the data processing method provided in any embodiment of the present application is implemented.
  • one or more embodiments of this specification may be provided as a method, device, terminal, computer-readable storage medium, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, one or more embodiments of the present description may employ a computer program embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein. The form of the product.
  • each embodiment in this specification is described in a progressive manner, the same and similar parts of each embodiment can be referred to each other, and each embodiment focuses on the differences from other embodiments.
  • the description is relatively simple, and for relevant parts, refer to part of the description of the method embodiment.
  • Embodiments of the subject matter and functional operations described in this specification can be implemented in digital electronic circuitry, tangibly embodied computer software or firmware, computer hardware including the structures disclosed in this specification and their structural equivalents, or in A combination of one or more of .
  • Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, that is, one or more of computer program instructions encoded on a tangible, non-transitory program carrier for execution by or to control the operation of data processing apparatus. Multiple modules.
  • the program instructions may be encoded on an artificially generated propagated signal, such as a machine-generated electrical, optical or electromagnetic signal, which is generated to encode and transmit information to a suitable receiver device for transmission by the data
  • the processing means executes.
  • a computer storage medium may be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
  • the processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform corresponding functions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, such as an FPGA (Field Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit).
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • Computers suitable for the execution of a computer program include, for example, general and/or special purpose microprocessors, or any other type of central processing unit.
  • a central processing unit will receive instructions and data from a read only memory and/or a random access memory.
  • the basic components of a computer include a central processing unit for implementing or executing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to, one or more mass storage devices for storing data, such as magnetic or magneto-optical disks, or optical disks, to receive data therefrom or to It transmits data, or both.
  • mass storage devices for storing data, such as magnetic or magneto-optical disks, or optical disks, to receive data therefrom or to It transmits data, or both.
  • a computer is not required to have such a device.
  • a computer may be embedded in another device such as a mobile phone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a device such as a Universal Serial Bus (USB) ) portable storage devices like flash drives, to name a few.
  • PDA personal digital assistant
  • GPS Global Positioning System
  • USB Universal Serial Bus
  • Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media, and memory devices, including, for example, semiconductor memory devices (such as EPROM, EEPROM, and flash memory devices), magnetic disks (such as internal hard disks or removable disks), magneto-optical disks, and CD ROM and DVD-ROM disks.
  • semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
  • magnetic disks such as internal hard disks or removable disks
  • magneto-optical disks and CD ROM and DVD-ROM disks.
  • the processor and memory can be supplemented by, or incorporated in, special purpose logic circuitry.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne un procédé de traitement de données, un appareil d'étalonnage de radar, un terminal et un support de stockage lisible par ordinateur. Le procédé consiste à : acquérir une carte de nuage de points d'une première scène cible et des données de nuage de points obtenues au moyen d'une pluralité de radars balayant la première scène cible (101) ; sur la base des données de nuage de points acquises correspondant à chaque radar et de la carte de nuage de points de la première scène cible, déterminer une première matrice de transfert correspondant à chaque radar (102) ; prendre n'importe quel radar parmi la pluralité de radars en tant que radar de référence et déterminer, sur la base de la première matrice de transfert correspondant au radar de référence, une seconde matrice de transfert correspondant aux radars autres que le radar de référence parmi la pluralité de radars, de sorte que les données collectées par chaque radar puissent être mappées sur un système de coordonnées du radar de référence (103) ; et, sur la base du radar de référence et de la seconde matrice de transfert correspondant aux autres radars, effectuer en outre une fusion de données sur des données de nuage de points obtenues au moyen de la pluralité de radars balayant une seconde scène cible (104), de sorte que la fusion de données acquises par une pluralité de radars puisse être mise en œuvre.
PCT/CN2022/070559 2021-09-16 2022-01-06 Traitement de données WO2023040137A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111087747.1 2021-09-16
CN202111087747.1A CN115248441A (zh) 2021-09-16 2021-09-16 数据处理方法、装置、终端及介质

Publications (1)

Publication Number Publication Date
WO2023040137A1 true WO2023040137A1 (fr) 2023-03-23

Family

ID=83697050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/070559 WO2023040137A1 (fr) 2021-09-16 2022-01-06 Traitement de données

Country Status (2)

Country Link
CN (1) CN115248441A (fr)
WO (1) WO2023040137A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116299367A (zh) * 2023-05-18 2023-06-23 中国测绘科学研究院 一种多激光空间标定方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1180794A (fr) * 1981-01-19 1985-01-08 Lawrence F. Anderson Systeme de combinaison de signaux radar multiples
CN110596653A (zh) * 2019-09-24 2019-12-20 江苏集萃智能传感技术研究所有限公司 一种多雷达数据融合方法及装置
CN111507928A (zh) * 2020-03-27 2020-08-07 福建汇川物联网技术科技股份有限公司 点云数据融合方法、装置设备及存储介质
CN112230241A (zh) * 2020-10-23 2021-01-15 湖北亿咖通科技有限公司 基于随机扫描型雷达的标定方法
CN112462350A (zh) * 2020-12-10 2021-03-09 苏州一径科技有限公司 雷达标定方法及装置、电子设备及存储介质
CN112558043A (zh) * 2020-11-17 2021-03-26 浙江众合科技股份有限公司 一种激光雷达的标定方法及电子设备
CN113052761A (zh) * 2019-12-26 2021-06-29 炬星科技(深圳)有限公司 激光点云地图融合方法、设备以及计算机可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1180794A (fr) * 1981-01-19 1985-01-08 Lawrence F. Anderson Systeme de combinaison de signaux radar multiples
CN110596653A (zh) * 2019-09-24 2019-12-20 江苏集萃智能传感技术研究所有限公司 一种多雷达数据融合方法及装置
CN113052761A (zh) * 2019-12-26 2021-06-29 炬星科技(深圳)有限公司 激光点云地图融合方法、设备以及计算机可读存储介质
CN111507928A (zh) * 2020-03-27 2020-08-07 福建汇川物联网技术科技股份有限公司 点云数据融合方法、装置设备及存储介质
CN112230241A (zh) * 2020-10-23 2021-01-15 湖北亿咖通科技有限公司 基于随机扫描型雷达的标定方法
CN112558043A (zh) * 2020-11-17 2021-03-26 浙江众合科技股份有限公司 一种激光雷达的标定方法及电子设备
CN112462350A (zh) * 2020-12-10 2021-03-09 苏州一径科技有限公司 雷达标定方法及装置、电子设备及存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116299367A (zh) * 2023-05-18 2023-06-23 中国测绘科学研究院 一种多激光空间标定方法
CN116299367B (zh) * 2023-05-18 2024-01-26 中国测绘科学研究院 一种多激光空间标定方法

Also Published As

Publication number Publication date
CN115248441A (zh) 2022-10-28

Similar Documents

Publication Publication Date Title
CN112567201B (zh) 距离测量方法以及设备
CN108419446B (zh) 用于激光深度图取样的***及方法
CN111436216B (zh) 用于彩色点云生成的方法和***
CN111160561B (zh) 使用支柱进行对象检测的深度学习
US10240934B2 (en) Method and system for determining a position relative to a digital map
CN112991454A (zh) 照相机到LiDAR的标定和验证
US20170102699A1 (en) Drone control through imagery
US11204610B2 (en) Information processing apparatus, vehicle, and information processing method using correlation between attributes
US20200012756A1 (en) Vision simulation system for simulating operations of a movable platform
WO2018133727A1 (fr) Procédé et appareil de génération de carte orthophotographique
JP6326641B2 (ja) 画像処理装置および画像処理方法
CN112414417B (zh) 自动驾驶地图生成方法、装置、电子设备及可读存储介质
US11544868B2 (en) Object location coordinate determination
US20210263533A1 (en) Mobile object and method for controlling mobile object
US11460855B1 (en) Systems and methods for sensor calibration
WO2023040137A1 (fr) Traitement de données
CN115457354A (zh) 融合方法、3d目标检测方法、车载设备及存储介质
CN111833443A (zh) 自主机器应用中的地标位置重建
US20210156710A1 (en) Map processing method, device, and computer-readable storage medium
CN112288821B (zh) 一种相机外参标定的方法及装置
CN113312403B (zh) 地图获取方法、装置、电子设备及存储介质
WO2022133986A1 (fr) Procédé et système d'estimation de précision
US20200106958A1 (en) Method and system for operating a movable platform using ray-casting mapping
JP2021103482A (ja) 自己位置推定方法
WO2022160101A1 (fr) Procédé et appareil d'estimation d'orientation, plateforme mobile et support de stockage lisible

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22868518

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE