CN114119465B - Point cloud data processing method and device - Google Patents

Point cloud data processing method and device Download PDF

Info

Publication number
CN114119465B
CN114119465B CN202111174740.3A CN202111174740A CN114119465B CN 114119465 B CN114119465 B CN 114119465B CN 202111174740 A CN202111174740 A CN 202111174740A CN 114119465 B CN114119465 B CN 114119465B
Authority
CN
China
Prior art keywords
point cloud
cloud data
target
acquisition time
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111174740.3A
Other languages
Chinese (zh)
Other versions
CN114119465A (en
Inventor
雷绳光
剧学铭
李肖含
马嗣昆
郝哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Liangdao Intelligent Vehicle Technology Co ltd
Original Assignee
Beijing Liangdao Intelligent Vehicle Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Liangdao Intelligent Vehicle Technology Co ltd filed Critical Beijing Liangdao Intelligent Vehicle Technology Co ltd
Priority to CN202111174740.3A priority Critical patent/CN114119465B/en
Publication of CN114119465A publication Critical patent/CN114119465A/en
Application granted granted Critical
Publication of CN114119465B publication Critical patent/CN114119465B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a point cloud data processing method, which relates to the technical field of data processing, and comprises the following steps: obtaining the relative speed of the target object relative to the driving object at the first acquisition time; determining a target laser point of a corresponding target object in target point cloud data; the position information of the target laser spot is adjusted in a direction opposite to the direction of the relative speed based on the relative speed of each laser radar acquisition time difference and the target object. By applying the point cloud data processing scheme provided by the embodiment of the invention, the error among the point cloud data acquired by each laser radar can be reduced.

Description

Point cloud data processing method and device
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method and an apparatus for processing point cloud data.
Background
In the prior art, a laser radar can be installed on a running object, and the laser radar can collect point cloud data of the surrounding environment of the running object in the running process of the running object. After the data processing device acquires the point cloud data, the data processing device may perform data processing on the point cloud data, so as to obtain environmental information of an environment around the driving object, where the environmental information may include: object information of other objects such as persons, animals, and buildings in the environment around the traveling object may be, for example, position information of the other objects, and traveling of the traveling object may be controlled based on the environment information.
The point cloud data can be collected together by a plurality of laser radars installed on a common driving object, and the data processing equipment can combine the point cloud data collected by the plurality of laser radars at the same collection time to obtain the environmental information of the surrounding environment at the collection time. However, since the frequencies of the point cloud data collected by different lidars are often difficult to keep synchronous, the collection time of the point cloud data collected by the lidars is also often different. In the process of driving the driving object, the relative positions of the driving object and other objects often change at different collection moments, so that the surrounding environment of the driving object can change greatly at different collection moments. In this case, the error between the point cloud data of the surrounding environments collected by different lidars is large, and the obtained environmental information based on the point cloud data with the large error between them is inaccurate. In order to improve the accuracy of the obtained environmental information, it is necessary to reduce the error between the point cloud data acquired by each lidar.
Disclosure of Invention
The embodiment of the invention aims to provide a point cloud data processing method and device, so as to reduce errors among point cloud data acquired by all laser radars. The specific technical scheme is as follows:
In a first aspect, an embodiment of the present invention provides a method for processing point cloud data, where the method includes:
Obtaining a relative speed of a target object relative to a driving object at a first acquisition time, wherein the first acquisition time is: the method comprises the steps of acquiring first point cloud data, wherein the first point cloud data are as follows: the point cloud data collected by the first laser radar installed on the driving object is that: an object in a running environment in which the running object is located;
Determining a target laser point corresponding to the target object in target point cloud data, wherein the target point cloud data is: the time difference between the acquisition time and the first acquisition time is smaller than second point cloud data of a first preset time difference, and the second point cloud data are as follows: point cloud data acquired by a second lidar mounted to the traveling object in addition to the first lidar;
Adjusting position information of the target laser point in a direction opposite to a direction of the relative speed based on an acquisition time difference and the relative speed, wherein the acquisition time difference is: and a time difference between the first acquisition time and the acquisition time of the target point cloud data.
In a second aspect, an embodiment of the present invention provides a point cloud data processing apparatus, including:
The speed obtaining module is used for obtaining the relative speed of the target object relative to the running object at a first acquisition time, wherein the first acquisition time is: the method comprises the steps of acquiring first point cloud data, wherein the first point cloud data are as follows: the point cloud data collected by the first laser radar installed on the driving object is that: an object in a running environment in which the running object is located;
the target determining module is configured to determine a target laser point corresponding to the target object in target point cloud data, where the target point cloud data is: the time difference between the acquisition time and the first acquisition time is smaller than second point cloud data of a first preset time difference, and the second point cloud data are as follows: point cloud data acquired by a second lidar mounted to the traveling object in addition to the first lidar;
the information adjustment module is used for adjusting the position information of the target laser point in the direction opposite to the direction of the relative speed based on the acquisition time difference and the relative speed, wherein the acquisition time difference is as follows: and a time difference between the first acquisition time and the acquisition time of the target point cloud data.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
A memory for storing a computer program;
And the processor is used for realizing the steps of the point cloud data processing method in any one of the first aspect when executing the program stored in the memory.
In a fourth aspect, an embodiment of the present invention provides a computer readable storage medium, in which a computer program is stored, the computer program implementing the steps of the method for processing point cloud data according to any one of the first aspects when being executed by a processor.
The embodiment of the invention also provides a computer program product containing instructions, which when run on a computer, cause the computer to execute the point cloud data processing method.
The embodiment of the invention has the beneficial effects that:
In the point cloud data processing scheme provided by the embodiment of the invention, the relative speed of the target object relative to the running object at the first acquisition time is firstly obtained, then the target laser point of the corresponding target object is determined in the target point cloud data, and finally the position information of the target laser point is adjusted in the direction opposite to the direction of the relative speed based on the acquisition time difference and the relative speed.
As can be seen from the above, since the relative speed is the speed of the target object with respect to the traveling object, and the acquisition time difference is the time difference between the first acquisition time and the acquisition time of the target point cloud data, the relative displacement of the target object with respect to the traveling object in the acquisition time difference can be determined based on the acquisition time difference and the relative speed, and the positional information of the target laser point can be adjusted in the direction opposite to the direction of the relative speed according to the relative displacement, so that the error between the target point cloud data and the first point cloud data due to the relative displacement of the target object can be canceled, and the adjusted target point cloud data approaches to the point cloud data that can be acquired by the second laser radar corresponding to the target point cloud data at the first acquisition time. Therefore, by the scheme provided by the embodiment of the invention, the error between the point cloud data acquired by the laser radar can be reduced.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other embodiments may be obtained according to these drawings to those skilled in the art.
Fig. 1 is a flow chart of a first point cloud data processing method according to an embodiment of the present invention;
fig. 2 is a flow chart of a second method for processing point cloud data according to an embodiment of the present invention;
fig. 3 is a flow chart of a third method for processing point cloud data according to an embodiment of the present invention;
Fig. 4 is a flow chart of a fourth method for processing point cloud data according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a first point cloud data processing device according to an embodiment of the present invention;
Fig. 6 is a schematic structural diagram of a second point cloud data processing device according to an embodiment of the present invention;
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by the person skilled in the art based on the present invention are included in the scope of protection of the present invention.
Because the problem of larger error of point cloud data acquired by a laser radar exists in the prior art, in order to solve the problem, the embodiment of the invention provides a point cloud data processing method, a device, electronic equipment and a storage medium.
In one embodiment of the present invention, a method for processing point cloud data is provided, where the method includes:
obtaining a relative speed of a target object relative to a driving object at a first acquisition time, wherein the first acquisition time is: the first point cloud data is acquired at the moment of time, and the first point cloud data is as follows: the point cloud data collected by the first laser radar installed on the driving object is that: an object in a running environment in which the running object is located;
And determining a target laser point corresponding to the target object in target point cloud data, wherein the target point cloud data is as follows: the second point cloud data with the time difference between the acquisition time and the first acquisition time smaller than the first preset time difference are: point cloud data collected by a second lidar attached to the traveling object in addition to the first lidar;
Adjusting the position information of the target laser spot in a direction opposite to the direction of the relative velocity based on an acquisition time difference with respect to the relative velocity, wherein the acquisition time difference is: and a time difference between the first acquisition time and the acquisition time of the target point cloud data.
As can be seen from the above, since the relative speed is the speed of the target object with respect to the traveling object, and the acquisition time difference is the time difference between the first acquisition time and the acquisition time of the target point cloud data, the relative displacement of the target object with respect to the traveling object in the acquisition time difference can be determined based on the acquisition time difference and the relative speed, and the positional information of the target laser point can be adjusted in the direction opposite to the direction of the relative speed according to the relative displacement, so that the error between the target point cloud data and the first point cloud data due to the relative displacement of the target object can be canceled, and the adjusted target point cloud data approaches to the point cloud data that can be acquired by the second laser radar corresponding to the target point cloud data at the first acquisition time. Therefore, by the scheme provided by the embodiment of the invention, the error between the point cloud data acquired by the laser radar can be reduced.
First, an execution body of the scheme provided by the embodiment of the present invention is described.
In this case, the execution subject of the scheme provided by the embodiment of the present invention may be a data processing apparatus installed in a traveling object. In this case, the lidar and the data processing device may communicate by wireless communication or by wired communication. After the laser radar collects the point cloud data, the point cloud data is sent to the data processing equipment, and after the data processing equipment receives the point cloud data, the scheme provided by the embodiment of the invention can be adopted to process the point cloud data, and then the environment information of the running environment where the running object is located is obtained based on the processed point cloud data, so that the running of the running object can be controlled based on the environment information.
In another case, the execution subject of the scheme provided by the embodiment of the invention can also be a remote server. In this case, the lidar and the remote server may communicate by wireless communication. After the laser radar collects the point cloud data, the point cloud data is sent to the remote server through a wireless communication mode, and after the remote server receives the point cloud data, the scheme provided by the embodiment of the invention can be adopted to process the data of the point cloud data, and then the environment information of the running environment where the running object is located is obtained based on the processed point cloud data, so that the running process of the running object is analyzed based on the environment information.
The method, the device, the electronic equipment and the storage medium for processing the point cloud data provided by the embodiment of the invention are explained in detail through specific embodiments.
Referring to fig. 1, a flow chart of a first method for processing point cloud data is provided, and the method includes the following steps S101 to S103.
Step S101: a relative speed of the target object with respect to the traveling object at the first acquisition time is obtained.
Wherein, the first acquisition moment is: the first point cloud data is acquired at the moment: the point cloud data collected by the first laser radar installed on the driving object is that: an object in a traveling environment in which the object is traveling.
The traveling object may be an object having a traveling function, such as a vehicle, a robot, or the like.
The target object is an object in a traveling environment in which the traveling object is located, that is, an object included in a surrounding environment of the traveling object. For example, if the traveling object is a vehicle, the target object may be another vehicle around the vehicle, or may be a pedestrian, a building, or the like around the vehicle.
The relative speed of the target object with respect to the traveling object, that is, the moving speed of the target object with respect to the traveling object is set by using the traveling object as a reference.
In addition, the time of acquiring the point cloud data by the laser radar is the acquisition time of the frame of the point cloud data, and the time of acquiring the first point cloud data by the first laser radar is the first acquisition time of the first point cloud data.
All laser points acquired by the laser radar at the same acquisition time can be called one-frame point cloud data.
In particular, there are various implementations for obtaining the relative speed of the target object with respect to the traveling object at the first acquisition time.
In one implementation, a relative movement distance of the target object with respect to the traveling object in a first time period may be obtained, and then the obtained relative movement distance is divided by a duration of the first time period to obtain a relative speed of the target object with respect to the traveling object at a first acquisition time.
The duration of the first period may be 100 ms, 50 ms, or other durations.
The first acquisition time may be a start time of the first period, or may be an end time of the first period, or may be any other time in the first period.
For example, if the first acquisition time is 8 points, the relative movement distance of the target object with respect to the traveling object in a period of 50 ms before 8 points to 50 ms after 8 points may be obtained, and at this time, the duration of the first period is 100 ms, that is, 0.1 seconds, and if the relative movement distance is 0.5 m, the result obtained by dividing the relative movement distance by the duration of the first period is 0.5/0.1=5 m/s, and after the conversion unit is km/h, the final result is 18km/h, that is, when the first acquisition time is 8 points, the relative speed of the target object with respect to the traveling object is 18km/h.
In this case, the above-mentioned relative speeds can also be obtained by the following steps S101A-S101D in the embodiment shown in fig. 3, which will not be described in detail here.
In another implementation manner, the target speed of the target object and the running speed of the running object at the first acquisition time may be obtained respectively, and then the relative speed of the target object relative to the running object at the first acquisition time may be obtained by subtracting the running speed of the running object from the target speed of the target object.
For example, if the traveling object is the vehicle M, the target object is the vehicle N, the traveling speed of the vehicle M at the first acquisition time is 80km/h, and the traveling speed of the vehicle N at the first acquisition time is 100km/h, the relative speed of the target object with respect to the traveling object at the first acquisition time is 100-80=20 km/h.
In one embodiment of the present invention, a speed measuring device for detecting a speed may be provided in a driving environment where a target object and a driving object are located, and after the speed measuring device obtains the target speed of the target object and the driving speed of the driving object at the first acquisition time, the target speed and the driving speed may be transmitted to an execution body of the embodiment of the present invention, and the execution body may calculate the relative speed based on the obtained target speed and driving speed.
The speed measuring device may calculate the relative speed based on the target speed and the running speed after obtaining the target speed of the target object and the running speed of the running object, and then send the calculated relative speed to the execution body according to the embodiment of the present invention, so that the execution body obtains the relative speed.
The speed measuring device may further obtain a target speed of the target object, and then send the target speed to the execution body of the embodiment of the present invention, and the running object may be equipped with a sensor for measuring a speed, and after obtaining the running speed of the running object, the sensor sends the running speed to the execution body of the embodiment of the present invention, and the execution body calculates the relative speed based on the obtained target speed and the running speed.
In another embodiment of the present invention, in the case where the target object is a traveling object such as a vehicle or a robot, the data processing device may be mounted on the target object. Among them, the data processing apparatus mounted on the traveling object may be referred to as a first data processing apparatus, and the data processing apparatus mounted on the target object may be referred to as a second data processing apparatus. Communication can be established between the first data processing device and the second data processing device, the second data processing device sends the target speed of the target object at the first acquisition time to the first data processing device, the first data processing device obtains the target speed sent by the target object, and the relative speed of the target object relative to the running object at the first acquisition time is calculated based on the target speed of the target object at the first acquisition time and the running speed of the running object.
Step S102: and determining a target laser point of the corresponding target object in the target point cloud data.
The target point cloud data are as follows: the second point cloud data with the time difference between the acquisition time and the first acquisition time smaller than the first preset time difference are: and the second laser radar is arranged on the driving object and used for collecting point cloud data except the first laser radar.
The second laser radar can acquire multi-frame point cloud data, each frame of point cloud data corresponds to respective acquisition time, and among the multi-frame point cloud data acquired by the second laser radar, point cloud data with a time difference between the acquisition time and the acquisition time of the first point cloud data being smaller than a first preset time difference is the target point cloud data.
The first preset time difference is 25 ms, 50 ms or other time difference.
And if the time difference between the acquisition time and the acquisition time of the first point cloud data is smaller than the first preset time difference, the number of the point cloud data is a plurality of. In this case, the plurality of point cloud data may be used as target point cloud data, and for each target point cloud data, a target laser point of a corresponding target object in the target point cloud data may be determined.
In another case, one point cloud data among the plurality of point cloud data may be selected as the target point cloud data. For example, point cloud data having the closest acquisition time to the acquisition time of the first point cloud data may be selected from the plurality of point cloud data as the target point cloud data; the point cloud data with the earliest acquisition time can be selected from the plurality of point cloud data to serve as target point cloud data; one point cloud data may also be randomly selected from among the plurality of point cloud data as the target point cloud data.
For example, if the second laser radar collects three frames of point cloud data x, y and z, the collection time of the point cloud data x is 25 milliseconds after 8 points, the collection time of the point cloud data y is 50 milliseconds after 8 points, the collection time of the point cloud data z is 75 milliseconds after 8 points, the first collection time is 30 milliseconds after 8 points, under the condition that the first preset time difference is 10 milliseconds, the collection time of the point cloud data x is different from the first collection time by 5 milliseconds, and the condition that the first preset time difference is smaller than the condition is satisfied, the point cloud data x is determined to be target point cloud data; under the condition that the first preset time difference is 30 milliseconds, the acquisition time of the point cloud data x and the point cloud data y are 5 milliseconds and 20 milliseconds respectively, and both the acquisition time and the first acquisition time meet the condition of being smaller than the first preset time difference, the point cloud data x and the point cloud data y can be determined to be the target point cloud data, or one of the point cloud data is selected to be the target point cloud data.
In one embodiment of the present invention, the target point cloud data may also be second point cloud data whose acquisition time is closest to the first acquisition time.
In this case, among the point cloud data collected by the second laser radar, the point cloud data with the smallest time difference between the collection time and the first collection time is the target point cloud data.
Since the lidar generally collects point cloud data of the surrounding environment of the lidar by emitting laser light to the outside and receiving laser light reflected from the outside, each laser beam received by the lidar is a laser point in the point cloud data. Therefore, the target laser point corresponding to the target object, that is, the laser point formed in the point cloud data by the laser reflected by the target object after the laser emitted by the laser radar contacts the target object.
In one implementation, feature extraction may be performed on the target point cloud data, and a laser point in the target point cloud data that matches an object feature of the target object may be determined as the target laser point based on the extracted feature. Feature extraction of the target point cloud data can be achieved based on a feature extraction algorithm, and can also be achieved based on a deep learning network for extracting features. The method of extracting the characteristics of the target point cloud data and determining the laser point matched with the object characteristics of the target object in the target point cloud data based on the extracted characteristics may be implemented by using the prior art, which is not described in detail herein.
In another implementation, the target laser point may also be determined by performing a point cloud clustering process on the target point cloud data. The determination of the target laser point by performing the point cloud clustering process on the target point cloud data is specifically described in the embodiment shown in step S102A in fig. 4, which is described in detail herein.
In addition, the number of the second laser radars may be one or more, one or more target point cloud data may be determined from the point cloud data collected by one second laser radar, a plurality of target point cloud data may be determined from the point cloud data collected by a plurality of second laser radars, and for each target point cloud data, a target laser point of a corresponding target object in the target point cloud data needs to be determined.
In addition, the number of target objects corresponding to the target laser points in one frame of target point cloud data may be one or more, and for each target object, the target laser point corresponding to the target object may be determined in the target point cloud data through the step S102.
Step S103: based on the acquisition time difference and the relative velocity, the position information of the target laser spot is adjusted in a direction opposite to the direction of the relative velocity.
Wherein, the collection time difference is: time difference between the first acquisition time and the acquisition time of the target point cloud data.
Specifically, the relative velocity is a vector having a magnitude and a direction, and the direction opposite to the direction of the relative velocity can be determined according to the direction of the relative velocity. For example, if the direction of the relative speed is the forward direction, the direction opposite to the direction of the relative speed is the forward-west direction; if the direction of the relative speed is the southeast direction, the direction opposite to the direction of the relative speed is the northwest direction.
In addition, the position information of the target laser spot may be used to characterize the relative position of the target object with respect to the traveling object, and the relative position of the target object with respect to the traveling object may be determined based on the position information of the target laser spot. For example, the position information of the target laser spot may be expressed in the form of three-dimensional coordinates in a spatial coordinate system, and when the origin of the three-dimensional coordinates is located at the position of the traveling object, the coordinates on each of the coordinates may be expressed as a distance between the target object and the traveling object in the direction of the coordinate axis, and the relative position between the target object and the traveling object in the same space may be known from the three-dimensional coordinates.
Specifically, in the case where the position information of the target laser spot is expressed as three-dimensional coordinates in the space coordinate system, since the space coordinate system generally includes an X-axis, a Y-axis, and a Z-axis, if the direction of the X-axis is defined as forward direction, the direction of the Y-axis is defined as forward direction, the direction of the Z-axis is defined as direction perpendicular to the ground direction, and the three-dimensional coordinates are (100, 20, 5), it can be known that the target object is located at a position of 100 meters forward direction, 20 meters forward direction, and 5 meters perpendicular to the ground direction from the traveling object.
In one embodiment of the present invention, the acquisition time difference and the relative speed may be multiplied to obtain a relative displacement of the target object with respect to the traveling object between the first acquisition time and the acquisition time of the target point cloud data.
The relative displacement may be expressed as a three-dimensional coordinate in the spatial coordinate system.
For example, if the three-dimensional coordinates of the relative displacement in the spatial coordinate system are (10,4,2), the X-axis direction is defined as the forward direction, the Y-axis direction is defined as the forward direction, and the Z-axis direction is defined as the direction perpendicular to the ground upward, it is indicated that the target object moves 10 meters in the forward direction, 4 meters in the forward direction, and 2 meters in the direction perpendicular to the ground upward with respect to the traveling object within the acquisition time difference.
Specifically, the position information of the target laser spot can be adjusted in the direction opposite to the direction of the relative velocity by subtracting the coordinates of the same coordinate axis from the three-dimensional coordinates of the relative displacement.
For example, if the three-dimensional coordinates of the target laser spot are (100, 20, 5) and the three-dimensional coordinates of the relative displacement are (10,4,2), the coordinates of the target laser spot on the coordinate axis are subtracted from the coordinates of the target laser spot on the coordinate axis for the X-axis, the Y-axis, and the Z-axis, respectively, to obtain three-dimensional coordinates, the X-axis coordinates are 100-10=90, the Y-axis coordinates are 20-4=16, and the Z-axis coordinates are 5-2=3, and thus the three-dimensional coordinates of the adjusted target laser spot are (90, 16, 3).
Because the acquisition time of the first point cloud data is different from the acquisition time of the target point cloud data, the difference between the acquisition time of the first point cloud data and the acquisition time of the target point cloud data is the acquisition time difference, and the target object and the traveling object can generate relative motion in the time period of the acquisition time difference, so that relative displacement is generated, the position information of the target laser point is adjusted in the direction opposite to the direction of the relative speed, the influence of the relative displacement on the position information of the target laser point can be counteracted, and the position information of the adjusted target laser point can be determined according to the position of the target object when the relative displacement is not generated.
If the acquisition time of the first point cloud data is the same as the acquisition time of the target point cloud data, the acquisition time difference does not exist, the target object and the traveling object do not generate relative motion, and the position information of the adjusted target laser point is determined according to the position of the target object when no relative displacement is generated, so that the adjusted target laser point can be considered to be a laser point approaching to the corresponding target object in the point cloud data which can be acquired by the second laser radar at the first acquisition time.
In addition, the position information of the target laser spot may be adjusted based on steps S103A-S103C in the embodiment shown in fig. 2, which will not be described in detail herein.
In addition, the number of target point cloud data may be one or more, and for each target point cloud data, the position information of the target laser point in the target point cloud data may be adjusted in the manner shown in steps S101 to S103.
And, for each target point cloud data, the target point cloud data may include a set of target laser points corresponding to one target object, or may include a plurality of sets of target laser points corresponding to different target objects. For different target objects, the position information of the target laser point corresponding to the target object may be adjusted in the manner shown in step S103.
From the above, in the solution provided by the embodiment of the present invention, the relative speed of the target object with respect to the driving object at the first acquisition time is obtained first, then the target laser point of the corresponding target object is determined in the target point cloud data, and finally the position information of the target laser point in the direction opposite to the direction of the relative speed is adjusted based on the acquisition time difference and the relative speed. Since the relative speed is the speed of the target object relative to the traveling object, and the acquisition time difference is the time difference between the first acquisition time and the acquisition time of the target point cloud data, based on the acquisition time difference and the relative speed, the relative displacement of the target object relative to the traveling object can be determined, the position information of the target laser point can be adjusted in the direction opposite to the direction of the relative speed according to the relative displacement, and the error between the target point cloud data and the first point cloud data caused by the relative displacement of the target object can be offset, so that the adjusted target point cloud data approaches to the point cloud data which can be acquired by the second laser radar corresponding to the target point cloud data at the first acquisition time. Therefore, by the scheme provided by the embodiment of the invention, the error between the point cloud data acquired by the laser radar can be reduced.
In one embodiment of the present invention, the first lidar may be: the laser radar with the highest resolution among the laser radars attached to the traveling object.
According to the scheme provided by the embodiment of the invention, the error between the point cloud data acquired by the laser radars is reduced by adjusting the position information of the target laser point in the target point cloud data, namely, the adjusted target point cloud data can be close to the point cloud data which can be acquired by the second laser radars corresponding to the target point cloud data at the first acquisition time, and the first acquisition time is the acquisition time of the first point cloud data, so that the position information of the target laser point in the adjusted target point cloud data is adjusted by taking the first point cloud data as a reference. In the embodiment of the invention, the first laser radar is the laser radar with the highest resolution in the laser radars installed and driven, so that compared with other laser radars serving as the first laser radars, the first point cloud data acquired by the laser radar with the highest resolution is more accurate, and the accuracy of adjusting the position information of the target laser point in the target point cloud data can be improved by taking the first point cloud data as a reference.
In addition, the first lidar may be a lidar attached to a preset attachment position in the traveling object, and the preset attachment position may be a top position of the traveling object, for example. Any lidar attached to a traveling object may be used.
In an embodiment of the present invention, referring to fig. 2, a flowchart of a second method for processing point cloud data is provided, and in this embodiment, compared with the embodiment shown in fig. 1, the following steps S103A-S103C may be used to implement the step S103.
Step S103A: the relative velocity is decomposed into a longitudinal velocity and a transverse velocity.
The direction of the longitudinal speed is the same as the running direction of the running object, and the direction of the transverse speed is perpendicular to the running direction of the running object.
Since the direction of the longitudinal velocity is the same as the traveling direction of the traveling object, and the direction of the lateral velocity is perpendicular to the traveling direction of the traveling object, the direction of the longitudinal velocity is perpendicular to the direction of the lateral velocity.
Specifically, the above-mentioned relative velocity is decomposed into a longitudinal velocity and a transverse velocity, that is, the relative velocity is orthogonally decomposed, and the velocity in one direction is decomposed into two velocities perpendicular to each other.
In addition, the directions of the longitudinal speed and the lateral speed are determined according to the traveling direction of the target object. In addition, the direction of the longitudinal speed and the transverse speed may be determined based on the geographic position. For example, the direction of the longitudinal velocity is the forward east direction, and the direction of the transverse velocity is the forward south direction.
Step S103B: based on the acquisition time difference and the longitudinal velocity, a longitudinal compensation amount is obtained, and positional information of the target laser spot is adjusted based on the longitudinal compensation amount in a direction opposite to the direction of the longitudinal velocity.
Specifically, the longitudinal compensation amount may be obtained based on the acquisition time difference and the longitudinal speed in such a manner that the acquisition time difference and the longitudinal speed are multiplied.
For example, if the acquisition time difference is 50 ms, that is, 0.05 seconds, and the longitudinal velocity is 10m/s, the longitudinal compensation amount is 0.05×10=0.5 m.
Similar to the above step S103, the position information of the target laser spot may be expressed in the form of two-dimensional coordinates in which an X-axis coordinate represents a distance between the target object and the traveling object in the direction of the longitudinal velocity and a Y-axis coordinate represents a distance between the target object and the traveling object in the direction of the lateral velocity, and subtracting the longitudinal compensation amount from the X-axis coordinate in the two-dimensional coordinates of the target laser spot may realize that the position information of the target laser spot is adjusted based on the longitudinal compensation amount in the direction opposite to the direction of the longitudinal velocity.
Step S103C: based on the acquisition time difference and the lateral velocity, a lateral compensation amount is obtained, and positional information of the target laser spot is adjusted based on the lateral compensation amount in a direction opposite to the direction of the lateral velocity.
Similar to the above step S103B, the lateral compensation amount can be obtained by multiplying the acquisition time difference by the lateral velocity, and subtracting the above lateral compensation amount from the Y-axis coordinate in the two-dimensional coordinates of the target laser spot, the positional information of the target laser spot can be adjusted based on the lateral compensation amount in the direction opposite to the direction of the lateral velocity.
In addition, in the solution provided in this embodiment, step S103B may be performed first, and then step S103C may be performed, that is, in the process of adjusting the position information of the target laser spot, the position information of the target laser spot may be adjusted in the direction opposite to the longitudinal speed first, and then the position information of the target laser spot may be adjusted in the direction opposite to the lateral speed.
In addition to this, step S103C may be performed first, and then step S103B may be performed, that is, in the process of adjusting the position information of the target laser spot, the position information of the target laser spot may be adjusted in the direction opposite to the lateral velocity first, and then the position information of the target laser spot may be adjusted in the direction opposite to the longitudinal velocity.
As can be seen from the above, in the solution provided by the embodiment of the present invention, the relative velocity is decomposed into the longitudinal velocity and the transverse velocity, and the position information of the target laser point is adjusted twice in the direction opposite to the direction of the longitudinal velocity and the direction opposite to the direction of the transverse velocity, so that the errors between the target point cloud data and the first point cloud data caused by the relative displacement of the target object generated in the acquisition time difference are offset in the two directions, thereby reducing the errors between the target point cloud data and the first point cloud data after the position information of the target laser point is adjusted.
In an embodiment of the present invention, referring to fig. 3, a flowchart of a third method for processing point cloud data is provided, and in this embodiment, compared with the embodiment shown in fig. 1, the step S101 may be implemented by the following steps S101A-S101D.
Step S101A: a first position of the target object at a first acquisition instant is obtained.
The first position of the target object is an absolute position of the target object, that is, a position of the target object in a running environment where the target object is located.
This step may be implemented according to any one of the following modes (one) to (four).
Mode (one): the execution body of the embodiment of the invention can obtain the relative position information of the target object relative to the running object at the first acquisition time and the position information of the running object, and determine the first position of the target object at the first acquisition time.
For example, at the first acquisition time, if the relative position of the target object with respect to the traveling object is a position of the target object located 100 meters from the forward direction of the traveling object, and the position information of the traveling object itself is a position of the traveling object located 200 meters from the forward direction of a certain building, it is possible to determine that the target object is located 100 meters from the forward direction of the building.
In one embodiment of the present invention, the position information of the traveling object at the first acquisition time may be position information of the traveling object acquired by a positioning sensor for determining position information, which is attached to the traveling object.
In another embodiment of the present invention, the position information of the traveling object at the first collection time may be position information of the traveling object at the first collection time, which is collected by a positioning sensor installed in a traveling environment where the traveling object is located.
For example, the positioning collector may be a laser radar, a camera, or the like.
Mode (ii): if the positioning sensor is installed in the driving environment where the target object and the driving object are located, the positioning sensor may collect a first position of the target object at a first collection time and send the collected first position to the execution subject of the embodiment of the present invention.
Mode (III): the data processing device in the target object, that is, the second data processing device may establish communication with the execution subject in the embodiment of the present invention, and after the second data processing device obtains the first position of the target object at the first acquisition time, the second data processing device may send the first position to the execution subject.
Specifically, if the target object is provided with a positioning sensor, the first position obtained by the second data processing apparatus may be acquired by the positioning sensor.
Mode (four): the first position of the target object at the first acquisition time may also be obtained by step S101A1 in the subsequent embodiment, which is not described in detail here.
Step S101B: and obtaining a second position of the target object at the acquisition time of the third point cloud data.
The third point cloud data is as follows: and the point cloud data which are acquired by the first laser radar and are acquired by the first laser radar, wherein the time difference between the acquisition time and the first acquisition time is smaller than the second preset time difference.
The second preset time difference may be 30 milliseconds, 50 milliseconds, or other time period set.
The third point cloud data may be point cloud data having a collection time earlier than the first collection time, or may be point cloud data having a collection time later than the first collection time. And if there are point cloud data with a time difference between the plurality of acquisition times and the first acquisition time being smaller than a second preset time difference in the point cloud data acquired by the first laser radar, determining a plurality of third point cloud data, where the acquisition time of each third point cloud data may be earlier than the first acquisition time or later than the first acquisition time. For each third point cloud data, a second position of the target object at the time of acquisition of the third point cloud data may be obtained.
Specifically, the step is similar to the step S101A, and the difference is only that the obtained time corresponding to the position of the target object is different, and the manner of obtaining the position of the target object is the same, which is not described herein.
Step S101C: and calculating the target speed of the target object at the first acquisition moment based on the first position and the second position.
Because the first position corresponds to the first acquisition time, the second position corresponds to the acquisition time of the third point cloud data, and the displacement generated by the target object between the first acquisition time and the acquisition time of the third point cloud data can be determined based on the first position and the second position, wherein the displacement is generated by the self motion of the target object between the two acquisition times.
In this case, since the second preset time difference that is usually set is often small, the time difference between the two times is small, so it can be considered that the target speed of the target object is kept unchanged between the first acquisition time and the third point cloud data acquisition time, and therefore the displacement generated by the target object can be divided by the time difference between the first acquisition time and the third point cloud data acquisition time, and the obtained speed is the target speed of the target object at the first acquisition time.
In another case, the number of the third point cloud data may be plural, and the third point cloud data may be acquired at a time before or after the first acquisition time, so that, for the plurality of third point cloud data, the third point cloud data may be acquired at a time before or after the first acquisition time, or the third point cloud data may be acquired at a time after the first acquisition time, or a part of the third point cloud data may be acquired at a time before or after the first acquisition time.
In this case, the second positions of the target object at the acquisition timing of the third point cloud data can be obtained for each acquisition timing of the third point cloud data, and therefore, the number of the second positions may be plural. Based on the first position and the plurality of second positions, a target speed of the target object at the first acquisition instant may be calculated.
Since the second preset time difference is generally smaller, the motion of the target object between the first acquisition time and the acquisition time of the third point cloud data can be approximately regarded as uniform motion or uniform speed change motion. Based on the first location and the plurality of second locations, a plurality of speeds of the target object between the first acquisition time and the acquisition time of the plurality of different third point cloud data may be calculated. If the calculated speeds are the same, the motion of the target object can be considered to be uniform motion, and at the moment, the calculated speed is determined to be the target speed of the target object at the first acquisition moment; if the calculated speeds are different, the acceleration of the target object can be further calculated based on the calculated speeds, and then the target speed of the target object at the first acquisition time can be calculated based on the acceleration of the target object and the calculated speeds.
The speed or acceleration may be calculated using prior art techniques and will not be described in detail herein.
Step S101D: based on the travel speed of the travel object at the first acquisition time and the target speed, a relative speed of the target object with respect to the travel object at the first acquisition time is determined.
The running speed of the running object at the first collecting time may be a running speed of the running object measured by a speed sensor for measuring the speed, which is mounted on the running object; the speed measuring device may be further configured to measure a running speed of the running object at the first acquisition time, and send the measured running speed to the execution body according to the embodiment of the present invention.
And subtracting the running speed of the running object from the target speed to obtain the speed which is the relative speed of the target object relative to the running object.
In one embodiment of the present invention, a relative speed of a target object with respect to a driving object at a first acquisition time is obtained, and the relative speed of the target object with respect to the driving object at the first acquisition time may be further determined by determining, for each first point cloud data, a target laser point of a corresponding target object in the first point cloud data, then performing target tracking on the target object based on position information of the target laser points in the first point cloud data, and determining a movement track of the target object, thereby determining the target speed of the target object based on the movement track of the target object, and further determining the relative speed of the target object with respect to the driving object at the first acquisition time.
The method for realizing the target tracking can be a Kalman filtering method, a particle filtering method or other methods for carrying out the target tracking in the prior art.
From the foregoing, in the solution provided by the embodiment of the present invention, by obtaining the first position of the target object at the first acquisition time and the second position of the third point cloud data at the acquisition time, because the position difference between the first position and the second position is the displacement generated between the first acquisition time and the acquisition time of the third point cloud data, the target speed of the target object at the first acquisition time can be accurately calculated based on the first position and the second position, and further based on the running speed of the running object at the first acquisition time and the calculated target speed, the relative speed of the target object relative to the running object at the first acquisition time can be accurately determined, so that the position information of the target laser point in the target point cloud data can be accurately adjusted.
In one embodiment of the present invention, the above step S101A may be implemented by the following step S101 A1.
Step S101A1: and performing point cloud fusion processing on the first point cloud data and the target point cloud data to obtain first fusion point cloud data, determining a first laser point corresponding to the target object in the first fusion point cloud data, and determining a first position of the target object based on the position information of the first laser point.
Specifically, the point cloud fusion processing is performed on the first point cloud data and the target point cloud data, that is, all laser points included in the first point cloud data and all laser points included in the target point cloud data are overlapped to obtain first fusion point cloud data, that is, the first fusion point cloud data is a set of all laser points in the first point cloud data and the target point cloud data.
For example, if the first point cloud data includes 100 laser points, the target point cloud data includes 120 laser points, and the point cloud fusion processing is performed on the first point cloud data and the target point cloud data, the obtained first fusion point cloud data is point cloud data including 100+120=220 laser points.
Similar to the above step S102, in the first fusion point cloud data, the first laser point corresponding to the target object may be determined by performing feature extraction on the first fusion point cloud data based on the object feature of the target object.
In the first fusion point cloud data, determining the first laser point corresponding to the target object may further determine the first laser point by performing a point cloud clustering process on the first fusion point cloud data. The specific operation steps of the point cloud clustering process may be referred to the embodiment shown in step S102A of fig. 4, which is described in detail herein.
The position information of the first laser points may be used to characterize the relative position of the point formed by the laser beam emitted by the laser radar reflecting on the target object with respect to the traveling object, and the relative position of the target object with respect to the traveling object may be determined based on the position information of each first laser point. The relative position of the target object relative to the running object is determined, and the first position of the target object can be determined by combining the absolute position of the running object.
Specifically, the geometric shape of the target object may be determined according to the position information of each first laser point, the geometric center of the geometric shape is obtained through calculation, and the relative position of the target object with respect to the driving object is obtained according to the position information represented by the geometric center.
In one embodiment of the present invention, the position information of the first laser point may be represented in the form of a three-dimensional coordinate in a spatial coordinate system, where an origin of the three-dimensional coordinate is located at a position where the traveling object is located, and then the coordinate on each coordinate axis of the three-dimensional coordinate may be represented as a distance between a point on the target object corresponding to the first laser point and the traveling object in a direction where the coordinate axis is located, and according to the three-dimensional coordinate of each first laser point, an average coordinate, that is, a three-dimensional coordinate of the geometric center, may be calculated, and the average coordinate may also be regarded as a three-dimensional coordinate of the target object in the spatial coordinate system, where the position information reflected by the average coordinate, that is, a relative position of the target object with respect to the traveling object, may be obtained.
For example, if it is determined that the three-dimensional coordinates of the first laser spot a are (15, 48, 42), the three-dimensional coordinates of the second laser spot b are (36, 78,3), and the three-dimensional coordinates of the third laser spot c are (6, 90, 33), the coordinates of the geometric center on three coordinate axes can be calculated as (15+36+6)/3=19, (48+78+90)/3=72, (42+3+33)/3=26) based on the three-dimensional coordinates of the first laser spot, and thus the three-dimensional coordinates of the geometric center are (19, 72, 26).
In another embodiment of the present invention, in the case where the position information is represented in the form of coordinates, for each coordinate axis, a weighted average, a maximum, a minimum, a median, or the like of the coordinates of the respective first laser points on the coordinate axis may be determined as the coordinates of the target object on the coordinate axis, thereby obtaining coordinates representing the relative position between the target object and the traveling object.
In another embodiment of the present invention, the above step S101B may be implemented by the following step S101B 1.
Step S101B1: and carrying out point cloud fusion processing on the third point cloud data and the fourth point cloud data to obtain second fusion point cloud data, determining a second laser point corresponding to the target object in the second fusion point cloud data, and determining a second position of the target object based on the position information of the second laser point.
The fourth point cloud data is as follows: and the time difference between the acquisition time acquired by the second laser radar and the acquisition time of the third point cloud data is smaller than the point cloud data of the third preset time difference. The third preset time difference may be the same as the first preset time difference or may be different from the first preset time difference.
The difference between this step and the step S101A1 is that the step S101A1 performs the point cloud fusion processing on the first point cloud data and the target point cloud data, and the step performs the point cloud fusion processing on the third point cloud data and the fourth point cloud data, which are not described herein.
In one embodiment of the present invention, the step S101A1 and the step S101B1 may be applied simultaneously to obtain the first position and the second position, respectively; the first position may be obtained only in step S101A1, and the second position may be obtained in a manner other than in step S101B 1; or the second position is obtained only by the step S101B1, and the first position is obtained by other means than the step S101 A1.
In the above-mentioned scheme provided by the embodiment of the invention, a point cloud fusion mode is adopted to fuse a plurality of point cloud data into one fused point cloud data, and then the position information of the target object is determined based on the fused point cloud data. Compared with the method for determining the position information of the target object based on one point cloud data, the method for determining the position information of the target object based on the fusion point cloud data can improve the accuracy of the determined position information of the target object because the fusion point cloud data comprises laser points in a plurality of point cloud data, and therefore the fusion point cloud data comprises laser points of the corresponding target object collected by different laser radars from a plurality of angles.
In an embodiment of the present invention, referring to fig. 4, a flowchart of a fourth point cloud data processing method is provided, and compared with the embodiment shown in fig. 1, in this embodiment, step S102 determines a target laser point corresponding to a target object in target point cloud data, and includes the following step S102A.
Step S102A: and performing point cloud clustering processing on the target point cloud data, and determining laser points belonging to the same cluster obtained by clustering as target laser points of a corresponding target object.
The purpose of the point cloud clustering is to determine laser points corresponding to the same object in the point cloud data as laser points belonging to the same cluster, and the laser points corresponding to different objects belong to different clusters. There are various ways to implement the point cloud clustering process, for example, european clustering, density clustering, and super-body clustering in the prior art.
Since the time difference between the first acquisition time and the acquisition time of the target point cloud data is generally smaller, and the relative displacement of the target object with respect to the traveling object is smaller, for the target laser points corresponding to the same target object in the first point cloud data and the target point cloud data, the position information of the target laser points in the two point cloud data is not greatly different, according to the rule, the laser points belonging to the same cluster can be determined in the first point cloud data, the determined laser points correspond to one target object, and the laser points belonging to the same cluster, the position information of which is most similar to the determined position information of the laser points in the target point cloud data, are determined as the target laser points corresponding to the same target object.
In addition, by performing point cloud clustering processing on the target point cloud data, laser points belonging to the same cluster in the target point cloud data can be determined, and then the laser points belonging to the same cluster are determined as target laser points corresponding to the target object, and then operations such as calculating relative speed, adjusting position information and the like can be performed based on the determined target laser points, without further identifying the target object according to the determined target laser points, so that the obtained laser points can be determined as target laser points corresponding to the target object as long as the laser points belonging to the same cluster are obtained.
In the above-mentioned scheme provided by the embodiment of the invention, the point cloud clustering processing is performed on the target point cloud data, so that the clustered laser points belonging to the same cluster can be determined as the target laser points of the corresponding target object. When the point cloud data of the target object is collected, the laser points reflected by the target object received by the laser radar are usually compact, that is, the distance between the target laser points is small, and the point cloud clustering process can determine the laser points belonging to the same cluster according to the distance between the laser points, so that the laser points belonging to the same cluster obtained by performing the point cloud clustering process on the target point cloud data are determined as the target laser points of the corresponding target object, and the target laser points of the corresponding target object can be determined more accurately.
Corresponding to the point cloud data processing method, the embodiment of the invention also provides a point cloud data processing device.
Referring to fig. 5, there is provided a schematic structural diagram of a first point cloud data processing apparatus, the apparatus including:
A speed obtaining module 501, configured to obtain a relative speed of a target object with respect to a driving object at a first acquisition time, where the first acquisition time is: the method comprises the steps of acquiring first point cloud data, wherein the first point cloud data are as follows: the point cloud data collected by the first laser radar installed on the driving object is that: an object in a running environment in which the running object is located;
The target determining module 502 is configured to determine a target laser point corresponding to the target object in target point cloud data, where the target point cloud data is: the time difference between the acquisition time and the first acquisition time is smaller than second point cloud data of a first preset time difference, and the second point cloud data are as follows: point cloud data acquired by a second lidar mounted to the traveling object in addition to the first lidar;
an information adjustment module 503, configured to adjust the position information of the target laser point in a direction opposite to the direction of the relative speed based on an acquisition time difference and the relative speed, where the acquisition time difference is: and a time difference between the first acquisition time and the acquisition time of the target point cloud data.
From the above, in the solution provided by the embodiment of the present invention, the relative speed of the target object with respect to the driving object at the first acquisition time is obtained first, then the target laser point of the corresponding target object is determined in the target point cloud data, and finally the position information of the target laser point in the direction opposite to the direction of the relative speed is adjusted based on the acquisition time difference and the relative speed. Since the relative speed is the speed of the target object relative to the traveling object, and the acquisition time difference is the time difference between the first acquisition time and the acquisition time of the target point cloud data, based on the acquisition time difference and the relative speed, the relative displacement of the target object relative to the traveling object can be determined, the position information of the target laser point can be adjusted in the direction opposite to the direction of the relative speed according to the relative displacement, and the error between the target point cloud data and the first point cloud data caused by the relative displacement of the target object can be offset, so that the adjusted target point cloud data approaches to the point cloud data which can be acquired by the second laser radar corresponding to the target point cloud data at the first acquisition time. Therefore, by the scheme provided by the embodiment of the invention, the error between the point cloud data acquired by the laser radar is reduced.
In one embodiment of the present invention, the information adjustment module 503 is specifically configured to:
Decomposing the relative speed into a longitudinal speed and a transverse speed, wherein the direction of the longitudinal speed is the same as the running direction of the running object, and the direction of the transverse speed is perpendicular to the running direction of the running object;
Obtaining a longitudinal compensation amount based on the acquisition time difference and the longitudinal speed, and adjusting the position information of the target laser point based on the longitudinal compensation amount in a direction opposite to the direction of the longitudinal speed;
Based on the acquisition time difference and the lateral velocity, a lateral compensation amount is obtained, and the position information of the target laser spot is adjusted based on the lateral compensation amount in a direction opposite to the direction of the lateral velocity.
As can be seen from the above, in the solution provided by the embodiment of the present invention, the relative velocity is decomposed into the longitudinal velocity and the transverse velocity, and the position information of the target laser point is adjusted twice in the direction opposite to the direction of the longitudinal velocity and the direction opposite to the direction of the transverse velocity, so that the errors between the target point cloud data and the first point cloud data caused by the relative displacement of the target object generated in the acquisition time difference are offset in the two directions, thereby reducing the errors between the target point cloud data and the first point cloud data after the position information of the target laser point is adjusted.
In an embodiment of the present invention, referring to fig. 6, a schematic structural diagram of a second point cloud data processing apparatus is provided, and in this embodiment, compared to the embodiment shown in fig. 5, the speed obtaining module 501 includes:
A first location submodule 501A configured to obtain a first location of the target object at a first acquisition time;
a second location sub-module 501B, configured to obtain a second location of the target object at a time of collection of third point cloud data, where the third point cloud data is: the point cloud data which are acquired by the first laser radar and are acquired by the first laser radar, wherein the time difference between the acquisition time and the first acquisition time is smaller than a second preset time difference;
a target speed submodule 501C, configured to calculate a target speed of the target object at the first acquisition time based on the first position and the second position;
a relative speed submodule 501D is configured to determine a relative speed of the target object with respect to the driving object at the first acquisition time based on the driving speed of the driving object at the first acquisition time and the target speed.
From the foregoing, in the solution provided by the embodiment of the present invention, by obtaining the first position of the target object at the first acquisition time and the second position of the third point cloud data at the acquisition time, because the position difference between the first position and the second position is the displacement generated between the first acquisition time and the acquisition time of the third point cloud data, the target speed of the target object at the first acquisition time can be accurately calculated based on the first position and the second position, and further based on the running speed of the running object at the first acquisition time and the calculated target speed, the relative speed of the target object relative to the running object at the first acquisition time can be accurately determined, so that the position information of the target laser point in the target point cloud data can be accurately adjusted.
In one embodiment of the present invention, the first location submodule 501A is specifically configured to:
Performing point cloud fusion processing on the first point cloud data and the target point cloud data to obtain first fusion point cloud data, determining a first laser point corresponding to the target object in the first fusion point cloud data, and determining a first position of the target object based on position information of the first laser point;
in another embodiment of the present invention, the second location sub-module 501B is specifically configured to:
Performing point cloud fusion processing on the third point cloud data and fourth point cloud data to obtain second fusion point cloud data, determining a second laser point corresponding to the target object in the second fusion point cloud data, and determining a second position of the target object based on position information of the second laser point, wherein the fourth point cloud data is: and the time difference between the acquisition time acquired by the second laser radar and the acquisition time of the third point cloud data is smaller than the point cloud data of a third preset time difference.
In the above-mentioned scheme provided by the embodiment of the invention, a point cloud fusion mode is adopted to fuse a plurality of point cloud data into one fused point cloud data, and then the position information of the target object is determined based on the fused point cloud data. Compared with the method for determining the position information of the target object based on one point cloud data, the method for determining the position information of the target object based on the fusion point cloud data can improve the accuracy of the determined position information of the target object because the fusion point cloud data comprises laser points in a plurality of point cloud data, and therefore the fusion point cloud data comprises laser points of the corresponding target object collected by different laser radars from a plurality of angles.
In one embodiment of the present invention, the objective determining module 502 is specifically configured to:
And performing point cloud clustering processing on the target point cloud data, and determining laser points belonging to the same cluster obtained by clustering as target laser points corresponding to the target object.
In the above-mentioned scheme provided by the embodiment of the invention, the point cloud clustering processing is performed on the target point cloud data, so that the clustered laser points belonging to the same cluster can be determined as the target laser points of the corresponding target object. When the point cloud data of the target object is collected, the laser points reflected by the target object received by the laser radar are usually compact, that is, the distance between the target laser points is small, and the point cloud clustering process can determine the laser points belonging to the same cluster according to the distance between the laser points, so that the laser points belonging to the same cluster obtained by performing the point cloud clustering process on the target point cloud data are determined as the target laser points of the corresponding target object, and the target laser points of the corresponding target object can be determined more accurately.
In one embodiment of the present invention, the first lidar is: and a laser radar with highest resolution among the laser radars mounted on the traveling object.
According to the scheme provided by the embodiment of the invention, the error between the point cloud data acquired by the laser radars is reduced by adjusting the position information of the target laser point in the target point cloud data, namely, the adjusted target point cloud data can be close to the point cloud data which can be acquired by the second laser radars corresponding to the target point cloud data at the first acquisition time, and the first acquisition time is the acquisition time of the first point cloud data, so that the position information of the target laser point in the adjusted target point cloud data is adjusted by taking the first point cloud data as a reference. In the embodiment of the invention, the first laser radar is the laser radar with the highest resolution in the laser radars installed and driven, so that compared with other laser radars serving as the first laser radars, the first point cloud data acquired by the laser radar with the highest resolution is more detailed, and the accuracy of adjusting the position information of the target laser point in the target point cloud data can be improved by taking the first point cloud data as a reference.
The embodiment of the present invention further provides an electronic device, as shown in fig. 7, including a processor 701, a communication interface 702, a memory 703 and a communication bus 704, where the processor 701, the communication interface 702, and the memory 703 perform communication with each other through the communication bus 704,
A memory 703 for storing a computer program;
the processor 701 is configured to execute the program stored in the memory 703, and implement the following steps:
obtaining a relative speed of a target object relative to a driving object at a first acquisition time, wherein the first acquisition time is: the method comprises the steps of acquiring first point cloud data, wherein the first point cloud data are as follows: the point cloud data are collected by a first laser radar installed on the driving object;
Determining a target laser point corresponding to the target object in target point cloud data, wherein the target point cloud data is: the second point cloud data with the acquisition time closest to the first acquisition time are: point cloud data acquired by a second lidar mounted to the traveling object in addition to the first lidar;
Adjusting position information of the target laser point in a direction opposite to a direction of the relative speed based on an acquisition time difference and the relative speed, wherein the acquisition time difference is: and a time difference between the first acquisition time and the acquisition time of the target point cloud data.
In addition, the electronic device may implement other image processing methods as described in the previous method embodiment section, which will not be described in detail herein.
The communication bus mentioned above for the electronic device may be a peripheral component interconnect standard (PERIPHERAL COMPONENT INTERCONNECT, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but may also be a digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components.
In yet another embodiment of the present invention, there is also provided a computer readable storage medium having stored therein a computer program which when executed by a processor implements the steps of any of the point cloud data processing methods described above.
In yet another embodiment of the present invention, there is also provided a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the point cloud data processing methods of the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the apparatus, electronic device, computer readable storage medium, and computer program product embodiments, the description is relatively simple, as relevant to the method embodiments being referred to in the section of the description of the method embodiments.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.

Claims (14)

1. A method for processing point cloud data, the method comprising:
Obtaining a relative speed of a target object relative to a driving object at a first acquisition time, wherein the first acquisition time is: the method comprises the steps of acquiring first point cloud data, wherein the first point cloud data are as follows: the point cloud data collected by the first laser radar installed on the driving object is that: an object in a running environment in which the running object is located;
Determining a target laser point corresponding to the target object in target point cloud data, wherein the target point cloud data is: the time difference between the acquisition time and the first acquisition time is smaller than second point cloud data of a first preset time difference, and the second point cloud data are as follows: point cloud data acquired by a second lidar mounted to the traveling object in addition to the first lidar;
Adjusting position information of the target laser point in a direction opposite to a direction of the relative speed based on an acquisition time difference and the relative speed, wherein the acquisition time difference is: a time difference between the first acquisition time and the acquisition time of the target point cloud data;
The adjusting the position information of the target laser point in a direction opposite to the direction of the relative speed based on the acquisition time difference and the relative speed includes:
And multiplying the acquisition time difference by the relative speed to obtain the relative displacement of the target object relative to the running object in the acquisition time difference, and adjusting the position information of the target laser point in the direction opposite to the direction of the relative speed according to the relative displacement, so that errors between target point cloud data and first point cloud data caused by the relative displacement of the target object can be offset, and the adjusted target point cloud data approaches to the point cloud data which can be acquired by the second laser radar corresponding to the target point cloud data at the first acquisition time.
2. The method of claim 1, wherein adjusting the position information of the target laser spot in a direction opposite to the direction of the relative velocity based on the acquisition time difference and the relative velocity comprises:
Decomposing the relative speed into a longitudinal speed and a transverse speed, wherein the direction of the longitudinal speed is the same as the running direction of the running object, and the direction of the transverse speed is perpendicular to the running direction of the running object;
Obtaining a longitudinal compensation amount based on the acquisition time difference and the longitudinal speed, and adjusting the position information of the target laser point based on the longitudinal compensation amount in a direction opposite to the direction of the longitudinal speed;
Based on the acquisition time difference and the lateral velocity, a lateral compensation amount is obtained, and the position information of the target laser spot is adjusted based on the lateral compensation amount in a direction opposite to the direction of the lateral velocity.
3. The method according to claim 1 or 2, wherein said obtaining the relative speed of the target object with respect to the driving object at the first acquisition instant comprises:
Obtaining a first position of the target object at a first acquisition time;
Obtaining a second position of the target object at a collection time of third point cloud data, wherein the third point cloud data is: the point cloud data which are acquired by the first laser radar and are acquired by the first laser radar, wherein the time difference between the acquisition time and the first acquisition time is smaller than a second preset time difference;
calculating a target speed of the target object at the first acquisition moment based on the first position and the second position;
And determining the relative speed of the target object relative to the running object at the first acquisition time based on the running speed of the running object at the first acquisition time and the target speed.
4. A method according to claim 3, wherein said obtaining a first position of said target object at a first acquisition instant comprises:
Performing point cloud fusion processing on the first point cloud data and the target point cloud data to obtain first fusion point cloud data, determining a first laser point corresponding to the target object in the first fusion point cloud data, and determining a first position of the target object based on position information of the first laser point;
And/or
The obtaining the second position of the target object at the time of the collection of the third point cloud data includes:
Performing point cloud fusion processing on the third point cloud data and fourth point cloud data to obtain second fusion point cloud data, determining a second laser point corresponding to the target object in the second fusion point cloud data, and determining a second position of the target object based on position information of the second laser point, wherein the fourth point cloud data is: and the time difference between the acquisition time acquired by the second laser radar and the acquisition time of the third point cloud data is smaller than the point cloud data of a third preset time difference.
5. The method according to claim 1 or 2, wherein the determining a target laser point corresponding to the target object in target point cloud data comprises:
And performing point cloud clustering processing on the target point cloud data, and determining laser points belonging to the same cluster obtained by clustering as target laser points corresponding to the target object.
6. A method according to claim 1 or 2, characterized in that,
The first laser radar is: and a laser radar with highest resolution among the laser radars mounted on the traveling object.
7. A point cloud data processing apparatus, the apparatus comprising:
The speed obtaining module is used for obtaining the relative speed of the target object relative to the running object at a first acquisition time, wherein the first acquisition time is: the method comprises the steps of acquiring first point cloud data, wherein the first point cloud data are as follows: the point cloud data collected by the first laser radar installed on the driving object is that: an object in a running environment in which the running object is located;
the target determining module is configured to determine a target laser point corresponding to the target object in target point cloud data, where the target point cloud data is: the time difference between the acquisition time and the first acquisition time is smaller than second point cloud data of a first preset time difference, and the second point cloud data are as follows: point cloud data acquired by a second lidar mounted to the traveling object in addition to the first lidar;
the information adjustment module is used for adjusting the position information of the target laser point in the direction opposite to the direction of the relative speed based on the acquisition time difference and the relative speed, wherein the acquisition time difference is as follows: a time difference between the first acquisition time and the acquisition time of the target point cloud data;
The information adjustment module is specifically configured to multiply the acquisition time difference with the relative speed to obtain a relative displacement of the target object relative to the traveling object within the acquisition time difference, and adjust the position information of the target laser point in a direction opposite to the direction of the relative speed according to the relative displacement, so that an error between the target point cloud data and the first point cloud data caused by the relative displacement of the target object can be offset, and the adjusted target point cloud data approaches to the point cloud data which can be acquired by the second laser radar corresponding to the target point cloud data at the first acquisition time.
8. The apparatus of claim 7, wherein the information adjustment module is specifically configured to:
Decomposing the relative speed into a longitudinal speed and a transverse speed, wherein the direction of the longitudinal speed is the same as the running direction of the running object, and the direction of the transverse speed is perpendicular to the running direction of the running object;
Obtaining a longitudinal compensation amount based on the acquisition time difference and the longitudinal speed, and adjusting the position information of the target laser point based on the longitudinal compensation amount in a direction opposite to the direction of the longitudinal speed;
Based on the acquisition time difference and the lateral velocity, a lateral compensation amount is obtained, and the position information of the target laser spot is adjusted based on the lateral compensation amount in a direction opposite to the direction of the lateral velocity.
9. The apparatus according to claim 7 or 8, wherein the speed obtaining module comprises:
The first position sub-module is used for obtaining a first position of the target object at a first acquisition time;
A second location sub-module, configured to obtain a second location of the target object at a time of collection of third point cloud data, where the third point cloud data is: the point cloud data which are acquired by the first laser radar and are acquired by the first laser radar, wherein the time difference between the acquisition time and the first acquisition time is smaller than a second preset time difference;
The target speed sub-module is used for calculating the target speed of the target object at the first acquisition moment based on the first position and the second position;
And the relative speed sub-module is used for determining the relative speed of the target object relative to the running object at the first acquisition time based on the running speed of the running object at the first acquisition time and the target speed.
10. The apparatus of claim 9, wherein the first location sub-module is specifically configured to:
Performing point cloud fusion processing on the first point cloud data and the target point cloud data to obtain first fusion point cloud data, determining a first laser point corresponding to the target object in the first fusion point cloud data, and determining a first position of the target object based on position information of the first laser point;
And/or
The second position sub-module is specifically configured to:
Performing point cloud fusion processing on the third point cloud data and fourth point cloud data to obtain second fusion point cloud data, determining a second laser point corresponding to the target object in the second fusion point cloud data, and determining a second position of the target object based on position information of the second laser point, wherein the fourth point cloud data is: and the time difference between the acquisition time acquired by the second laser radar and the acquisition time of the third point cloud data is smaller than the point cloud data of a third preset time difference.
11. The apparatus according to claim 7 or 8, wherein the targeting module is specifically configured to:
And performing point cloud clustering processing on the target point cloud data, and determining laser points belonging to the same cluster obtained by clustering as target laser points corresponding to the target object.
12. The apparatus according to claim 7 or 8, wherein,
The first laser radar is: and a laser radar with highest resolution among the laser radars mounted on the traveling object.
13. The electronic equipment is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
A memory for storing a computer program;
a processor for carrying out the method steps of any one of claims 1-6 when executing a program stored on a memory.
14. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored therein a computer program which, when executed by a processor, implements the method steps of any of claims 1-6.
CN202111174740.3A 2021-10-09 2021-10-09 Point cloud data processing method and device Active CN114119465B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111174740.3A CN114119465B (en) 2021-10-09 2021-10-09 Point cloud data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111174740.3A CN114119465B (en) 2021-10-09 2021-10-09 Point cloud data processing method and device

Publications (2)

Publication Number Publication Date
CN114119465A CN114119465A (en) 2022-03-01
CN114119465B true CN114119465B (en) 2024-07-05

Family

ID=80441835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111174740.3A Active CN114119465B (en) 2021-10-09 2021-10-09 Point cloud data processing method and device

Country Status (1)

Country Link
CN (1) CN114119465B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115015955A (en) * 2022-05-23 2022-09-06 天津卡尔狗科技有限公司 Method, apparatus, device, storage medium and program product for determining motion information

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109903330A (en) * 2018-09-30 2019-06-18 华为技术有限公司 A kind of method and apparatus handling data
CN112014829A (en) * 2020-08-05 2020-12-01 深圳煜炜光学科技有限公司 Performance index testing method and device of laser radar scanner

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306053A (en) * 2019-08-09 2021-02-02 北京京东尚科信息技术有限公司 Unmanned vehicle control method
CN110687549B (en) * 2019-10-25 2022-02-25 阿波罗智能技术(北京)有限公司 Obstacle detection method and device
CN112147635B (en) * 2020-09-25 2024-05-31 北京亮道智能汽车技术有限公司 Detection system, method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109903330A (en) * 2018-09-30 2019-06-18 华为技术有限公司 A kind of method and apparatus handling data
CN112014829A (en) * 2020-08-05 2020-12-01 深圳煜炜光学科技有限公司 Performance index testing method and device of laser radar scanner

Also Published As

Publication number Publication date
CN114119465A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN111492265B (en) Multi-resolution, simultaneous localization and mapping based on 3D lidar measurements
CN110687549B (en) Obstacle detection method and device
CN110178048B (en) Method and system for generating and updating vehicle environment map
CN110889808B (en) Positioning method, device, equipment and storage medium
JP4984659B2 (en) Own vehicle position estimation device
JP2014025925A (en) Vehicle controller and vehicle system
US10656259B2 (en) Method for determining trajectories of moving physical objects in a space on the basis of sensor data of a plurality of sensors
CN112051575B (en) Method for adjusting millimeter wave radar and laser radar and related device
CN112731371B (en) Laser radar and vision fusion integrated target tracking system and method
CN113160327A (en) Method and system for realizing point cloud completion
CN109407086B (en) Aircraft trajectory generation method and system and trapping system target guiding method
US20220178718A1 (en) Sensor fusion for dynamic mapping
CN113959457B (en) Positioning method and device for automatic driving vehicle, vehicle and medium
KR102456151B1 (en) Sensor fusion system based on radar and camera and method of calculating the location of nearby vehicles
CN112013877A (en) Detection method and related device for millimeter wave radar and inertial measurement unit
CN114280611A (en) Road side sensing method integrating millimeter wave radar and camera
CN114485698A (en) Intersection guide line generating method and system
CN114119465B (en) Point cloud data processing method and device
CN112965076B (en) Multi-radar positioning system and method for robot
US20230094836A1 (en) Method for Detecting Moving Objects in the Surroundings of a Vehicle, and Motor Vehicle
CN111753901B (en) Data fusion method, device, system and computer equipment
CN113433568B (en) Laser radar observation simulation method and device
CN112630798B (en) Method and apparatus for estimating ground
CN113895482B (en) Train speed measuring method and device based on trackside equipment
CN114861725A (en) Post-processing method, device, equipment and medium for perception and tracking of target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant