US20210208261A1 - Method for identification of a noise point used for lidar, and lidar system - Google Patents

Method for identification of a noise point used for lidar, and lidar system Download PDF

Info

Publication number
US20210208261A1
US20210208261A1 US17/146,177 US202117146177A US2021208261A1 US 20210208261 A1 US20210208261 A1 US 20210208261A1 US 202117146177 A US202117146177 A US 202117146177A US 2021208261 A1 US2021208261 A1 US 2021208261A1
Authority
US
United States
Prior art keywords
point
noise
factor
weight
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/146,177
Inventor
Xiaotong Zhou
Shaoqing Xiang
Zhaoming Zeng
Zhenlei Shao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Velodyne Lidar USA Inc
Original Assignee
Velodyne Lidar USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Velodyne Lidar USA Inc filed Critical Velodyne Lidar USA Inc
Priority to US17/146,177 priority Critical patent/US20210208261A1/en
Publication of US20210208261A1 publication Critical patent/US20210208261A1/en
Assigned to HERCULES CAPITAL, INC., AS AGENT reassignment HERCULES CAPITAL, INC., AS AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VELODYNE LIDAR USA, INC.
Assigned to VELODYNE LIDAR USA, INC. reassignment VELODYNE LIDAR USA, INC. RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT RECORDED AT REEL/FRAME NO. 063593/0463 Assignors: HERCULES CAPITAL, INC.
Assigned to VELODYNE LIDAR USA, INC. reassignment VELODYNE LIDAR USA, INC. MERGER AND CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: VELODYNE LIDAR USA, INC., VELODYNE LIDAR, INC., VL MERGER SUB INC.
Assigned to VELODYNE LIDAR, INC. reassignment VELODYNE LIDAR, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HESAI PHOTONICS TECHNOLOGY CO., LTD.
Assigned to HESAI PHOTONICS TECHNOLOGY CO., LTD. reassignment HESAI PHOTONICS TECHNOLOGY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHAO, Zhenlei, XIANG, SHAOQING, ZENG, ZHAOMING, ZHOU, Xiaotong
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/495Counter-measures or counter-counter-measures using electronic or electro-optical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/061Sources
    • G01N2201/06113Coherent sources; lasers
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the present disclosure relates to the field of LiDAR, and, more specifically, to a method for identification of a noise point used for a LiDAR, and a LiDAR system.
  • LiDAR light detection and ranging
  • the working principle of LiDAR is as follows: a LiDAR emitter emits a beam of laser; the laser beam runs into an object, and returns back to a laser receiver upon diffuse reflection; and a radar module multiplies a time interval between emission and reception of the signal with the velocity of light, and divides by 2 so as to calculate out the distance between the emitter and the object.
  • the number of laser beams it usually includes, for example, single-line LiDAR, 4-line LiDAR, 8-/16-/32-/64-line LiDAR, etc.
  • One or more laser beams are emitted at various angles in the vertical direction, achieving the detection of the three-dimensional profile of a target region through scanning in the horizontal direction. Since a plurality of measuring channels (lines) are equivalent to a plurality of scanning planes of different inclination angles, the more the laser beams in the vertical field of view, the higher the angular resolution in the vertical direction and the greater the density of a laser point cloud.
  • a mechanical LiDAR as an example, it will generate more noise points and thus affects the quality of a point cloud when used for detection under rainy, snowy or foggy weather. For example, since there will be large and small water drops in a high density in the air under rainy, snowy or foggy weather, when a laser beam is irradiated onto a water drop, it will produce reflection echoes and form corresponding points in a point cloud. As the point of such type is not the one that actually exists in the detected target object, it should be regarded as a noise point.
  • LiDAR In addition to the mechanical LiDAR as mentioned above, other types of LiDARs, such as galvanometer scanning LiDAR, rotating mirror scanning LiDAR, or pure solid-state LiDAR including Flash LiDAR and phased array LiDAR, will also encounter the same noise point problem when used for detection in rain, snow and fog.
  • galvanometer scanning LiDAR rotating mirror scanning LiDAR
  • pure solid-state LiDAR including Flash LiDAR and phased array LiDAR
  • the present disclosure provides a method for identification of a noise point used for a LiDAR, comprising:
  • step S 201 receiving a point cloud generated by the LiDAR
  • step S 202 obtaining a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point;
  • step S 203 determining whether the point is a noise point at least based on the distance and the at least one of the reflectivity and the continuity parameter.
  • the step S 202 comprises: obtaining the reflectivity of the point in the point cloud and the distance between the point and the LiDAR, wherein the step S 203 comprises determining whether the point is a noise point based on the reflectivity and the distance.
  • the step S 202 comprises: obtaining the continuity parameter of the point in the point cloud and the distance between the point and the LiDAR, wherein the step S 203 comprises: determining whether the point is a noise point based on the continuity parameter and the distance.
  • the step S 203 comprises: determining that the point is a noise point when the distance is within a predefined range of distance and the reflectivity is less than or equal to a predefined threshold for reflectivity.
  • the step S 203 comprises: calculating a noise point confidence level of the point, and determining that the point is a noise point when the noise point confidence level is beyond a normal range of confidence level, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within the predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; and the noise point confidence level equals to the sum of the distance factor and the reflectivity factor, wherein the first weight is greater than the second weight.
  • the step S 202 comprises: obtaining the reflectivity and the continuity parameter of the point and the distance between the point and the LiDAR; wherein the step S 203 comprises: determining whether the point is a noise point based on the distance, the reflectivity and the continuity parameter.
  • the method further comprises: obtaining noise of the point and the number of echo pulses of the point; wherein the step S 203 comprises: determining whether the point is a noise point based on the distance, the reflectivity, the continuity parameter, the noise and the number of echo pulses.
  • the step of determining whether the point is a noise point comprises: calculating a noise point confidence level of the point, and determining that the point is a noise point when the noise point confidence level is beyond a normal range of confidence level, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within a predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight; a noise factor is set to zero when the noise is greater than a threshold for noise, otherwise the noise factor is set as a fourth weight; an echo pulse number factor is set to zero when the number of echo pulses is greater than a threshold for pulse number, otherwise the echo pulse number factor is set as a fifth weight
  • the first, second, third, fourth and fifth weights satisfy one or more of the following conditions: the first weight equals to the sum of the second and third weights; the second weight equals to the third weight; the first weight equals to the threshold for confidence level; the sum of the fourth and fifth weights equals to the second and/or third weights; and the first weight is greater than the second, third, fourth and fifth weight.
  • the method further comprises: dynamically adjusting the first, second, third, fourth and fifth weights based on the detected state of weather.
  • the step of dynamically adjusting comprises: reducing the second weight and increasing the fourth weight when snowy weather is detected; and/or increasing the fifth weight when foggy weather is detected.
  • the step S 203 comprises: determining that the point is a noise point when the distance is within a predefined range of distance, and the continuity parameter is beyond a normal range of continuity parameter.
  • the step of determining whether the point is a noise point comprises: calculating a noise point confidence level of the point, and determining that the point is a noise point when the noise point confidence level is beyond a normal range of confidence level, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within a predefined range of distance, otherwise the distance factor is set as a first weight; a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight; and the noise point confidence level equals to the sum of the distance factor and the continuity factor, wherein the first weight is greater than the third weight.
  • the present disclosure further discloses a LiDAR system, comprising:
  • a LiDAR configured to scan its surroundings to generate a point cloud
  • a denoising unit coupled to the LiDAR to receive the point cloud, and configured to perform the method for identification of a noise point described above to determine whether a point in the point cloud is a noise point, and filter out noise points in the point cloud;
  • an output unit coupled to the denoising unit, and configured to output the point cloud.
  • the LiDAR system further comprises a control unit coupled to the LiDAR, the denoising unit and the output unit, and capable of enabling or disabling the denoising unit, wherein in an enabled state, the denoising unit filters out noise points in the point cloud, and the output unit outputs the point cloud with the noise points filtered out; and in a disabled mode, the denoising unit is disabled, and the output unit outputs the point cloud with the noise points not being filtered out.
  • a control unit coupled to the LiDAR, the denoising unit and the output unit, and capable of enabling or disabling the denoising unit, wherein in an enabled state, the denoising unit filters out noise points in the point cloud, and the output unit outputs the point cloud with the noise points filtered out; and in a disabled mode, the denoising unit is disabled, and the output unit outputs the point cloud with the noise points not being filtered out.
  • control unit enables the denoising unit when rainy, snowy or foggy weather is detected.
  • control unit determines that rainy, snowy or foggy weather is detected, when the number of noise points goes beyond a predefined threshold.
  • the LiDAR system further comprising an input unit for receiving input from a user, wherein the control unit is capable of enabling or disabling the denoising unit based on the input from the user.
  • the present disclosure further discloses a LiDAR system, comprising:
  • a LiDAR configured to scan its surroundings to generate a point cloud
  • a confidence level calculating unit coupled to the LiDAR to receive the point cloud, and configured to calculate a noise point confidence level of a point in the point cloud at least based on a distance between the point and the LiDAR, and at least one of reflectivity and continuity parameter of the point;
  • an output unit coupled to the LiDAR and the confidence level calculating unit, and configured to output the point cloud and noise point confidence levels of points in the point cloud.
  • the confidence level calculating unit is configured to calculate a noise point confidence level of a point in the point cloud, based on a distance between the point and the LiDAR, reflectivity of the point, continuity parameter of the point, noise of the point and a pulse number factor of the point, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within the predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; a continuity factor is set to zero when the continuity parameter is beyond a normal range of continuity parameter, otherwise the continuity factor is set as a third weight; a noise factor is set to zero when the noise is greater than a threshold for noise, otherwise the noise factor is set as a fourth weight; a pulse number factor is set to zero when the number of pulses is greater than a threshold for pulse number, otherwise the
  • the LiDAR system further comprises an input unit for receiving input from a user, wherein the input unit is capable of instructing, based on the input from the user, the output unit about whether to filter out noise points in the point cloud.
  • the present disclosure further discloses a device for identification of a noise point used for a LiDAR, comprising:
  • a receiving unit configured to receive a point cloud generated by the LiDAR
  • an obtaining unit configured to obtain a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point;
  • a determining unit configured to determine whether the point is a noise point at least based on to the distance and the at least one of the reflectivity and the continuity parameter.
  • the present disclosure further discloses a computer readable storage medium having stored thereon computer executable instructions that, when executed by a processor, perform the method for identification of a noise point described above.
  • FIG. 1 is a schematic illustrating a LiDAR
  • FIG. 2 illustrates a method for identification of noise point used for a LiDAR in accordance with the first aspect of the present disclosure
  • FIG. 3 schematically illustrates the projection of multiple points in a point cloud of a LiDAR onto a plane perpendicular to the laser direction;
  • FIG. 4A illustrates a method for calculating the noise of echoes in accordance with one example of the present disclosure
  • FIG. 4B illustrates a comparison between the waveform noise in the case of snow and the normal waveform noise in accordance with one example of the present disclosure
  • FIG. 5 schematically illustrates a LiDAR system in accordance with the second aspect of the present disclosure
  • FIG. 6 illustrates a LiDAR system in accordance with one preferable example of the present disclosure
  • FIG. 7 illustrates a LiDAR system in accordance with another example of the present disclosure
  • FIG. 8 illustrates a LiDAR system in accordance with another example of the present disclosure
  • FIG. 9 illustrates a noise point identification device for use on a LiDAR in accordance with the third aspect of the present disclosure.
  • FIG. 10 illustrates a computer program product in accordance with the third aspect of the present disclosure.
  • orientation or position relations denoted by such terms as “central” “longitudinal” “latitudinal” “length” “width” “thickness” “above” “below” “front” “rear” “left” “right” “vertical” “horizontal” “top” “bottom” “inside” “outside” “clockwise” “counterclockwise” and the like are based on the orientation or position as shown in the accompanying drawings, and only used for the purpose of facilitating description for the present disclosure and simplification of the description, instead of indicating or suggesting that the denoted devices or elements must be specifically oriented, or configured or operated in some specific orientation. Thus, such terms should not be construed to limit the present disclosure.
  • connection may refer to fixed connection, dismountable connection, or integrated connection; also to mechanical connection, electric connection or intercommunication; further to direct connection, or connection by an intermediary medium; or even to internal communication between two elements or interaction between two elements.
  • mount link
  • connect should be understood as generic terms.
  • connection may refer to fixed connection, dismountable connection, or integrated connection; also to mechanical connection, electric connection or intercommunication; further to direct connection, or connection by an intermediary medium; or even to internal communication between two elements or interaction between two elements.
  • first feature if a first feature is “above” or “below” a second one, it may cover the direction contact between the first and second features, also cover the contact via another feature therebetween, instead of the direct contact. Furthermore, if a first feature “above”, “over” or “on the top of” a second one, it may cover that the first feature is right above or on the inclined top of the second feature, or just indicate that the first feature has a horizontal height higher than that of the second feature.
  • first feature is “below”, “under” or “on the bottom of” a second feature, it may cover that the first feature is right below and on the inclined bottom of the second feature, or just indicates that the first feature has a horizontal height lower than that of the second feature.
  • FIG. 1 illustrates an instance of a LiDAR 100 .
  • the LiDAR is a 16-line LiDAR, which means it may emit 16 lines of laser beams in total, including L 1 , L 2 , . . . , L 15 and L 16 , in an vertical plane as shown in the figure (each line of laser beams just corresponds to one channel of all the 16 channels of the LiDAR) for detection of the surroundings.
  • the LiDAR 100 may rotate about its vertical axis.
  • each channel of the LiDAR successively emits a laser beam in turn according to a certain time interval (e.g., 1 ⁇ s) and carries out detection so as to complete a line scanning in the vertical field of view; and then perform the next line scanning in the vertical field of view as spaced with a certain angle (e.g., 0.1 or 0.2 degree) in the direction of the horizontal field of view, so as to form a point cloud upon multiple times of detection during the rotation and thus be able to detect the surroundings.
  • a certain time interval e.g. 1 ⁇ s
  • a certain angle e.g., 0.1 or 0.2 degree
  • FIG. 2 illustrates a method for identification of noise point of a LiDAR in accordance with the first aspect of the present disclosure.
  • the method for identification of noise point comprises:
  • Step S 201 receiving a point cloud generated by the LiDAR.
  • Data of the point cloud generated by the LiDAR may usually include coordinates of every point and reflectivity of the point (the reflectivity is proportional to the strength of the reflected beam and the distance between the target point and the LiDAR).
  • the mounting position of an LiDAR for example, is taken as an origin, and the offset of the point may be specifically represented by polar coordinates (i.e., distance and angle), or represented using x/y/z three-dimensional rectangular coordinates.
  • Step S 202 obtaining a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point.
  • the distance between the point and the LiDAR may be calculated out directly by the coordinates of the point.
  • the reflectivity of the point may be obtained directly from the data of the point cloud.
  • the “continuity parameter” of the point may be defined as a characterization parameter that indicates the continuity between the point and one or more points of its adjacent points in the point cloud, such as the distance to one or more of the surrounding points in the point cloud.
  • FIG. 3 schematically illustrates the projection of a point cloud generated by a LiDAR on a plane perpendicular to the laser direction (for simplicity, the depth information is not shown in FIG. 3 ).
  • point 5 is a point currently to be determined, and is an echo detected out at the time tn by the laser beam L 2 in FIG. 1
  • point 4 is an echo detected out at the last time t n ⁇ 1 by the same channel, namely by the laser beam L 2
  • point 6 is an echo detected out at the next time t n+1 by the same channel, namely by the laser beam L 2 .
  • point 1 , point 2 and point 3 are echoes detected out at the time t n ⁇ 1 , t n , and t n+1 by the laser beam L 3 , respectively; and point 7 , point 8 and point 9 are echoes detected out at the time t n ⁇ 1 , t n , and t n+1 by the laser beam L 1 , respectively.
  • FIG. 3 shows the equal spacing between the points, this drawing is just exemplary, not meaning equidistant spacing between echo points formed at successive time (t n ⁇ 1 , t n , and t n+1 ) by the adjacent beams in the point cloud data.
  • the continuity parameter of point 5 may either refer to the absolute value of the difference value between the distance from point 5 to the LiDAR, and the distance from any one of the eight points (that are above, below, to the left, to the right, to the upper left, to the upper right, to the lower left and to the lower right of point 5 ) to the LiDAR, or refer to the weighted average of the difference values between the distance from point 5 to the LiDAR and the distance from multiple points of the eight points (that are above, below, to the left, to the right, to the upper left, to the upper right, to the lower left and to the lower right of point 5 ) to the LiDAR. Both of these are within the scope of the present disclosure. Under the inspiration of the conception herein, those skilled in the art may make specific adaptation according to needs.
  • LiDAR When a normal object is detected by a LiDAR, those generated points usually show a relatively good continuity. Good continuity of points herein means the lower absolute value of the difference value between the distances from adjacent points to the LiDAR, for example, within the error of range detection of the LiDAR.
  • Good continuity of points herein means the lower absolute value of the difference value between the distances from adjacent points to the LiDAR, for example, within the error of range detection of the LiDAR.
  • One of the main application scenarios of LiDAR is for detecting a variety of objects in road traffic, while the speed of a moving object in road traffic is quite lower than the velocity of light. LiDAR has a higher resolution in horizontal angles, e.g., 0.1 or 0.2 degree.
  • LiDAR has a very high scanning frequency.
  • the time interval between performing line scanning in the same vertical field of view by two adjacent channels of the LiDAR is about 1 ⁇ s, and the cycle for one line scanning in the vertical field of view is also just more than ten ⁇ s or tens of ⁇ s, so the moving distance of a moving object in road traffic may be ignored during such a short time period.
  • the distances from adjacent points of a LiDAR point cloud to the LiDAR should be equal theoretically, and the difference value between the distances actually measured should be relatively tiny.
  • the difference value (or the weighted average of multiple difference values) between the distance from a certain point to the LiDAR and the distance from any one or more of the eight points (that are above, below, to the left, to the right, to the upper left, to the upper right, to the lower left and to the lower right of the certain point) to the LiDAR is greater than a threshold (such as 1 meter), it may be determined that the point has a poor continuity, and thus the point may be a noise point.
  • the correlation between a test point and its surrounding points may be also calculated to be used as the continuity parameter, which is similar to a median filter, that is, an eigenvalue can be calculated out according to convolution and may be used to characterize a discreteness degree between a current point and its surrounding points; when the discreteness degree is higher than a certain threshold, it should be deemed that the point and the surrounding ones are discontinuous.
  • the threshold for the continuity parameter is also variable.
  • the continuity parameter may be adjusted according to the distance between a test point and the LiDAR, and if the distance increases, the measuring error for the LiDAR will also become greater correspondingly, and the threshold for the continuity parameter may be set at a greater value.
  • the difference value between the respective distances from two points to the LiDAR can be calculated approximately as the distance between these two points. This is because of the very small angular spacing between two points in the horizontal direction, e.g., 0.1 or 0.2 degree, in the process of scanning by the LiDAR.
  • the distance between points 4 and 5 approximately equals to the difference value between the distance of point 4 to the LiDAR and the distance of point 5 to the LiDAR.
  • Step S 203 determining whether the point is a noise point at least based on at least one of the reflectivity and the continuity parameter, and the distance.
  • a noise point may be identified and determined by means of distance together with reflectivity.
  • the inventors discover that noise points usually concentrate within a range of 5-10 meters away from the LiDAR, or more prominently within a range of 5-7 meters. Meanwhile, the reflectivity of a noise point is usually less than or equal to 2%.
  • a noise point may be identified according to a combination of distance and reflectivity. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), and the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the point is determined to be a noise point.
  • a noise point may be identified and determined by means of distance together with continuity parameter.
  • the continuity parameter is defined as the difference values between the distance from the point to the LiDAR and the distances from the surrounding points to the LiDAR, and the normal continuity ranges are, for example, less than or equal to 1 meter, or less than or equal to 0.5 meter)
  • the point is determined to be a noise point.
  • continuity parameter is, for example, the difference values between the distance from the current point to the LiDAR and the distances from a previous point (a left point in FIG.
  • the continuity parameter is beyond the normal range of continuity, it indicates that the point is very likely to be a noise point, rather than a point truly positioned on the surrounding object.
  • a noise point may be identified and determined by means of a combination of distance, reflectivity and continuity. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), and the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the point is determined to be a noise point.
  • a predefined range of distance such as 5-10 meters, or 5-7 meters
  • the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%)
  • the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter)
  • the point is determined to be a noise point.
  • the inventors of the present application discover that the noise of the echo corresponding to a point to be determined is also another factor that may be used for determining a noise point, especially in the case of snow.
  • the laser beams emitted by the emitter of the LiDAR run into an object, and form echoes upon diffuse reflection, which are then received by the laser receiver to generate analog data signals (i.e., echo waveform signals); the analog data signals are sampled by the ADC at a certain frequency, and quantization-coded, thereby obtaining a series of sampling points, as shown in FIG. 4A .
  • Waveform noises may be characterized using RMS, i.e., characterizing the fluctuation degree of the waveform, and the specific calculation may be conducted by obtaining the sum of the square of the difference values between each sampling point and echo base line and then calculating the square root of the sum.
  • the echo base line herein refers to a mean value of signals generated by the laser receiver without laser beam reflection, which may be construed as a background noise signal value.
  • the operation of calculating RMS is relatively complicated, the computing resource of FPGA is also so limited that it is rather difficult to use FPGA to perform such a calculation, and a waveform may further contain normal echo signals besides noise, thereby making it difficult to have statistics. Therefore, other numerical values which may be calculated more easily may also be used for characterizing the noise of the waveform. For example, in accordance with one preferable example described herein, as shown in FIG.
  • the echo base line is 85 LSB (Least Significant Bit) value of ADC; and within the range of 10 LSB value above the echo base line, counting 200 sampling points (a number based on experience, which may be likewise 100 or 300 surely) from left to right, the magnitude of the noise of the waveform may be characterized by using the sum or average value of the difference value between the ordinate value of each sampling point and the echo base line, or using the area of a graph surrounded by a curve formed by connecting 200 sampling points, a straight line through the abscissa of the first sampling point and parallel to the Y-axis, a straight line through the abscissa of the 200th sampling point and parallel to the Y-axis, and the echo base.
  • LSB Local Significant Bit
  • the LSB value is the unit to characterize the magnitude of the signal after ADC sampling in the circuit, and is relevant to the performance of the adopted ADC.
  • the 10 LSB value above the echo base line as selected in the preferable example has relation to an echo signal detection threshold as predefined for the LiDAR, in which the echo signal detection threshold is used by the system to determine that a signal is a valid echo signal when the signal has the signal strength greater than or equal to the echo signal detection threshold, and to determine that the signal is noise when it has the signal strength less than the echo signal detection threshold.
  • the echo signal detection threshold is used by the system to determine that a signal is a valid echo signal when the signal has the signal strength greater than or equal to the echo signal detection threshold, and to determine that the signal is noise when it has the signal strength less than the echo signal detection threshold.
  • the method is used to calculate the noise of the echo of a normal point under snowy weather and the noise of the echo of a noise point caused by the snowy weather.
  • the echo noise of each noise point caused by the snowy weather is above 200 (unitless), while the noise of the waveform of a normal point is about 100 only. They may be easily distinguished from each other.
  • lines 41 and 42 in FIG. 4B though presented as continuous lines, are both actually the lines constituted by connecting discrete points one by one.
  • the abscissa stands for the number of measured points, and the abscissa 200 in the figure represents 200 times of detection (200 points) conducted for the same one scenario, thereby obtaining 200 results of noise detection correspondingly, for example, obtaining 200 waveforms of the echo signals similar to that in FIG. 4A .
  • the ordinate of the points in line 42 represents that the noise of the echo of a normal point is 99
  • the ordinate of the points in line 41 represents that the noise of the echo of the noise points caused by the snowy weather is 302.
  • the ordinate of the points in line 42 represents that the noise of the echo of a normal point is 102
  • the ordinate of the points in line 41 represents that the noise of the echo of the noise points caused by the snowy weather is 298.
  • a noise point may be identified and determined by means of a combination of distance, reflectivity, continuity and noise. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), and the noise of the point is greater than a noise threshold (such as 200), the point is determined to be a noise point.
  • a predefined threshold for reflectivity such as 2%
  • the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter)
  • the noise of the point is greater than a noise threshold (such as 200)
  • the point is determined to be a noise point.
  • the inventors of the present application discover that the number of echo pulses is also another factor available for determining a noise point.
  • a laser beam containing one laser pulse is emitted to an object, and reflected back, forming an echo beam which is detected by a LiDAR.
  • the number of echo pulses being detected should be one only.
  • the number of echo pulses that may be detected as greater than the measuring threshold is often more than one for some reasons. For example, a laser beam will gradually diffuse as it travels forward, so that it may run into two different objects successively. The inventors find that the number of echo pulse may increase under rainy, snowy or foggy weather.
  • the laser beams emitted by LiDARs are usually pulse-encoded.
  • a detecting laser beam includes two laser pulses, the time interval of which is encoded, i.e., dual-pulse laser detection. Therefore, during the dual-pulse laser detection, the pulse number threshold may be set at 7. For example, when the number of the detected echo pulse is more than a pulse number threshold (e.g., 7), it indicates that the point may be a noise point.
  • a noise point may be identified and determined by means of a combination of distance, reflectivity, continuity and the number of echo pulses. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • a predefined range of distance such as 5-10 meters, or 5-7 meters
  • the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%)
  • the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter)
  • the number of echo pulses is more than a pulse number threshold (such as 7)
  • the point is determined to be a noise point.
  • a noise point may be identified and determined by means of a combination of distance, reflectivity, continuity, noise and the number of echo pulses. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the noise of the point is greater than a noise threshold (such as 200), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • a predefined range of distance such as 5-10 meters, or 5-7 meters
  • the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%)
  • the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter)
  • the noise of the point is greater than a noise threshold (such as 200)
  • the number of echo pulses is more than a pulse number threshold (such as 7)
  • a noise point may be identified and determined by means of a combination of continuity and reflectivity. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), and the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the point is determined to be a noise point.
  • a predefined threshold for reflectivity such as 2%
  • a noise point may be identified and determined by means of a combination of continuity, reflectivity and noise. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), and the noise of the point is greater than a noise threshold (such as 200), the point is determined to be a noise point.
  • a predefined threshold for reflectivity such as 2%
  • a noise threshold such as 200
  • a noise point may be identified and determined by means of a combination of continuity, reflectivity and the number of echo pulses. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • a predefined threshold for reflectivity such as 2%
  • a pulse number threshold such as 7
  • a noise point may be identified and determined by means of a combination of continuity, reflectivity, noise and the number of echo pulses. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the noise of the point is greater than a noise threshold (such as 200), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • a predefined threshold for reflectivity such as 2%
  • the noise of the point is greater than a noise threshold (such as 200)
  • the number of echo pulses is more than a pulse number threshold (such as 7)
  • noise may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), it is determined that the state of weather is snowy, or the noise point is a noise point caused by the snowy state of weather. Since the echo noise becomes larger under snowy weather, and as shown in FIG. 4B , the echo noise of almost every point under snowy weather is above 200, this factor, i.e. echo noise, may be substantially used to determine a noise point under the snowy weather.
  • a noise threshold such as 200
  • a combination of noise and distance may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), and the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • a noise threshold such as 200
  • the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters)
  • a combination of noise and continuity may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), and the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • a noise threshold such as 200
  • the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter)
  • a combination of noise, distance and continuity may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), and the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • a noise threshold such as 200
  • the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters)
  • the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter)
  • a combination of noise, distance and the number of echo pulses may be used to determine whether a state of weather is snowy.
  • a noise threshold such as 200
  • the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters)
  • the number of echo pulses is more than a pulse number threshold (such as 7)
  • a combination of noise, distance, continuity and the number of echo pulses may be used to determine whether a state of weather is snowy.
  • a noise threshold such as 200
  • the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters)
  • the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter)
  • the number of echo pulses is more than a pulse number threshold (such as 7)
  • a combination of the number of echo pulses and distance can be used to determine whether a state of weather is rainy or foggy.
  • a pulse number threshold such as 7
  • the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters)
  • the state of weather is rainy or foggy
  • the point is a noise point caused by the rainy or foggy state of weather.
  • parameters including distance, reflectivity, continuity, noise and pulse number, or various combinations thereof can be used to determine a noise point and a specific state of weather.
  • the noise point confidence level of a point by means of calculating the noise point confidence level of a point, the point may be likewise determined to be a noise point when the noise point confidence level is beyond the normal range of confidence level.
  • a corresponding noise point confidence level may be calculated, and the calculation method thereof will be provided as below only for the purpose of illustration.
  • the contribution coefficient or factor of the confidence level brought by the characteristic parameter is 0, otherwise the contribution coefficient or factor of the confidence level brought by the characteristic parameter is a weight of the characteristic parameter.
  • a noise point is determined. For example, if a certain point m has a distance of 12 m to the LiDAR, then the determination condition of distance is not satisfied, and the distance factor will be 10; and if the reflectivity is 3%, thereby not satisfying the determination condition of the reflectivity, then the reflectivity factor is set as 5.
  • the normal range of confidence level may be set according to experience and/or specific weather or environment conditions. In accordance with one example, the normal range of confidence level is greater than or equal to 10. When a confidence level is greater than or equal to 10, it indicates the point is a normal point; and when a confidence level is beyond the range, namely, less than 10, it indicates the point is a noise point.
  • the weight of distance (a first weight) is greater than that of reflectivity (a second weight), as shown in Table 1.
  • the weight for each of the characteristic parameters above and the normal range of confidence level set at 10 are just examples of the present disclosure, and the weight for each characteristic parameters and the normal range of confidence level may be changed as needed.
  • the endpoints of the normal range of confidence level may be referred to as confidence level threshold, for example, 10.
  • Table 2 shows a calculation method to identify and determine a noise point according to a combination of distance, reflectivity and continuity, by means of calculation of noise point confidence level.
  • a distance factor is set to zero when the distance is within the range of the predefined distance, otherwise the distance factor is set as a first weight (i.e. the weight of distance); a reflectivity factor is set to zero when the reflectivity is less than or equal to the predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight (i.e. the weight of reflectivity); and a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight (i.e. the weight of continuity).
  • the noise point confidence level equals to the sum of the distance factor, the reflectivity factor and the continuity factor.
  • Table 3 shows a calculation method to identify and determine a noise point according to a combination of distance, reflectivity, continuity, noise and the number of echo pulses, by means of calculation of a noise point confidence level.
  • a distance factor is set to zero when the distance is within the range of the predefined distance, otherwise the distance factor is set as a first weight (i.e. the weight of distance); a reflectivity factor is set to zero when the reflectivity is less than or equal to the predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight (i.e. the weight of reflectivity); a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight (i.e.
  • a noise factor is set to zero when the noise is greater than the threshold for noise, otherwise the noise factor is set as a fourth weight (i.e., the weight of noise); and an echo pulse number factor is set to zero when the number of echo pulses is greater than the threshold for pulse number, otherwise the echo pulse number factor is set as a fifth weight (i.e. the weight of the number of echo pulses).
  • the noise point confidence level equals to the sum of the distance factor, the reflectivity factor, the continuity factor, the noise factor and the echo pulse number factor.
  • the first, second, third, fourth and fifth weights satisfy one or more of the following conditions:
  • the first weight equals to the sum of the second and third weights
  • the second weight equals to the third weight
  • the first weight equals to the threshold for confidence level
  • the sum of the fourth and fifth weights equals to the second and/or third weight
  • the first weight is greater than the second, third, fourth or fifth weight.
  • the first, second, third, fourth and fifth weights may be dynamically adjusted according to the detected state of weather, including adjusting the absolute value of each weight and the relative proportional relation therebetween.
  • a noise point caused by the snowy weather has greater echo noise
  • almost every noise point caused by the snowy weather in the point cloud has an echo noise greater than 200.
  • the reflectivity of a noise point caused by the snowy weather has a greater value, so it may be difficult to determine a noise point caused by the snowy weather according to the reflectivity. Therefore, correspondingly, the weight of reflectivity should be reduced under the snowy weather.
  • the second weight the weight of reflectivity
  • the fourth weight the weight of echo noise
  • the fifth weight may be increased.
  • the second aspect of the present disclosure involves a LiDAR system 300 .
  • a LiDAR system 300 With reference to FIG. 5 , the detailed description will be made as below.
  • the LiDAR system 300 comprises: a LiDAR 301 , a denoising unit 302 , and an output unit 303 .
  • the LiDAR 301 may be an existing LiDAR, as the LiDAR 100 as shown in FIG. 1 , and is configured to scan surroundings to generate point cloud data, while its specific structure will not be described in detail here.
  • the denoising unit 302 is coupled to the LiDAR 301 to receive the point cloud, and configured to perform the method 200 for identifying a noise point as described in the first aspect of the present disclosure to determine whether a point in the point cloud is a noise point, and configured to filter out noise points in the point cloud.
  • the output unit 303 is coupled to the denoising unit 302 , and outputs the point cloud. As shown, the information flow from the denoising unit 302 to the output unit 303 is unidirectional, but the present disclosure is not limited to this because a bidirectional information flow between them is also possible.
  • the LiDAR 301 , the denoising unit 302 and the output unit 303 are three separate units. But this is just for the exemplary purpose. Both the denoising unit 302 and the output unit 303 may be integrated into the LiDAR 301 .
  • a denoising unit 302 may be added to the current LiDAR 301 , to receive point cloud data, and determine and filter out noise points according to the method 200 for identification of noise point, and then by using the output unit 303 , i.e., an output interface, the point cloud with noise points filtered out is output and presented to a user.
  • the LiDAR 301 , the denoising unit 302 and the output unit 303 also may be separate units.
  • the LiDAR 301 is only responsible for laser detection of the surroundings, and output of original point cloud data.
  • the denoising unit 302 and the output unit 303 may be a computer, workstation, or application specific integrated circuit (ASIC) responsible for data processing, which, after receiving the point cloud data, conduct such operations as identification and filtering for noise points, and output. These should all fall within the protection scope hereof.
  • ASIC application specific integrated circuit
  • FIG. 6 illustrates one preferable example described herein.
  • the LiDAR system 300 further comprises a control unit 304 .
  • the control unit 304 is coupled to the denoising unit 302 , and may enable or disable the denoising unit 302 .
  • the denoising unit 302 filters out noise points in the point cloud, and the output unit 303 outputs the point cloud with noise points filtered out; and in a disabled mode, the denoising unit 302 is disabled, and the output unit 303 outputs the point cloud without noise points filtered out.
  • the control unit 304 is coupled to the LiDAR 301 and the output unit 303 .
  • the denoising unit 302 is enabled by default.
  • the denoising unit 302 is disabled by default, and turns enabled under certain circumstances. For example, when rainy, snowy or foggy weather is detected, the control unit 304 enables the denoising unit 302 . This is because, in a state of rainy, snowy or foggy weather, a great amount of water in liquid or solid state exists in the air, which my easily produces some abnormal radar echoes to cause a great number of noise points in the point cloud.
  • the control unit 304 determines that rainy, snowy or foggy weather is detected.
  • a counter may be provided in the control unit 304 or the denoising unit 302 and count the number of the noise points in a point cloud detected currently; when the count value goes beyond a predefined threshold, it may determine that the current state of weather is rainy, snowy or foggy, thereby triggering or enabling the denoising unit 302 .
  • the LiDAR system 300 may further comprise a precipitation sensor, such as an optical sensor or a capacitive sensor.
  • a precipitation sensor such as an optical sensor or a capacitive sensor.
  • An output signal of the precipitation sensor may be used to trigger to enable the denoising unit 302 .
  • An optical sensor for example, comprises a luminous diode and a transparent window. When no precipitation occurs, almost all light beams emitted by the luminous diode will be reflected onto a photosensitive element.
  • part of parameters as previously mentioned may be also used to determine rainy, snowy or foggy weather, so as to trigger enabling of the denoising unit 302 .
  • rainy, snowy or foggy weather may be determined, thereby triggering and enabling the denoising unit 302 then.
  • foggy weather may be determined, thereby triggering and enabling the denoising unit 302 then.
  • the LiDAR system 300 further comprises an input unit 305 for receiving input from a user.
  • the control unit 304 may enable or disable the denoising unit 302 according to the input from the user. This has the advantage that the user may decide whether to enable the denoising unit 302 and whether to perform a noise filtering operation.
  • FIG. 7 illustrates a LiDAR system 400 in accordance with another example described herein.
  • the LiDAR system 400 comprises: a LiDAR 401 , a confidence level calculating unit 402 , and an output unit 403 .
  • the LiDAR 401 is configured to scan surroundings to generate point cloud, while its specific structure will not be described in detail here.
  • the confidence level calculating unit 402 is coupled to the LiDAR 401 to receive the point cloud, and is configured to calculate a noise point confidence level of a point of the point cloud at least according to a distance between the point and the LiDAR and at least one of reflectivity and continuity parameter of the point.
  • the output unit 403 is coupled to the LiDAR 401 and the confidence level calculating unit 402 , and outputs the point cloud and a noise point confidence level of the point in the point cloud.
  • the calculation method of the noise point confidence level is similar to the description as made in the first aspect of the present disclosure, that is, use of one of the five characteristic parameters or any combination of the five characteristic parameters (including distance, reflectivity, continuity, noise and the number of echo pulses) to perform calculation.
  • one example will be provided only, and the rest will not go into details.
  • a noise point confidence level of the point is calculated according to a distance from a point in the point cloud to the LiDAR, reflectivity of the point, continuity parameter of the point, noise of the point and pulse number factor of the point.
  • the specific calculation comprises: for example, a distance factor is set to zero when the distance is within the predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight; a noise factor is set to zero when the noise is greater than a noise threshold, otherwise the noise factor is set as a fourth weight; and a pulse number factor is set to zero when the pulse number is greater than a threshold for pulse number, otherwise the pulse number factor is set as a fifth weight; and the noise point confidence level equals to the sum of the distance factor, the reflectivity factor, the continuity factor, the noise factor and the pulse number factor.
  • the first weight is greater than the second, third, fourth, and fifth weights.
  • the confidence level calculating unit 402 and the output unit 403 may be integrated into the LiDAR 401 , or the three may be implemented as separate units. These should both fall within the protection scope hereof.
  • the noise confidence level of a point in the point cloud may be also provided to a user.
  • one or more bits may be additionally added to the point cloud data of the LiDAR, in which the confidence level information is written in; the confidence level information is provided to a user for reference; and the user may decide whether to optimize radar point cloud graph or filter out noise points according to the confidence level information.
  • the LiDAR system 400 further comprises an input unit 404 for receiving input from a user, wherein the input unit 404 may instruct, according to the input of the user, the output unit about whether to filter out noise points in the point cloud. For example, when a user thinks there are too many noise points in the output point cloud (for example, in the case of rain, snow or fog) that it is necessary to filter them out, the user may use the input unit 404 to instruct the output unit 403 to filter out noise points, and then output the point cloud.
  • the input unit 404 may instruct, according to the input of the user, the output unit about whether to filter out noise points in the point cloud. For example, when a user thinks there are too many noise points in the output point cloud (for example, in the case of rain, snow or fog) that it is necessary to filter them out, the user may use the input unit 404 to instruct the output unit 403 to filter out noise points, and then output the point cloud.
  • the output unit 403 After receiving the instruction of filtering noise points, the output unit 403 determines, according to the noise point confidence level of a point in the point cloud, that the point is a noise point and filter it out when the noise point confidence level is beyond the normal range of confidence level. Therefore, the output point cloud excludes noise points, or includes them as less as possible.
  • the third aspect of the present disclosure involves a noise point identification device 500 for use in a LiDAR.
  • the device 500 comprises: a receiving unit 501 configured to receive a point cloud generated by the LiDAR; an obtaining unit 502 configured to obtain a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point; and a determining unit 503 configured to determine whether the point is a noise point at least according to the distance and the at least one of the reflectivity and the continuity parameter.
  • the third aspect of the present disclosure also involves a block diagram of a computer program product 600 , as shown in FIG. 10 .
  • a signal carrying medium 602 may be implemented as, or encompass, a computer readable medium 606 , a computer recordable medium 608 , a computer communication medium 610 , or a combination thereof, which stores programming instructions 604 that may be configured with a processor to execute all or some of the processes described previously.
  • These instructions may include, for example, one or more executable instructions for allowing one or more processors to execute the following processing: S 201 , receiving a point cloud generated by the LiDAR; S 202 , obtaining at least one of reflectivity and continuity parameter of a point in the point cloud, and a distance between the point and the LiDAR; and Step S 203 , determining whether the point is a noise point at least according to at least one of the reflectivity and the continuity parameter, and the distance.
  • any process or method in a flow chart or otherwise set out herein may be understood to represent one or more modules, segments or portions of codes of executable instructions for steps of achieving a specific logical function or process, and the range of the preferable embodiments hereof covers other implementations, which may be achieved not following the orders shown or discussed herein, including executing a function in view of the function involved by basically simultaneous means or in reverse order. This should be understood by the skilled persons in the technical field of the examples of the present disclosure.
  • a more particular example, instead of an exhausted list, of the computer readable medium includes the following: an electric connection having one or more wiring (electronic device), a portable computer enclosure (magnetic device), a random access memory (RAM), a read only memory (ROM), an electrically programmable read-only-memory (EPROM or flash memory), an optical fiber device, and a CD read-only-memory (CDROM).
  • an electric connection having one or more wiring (electronic device), a portable computer enclosure (magnetic device), a random access memory (RAM), a read only memory (ROM), an electrically programmable read-only-memory (EPROM or flash memory), an optical fiber device, and a CD read-only-memory (CDROM).
  • the computer readable medium even may be paper or other suitable mediums, on which the programs may be printed. This is because the programs may be electronically acquired through, for example, optical scanning for paper or other mediums, and next through editing, decoding, or processing in other suitable ways if necessary, and then will be stored in a computer storage. It should be understood that every part of the present disclosure may be achieved by means of hardware, software, firmware or combination thereof.
  • multiple steps or methods may be achieved using software or firmware stored in a storage and executed by suitable instruction execution systems.
  • any of the following technologies commonly known in the art, or combination thereof may be used to do so, just as same as done in another embodiment: a discrete logical circuit of logical gate for achieving logical functions for data signals, an application-specific integrated circuit with suitable combined logical gates, a programmable gate array (PGA), a field programmable gate array (FPGA), etc.
  • the solutions of the examples herein may be not only applied to mechanical LiDAR as mentioned in the examples above, but to other types of LiDARs, such as galvanometer scanning LiDAR, rotating mirror scanning LiDAR, or pure solid-state LiDAR including Flash LiDAR and phased array LiDAR.
  • the present invention imposes no limitation on the type of applicable LiDARs.
  • the contents described above are just preferable examples of the present disclosure, and are not used to limit the present disclosure.
  • the detailed description of the present disclosure has been provided with reference to the foregoing examples, those skilled in the art still may make modifications to the technical solutions recorded in various examples described above, or conduct equivalent replacement of part of technical features therein. Any modification, equivalent replacement, improvement, if only within the spirit and principles set out herein, should be covered by the protection scope of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present disclosure relates to a method for identification of a noise point used for a LiDAR, comprising: receiving a point cloud generated by the LiDAR; obtaining at least one of reflectivity and continuity parameter of a point in the point cloud, and a distance between the point and the LiDAR; and determining whether the point is a noise point at least based on at least one of the reflectivity and the continuity parameter, and the distance.

Description

    CROSS-REFERENCE
  • This application is a Continuation Application of International Patent Application PCT/CN2019/085765, filed May 7, 2019, which claims the benefit of Chinese Application No. CN201910322910.4, filed on Apr. 22, 2019, each of which is entirely incorporated herein by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to the field of LiDAR, and, more specifically, to a method for identification of a noise point used for a LiDAR, and a LiDAR system.
  • BACKGROUND
  • LiDAR (light detection and ranging) is a general term for laser active detection sensor devices. The working principle of LiDAR is as follows: a LiDAR emitter emits a beam of laser; the laser beam runs into an object, and returns back to a laser receiver upon diffuse reflection; and a radar module multiplies a time interval between emission and reception of the signal with the velocity of light, and divides by 2 so as to calculate out the distance between the emitter and the object. According to the number of laser beams, it usually includes, for example, single-line LiDAR, 4-line LiDAR, 8-/16-/32-/64-line LiDAR, etc. One or more laser beams are emitted at various angles in the vertical direction, achieving the detection of the three-dimensional profile of a target region through scanning in the horizontal direction. Since a plurality of measuring channels (lines) are equivalent to a plurality of scanning planes of different inclination angles, the more the laser beams in the vertical field of view, the higher the angular resolution in the vertical direction and the greater the density of a laser point cloud.
  • Taking a mechanical LiDAR as an example, it will generate more noise points and thus affects the quality of a point cloud when used for detection under rainy, snowy or foggy weather. For example, since there will be large and small water drops in a high density in the air under rainy, snowy or foggy weather, when a laser beam is irradiated onto a water drop, it will produce reflection echoes and form corresponding points in a point cloud. As the point of such type is not the one that actually exists in the detected target object, it should be regarded as a noise point.
  • In addition to the mechanical LiDAR as mentioned above, other types of LiDARs, such as galvanometer scanning LiDAR, rotating mirror scanning LiDAR, or pure solid-state LiDAR including Flash LiDAR and phased array LiDAR, will also encounter the same noise point problem when used for detection in rain, snow and fog.
  • The contents in the Background just disclose the known technologies of inventors rather than surely represent the prior art in the field.
  • SUMMARY
  • In view of at least one of the defects existing in the prior art, the present disclosure provides a method for identification of a noise point used for a LiDAR, comprising:
  • step S201: receiving a point cloud generated by the LiDAR;
  • step S202: obtaining a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point; and
  • step S203: determining whether the point is a noise point at least based on the distance and the at least one of the reflectivity and the continuity parameter.
  • According to one aspect of the present disclosure, the step S202 comprises: obtaining the reflectivity of the point in the point cloud and the distance between the point and the LiDAR, wherein the step S203 comprises determining whether the point is a noise point based on the reflectivity and the distance.
  • According to one aspect of the present disclosure, the step S202 comprises: obtaining the continuity parameter of the point in the point cloud and the distance between the point and the LiDAR, wherein the step S203 comprises: determining whether the point is a noise point based on the continuity parameter and the distance.
  • According to one aspect of the present disclosure, the step S203 comprises: determining that the point is a noise point when the distance is within a predefined range of distance and the reflectivity is less than or equal to a predefined threshold for reflectivity.
  • According to one aspect of the present disclosure, the step S203 comprises: calculating a noise point confidence level of the point, and determining that the point is a noise point when the noise point confidence level is beyond a normal range of confidence level, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within the predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; and the noise point confidence level equals to the sum of the distance factor and the reflectivity factor, wherein the first weight is greater than the second weight.
  • According to one aspect of the present disclosure, the step S202 comprises: obtaining the reflectivity and the continuity parameter of the point and the distance between the point and the LiDAR; wherein the step S203 comprises: determining whether the point is a noise point based on the distance, the reflectivity and the continuity parameter.
  • According to one aspect of the present disclosure, the method further comprises: obtaining noise of the point and the number of echo pulses of the point; wherein the step S203 comprises: determining whether the point is a noise point based on the distance, the reflectivity, the continuity parameter, the noise and the number of echo pulses.
  • According to one aspect of the present disclosure, the step of determining whether the point is a noise point comprises: calculating a noise point confidence level of the point, and determining that the point is a noise point when the noise point confidence level is beyond a normal range of confidence level, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within a predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight; a noise factor is set to zero when the noise is greater than a threshold for noise, otherwise the noise factor is set as a fourth weight; an echo pulse number factor is set to zero when the number of echo pulses is greater than a threshold for pulse number, otherwise the echo pulse number factor is set as a fifth weight, and the noise point confidence level equals to the sum of the distance factor, the reflectivity factor, the continuity factor, the noise factor and the echo pulse number factor.
  • According to one aspect of the present disclosure, the first, second, third, fourth and fifth weights satisfy one or more of the following conditions: the first weight equals to the sum of the second and third weights; the second weight equals to the third weight; the first weight equals to the threshold for confidence level; the sum of the fourth and fifth weights equals to the second and/or third weights; and the first weight is greater than the second, third, fourth and fifth weight.
  • According to one aspect of the present disclosure, the method further comprises: dynamically adjusting the first, second, third, fourth and fifth weights based on the detected state of weather.
  • According to one aspect of the present disclosure, the step of dynamically adjusting comprises: reducing the second weight and increasing the fourth weight when snowy weather is detected; and/or increasing the fifth weight when foggy weather is detected.
  • According to one aspect of the present disclosure, the step S203 comprises: determining that the point is a noise point when the distance is within a predefined range of distance, and the continuity parameter is beyond a normal range of continuity parameter.
  • According to one aspect of the present disclosure, the step of determining whether the point is a noise point comprises: calculating a noise point confidence level of the point, and determining that the point is a noise point when the noise point confidence level is beyond a normal range of confidence level, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within a predefined range of distance, otherwise the distance factor is set as a first weight; a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight; and the noise point confidence level equals to the sum of the distance factor and the continuity factor, wherein the first weight is greater than the third weight.
  • The present disclosure further discloses a LiDAR system, comprising:
  • a LiDAR configured to scan its surroundings to generate a point cloud;
  • a denoising unit coupled to the LiDAR to receive the point cloud, and configured to perform the method for identification of a noise point described above to determine whether a point in the point cloud is a noise point, and filter out noise points in the point cloud; and
  • an output unit coupled to the denoising unit, and configured to output the point cloud.
  • According to one aspect of the present disclosure, the LiDAR system further comprises a control unit coupled to the LiDAR, the denoising unit and the output unit, and capable of enabling or disabling the denoising unit, wherein in an enabled state, the denoising unit filters out noise points in the point cloud, and the output unit outputs the point cloud with the noise points filtered out; and in a disabled mode, the denoising unit is disabled, and the output unit outputs the point cloud with the noise points not being filtered out.
  • According to one aspect of the present disclosure, the control unit enables the denoising unit when rainy, snowy or foggy weather is detected.
  • According to one aspect of the present disclosure, the control unit determines that rainy, snowy or foggy weather is detected, when the number of noise points goes beyond a predefined threshold.
  • According to one aspect of the present disclosure, the LiDAR system further comprising an input unit for receiving input from a user, wherein the control unit is capable of enabling or disabling the denoising unit based on the input from the user.
  • The present disclosure further discloses a LiDAR system, comprising:
  • a LiDAR configured to scan its surroundings to generate a point cloud;
  • a confidence level calculating unit coupled to the LiDAR to receive the point cloud, and configured to calculate a noise point confidence level of a point in the point cloud at least based on a distance between the point and the LiDAR, and at least one of reflectivity and continuity parameter of the point; and
  • an output unit coupled to the LiDAR and the confidence level calculating unit, and configured to output the point cloud and noise point confidence levels of points in the point cloud.
  • According to one aspect of the present disclosure, the confidence level calculating unit is configured to calculate a noise point confidence level of a point in the point cloud, based on a distance between the point and the LiDAR, reflectivity of the point, continuity parameter of the point, noise of the point and a pulse number factor of the point, wherein calculating the noise point confidence level of the point comprises: a distance factor is set to zero when the distance is within the predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; a continuity factor is set to zero when the continuity parameter is beyond a normal range of continuity parameter, otherwise the continuity factor is set as a third weight; a noise factor is set to zero when the noise is greater than a threshold for noise, otherwise the noise factor is set as a fourth weight; a pulse number factor is set to zero when the number of pulses is greater than a threshold for pulse number, otherwise the pulse number factor is set as a fifth weight; and the noise point confidence level equals to the sum of the distance factor, the reflectivity factor, the continuity factor, the noise factor and the pulse number factor, wherein the first weight is greater than the second, third, fourth and fifth weight.
  • According to one aspect of the present disclosure, the LiDAR system further comprises an input unit for receiving input from a user, wherein the input unit is capable of instructing, based on the input from the user, the output unit about whether to filter out noise points in the point cloud.
  • The present disclosure further discloses a device for identification of a noise point used for a LiDAR, comprising:
  • a receiving unit configured to receive a point cloud generated by the LiDAR;
  • an obtaining unit configured to obtain a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point; and
  • a determining unit configured to determine whether the point is a noise point at least based on to the distance and the at least one of the reflectivity and the continuity parameter.
  • The present disclosure further discloses a computer readable storage medium having stored thereon computer executable instructions that, when executed by a processor, perform the method for identification of a noise point described above.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, as part of the present disclosure, are provided for the purpose of further understanding of the present disclosure, and the schematic embodiments and description serve to illustrate the present disclosure, both of which should not impose any improper limitation on the present disclosure. In the drawings:
  • FIG. 1 is a schematic illustrating a LiDAR;
  • FIG. 2 illustrates a method for identification of noise point used for a LiDAR in accordance with the first aspect of the present disclosure;
  • FIG. 3 schematically illustrates the projection of multiple points in a point cloud of a LiDAR onto a plane perpendicular to the laser direction;
  • FIG. 4A illustrates a method for calculating the noise of echoes in accordance with one example of the present disclosure;
  • FIG. 4B illustrates a comparison between the waveform noise in the case of snow and the normal waveform noise in accordance with one example of the present disclosure;
  • FIG. 5 schematically illustrates a LiDAR system in accordance with the second aspect of the present disclosure;
  • FIG. 6 illustrates a LiDAR system in accordance with one preferable example of the present disclosure;
  • FIG. 7 illustrates a LiDAR system in accordance with another example of the present disclosure;
  • FIG. 8 illustrates a LiDAR system in accordance with another example of the present disclosure;
  • FIG. 9 illustrates a noise point identification device for use on a LiDAR in accordance with the third aspect of the present disclosure; and
  • FIG. 10 illustrates a computer program product in accordance with the third aspect of the present disclosure.
  • DETAILED DESCRIPTION
  • The following exemplary embodiments will be described only in a brief manner. Just as those skilled in the art will recognize, changes in various ways to the examples described herein can be carried out without departing from the spirit or scope of the present disclosure. Therefore, the drawings and description are deemed substantively exemplary, instead of limitative.
  • In the description of the present disclosure, it need be understood that the orientation or position relations denoted by such terms as “central” “longitudinal” “latitudinal” “length” “width” “thickness” “above” “below” “front” “rear” “left” “right” “vertical” “horizontal” “top” “bottom” “inside” “outside” “clockwise” “counterclockwise” and the like are based on the orientation or position as shown in the accompanying drawings, and only used for the purpose of facilitating description for the present disclosure and simplification of the description, instead of indicating or suggesting that the denoted devices or elements must be specifically oriented, or configured or operated in some specific orientation. Thus, such terms should not be construed to limit the present disclosure. In addition, such terms as “first” and “second” are only used for the purpose of description, rather than indicating or suggesting relative importance or implicitly indicating the number of the designated technical features. Accordingly, features defined with “first” or “second” may, expressly or implicitly, include one or more of such features. In the description of the present disclosure, “more” means two or above, unless otherwise defined explicitly and specifically.
  • In the description of the present disclosure, it need to be specified that, unless otherwise specified and defined explicitly, such terms as “mount” “link” and “connect” should be understood as generic terms. For example, connection may refer to fixed connection, dismountable connection, or integrated connection; also to mechanical connection, electric connection or intercommunication; further to direct connection, or connection by an intermediary medium; or even to internal communication between two elements or interaction between two elements. For those skilled in the art, they can construe the specific meaning of such terms herein in light of specific circumstances.
  • Herein, unless otherwise specified and defined explicitly, if a first feature is “above” or “below” a second one, it may cover the direction contact between the first and second features, also cover the contact via another feature therebetween, instead of the direct contact. Furthermore, if a first feature “above”, “over” or “on the top of” a second one, it may cover that the first feature is right above or on the inclined top of the second feature, or just indicate that the first feature has a horizontal height higher than that of the second feature. If a first feature is “below”, “under” or “on the bottom of” a second feature, it may cover that the first feature is right below and on the inclined bottom of the second feature, or just indicates that the first feature has a horizontal height lower than that of the second feature.
  • The disclosure below provides many different embodiments and examples for achieving different structures described herein. In order to simplify the disclosure herein, the following will give the description of the parts and arrangements embodied in specific examples. Surely, they are just for the exemplary purpose, not intended to limit the present disclosure. Besides, the present disclosure may repeat a reference number and/or reference letter in different examples, and such repeat is for the purpose of simplification and clarity, and itself denotes none of the relations among various embodiments and/or arrangements as discussed. In addition, the present disclosure provides examples for a variety of specific techniques and materials, but the common skilled persons in the art are aware of an application of other techniques and/or a use of other materials.
  • The following description, along with the accompanying drawings, sets forth the preferable examples herein. It should be understood that the preferable examples described herein are only for the purpose of illustrating and explaining, instead of limiting, the present disclosure.
  • With regard to data of echoes of radar point cloud for under rainy, snowy or foggy weather, the inventors of the present disclosure discover that a noise point in the LiDAR point cloud could be effectively identified by means of a combination of the distance, reflectivity, point continuity parameter and other features of a point in the point cloud. Those skilled in the art may understand, although a LiDAR point cloud under rainy, snowy or foggy weather is taken as an example for illustration herein, the protection scope of the present disclosure should not be limited to identification of noise points under rainy, snowy or foggy weather, but cover applications for identifying and determining noise points under other states of weather.
  • FIG. 1 illustrates an instance of a LiDAR 100. The LiDAR is a 16-line LiDAR, which means it may emit 16 lines of laser beams in total, including L1, L2, . . . , L15 and L16, in an vertical plane as shown in the figure (each line of laser beams just corresponds to one channel of all the 16 channels of the LiDAR) for detection of the surroundings. During the detection, the LiDAR 100 may rotate about its vertical axis. During the rotation, each channel of the LiDAR successively emits a laser beam in turn according to a certain time interval (e.g., 1 μs) and carries out detection so as to complete a line scanning in the vertical field of view; and then perform the next line scanning in the vertical field of view as spaced with a certain angle (e.g., 0.1 or 0.2 degree) in the direction of the horizontal field of view, so as to form a point cloud upon multiple times of detection during the rotation and thus be able to detect the surroundings.
  • FIG. 2 illustrates a method for identification of noise point of a LiDAR in accordance with the first aspect of the present disclosure. As shown in FIG. 2, the method for identification of noise point comprises:
  • At Step S201, receiving a point cloud generated by the LiDAR.
  • Data of the point cloud generated by the LiDAR may usually include coordinates of every point and reflectivity of the point (the reflectivity is proportional to the strength of the reflected beam and the distance between the target point and the LiDAR). In the coordinates of a point, the mounting position of an LiDAR, for example, is taken as an origin, and the offset of the point may be specifically represented by polar coordinates (i.e., distance and angle), or represented using x/y/z three-dimensional rectangular coordinates.
  • At Step S202, obtaining a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point.
  • The distance between the point and the LiDAR may be calculated out directly by the coordinates of the point. The reflectivity of the point may be obtained directly from the data of the point cloud. Herein, the “continuity parameter” of the point may be defined as a characterization parameter that indicates the continuity between the point and one or more points of its adjacent points in the point cloud, such as the distance to one or more of the surrounding points in the point cloud. With reference to FIG. 3, the continuity parameter of the present disclosure will be described. FIG. 3 schematically illustrates the projection of a point cloud generated by a LiDAR on a plane perpendicular to the laser direction (for simplicity, the depth information is not shown in FIG. 3). Among others, point 5, for example, is a point currently to be determined, and is an echo detected out at the time tn by the laser beam L2 in FIG. 1, point 4 is an echo detected out at the last time tn−1 by the same channel, namely by the laser beam L2, and point 6 is an echo detected out at the next time tn+1 by the same channel, namely by the laser beam L2. Correspondingly, point 1, point 2 and point 3 are echoes detected out at the time tn−1, tn, and tn+1 by the laser beam L3, respectively; and point 7, point 8 and point 9 are echoes detected out at the time tn−1, tn, and tn+1 by the laser beam L1, respectively. It should be noted that although FIG. 3 shows the equal spacing between the points, this drawing is just exemplary, not meaning equidistant spacing between echo points formed at successive time (tn−1, tn, and tn+1) by the adjacent beams in the point cloud data. In the present disclosure, the continuity parameter of point 5 may either refer to the absolute value of the difference value between the distance from point 5 to the LiDAR, and the distance from any one of the eight points (that are above, below, to the left, to the right, to the upper left, to the upper right, to the lower left and to the lower right of point 5) to the LiDAR, or refer to the weighted average of the difference values between the distance from point 5 to the LiDAR and the distance from multiple points of the eight points (that are above, below, to the left, to the right, to the upper left, to the upper right, to the lower left and to the lower right of point 5) to the LiDAR. Both of these are within the scope of the present disclosure. Under the inspiration of the conception herein, those skilled in the art may make specific adaptation according to needs.
  • When a normal object is detected by a LiDAR, those generated points usually show a relatively good continuity. Good continuity of points herein means the lower absolute value of the difference value between the distances from adjacent points to the LiDAR, for example, within the error of range detection of the LiDAR. One of the main application scenarios of LiDAR is for detecting a variety of objects in road traffic, while the speed of a moving object in road traffic is quite lower than the velocity of light. LiDAR has a higher resolution in horizontal angles, e.g., 0.1 or 0.2 degree. Thus, mostly, especially when an object is closer to a LiDAR, for example, within a range of 5-10 meters in which noise points are relatively concentrated or prominent under rainy, snowy or foggy weather, the points scanned at horizontal angles of two consecutive scanning correspond to the same object (except for just at the edge of the object). Furthermore, LiDAR has a very high scanning frequency. For example, the time interval between performing line scanning in the same vertical field of view by two adjacent channels of the LiDAR is about 1 μs, and the cycle for one line scanning in the vertical field of view is also just more than ten μs or tens of μs, so the moving distance of a moving object in road traffic may be ignored during such a short time period. Therefore, the distances from adjacent points of a LiDAR point cloud to the LiDAR should be equal theoretically, and the difference value between the distances actually measured should be relatively tiny. When there is a great distance difference, it is very likely caused by noise points. For example, when the difference value (or the weighted average of multiple difference values) between the distance from a certain point to the LiDAR and the distance from any one or more of the eight points (that are above, below, to the left, to the right, to the upper left, to the upper right, to the lower left and to the lower right of the certain point) to the LiDAR is greater than a threshold (such as 1 meter), it may be determined that the point has a poor continuity, and thus the point may be a noise point. In addition to the use of the difference value (or the minimum difference, maximum difference, or mean difference) between the distance from a point to the LiDAR and the distance from adjacent points to the LiDAR as the continuity parameter, the correlation between a test point and its surrounding points may be also calculated to be used as the continuity parameter, which is similar to a median filter, that is, an eigenvalue can be calculated out according to convolution and may be used to characterize a discreteness degree between a current point and its surrounding points; when the discreteness degree is higher than a certain threshold, it should be deemed that the point and the surrounding ones are discontinuous.
  • Moreover, the threshold for the continuity parameter is also variable. For example, the continuity parameter may be adjusted according to the distance between a test point and the LiDAR, and if the distance increases, the measuring error for the LiDAR will also become greater correspondingly, and the threshold for the continuity parameter may be set at a greater value.
  • In addition, in accordance with another example described herein, the difference value between the respective distances from two points to the LiDAR can be calculated approximately as the distance between these two points. This is because of the very small angular spacing between two points in the horizontal direction, e.g., 0.1 or 0.2 degree, in the process of scanning by the LiDAR. Thus, in this case, the distance between points 4 and 5 approximately equals to the difference value between the distance of point 4 to the LiDAR and the distance of point 5 to the LiDAR.
  • In Step S203: determining whether the point is a noise point at least based on at least one of the reflectivity and the continuity parameter, and the distance.
  • In the following it will specifically describe the method for determination of the noise point herein according to the distance and at least one of the reflectivity and the continuity parameter.
  • In accordance with one example described herein, a noise point may be identified and determined by means of distance together with reflectivity. The inventors discover that noise points usually concentrate within a range of 5-10 meters away from the LiDAR, or more prominently within a range of 5-7 meters. Meanwhile, the reflectivity of a noise point is usually less than or equal to 2%. Thus, a noise point may be identified according to a combination of distance and reflectivity. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), and the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the point is determined to be a noise point.
  • In accordance with one example described herein, a noise point may be identified and determined by means of distance together with continuity parameter. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), and the continuity parameter is beyond the normal range of continuity (for example, the continuity parameter is defined as the difference values between the distance from the point to the LiDAR and the distances from the surrounding points to the LiDAR, and the normal continuity ranges are, for example, less than or equal to 1 meter, or less than or equal to 0.5 meter), the point is determined to be a noise point. As stated above, continuity parameter is, for example, the difference values between the distance from the current point to the LiDAR and the distances from a previous point (a left point in FIG. 3) and/or a next point (a right point in FIG. 3) by the same channel to the LiDAR, or the difference value between the distance from the current point to the LiDAR and the distances from the points by the adjacent channels above and below (upper and/or lower points in FIG. 3) to the LiDAR, or the distances from the current point to other adjacent points in the point cloud, or the parameters characterizing the discreteness degree of the current point relative to the surrounding points. If the continuity parameter is beyond the normal range of continuity, it indicates that the point is very likely to be a noise point, rather than a point truly positioned on the surrounding object.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of distance, reflectivity and continuity. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), and the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the point is determined to be a noise point.
  • Moreover, the inventors of the present application discover that the noise of the echo corresponding to a point to be determined is also another factor that may be used for determining a noise point, especially in the case of snow. In a LiDAR, the laser beams emitted by the emitter of the LiDAR run into an object, and form echoes upon diffuse reflection, which are then received by the laser receiver to generate analog data signals (i.e., echo waveform signals); the analog data signals are sampled by the ADC at a certain frequency, and quantization-coded, thereby obtaining a series of sampling points, as shown in FIG. 4A. Waveform noises may be characterized using RMS, i.e., characterizing the fluctuation degree of the waveform, and the specific calculation may be conducted by obtaining the sum of the square of the difference values between each sampling point and echo base line and then calculating the square root of the sum. The echo base line herein refers to a mean value of signals generated by the laser receiver without laser beam reflection, which may be construed as a background noise signal value.
  • In addition, the operation of calculating RMS is relatively complicated, the computing resource of FPGA is also so limited that it is rather difficult to use FPGA to perform such a calculation, and a waveform may further contain normal echo signals besides noise, thereby making it difficult to have statistics. Therefore, other numerical values which may be calculated more easily may also be used for characterizing the noise of the waveform. For example, in accordance with one preferable example described herein, as shown in FIG. 4A, the echo base line is 85 LSB (Least Significant Bit) value of ADC; and within the range of 10 LSB value above the echo base line, counting 200 sampling points (a number based on experience, which may be likewise 100 or 300 surely) from left to right, the magnitude of the noise of the waveform may be characterized by using the sum or average value of the difference value between the ordinate value of each sampling point and the echo base line, or using the area of a graph surrounded by a curve formed by connecting 200 sampling points, a straight line through the abscissa of the first sampling point and parallel to the Y-axis, a straight line through the abscissa of the 200th sampling point and parallel to the Y-axis, and the echo base. It need be specified that the LSB value is the unit to characterize the magnitude of the signal after ADC sampling in the circuit, and is relevant to the performance of the adopted ADC. In addition, the 10 LSB value above the echo base line as selected in the preferable example has relation to an echo signal detection threshold as predefined for the LiDAR, in which the echo signal detection threshold is used by the system to determine that a signal is a valid echo signal when the signal has the signal strength greater than or equal to the echo signal detection threshold, and to determine that the signal is noise when it has the signal strength less than the echo signal detection threshold. Those skilled in the art also may select other suitable values according to their mastered prior technologies and knowledge in the field. Of course, in practical implementation, those skilled in the art may also adopt other units besides LSB value to characterize the strength and magnitude of echo signals; and using LSB value as the unit of echo to describe the strength of echo signals herein is just to allow those skilled in the art to better understand and implement the present invention, and it imposes no limits to the protection scope of the present invention.
  • In accordance with one example described herein, the method is used to calculate the noise of the echo of a normal point under snowy weather and the noise of the echo of a noise point caused by the snowy weather. As shown in FIG. 4B, the echo noise of each noise point caused by the snowy weather is above 200 (unitless), while the noise of the waveform of a normal point is about 100 only. They may be easily distinguished from each other. Moreover, lines 41 and 42 in FIG. 4B, though presented as continuous lines, are both actually the lines constituted by connecting discrete points one by one. The abscissa stands for the number of measured points, and the abscissa 200 in the figure represents 200 times of detection (200 points) conducted for the same one scenario, thereby obtaining 200 results of noise detection correspondingly, for example, obtaining 200 waveforms of the echo signals similar to that in FIG. 4A. Subsequently, for example, for the abscissa of 1, the ordinate of the points in line 42 represents that the noise of the echo of a normal point is 99, and the ordinate of the points in line 41 represents that the noise of the echo of the noise points caused by the snowy weather is 302. And for example, for the abscissa of 25, which represents the 25th test, the ordinate of the points in line 42 represents that the noise of the echo of a normal point is 102, while the ordinate of the points in line 41 represents that the noise of the echo of the noise points caused by the snowy weather is 298.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of distance, reflectivity, continuity and noise. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), and the noise of the point is greater than a noise threshold (such as 200), the point is determined to be a noise point.
  • Moreover, the inventors of the present application discover that the number of echo pulses is also another factor available for determining a noise point. A laser beam containing one laser pulse is emitted to an object, and reflected back, forming an echo beam which is detected by a LiDAR. Theoretically, the number of echo pulses being detected should be one only. However, during practical detection, the number of echo pulses that may be detected as greater than the measuring threshold is often more than one for some reasons. For example, a laser beam will gradually diffuse as it travels forward, so that it may run into two different objects successively. The inventors find that the number of echo pulse may increase under rainy, snowy or foggy weather. Therefore, during detection of a certain point, when the number of the detected echo pulses is more than a pulse number threshold (e.g., 3), it may indicate that the point may be a noise point. In addition, to reduce crosstalk between two different LiDARs, the laser beams emitted by LiDARs are usually pulse-encoded. For example, a detecting laser beam includes two laser pulses, the time interval of which is encoded, i.e., dual-pulse laser detection. Therefore, during the dual-pulse laser detection, the pulse number threshold may be set at 7. For example, when the number of the detected echo pulse is more than a pulse number threshold (e.g., 7), it indicates that the point may be a noise point.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of distance, reflectivity, continuity and the number of echo pulses. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of distance, reflectivity, continuity, noise and the number of echo pulses. For example, when the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the noise of the point is greater than a noise threshold (such as 200), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of continuity and reflectivity. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), and the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the point is determined to be a noise point.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of continuity, reflectivity and noise. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), and the noise of the point is greater than a noise threshold (such as 200), the point is determined to be a noise point.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of continuity, reflectivity and the number of echo pulses. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • In accordance with one example described herein, a noise point may be identified and determined by means of a combination of continuity, reflectivity, noise and the number of echo pulses. For example, when the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), the reflectivity is less than or equal to a predefined threshold for reflectivity (such as 2%), the noise of the point is greater than a noise threshold (such as 200), and the number of echo pulses is more than a pulse number threshold (such as 7), the point is determined to be a noise point.
  • In accordance with one example described herein, noise may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), it is determined that the state of weather is snowy, or the noise point is a noise point caused by the snowy state of weather. Since the echo noise becomes larger under snowy weather, and as shown in FIG. 4B, the echo noise of almost every point under snowy weather is above 200, this factor, i.e. echo noise, may be substantially used to determine a noise point under the snowy weather.
  • In accordance with one example described herein, a combination of noise and distance may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), and the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • In accordance with one example described herein, a combination of noise and continuity may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), and the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • In accordance with one example described herein, a combination of noise, distance and continuity may be used to determine whether a state of weather is snowy. For example, when the noise of the point is greater than a noise threshold (such as 200), the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), and the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • In accordance with one example described herein, a combination of noise, distance and the number of echo pulses may be used to determine whether a state of weather is snowy. When the noise of the point is greater than a noise threshold (such as 200), the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), and the number of echo pulses is more than a pulse number threshold (such as 7), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • In accordance with one example described herein, a combination of noise, distance, continuity and the number of echo pulses may be used to determine whether a state of weather is snowy. When the noise of the point is greater than a noise threshold (such as 200), the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), the continuity parameter is beyond the normal range of continuity parameter (such as 1 meter), and the number of echo pulses is more than a pulse number threshold (such as 7), it is determined that the state of weather is snowy, or the point is a noise point caused by the snowy state of weather.
  • In accordance with one example described herein, a combination of the number of echo pulses and distance can be used to determine whether a state of weather is rainy or foggy. When the number of echo pulses is more than a pulse number threshold (such as 7), and the distance is within the predefined range of distance (such as 5-10 meters, or 5-7 meters), it is determined that the state of weather is rainy or foggy, or the point is a noise point caused by the rainy or foggy state of weather. As enumerated in the foregoing various examples, parameters including distance, reflectivity, continuity, noise and pulse number, or various combinations thereof, can be used to determine a noise point and a specific state of weather. In accordance with one example described herein, by means of calculating the noise point confidence level of a point, the point may be likewise determined to be a noise point when the noise point confidence level is beyond the normal range of confidence level.
  • With regard to the foregoing examples of various combinations, a corresponding noise point confidence level may be calculated, and the calculation method thereof will be provided as below only for the purpose of illustration.
  • With regard to the example involving identification and determination of a noise point by using a combination of distance and reflectivity, the calculation method as shown in Table 1, for example, may be adopted.
  • TABLE 1
    Characteristic
    Parameter Determination Condition Weight
    Distance Distance between 5 and 10 meters 10
    Reflectivity Reflectivity less than or equal to 2% 5
  • For example, with regard to each of the characteristic parameters in the Table 1 above, if the determination condition for the characteristic parameter is satisfied, the contribution coefficient or factor of the confidence level brought by the characteristic parameter is 0, otherwise the contribution coefficient or factor of the confidence level brought by the characteristic parameter is a weight of the characteristic parameter. Finally, if the final confidence level is determined to be beyond the normal range of confidence level, a noise point is determined. For example, if a certain point m has a distance of 12 m to the LiDAR, then the determination condition of distance is not satisfied, and the distance factor will be 10; and if the reflectivity is 3%, thereby not satisfying the determination condition of the reflectivity, then the reflectivity factor is set as 5. The noise point confidence level equals to the sum of the distance factor and reflectivity factor, namely, 10+5=15. The normal range of confidence level may be set according to experience and/or specific weather or environment conditions. In accordance with one example, the normal range of confidence level is greater than or equal to 10. When a confidence level is greater than or equal to 10, it indicates the point is a normal point; and when a confidence level is beyond the range, namely, less than 10, it indicates the point is a noise point. In accordance with one example described herein, the weight of distance (a first weight) is greater than that of reflectivity (a second weight), as shown in Table 1.
  • It need be specified that the weight for each of the characteristic parameters above and the normal range of confidence level set at 10 are just examples of the present disclosure, and the weight for each characteristic parameters and the normal range of confidence level may be changed as needed. The endpoints of the normal range of confidence level may be referred to as confidence level threshold, for example, 10.
  • Table 2 shows a calculation method to identify and determine a noise point according to a combination of distance, reflectivity and continuity, by means of calculation of noise point confidence level.
  • TABLE 2
    Characteristic
    Parameter Determination Condition Weight
    Distance Distance between 5 and 10 meters 10
    Reflectivity Reflectivity less than or equal to 2% 5
    Continuity Continuity parameter exceeding 1 meter 5
  • Among others, calculation of a noise point confidence level of a point is as follows: a distance factor is set to zero when the distance is within the range of the predefined distance, otherwise the distance factor is set as a first weight (i.e. the weight of distance); a reflectivity factor is set to zero when the reflectivity is less than or equal to the predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight (i.e. the weight of reflectivity); and a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight (i.e. the weight of continuity). The noise point confidence level equals to the sum of the distance factor, the reflectivity factor and the continuity factor.
  • Table 3 shows a calculation method to identify and determine a noise point according to a combination of distance, reflectivity, continuity, noise and the number of echo pulses, by means of calculation of a noise point confidence level.
  • TABLE 3
    Characteristic
    Parameter Determination Condition Weight
    Distance Distance between 5 and 10 meters 10
    Reflectivity Reflectivity less than or equal to 2% 5
    Continuity Continuity parameter exceeding 1 meter 5
    Noise Waveform base noise greater than 200 3
    Pulse Number Pulse number greater than 7 3
  • Among others, calculation of a noise point confidence level of a point is as follow: a distance factor is set to zero when the distance is within the range of the predefined distance, otherwise the distance factor is set as a first weight (i.e. the weight of distance); a reflectivity factor is set to zero when the reflectivity is less than or equal to the predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight (i.e. the weight of reflectivity); a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight (i.e. the weight of continuity); a noise factor is set to zero when the noise is greater than the threshold for noise, otherwise the noise factor is set as a fourth weight (i.e., the weight of noise); and an echo pulse number factor is set to zero when the number of echo pulses is greater than the threshold for pulse number, otherwise the echo pulse number factor is set as a fifth weight (i.e. the weight of the number of echo pulses). The noise point confidence level equals to the sum of the distance factor, the reflectivity factor, the continuity factor, the noise factor and the echo pulse number factor.
  • In accordance with one preferable example disclosed herein, the first, second, third, fourth and fifth weights satisfy one or more of the following conditions:
  • The first weight equals to the sum of the second and third weights;
  • The second weight equals to the third weight;
  • The first weight equals to the threshold for confidence level;
  • The sum of the fourth and fifth weights equals to the second and/or third weight; and
  • The first weight is greater than the second, third, fourth or fifth weight.
  • In accordance with one preferable example described herein, the first, second, third, fourth and fifth weights may be dynamically adjusted according to the detected state of weather, including adjusting the absolute value of each weight and the relative proportional relation therebetween.
  • For example, a noise point caused by the snowy weather has greater echo noise, and almost every noise point caused by the snowy weather in the point cloud has an echo noise greater than 200. Moreover, the reflectivity of a noise point caused by the snowy weather has a greater value, so it may be difficult to determine a noise point caused by the snowy weather according to the reflectivity. Therefore, correspondingly, the weight of reflectivity should be reduced under the snowy weather. Thus, when snow is detected, the second weight (the weight of reflectivity) may be dynamically lowered while the fourth weight (the weight of echo noise) may be increased.
  • Moreover, when fog is detected, the fifth weight may be increased.
  • After use of the method for identification of noise point (by means of continuity+reflectivity) as described herein for denoising a LiDAR point cloud in FPGA, it may be found from the result that the method may filter out noise points under the rainy weather very effectively. The proportion of normal points that are lost due to the filtering operation is at 1/144, completely within an acceptable range.
  • The thresholds, weights and the proportional relations between weights of various parameters in the examples illustrated above are just determined by the inventors based on experience and a large number of experiments, are illustrated as preferable examples of the present disclosure, and should not serve to limit the protection scope of the present disclosure.
  • Second Aspect
  • The second aspect of the present disclosure involves a LiDAR system 300. With reference to FIG. 5, the detailed description will be made as below.
  • As shown in FIG. 5, the LiDAR system 300 comprises: a LiDAR 301, a denoising unit 302, and an output unit 303. Among others, the LiDAR 301, for example, may be an existing LiDAR, as the LiDAR 100 as shown in FIG. 1, and is configured to scan surroundings to generate point cloud data, while its specific structure will not be described in detail here. The denoising unit 302 is coupled to the LiDAR 301 to receive the point cloud, and configured to perform the method 200 for identifying a noise point as described in the first aspect of the present disclosure to determine whether a point in the point cloud is a noise point, and configured to filter out noise points in the point cloud. The output unit 303 is coupled to the denoising unit 302, and outputs the point cloud. As shown, the information flow from the denoising unit 302 to the output unit 303 is unidirectional, but the present disclosure is not limited to this because a bidirectional information flow between them is also possible.
  • As shown in FIG. 5, the LiDAR 301, the denoising unit 302 and the output unit 303 are three separate units. But this is just for the exemplary purpose. Both the denoising unit 302 and the output unit 303 may be integrated into the LiDAR 301. For example, a denoising unit 302 may be added to the current LiDAR 301, to receive point cloud data, and determine and filter out noise points according to the method 200 for identification of noise point, and then by using the output unit 303, i.e., an output interface, the point cloud with noise points filtered out is output and presented to a user. Of course, the LiDAR 301, the denoising unit 302 and the output unit 303 also may be separate units. For example, the LiDAR 301 is only responsible for laser detection of the surroundings, and output of original point cloud data. The denoising unit 302 and the output unit 303, for example, may be a computer, workstation, or application specific integrated circuit (ASIC) responsible for data processing, which, after receiving the point cloud data, conduct such operations as identification and filtering for noise points, and output. These should all fall within the protection scope hereof.
  • FIG. 6 illustrates one preferable example described herein. Among others, the LiDAR system 300 further comprises a control unit 304. The control unit 304 is coupled to the denoising unit 302, and may enable or disable the denoising unit 302. In an enabled state, the denoising unit 302 filters out noise points in the point cloud, and the output unit 303 outputs the point cloud with noise points filtered out; and in a disabled mode, the denoising unit 302 is disabled, and the output unit 303 outputs the point cloud without noise points filtered out. Preferably, the control unit 304 is coupled to the LiDAR 301 and the output unit 303.
  • In accordance with one preferable example described herein, the denoising unit 302 is enabled by default.
  • In accordance with one preferable example described herein, the denoising unit 302 is disabled by default, and turns enabled under certain circumstances. For example, when rainy, snowy or foggy weather is detected, the control unit 304 enables the denoising unit 302. This is because, in a state of rainy, snowy or foggy weather, a great amount of water in liquid or solid state exists in the air, which my easily produces some abnormal radar echoes to cause a great number of noise points in the point cloud.
  • In accordance with one preferable example described herein, for example, when the number of noise points exceeds a predefined threshold, the control unit 304 determines that rainy, snowy or foggy weather is detected. For example, a counter may be provided in the control unit 304 or the denoising unit 302 and count the number of the noise points in a point cloud detected currently; when the count value goes beyond a predefined threshold, it may determine that the current state of weather is rainy, snowy or foggy, thereby triggering or enabling the denoising unit 302.
  • Preferably, the LiDAR system 300 may further comprise a precipitation sensor, such as an optical sensor or a capacitive sensor. When a state of rainy, snowy or foggy weather is detected, an output signal of the precipitation sensor may be used to trigger to enable the denoising unit 302. An optical sensor, for example, comprises a luminous diode and a transparent window. When no precipitation occurs, almost all light beams emitted by the luminous diode will be reflected onto a photosensitive element. When precipitation occurs, for example, in a state of rainy, snowy or foggy weather, water drops or moisture will appear on the transparent window, part of light beams emitted by the luminous diode will be deflected, and the total amount of light received by the photosensitive element is caused to change, thereby a state of rainy, snowy or foggy weather may be determined and detected, and further triggering to enable the denoising unit 302.
  • In accordance with one preferable example described herein, part of parameters as previously mentioned may be also used to determine rainy, snowy or foggy weather, so as to trigger enabling of the denoising unit 302. For example, when the reflectivity of a point in the point cloud is higher than a certain threshold, snowy weather may be determined, thereby triggering and enabling the denoising unit 302 then. Additionally, when the number of echo pulses of a point in the point cloud goes beyond a certain threshold, foggy weather may be determined, thereby triggering and enabling the denoising unit 302 then.
  • As shown in FIG. 6, in accordance with one preferable example described herein, the LiDAR system 300 further comprises an input unit 305 for receiving input from a user. Among others, the control unit 304 may enable or disable the denoising unit 302 according to the input from the user. This has the advantage that the user may decide whether to enable the denoising unit 302 and whether to perform a noise filtering operation.
  • FIG. 7 illustrates a LiDAR system 400 in accordance with another example described herein. As shown in FIG. 7, the LiDAR system 400 comprises: a LiDAR 401, a confidence level calculating unit 402, and an output unit 403. The LiDAR 401 is configured to scan surroundings to generate point cloud, while its specific structure will not be described in detail here. The confidence level calculating unit 402 is coupled to the LiDAR 401 to receive the point cloud, and is configured to calculate a noise point confidence level of a point of the point cloud at least according to a distance between the point and the LiDAR and at least one of reflectivity and continuity parameter of the point. The output unit 403 is coupled to the LiDAR 401 and the confidence level calculating unit 402, and outputs the point cloud and a noise point confidence level of the point in the point cloud.
  • The calculation method of the noise point confidence level is similar to the description as made in the first aspect of the present disclosure, that is, use of one of the five characteristic parameters or any combination of the five characteristic parameters (including distance, reflectivity, continuity, noise and the number of echo pulses) to perform calculation. Here, one example will be provided only, and the rest will not go into details. For example, according to a distance from a point in the point cloud to the LiDAR, reflectivity of the point, continuity parameter of the point, noise of the point and pulse number factor of the point, a noise point confidence level of the point is calculated. The specific calculation comprises: for example, a distance factor is set to zero when the distance is within the predefined range of distance, otherwise the distance factor is set as a first weight; a reflectivity factor is set to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight; a continuity factor is set to zero when the continuity parameter is beyond the normal range of continuity parameter, otherwise the continuity factor is set as a third weight; a noise factor is set to zero when the noise is greater than a noise threshold, otherwise the noise factor is set as a fourth weight; and a pulse number factor is set to zero when the pulse number is greater than a threshold for pulse number, otherwise the pulse number factor is set as a fifth weight; and the noise point confidence level equals to the sum of the distance factor, the reflectivity factor, the continuity factor, the noise factor and the pulse number factor.
  • Preferably, the first weight is greater than the second, third, fourth, and fifth weights.
  • Similar to the LiDAR system 300 as shown in FIGS. 5 and 6, in the LiDAR system 400 in FIG. 7, the confidence level calculating unit 402 and the output unit 403 may be integrated into the LiDAR 401, or the three may be implemented as separate units. These should both fall within the protection scope hereof.
  • By the technical solution of the example, while the point cloud is output, the noise confidence level of a point in the point cloud may be also provided to a user. In a specific embodiment, one or more bits may be additionally added to the point cloud data of the LiDAR, in which the confidence level information is written in; the confidence level information is provided to a user for reference; and the user may decide whether to optimize radar point cloud graph or filter out noise points according to the confidence level information.
  • As shown in FIG. 8, in accordance with one preferable example described herein, the LiDAR system 400 further comprises an input unit 404 for receiving input from a user, wherein the input unit 404 may instruct, according to the input of the user, the output unit about whether to filter out noise points in the point cloud. For example, when a user thinks there are too many noise points in the output point cloud (for example, in the case of rain, snow or fog) that it is necessary to filter them out, the user may use the input unit 404 to instruct the output unit 403 to filter out noise points, and then output the point cloud. After receiving the instruction of filtering noise points, the output unit 403 determines, according to the noise point confidence level of a point in the point cloud, that the point is a noise point and filter it out when the noise point confidence level is beyond the normal range of confidence level. Therefore, the output point cloud excludes noise points, or includes them as less as possible.
  • Third Aspect
  • Further, the third aspect of the present disclosure involves a noise point identification device 500 for use in a LiDAR. As shown in FIG. 9, the device 500 comprises: a receiving unit 501 configured to receive a point cloud generated by the LiDAR; an obtaining unit 502 configured to obtain a distance between a point in the point cloud and the LiDAR, and at least one of reflectivity and continuity parameter of the point; and a determining unit 503 configured to determine whether the point is a noise point at least according to the distance and the at least one of the reflectivity and the continuity parameter.
  • The third aspect of the present disclosure also involves a block diagram of a computer program product 600, as shown in FIG. 10. A signal carrying medium 602 may be implemented as, or encompass, a computer readable medium 606, a computer recordable medium 608, a computer communication medium 610, or a combination thereof, which stores programming instructions 604 that may be configured with a processor to execute all or some of the processes described previously. These instructions may include, for example, one or more executable instructions for allowing one or more processors to execute the following processing: S201, receiving a point cloud generated by the LiDAR; S202, obtaining at least one of reflectivity and continuity parameter of a point in the point cloud, and a distance between the point and the LiDAR; and Step S203, determining whether the point is a noise point at least according to at least one of the reflectivity and the continuity parameter, and the distance.
  • The description of any process or method in a flow chart or otherwise set out herein may be understood to represent one or more modules, segments or portions of codes of executable instructions for steps of achieving a specific logical function or process, and the range of the preferable embodiments hereof covers other implementations, which may be achieved not following the orders shown or discussed herein, including executing a function in view of the function involved by basically simultaneous means or in reverse order. This should be understood by the skilled persons in the technical field of the examples of the present disclosure. The logics and/or steps shown in a flow chart or otherwise described herein, for example, the one that may be regarded as a sequencing list of executable instructions for achieving a logical function, may be particularly achieved in any computer readable medium for use by instruction execution systems, apparatuses or devices (such as a computer-based system, a system comprising processors or other systems that may read and execute instructions from an instruction execution system, apparatus or device), or combined use with these instruction execution systems, apparatuses or devices. In this description, “computer readable medium” may be any apparatus that may comprise, store, communicate, propagate or transmit programs for use by instruction execution systems, apparatuses or devices, or combined use with these instruction execution systems, apparatuses or devices. A more particular example, instead of an exhausted list, of the computer readable medium includes the following: an electric connection having one or more wiring (electronic device), a portable computer enclosure (magnetic device), a random access memory (RAM), a read only memory (ROM), an electrically programmable read-only-memory (EPROM or flash memory), an optical fiber device, and a CD read-only-memory (CDROM).
  • Moreover, the computer readable medium even may be paper or other suitable mediums, on which the programs may be printed. This is because the programs may be electronically acquired through, for example, optical scanning for paper or other mediums, and next through editing, decoding, or processing in other suitable ways if necessary, and then will be stored in a computer storage. It should be understood that every part of the present disclosure may be achieved by means of hardware, software, firmware or combination thereof.
  • In the embodiments described above, multiple steps or methods may be achieved using software or firmware stored in a storage and executed by suitable instruction execution systems. For example, if achieved with hardware, any of the following technologies commonly known in the art, or combination thereof, may be used to do so, just as same as done in another embodiment: a discrete logical circuit of logical gate for achieving logical functions for data signals, an application-specific integrated circuit with suitable combined logical gates, a programmable gate array (PGA), a field programmable gate array (FPGA), etc.
  • Furthermore, the solutions of the examples herein may be not only applied to mechanical LiDAR as mentioned in the examples above, but to other types of LiDARs, such as galvanometer scanning LiDAR, rotating mirror scanning LiDAR, or pure solid-state LiDAR including Flash LiDAR and phased array LiDAR. The present invention imposes no limitation on the type of applicable LiDARs. Last but not least, the contents described above are just preferable examples of the present disclosure, and are not used to limit the present disclosure. Although the detailed description of the present disclosure has been provided with reference to the foregoing examples, those skilled in the art still may make modifications to the technical solutions recorded in various examples described above, or conduct equivalent replacement of part of technical features therein. Any modification, equivalent replacement, improvement, if only within the spirit and principles set out herein, should be covered by the protection scope of the present disclosure.

Claims (19)

What is claimed is:
1. A method for identification of a noise point used for a light detection and ranging (LiDAR) system, comprising:
(a) receiving a point cloud generated by the LiDAR system;
(b) obtaining a distance and a reflectivity corresponding to a point in the point cloud;
(c) calculating a noise point confidence level of the point based at least in part on a combination of a distance factor and a reflectivity factor, wherein (i) the distance factor is set to zero when the distance is within a predetermined range of distance, otherwise the distance factor is set as a first weight, (ii) the reflectivity factor is to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight, wherein the first weight is greater than the second weight; and
(d) determining that the point is a noise point when the noise point confidence level is above a normal range of confidence level.
2. The method of claim 1, wherein the noise point confidence level is the sum of the distance factor and the reflectivity factor.
3. The method of claim 1, further comprising obtaining a continuity parameter of the point, wherein the noise point confidence level of the point is based on the distance factor, the reflectivity factor and a continuity factor of the continuity parameter.
4. The method of claim 3, wherein the continuity factor is set to zero when the continuity parameter is beyond a normal range of the continuity parameter, otherwise the continuity factor is set as a third weight.
5. The method of claim 4, further comprising obtaining a noise parameter of the point and the number of echo pulses corresponding to the point, wherein the noise point confidence level of the point is based on the distance factor, the reflectivity factor, the continuity factor, a noise factor of the noise parameter and an echo pulse number factor of the number of echo pulses.
6. The method of claim 5, wherein the noise factor is set to zero when the noise parameter is greater than a threshold for noise, otherwise the noise factor is set as a fourth weight, wherein the echo pulse number factor is set to zero when the number of echo pulses is greater than a threshold for pulse number, otherwise the echo pulse number factor is set as a fifth weight, and wherein the noise point confidence level is the sum of the distance factor, the reflectivity factor, the continuity factor, the noise factor and the echo pulse number factor.
7. The method of claim 6, wherein the first, second, third, fourth and fifth weights satisfy one or more of the following conditions:
the first weight equals to the sum of the second and third weights;
the second weight equals to the third weight;
the first weight equals to the threshold for confidence level;
the sum of the fourth and fifth weights equals to the second and/or third weights; and
the first weight is greater than the second, third, fourth and fifth weight.
8. The method of claim 6, further comprising dynamically adjusting the first weight, the second weight, the third weight, the fourth weight or the fifth weight based on a detected weather condition.
9. The method of claim 8, further comprising reducing the second weight and increasing the fourth weight when the weather condition indicates snowy, or increasing the fifth weight when the weather condition indicates foggy.
10. A system performing the method of claim 1 comprising:
a denoising unit coupled to the LiDAR system to receive the point cloud, and configured to perform the method to determine whether a point in the point cloud is a noise point, and filter out noise points in the point cloud; and
an output unit coupled to the denoising unit and configured to output the point cloud.
11. The system of claim 10, further comprising a control unit coupled to the denoising unit, and capable of enabling or disabling the denoising unit, wherein in an enabled mode, the denoising unit filters out noise points in the point cloud, and the output unit outputs the point cloud with the noise points filtered out; and in a disabled mode, the denoising unit is disabled, and the output unit outputs the point cloud with the noise points not being filtered out.
12. The system of claim 11, wherein the control unit enables the denoising unit when a rainy, snowy or foggy weather condition is detected.
13. The system of claim 12, wherein the control unit is configured to detect a rainy, snowy or foggy weather by detecting the number of noise points in the point cloud greater than a predefined threshold.
14. The system of claim 11, wherein the control unit is configured to enable or disable the denoising unit based on an input from a user.
15. A light detection and ranging (LiDAR) system comprising:
a LiDAR device configured to scan its surroundings to generate a point cloud;
a computing unit coupled to the LiDAR device to receive the point cloud, and configured to calculate a noise point confidence level of a point in the point cloud based at least in part on a combination of a distance factor and a reflectivity factor, wherein (i) the distance factor is set to zero when the distance is within a predetermined range of distance, otherwise the distance factor is set as a first weight, (ii) the reflectivity factor is to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight, wherein the first weight is greater than the second weight; and
an output unit coupled to the LiDAR device and the computing unit and configured to output the point cloud and noise point confidence levels of the points in the point cloud.
16. The LiDAR system of claim 15, wherein the computing unit is configured to calculate a noise point confidence level of a point in the point cloud, based on the distance factor of the point, the reflectivity factor of the point, a continuity factor of a continuity parameter of the point, a noise factor of the point and a pulse number factor of the point, wherein the continuity factor is set to zero when the continuity parameter is beyond a normal range of the continuity parameter, otherwise the continuity factor is set as a third weight, wherein the noise factor is set to zero when the noise parameter is greater than a threshold for noise, otherwise the noise factor is set as a fourth weight, wherein the echo pulse number factor is set to zero when the number of echo pulses is greater than a threshold for pulse number, otherwise the echo pulse number factor is set as a fifth weight,
and wherein the first weight is greater than the second, third, fourth and fifth weight.
17. The LiDAR system of claim 15, further comprising an input unit configured to receive a user input including an instruction to the output unit indicating whether to filter out noise points in the point cloud.
18. A device for identification of a noise point used for a light detection and ranging (LiDAR) system, the device comprising:
a receiving unit configured to receive a point cloud generated by the LiDAR system;
an obtaining unit configured to obtain a distance and a reflectivity of a point in the point cloud; and
a determination unit for determining whether the point is a noise point based at least in part on the distance and the reflectivity, wherein the determination unit is configured to:
(a) calculate a noise point confidence level of the point based at least in part on a distance factor and a reflectivity factor, wherein (i) the distance factor is set to zero when the distance is within a predetermined range of distance, otherwise the distance factor is set as a first weight, (ii) the reflectivity factor is to zero when the reflectivity is less than or equal to a predefined threshold for reflectivity, otherwise the reflectivity factor is set as a second weight, wherein the first weight is greater than the second weight, and
(b) determine that the point is a noise point when the noise point confidence level is above a normal range of confidence level.
19. A non-transitory computer readable medium storing instructions that, when executed by a processor, causes the processor to perform the method of claim 1 for identification of a noise point.
US17/146,177 2019-04-22 2021-01-11 Method for identification of a noise point used for lidar, and lidar system Pending US20210208261A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/146,177 US20210208261A1 (en) 2019-04-22 2021-01-11 Method for identification of a noise point used for lidar, and lidar system

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201910322910.4A CN110031823B (en) 2019-04-22 2019-04-22 Noise point identification method for laser radar and laser radar system
CN201910322910.4 2019-04-22
PCT/CN2019/085765 WO2020215368A1 (en) 2019-04-22 2019-05-07 Noisy point identification method for laser radar, and laser radar system
US16/827,182 US10908268B2 (en) 2019-04-22 2020-03-23 Method for identification of a noise point used for LiDAR, and LiDAR system
US17/146,177 US20210208261A1 (en) 2019-04-22 2021-01-11 Method for identification of a noise point used for lidar, and lidar system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/827,182 Continuation US10908268B2 (en) 2019-04-22 2020-03-23 Method for identification of a noise point used for LiDAR, and LiDAR system

Publications (1)

Publication Number Publication Date
US20210208261A1 true US20210208261A1 (en) 2021-07-08

Family

ID=67239379

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/827,182 Active US10908268B2 (en) 2019-04-22 2020-03-23 Method for identification of a noise point used for LiDAR, and LiDAR system
US17/146,177 Pending US20210208261A1 (en) 2019-04-22 2021-01-11 Method for identification of a noise point used for lidar, and lidar system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/827,182 Active US10908268B2 (en) 2019-04-22 2020-03-23 Method for identification of a noise point used for LiDAR, and LiDAR system

Country Status (8)

Country Link
US (2) US10908268B2 (en)
EP (1) EP3961251A4 (en)
JP (1) JP7466570B2 (en)
KR (1) KR20220003570A (en)
CN (2) CN110988846B (en)
IL (1) IL287425A (en)
MX (1) MX2021012864A (en)
WO (1) WO2020215368A1 (en)

Families Citing this family (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11609336B1 (en) 2018-08-21 2023-03-21 Innovusion, Inc. Refraction compensation for use in LiDAR systems
KR102580275B1 (en) 2016-12-30 2023-09-18 이노뷰전, 인크. Multi-wavelength lidar design
US10942257B2 (en) 2016-12-31 2021-03-09 Innovusion Ireland Limited 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices
US11009605B2 (en) 2017-01-05 2021-05-18 Innovusion Ireland Limited MEMS beam steering and fisheye receiving lens for LiDAR system
WO2018129408A1 (en) 2017-01-05 2018-07-12 Innovusion Ireland Limited Method and system for encoding and decoding lidar
WO2019139895A1 (en) 2018-01-09 2019-07-18 Innovusion Ireland Limited Lidar detection systems and methods that use multi-plane mirrors
US11675050B2 (en) 2018-01-09 2023-06-13 Innovusion, Inc. LiDAR detection systems and methods
WO2019164961A1 (en) 2018-02-21 2019-08-29 Innovusion Ireland Limited Lidar systems with fiber optic coupling
US11391823B2 (en) 2018-02-21 2022-07-19 Innovusion, Inc. LiDAR detection systems and methods with high repetition rate to observe far objects
WO2019165294A1 (en) 2018-02-23 2019-08-29 Innovusion Ireland Limited 2-dimensional steering system for lidar systems
US11808888B2 (en) 2018-02-23 2023-11-07 Innovusion, Inc. Multi-wavelength pulse steering in LiDAR systems
WO2019199775A1 (en) 2018-04-09 2019-10-17 Innovusion Ireland Limited Lidar systems and methods for exercising precise control of a fiber laser
CN112585492A (en) 2018-06-15 2021-03-30 图达通爱尔兰有限公司 LIDAR system and method for focusing a range of interest
US11579300B1 (en) 2018-08-21 2023-02-14 Innovusion, Inc. Dual lens receive path for LiDAR system
US11614526B1 (en) 2018-08-24 2023-03-28 Innovusion, Inc. Virtual windows for LIDAR safety systems and methods
US11796645B1 (en) 2018-08-24 2023-10-24 Innovusion, Inc. Systems and methods for tuning filters for use in lidar systems
US11579258B1 (en) 2018-08-30 2023-02-14 Innovusion, Inc. Solid state pulse steering in lidar systems
WO2020102406A1 (en) 2018-11-14 2020-05-22 Innovusion Ireland Limited Lidar systems and methods that use a multi-facet mirror
WO2020146493A1 (en) 2019-01-10 2020-07-16 Innovusion Ireland Limited Lidar systems and methods with beam steering and wide angle signal detection
US11486970B1 (en) 2019-02-11 2022-11-01 Innovusion, Inc. Multiple beam generation from a single source beam for use with a LiDAR system
US11977185B1 (en) 2019-04-04 2024-05-07 Seyond, Inc. Variable angle polygon for use with a LiDAR system
CN110988846B (en) * 2019-04-22 2023-07-18 威力登激光雷达美国有限公司 Noise point identification method for laser radar and laser radar system
US11556000B1 (en) 2019-08-22 2023-01-17 Red Creamery Llc Distally-actuated scanning mirror
CN110515054B (en) * 2019-08-23 2021-07-23 斯坦德机器人(深圳)有限公司 Filtering method and device, electronic equipment and computer storage medium
CN112912756A (en) * 2019-09-17 2021-06-04 深圳市大疆创新科技有限公司 Point cloud noise filtering method, distance measuring device, system, storage medium and mobile platform
CN115643809A (en) * 2019-09-26 2023-01-24 深圳市大疆创新科技有限公司 Signal processing method of point cloud detection system and point cloud detection system
US11150348B2 (en) 2019-10-02 2021-10-19 Cepton Technologies, Inc. Techniques for detecting cross-talk interferences in lidar imaging sensors
USD959305S1 (en) * 2019-10-31 2022-08-02 Hesai Technology Co., Ltd. Lidar
CN113196092A (en) * 2019-11-29 2021-07-30 深圳市大疆创新科技有限公司 Noise filtering method and device and laser radar
CN112639516B (en) * 2019-12-10 2023-08-04 深圳市速腾聚创科技有限公司 Laser radar ranging method, laser radar ranging device, computer equipment and storage medium
US11288876B2 (en) * 2019-12-13 2022-03-29 Magic Leap, Inc. Enhanced techniques for volumetric stage mapping based on calibration object
CN111007484B (en) * 2019-12-27 2023-08-25 联合微电子中心有限责任公司 Single-line laser radar
EP4130798A4 (en) * 2020-04-15 2023-05-31 Huawei Technologies Co., Ltd. Target identification method and device
USD1000978S1 (en) * 2020-04-23 2023-10-10 Hesai Technology Co., Ltd. Lidar
WO2021258246A1 (en) * 2020-06-22 2021-12-30 华为技术有限公司 Radar system, mobile device and radar detection method
CN112505704B (en) * 2020-11-10 2024-06-07 北京埃福瑞科技有限公司 Method for improving safety of autonomous intelligent perception system of train and train
CN112419360B (en) * 2020-11-16 2023-02-21 北京理工大学 Background removing and target image segmenting method based on stereo imaging
US11158120B1 (en) * 2020-11-18 2021-10-26 Motional Ad Llc Ghost point filtering
CN112700387A (en) * 2021-01-08 2021-04-23 瓴道(上海)机器人科技有限公司 Laser data processing method, device and equipment and storage medium
US11422267B1 (en) 2021-02-18 2022-08-23 Innovusion, Inc. Dual shaft axial flux motor for optical scanners
US11789128B2 (en) 2021-03-01 2023-10-17 Innovusion, Inc. Fiber-based transmitter and receiver channels of light detection and ranging systems
CN112907480B (en) * 2021-03-11 2023-05-09 北京格灵深瞳信息技术股份有限公司 Point cloud surface ripple removing method and device, terminal and storage medium
CN112834981B (en) * 2021-03-15 2022-07-15 哈尔滨工程大学 Null array direction-of-arrival estimation method under impulse noise background
WO2022198638A1 (en) * 2021-03-26 2022-09-29 深圳市大疆创新科技有限公司 Laser ranging method, laser ranging device, and movable platform
CN116547562A (en) * 2021-03-26 2023-08-04 深圳市大疆创新科技有限公司 Point cloud noise filtering method, system and movable platform
US11555895B2 (en) 2021-04-20 2023-01-17 Innovusion, Inc. Dynamic compensation to polygon and motor tolerance using galvo control profile
US11614521B2 (en) 2021-04-21 2023-03-28 Innovusion, Inc. LiDAR scanner with pivot prism and mirror
EP4305450A1 (en) 2021-04-22 2024-01-17 Innovusion, Inc. A compact lidar design with high resolution and ultra-wide field of view
CN113204027B (en) * 2021-05-06 2024-06-11 武汉海达数云技术有限公司 Pulse type laser radar cross-period ranging method capable of precisely selecting ranging period
EP4314884A1 (en) 2021-05-21 2024-02-07 Innovusion, Inc. Movement profiles for smart scanning using galvonometer mirror inside lidar scanner
CA3160169A1 (en) * 2021-05-24 2022-11-24 Avidbots Corp System and method of software and pitch control of a disinfection module for a semi-autonomous cleaning and disinfection device
CN117616307A (en) * 2021-07-05 2024-02-27 深圳市速腾聚创科技有限公司 Point cloud processing method and device of laser radar, storage medium and terminal equipment
KR102655213B1 (en) * 2021-07-08 2024-04-05 한국과학기술원 Noise filtering method for point cloud of 4 dimensional radar and data processing apparatus
US11768294B2 (en) 2021-07-09 2023-09-26 Innovusion, Inc. Compact lidar systems for vehicle contour fitting
CN113763263A (en) * 2021-07-27 2021-12-07 华能伊敏煤电有限责任公司 Water mist tail gas noise treatment method based on point cloud tail gas filtering technology
CN114012783B (en) * 2021-10-12 2023-04-07 深圳优地科技有限公司 Robot fault detection method, robot and storage medium
CN113640779B (en) * 2021-10-15 2022-05-03 北京一径科技有限公司 Radar failure determination method and device, and storage medium
CN216356147U (en) 2021-11-24 2022-04-19 图达通智能科技(苏州)有限公司 Vehicle-mounted laser radar motor, vehicle-mounted laser radar and vehicle
CN114355381B (en) * 2021-12-31 2022-09-09 安徽海博智能科技有限责任公司 Laser radar point cloud quality detection and improvement method
US11871130B2 (en) 2022-03-25 2024-01-09 Innovusion, Inc. Compact perception device
CN114926356B (en) * 2022-05-10 2024-06-18 大连理工大学 LiDAR point cloud unsupervised denoising method aiming at snowfall influence
CN117630866A (en) * 2022-08-10 2024-03-01 上海禾赛科技有限公司 Laser radar, resource allocation method for laser radar, and computer-readable storage medium
CN118131179A (en) * 2022-12-02 2024-06-04 上海禾赛科技有限公司 Data processing method for laser radar point cloud, laser radar and system thereof
CN116520298A (en) * 2023-06-12 2023-08-01 北京百度网讯科技有限公司 Laser radar performance test method and device, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140247968A1 (en) * 2011-11-16 2014-09-04 Bayerische Motoren Werke Aktiengesellschaft Method for Fog Detection
US20160103208A1 (en) * 2014-10-14 2016-04-14 Hyundai Motor Company System for filtering lidar data in vehicle and method thereof
US20170220875A1 (en) * 2016-01-29 2017-08-03 Faraday&Future Inc. System and method for determining a visibility state
US20190391270A1 (en) * 2018-06-25 2019-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mitigating effects of high-reflectivity objects in lidar data

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3881779B2 (en) * 1998-06-24 2007-02-14 本田技研工業株式会社 Object detection device
JP2000356677A (en) * 1999-06-15 2000-12-26 Topcon Corp Apparatus and method for measuring distance
US9097800B1 (en) * 2012-10-11 2015-08-04 Google Inc. Solid object detection system using laser and radar sensor fusion
CN104574282B (en) * 2013-10-22 2019-06-07 鸿富锦精密工业(深圳)有限公司 Point cloud noise spot removes system and method
JP2015219120A (en) * 2014-05-19 2015-12-07 パナソニックIpマネジメント株式会社 Distance measuring apparatus
CN105404844B (en) * 2014-09-12 2019-05-31 广州汽车集团股份有限公司 A kind of Method for Road Boundary Detection based on multi-line laser radar
JP6418961B2 (en) * 2015-01-28 2018-11-07 シャープ株式会社 Obstacle detection device, moving object, obstacle detection method, and obstacle detection program
JP6505470B2 (en) * 2015-02-27 2019-04-24 株式会社デンソー Noise removal method and object recognition apparatus
CN106845321B (en) * 2015-12-03 2020-03-27 高德软件有限公司 Method and device for processing pavement marking information
JP6725982B2 (en) * 2015-12-15 2020-07-22 シャープ株式会社 Obstacle determination device
JP2018036102A (en) * 2016-08-30 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 Distance measurement device and method of controlling distance measurement device
CN107167788B (en) * 2017-03-21 2020-01-21 深圳市速腾聚创科技有限公司 Method and system for obtaining laser radar calibration parameters and laser radar calibration
US10241198B2 (en) * 2017-03-30 2019-03-26 Luminar Technologies, Inc. Lidar receiver calibration
CN107818550B (en) * 2017-10-27 2021-05-28 广东电网有限责任公司机巡作业中心 Point cloud top noise point removing method based on LiDAR
CN108303690B (en) * 2018-01-17 2019-12-27 深圳煜炜光学科技有限公司 Ranging method and ranging system for eliminating laser radar blind area
CN108267746A (en) * 2018-01-17 2018-07-10 上海禾赛光电科技有限公司 Laser radar system, the processing method of laser radar point cloud data, readable medium
CN108761461B (en) * 2018-05-29 2022-02-18 南京信息工程大学 Rainfall forecasting method based on weather radar echo time sequence image
WO2019227975A1 (en) * 2018-05-30 2019-12-05 Oppo广东移动通信有限公司 Control system of laser projector, terminal and control method of laser projector
CN110988846B (en) * 2019-04-22 2023-07-18 威力登激光雷达美国有限公司 Noise point identification method for laser radar and laser radar system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140247968A1 (en) * 2011-11-16 2014-09-04 Bayerische Motoren Werke Aktiengesellschaft Method for Fog Detection
US20160103208A1 (en) * 2014-10-14 2016-04-14 Hyundai Motor Company System for filtering lidar data in vehicle and method thereof
US20170220875A1 (en) * 2016-01-29 2017-08-03 Faraday&Future Inc. System and method for determining a visibility state
US20190391270A1 (en) * 2018-06-25 2019-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mitigating effects of high-reflectivity objects in lidar data

Also Published As

Publication number Publication date
US10908268B2 (en) 2021-02-02
KR20220003570A (en) 2022-01-10
WO2020215368A1 (en) 2020-10-29
EP3961251A4 (en) 2023-01-18
IL287425A (en) 2021-12-01
MX2021012864A (en) 2022-02-22
JP7466570B2 (en) 2024-04-12
JP2022537480A (en) 2022-08-26
US20200379096A1 (en) 2020-12-03
CN110988846A (en) 2020-04-10
CN110988846B (en) 2023-07-18
CN110031823A (en) 2019-07-19
EP3961251A1 (en) 2022-03-02
CN110031823B (en) 2020-03-24

Similar Documents

Publication Publication Date Title
US10908268B2 (en) Method for identification of a noise point used for LiDAR, and LiDAR system
US11982771B2 (en) Method for identification of a noise point used for LiDAR, and LiDAR system
US11255728B2 (en) Systems and methods for efficient multi-return light detectors
US10830880B2 (en) Selecting LIDAR pulse detector depending on pulse type
US11415681B2 (en) LIDAR based distance measurements with tiered power control
US20200081124A1 (en) Method and device for filtering out non-ground points from point cloud, and storage medium
US10768281B2 (en) Detecting a laser pulse edge for real time detection
US20240192338A1 (en) Method for signal processing for lidar, detection method of lidar, and lidar
US11513197B2 (en) Multiple-pulses-in-air laser scanning system with ambiguity resolution based on range probing and 3D point analysis
US20210255289A1 (en) Light detection method, light detection device, and mobile platform
CN114779211A (en) Laser pulse radar equipment and point cloud density improving method and equipment
CN114527469A (en) Object detection device, object detection method, and storage medium
KR20210153563A (en) System and method for histogram binning for depth detectiion
KR102284197B1 (en) Indoor map generation method and apparatus using lidar data
KR102284196B1 (en) Indoor map generation method and apparatus using lidar data
CN110308460B (en) Parameter determination method and system of sensor
WO2024131976A1 (en) Obstruction detection methods and obstruction detection devices for lidars, and lidars
CN117310659A (en) Method for judging light window shielding state of laser radar and related products
CN117665773A (en) Laser radar testing method and system
CN117572458A (en) Dirt shielding detection method for laser radar window and related equipment thereof
CN118348505A (en) Inter-channel crosstalk filtering method for laser radar, laser radar and data processing system
CN117572383A (en) Noise filtering method and device for radar, upper computer and laser radar
CN116953656A (en) Detection result processing method and device and computer readable storage medium

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: HERCULES CAPITAL, INC., AS AGENT, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:VELODYNE LIDAR USA, INC.;REEL/FRAME:063593/0463

Effective date: 20230509

AS Assignment

Owner name: VELODYNE LIDAR USA, INC., CALIFORNIA

Free format text: RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT RECORDED AT REEL/FRAME NO. 063593/0463;ASSIGNOR:HERCULES CAPITAL, INC.;REEL/FRAME:065350/0801

Effective date: 20231025

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: VELODYNE LIDAR, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HESAI PHOTONICS TECHNOLOGY CO., LTD.;REEL/FRAME:067300/0445

Effective date: 20200827

Owner name: HESAI PHOTONICS TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHOU, XIAOTONG;XIANG, SHAOQING;ZENG, ZHAOMING;AND OTHERS;REEL/FRAME:067300/0085

Effective date: 20200427

Owner name: VELODYNE LIDAR USA, INC., CALIFORNIA

Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:VL MERGER SUB INC.;VELODYNE LIDAR, INC.;VELODYNE LIDAR USA, INC.;REEL/FRAME:067300/0532

Effective date: 20200929