WO2021097983A1 - 定位的方法、装置、设备及存储介质 - Google Patents

定位的方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2021097983A1
WO2021097983A1 PCT/CN2019/126326 CN2019126326W WO2021097983A1 WO 2021097983 A1 WO2021097983 A1 WO 2021097983A1 CN 2019126326 W CN2019126326 W CN 2019126326W WO 2021097983 A1 WO2021097983 A1 WO 2021097983A1
Authority
WO
WIPO (PCT)
Prior art keywords
pose
point cloud
cloud data
actual
original point
Prior art date
Application number
PCT/CN2019/126326
Other languages
English (en)
French (fr)
Inventor
黄赓
左之远
韩旭
Original Assignee
广州文远知行科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州文远知行科技有限公司 filed Critical 广州文远知行科技有限公司
Publication of WO2021097983A1 publication Critical patent/WO2021097983A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • This application relates to autonomous driving technology, such as a positioning method, device, equipment, and storage medium.
  • the positioning of unmanned vehicles is an important part of autonomous driving technology. Specifically, when an unmanned vehicle is traveling, the point cloud data around the unmanned vehicle can be obtained through lidar, and the point cloud data can be matched on a pre-built map to determine the location of the unmanned vehicle on the map. . And realize the precise navigation of the unmanned car and realize automatic driving.
  • lidar is easily affected by the environment, such as rain, traffic jams, etc., which makes the acquired point cloud data appear large noises and become unsmooth, which increases the use of this point cloud data and map to perform The difficulty of matching reduces the accuracy of positioning.
  • the point cloud data can be filtered and smoothed by adjusting the calibration parameters, so as to reduce the noise of the point cloud data and improve the accuracy of positioning.
  • the positioning result based on filtering and smoothing technology is very sensitive to the calibration parameters, which brings a heavy burden to the accuracy and frequency of the calibration.
  • This application provides a positioning method, device, equipment, and storage medium to improve the efficiency and accuracy of positioning.
  • this application provides a positioning method, which includes:
  • Correction processing is performed on the actual measured pose using the offset data to obtain the actual pose of the target object at each time.
  • the receiving at least two original point cloud data detected by the target object according to the second frequency includes:
  • the time stamps of the point cloud data belonging to the same time period are normalized to obtain the original point cloud data corresponding to each time period.
  • the determining the offset data at the same time as the actual measured pose based on the original point cloud data includes:
  • Interpolation is performed according to the offset data of two adjacent first moments to obtain offset data corresponding to each of the actual measured poses between two adjacent first moments.
  • Posture including:
  • the estimated pose is set as the actual pose of the target object at the first moment.
  • the method further includes:
  • the using the adjusted original point cloud data to perform matching in a pre-built map includes:
  • the product of the likelihood values of all the original point cloud data is used as the result of the matching.
  • the using the offset data to perform correction processing on the actual measured pose to obtain the actual pose of the target object at each moment includes:
  • the sum of the offset data and the measured pose at the same moment is determined as the actual pose of the target object at each moment.
  • the using the offset data to perform correction processing on the actual measured pose to obtain the actual pose of the target object at each moment includes:
  • the offset data with the time difference from the measured pose is added to the measured pose to obtain the actual pose of the target object at each time.
  • this application also provides a positioning device, which includes:
  • the measured pose receiving module is used to receive at least two measured poses of the target object detected according to the first frequency
  • An original point cloud receiving module configured to receive at least two original point cloud data detected by the target object according to the second frequency
  • An offset data determination module configured to determine offset data at the same time as the actual measured pose based on the original point cloud data
  • the actual pose determination module is used to correct the measured pose using the offset data to obtain the actual pose of the target object at each moment.
  • the present application also provides a positioning device, which includes: a memory and one or more processors;
  • the memory is used to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the positioning method according to any one of the first aspect.
  • the present application also provides a storage medium containing computer-executable instructions, wherein the computer-executable instructions are used to execute any of the computer-executable instructions when executed by a computer processor. Positioning method.
  • the present application receives at least two measured poses of a target object detected according to a first frequency; receives at least two original point cloud data detected by the target object according to a second frequency; based on the original point cloud data, Determine the offset data at the same time as the actual measured pose; use the offset data to correct the actual measured pose to obtain the actual pose of the target object at each time, which solves the problem of the original point
  • the cloud data collection process is easily affected by environmental noise, which brings about the problem of reduced positioning accuracy, which makes the trajectory of the determined actual pose of the target object smoother, and improves the accuracy and accuracy of the determined actual pose. effectiveness.
  • FIG. 1A is a flowchart of a positioning method provided in Embodiment 1 of this application;
  • FIG. 1B is a schematic diagram of a moment of obtaining actual measured pose and original point cloud data according to Embodiment 1 of the application;
  • FIG. 1B is a schematic diagram of a moment of obtaining actual measured pose and original point cloud data according to Embodiment 1 of the application;
  • 1C is a schematic diagram of the relationship between the actual pose and the measured pose provided in the first embodiment of the application;
  • FIG. 2A is a flowchart of a positioning method provided in Embodiment 2 of this application.
  • 2B is a schematic diagram of an interpolation method for offset data provided in the second embodiment of this application.
  • FIG. 3 is a schematic structural diagram of a positioning device provided in Embodiment 3 of this application.
  • FIG. 4 is a schematic structural diagram of a positioning device provided in Embodiment 4 of this application.
  • FIG. 1A is a flowchart of a positioning method provided in Embodiment 1 of this application
  • FIG. 1B is a schematic diagram of a moment of acquiring actual measured pose and original point cloud data provided in Embodiment 1 of this application
  • FIG. 1C is an implementation of this application
  • Example 1 provides a schematic diagram of the relationship between the actual pose and the measured pose.
  • This embodiment is applicable to the situation of positioning a target object, and the method can be executed by a positioning device.
  • the device can be implemented by software and/or hardware, and is generally integrated in the controller of the target object.
  • the target objects in this embodiment may specifically refer to electronic terminal devices that can perform related tasks on their own, such as vehicles, robots, smart furniture devices, and smart service devices. It is understandable that the above-mentioned target object can quickly and accurately determine the pose of the terminal based on the positioning method provided in this embodiment during the unmanned automatic travel process, so as to realize accurate navigation and avoidance of the target object. barrier.
  • the method specifically includes the following steps:
  • S110 Receive at least two actual measured poses of the target object detected according to the first frequency.
  • the pose may include position and orientation.
  • a reference coordinate system can be determined, and the coordinates of the target object in the reference coordinate system can be determined as the position of the target object; the rotation angle of the target object in the reference coordinate system can be determined as the orientation of the target object.
  • the measured pose may be the pose detected by the hardware preset in the target object.
  • the hardware used to obtain the measured pose is not limited, and this embodiment will be described by using an example.
  • a hardware module related to satellite positioning may be preset in the target object, and the position may be determined in the manner of satellite positioning.
  • satellite positioning is a technology that uses satellites to accurately locate something.
  • the satellite positioning system is an interconnected assembly or device (component) composed of determining the spatial position as the target. This system can ensure that at least 4 satellites can be observed at any point on the earth at any time, so as to ensure that the satellites can collect the longitude, latitude, and altitude of the observation point, so as to achieve navigation, positioning, and timing functions.
  • This technology can be used to guide airplanes, ships, vehicles and individuals to safely and accurately follow the selected route and arrive at the destination on time.
  • mainstream positioning systems include China's Beidou satellite navigation system, EU's Galileo satellite navigation system, and Russian global navigation satellite system, in addition to the United States' Global Positioning System (GPS).
  • GPS Global Positioning System
  • the positioning result may include the latitude and longitude coordinates of the location of the target object.
  • the GPS positioning results can also be converted into coordinates in the plane coordinate system through methods such as Gauss-Krüger projection, Mercator projection, Lambert projection, Universal Transverse Mercator (UTM) projection, etc. .
  • an inertial measurement unit may be preset in the target object, and the orientation is determined by measuring the three-axis attitude angle (or angular rate) and acceleration of the target object. , Expressed in rotation angle.
  • a three-axis gyroscope and three-directional accelerometers are installed in an IMU to measure the angular velocity and acceleration of the object in three-dimensional space, and to calculate the posture of the object. In order to improve reliability, it is also possible to equip more sensors for each axis. Generally speaking.
  • the IMU should be installed on the center of gravity of the measured object.
  • the hardware preset in the target object can be set to detect the actual pose at the first frequency to form a first sequence consisting of the actual pose with a timestamp.
  • the measured pose can be obtained at each time determined by the first frequency, that is, the first sequence is smooth in time.
  • the hardware use environment due to the influence of the hardware use environment, there is also the problem of inaccurate positioning.
  • the satellite signal when the satellite signal is poor, it may not be able to provide stable and accurate positioning results for the automatic driving of vehicles.
  • hardware configurations related to satellite positioning can be increased, but this also brings high hardware costs.
  • S120 Receive at least two original point cloud data detected by the target object according to the second frequency.
  • Point cloud data refers to a storage form of data obtained through a 3D scanner. Generally, in the point cloud data, the scan data is recorded in the form of points. Each point may contain three-dimensional coordinates, and some may contain color information (RGB) or reflective surface intensity information (Intensity).
  • RGB color information
  • Intensity reflective surface intensity information
  • the three-dimensional scanner can be a lidar, an infrared scanner, etc.
  • a laser radar can be used as an example for description.
  • lidar is a radar system that emits a laser beam to detect the target's position, speed and other characteristic quantities. Its working principle is to transmit a detection signal (laser beam) to the target, and then compare the received signal reflected from the target (target echo) with the transmitted signal. After proper processing, the relevant information of the target can be obtained, such as Target distance, azimuth, height, speed, attitude, and even shape and other parameters.
  • the lidar can obtain the distance between the environmental object around the target object and the target object to generate point cloud data.
  • the lidar can be installed on the roof of the car, and the mechanical lidar can sweep around the axis.
  • the lidar can emit high-frequency laser beams to continuously scan the external environment while rapidly rotating through multiple laser emitting components, so as to collect point cloud data in different directions during the travel of the target object.
  • the point cloud data may be divided into multiple time periods according to the second frequency, where the second frequency may be the frequency of one revolution of the lidar or a multiple of the frequency of one revolution of the lidar.
  • the point cloud data for each time period includes at least the point cloud data scanned by the lidar every time the lidar sweeps a circle.
  • the time stamp of the point cloud data belonging to the same time period can be normalized by means of motion compensation, and the original point cloud data corresponding to each time period can be obtained to eliminate the problems caused by vehicle driving and lidar rotation. Incoming data deviation.
  • it can be preset to scan the original point cloud data at the second frequency on the target object to form a second sequence consisting of the original point cloud data that carries a time stamp.
  • the first frequency can be set to be greater than the second frequency. It is also possible to set the first frequency to be a multiple of the second frequency. Exemplarily, the second frequency may be set to 50 Hz, and the first frequency may be set to 500 Hz. That is to say, for each moment corresponding to the measured pose, the original point cloud data corresponding to that moment can be obtained.
  • the second frequency is 50 Hz and the first frequency is 500 Hz
  • the frequency of 500 Hz is used to obtain 10 actual measurements from time 1 to time 10 Posture.
  • the first sequence can be expressed as ⁇ measured pose 1, measured pose 2, measured pose 3..., measured pose n ⁇ , where 1, 2, ..., n are timestamps; at the same time, it can also be According to the frequency of 50 Hz, two original point cloud data are acquired at time 1 and time 10.
  • the second sequence can be expressed as ⁇ original point cloud data 1, original point cloud data 10..., original point cloud data m ⁇ , where 1, 10,..., m are timestamps.
  • the offset data is the difference between the measured pose and the actual pose of the target object.
  • the third sequence of offset data can be determined based on the original point cloud data.
  • the third sequence can be expressed as ⁇ offset data 1, offset data 2, offset data 3..., real offset data t ⁇ , where 1, 2,..., t are timestamps. Use the offset data to correct the measured pose to determine the actual pose of the target object.
  • the measured pose and original point cloud data with the same time stamp it can be considered that the measured pose and the original point cloud data are located at the same time.
  • a method of constructing a map can be used to determine the estimated pose of the target object in the map by matching the original point cloud data with the point cloud data in a pre-built map. The difference between the estimated pose and the measured pose can be used as offset data.
  • SLAM Simultaneous Localization and Mapping
  • the original point cloud data is collected according to the second frequency, that is to say, not all the corresponding moments of the actual measured pose have corresponding original point clouds. data.
  • the offset data at time 1 and time 10 can be directly determined; and time 2 to time 9 It has the measured pose, but the corresponding original point cloud data is not obtained, and the offset data cannot be obtained directly.
  • the offset data at the current time may be determined based on the offset data at the adjacent time.
  • the offset data at time 2 to time 9 may be the offset between time 1 and time 10. Linear interpolation of data.
  • the offset data at the current moment may be determined according to the offset data at the previous moment. For example, the offset data from time 2 to time 9 is the same as time 1, and the offset data from time 11 to time 12... is the same as the offset data at time 10.
  • the original point cloud data is matched with the point cloud data in the pre-built map to determine the pose of the target object in the map.
  • the acquisition of the original point cloud data is susceptible to environmental factors. In the case of rainy weather and vehicle congestion, it is easy to cause large noise in the original point cloud data, which will affect the accuracy of positioning; on the other hand, generally, it is costly to match point cloud data. Time and the process of consuming hardware computing power are easy to bring a certain lag in positioning, and lag is fatal in some application scenarios, such as the navigation of unmanned vehicles.
  • the combination between the measured pose and the offset data determined by the original point cloud data is used to determine the actual pose of the target object. The problem of inaccurate positioning caused by the impact of environmental factors on point cloud data; on the other hand, because there is no need to match the original point cloud data with the map at every moment, reducing the time consumption caused by the matching , To reduce the lag of positioning results.
  • S140 Perform correction processing on the actual measured pose using the offset data to obtain the actual pose of the target object at each moment.
  • the sum of the offset data at the same time and the measured pose is determined as the actual pose of the target object at each time.
  • the actual pose 1 is the sum of the offset data 1 and the measured pose 1.
  • the horizontal axis of the coordinate system is the time axis, and the vertical axis represents the magnitude of the pose; the black solid line represents the change curve of the measured pose with time t, and the black dashed line represents the change of the actual pose with time t. curve.
  • a preset time difference can be determined; for the actual measured pose at each moment, the offset data with the time difference from the actual measured pose is added to the actual measured pose to obtain the target at each moment The actual pose of the subject.
  • the actual pose 1 is the sum of the offset data 1 + ⁇ and the measured pose 1.
  • the actual pose 1 is the offset The sum of data 2 and measured pose 1.
  • the noise generated when the offset data is determined based on the original point cloud data may be caused by the impact of environmental factors on the original point cloud data, or it may be caused by the original point cloud data and the point cloud data in the map. Brought by the match.
  • the time difference ⁇ can be obtained through experiments, and it can be set to 0.8s, and it can also be set to a larger time without affecting the accuracy of the actual pose.
  • the technical solution of this embodiment is to receive at least two actual measured poses of the target object detected according to the first frequency; receive at least two original point cloud data detected by the target object according to the second frequency; based on the The original point cloud data, determine the offset data at the same time as the actual measured pose; use the offset data to correct the actual measured pose to obtain the actual pose of the target object at each time, It solves the problem that the original point cloud data collection process is susceptible to environmental noise, and the positioning accuracy is reduced, so that the determined target object's actual pose trajectory is smoother and the determined reality is improved. The effect of pose accuracy and efficiency.
  • FIG. 2A is a flowchart of a positioning method provided in Embodiment 2 of this application
  • FIG. 2B is a schematic diagram of an offset data interpolation method provided in Embodiment 2 of this application.
  • the method may include:
  • S210 Receive at least two actual measured poses of the target object detected according to the first frequency.
  • the first sequence can be expressed as ⁇ measured pose 1, measured pose 2, measured pose 3..., measured pose n ⁇ , where 1, 2, ..., n are timestamps.
  • S220 Receive at least two original point cloud data detected by the target object according to the second frequency.
  • the second sequence can be expressed as ⁇ original point cloud data 1, original point cloud data 10..., original point cloud data m ⁇ , where 1, 10,..., m are timestamps.
  • the measured pose and original point cloud data with the same time stamp it can be considered that the measured pose and the original point cloud data are located at the same time.
  • the first time is the time corresponding to the original point cloud data.
  • the time stamp of the original point cloud data 1 is time 1
  • the time stamp of the original point cloud data 1 is time 10
  • ICP Iterative Closest Point
  • NDT Normal Distributions Transform
  • step S240 can be further refined into the following steps:
  • the transformation relationship may be a movement matrix and/or a rotation matrix.
  • the movement and/or rotation of the measured pose can be realized to obtain the estimated pose.
  • multiple estimated poses can be obtained by using the transformation relationship, so that when the convergence condition is satisfied, an estimated pose can be determined from the multiple estimated poses as the actual target object at the first moment. Posture.
  • the reference point cloud data in the pre-built map is rasterized to obtain multiple rasters.
  • the space occupied by the reference point cloud data in the pre-built map is divided into grids or voxels of a specified size (CellSize).
  • the probability distribution corresponding to each grid can be determined according to the number of reference point cloud data included in the grid.
  • the normal distribution is used to represent the probability distribution corresponding to each grid
  • the multi-dimensional normal distribution parameters of each grid can be calculated, such as the mean q, the covariance matrix ⁇ , to determine the probability distribution of each grid.
  • the grid that each original point cloud data falls into can be determined; according to the probability distribution corresponding to the grid, the likelihood value of each original point cloud data is calculated; and the likelihood values of all the original point cloud data are calculated The product of as the result of the match. The larger the value of the matching result, the greater the probability that the estimated pose will be the actual pose.
  • step S15 If yes, go to step S15; if not, go to step S16.
  • the matching result can be used to adjust the transformation relationship based on the Newton optimization algorithm.
  • the adjustment range of the transformation relationship is less than the preset value, it can be considered that the matching result meets the convergence condition, and the estimated pose corresponding to the transformation relationship can be used as the actual pose.
  • step S17 Use the adjusted transformation relationship as the current transformation relationship, and continue to execute step S12 until the matching result meets the convergence condition.
  • S250 Determine the difference between the actual pose and the actual measured pose as offset data corresponding to the actual measured pose at the first moment.
  • the offset data is the difference between the measured pose and the actual pose of the target object.
  • the third sequence of offset data can be determined based on the original point cloud data.
  • the third sequence can be expressed as ⁇ offset data 1, offset data 2, offset data 3..., real offset data t ⁇ , where 1, 2,..., t are timestamps. Use the offset data to correct the measured pose to determine the actual pose of the target object.
  • step S240 in order to reduce the amount of calculation involved in calculating the offset data, when performing the above step S240 to determine the actual measured pose, consideration of the offset data at the previous moment may be added, such as based on the previous moment.
  • the offset data first corrects the actual measured pose at the current moment, and uses the corrected actual pose as the starting point. Based on the original point cloud data corresponding to the first moment, the search target object is in the pre-built map. The actual pose at a moment; step S250 is executed again. Generally, the offset data between adjacent moments generally has continuity or the gap is small. First, correct the actual measured pose at the current moment at the offset time of the previous moment, which is equivalent to reducing the corrected actual measured pose.
  • the gap with the actual pose reduces the search range for searching the actual pose in the pre-built map.
  • the number of adjustments to the transformation relationship in the NDT algorithm can be reduced, and the number of matching in the map using the original point cloud data can be reduced, thereby reducing the actual position obtained by the search. Stance time.
  • the original point cloud data 1 and the measured pose 1 at that time can be used.
  • the actual measured pose k+1 measured at the time k+1 can be used, and the time The offset data k of k corrects the actual measured pose k+1 to obtain the corrected actual measured pose k+1 (for example, the actual measured pose k+1 is added to the offset data k), and the corrected actual pose k+1 is added to the offset data k.
  • the offset data at k+1 is k+1.
  • the offset data at each time can be obtained.
  • S260 Perform an interpolation operation according to the offset data of two adjacent first moments to obtain offset data corresponding to each of the actual measured poses between two adjacent first moments.
  • the offset data at the current moment can be determined based on the offset data at two adjacent first moments, for example, the offset data at time 2 to time 9 (offset data 2 to offset data 9) It can be a linear interpolation of the offset data (offset data 1 and offset data 10) at time 1 and time 10.
  • the interpolation operation can be linear interpolation or nonlinear interpolation.
  • the linear interpolation is taken as an example for illustration.
  • the horizontal axis of the coordinate system is the time axis, and the time corresponding to the dashed line is the first time.
  • Correction(k) can be used to represent the offset data k corresponding to time k; t(k) can be used to represent time k. Then the offset data k+1 can be expressed as:
  • S270 Perform correction processing on the actual measured pose using the offset data to obtain the actual pose of the target object at each time.
  • the sum of the offset data and the measured pose at the same moment may be determined as the actual pose of the target object at each moment.
  • the actual pose 1 is the sum of the offset data 1 and the measured pose 1.
  • FIG. 3 is a schematic structural diagram of a positioning device provided in Embodiment 3 of this application.
  • the device specifically includes the following structures: a measured pose receiving module 310, an original point cloud receiving module 320, an offset data determining module 330, and an actual pose determining module 340.
  • the measured pose receiving module 310 is configured to receive at least two measured poses of the target object detected according to the first frequency.
  • the original point cloud receiving module 320 is configured to receive at least two original point cloud data detected by the target object according to the second frequency.
  • the offset data determining module 330 is configured to determine offset data at the same time as the actual measured pose based on the original point cloud data.
  • the actual pose determination module 340 is configured to use the offset data to perform correction processing on the actual measured pose to obtain the actual pose of the target object at each moment.
  • the technical solution of this embodiment is to receive at least two actual measured poses of the target object detected according to the first frequency; receive at least two original point cloud data detected by the target object according to the second frequency; based on the The original point cloud data is used to determine the offset data at the same time as the actual measured pose; use the offset data to correct the actual measured pose to obtain the actual pose of the target object at each time, It solves the problem that the original point cloud data collection process is susceptible to environmental noise, and the positioning accuracy is reduced, so that the determined target object's actual pose trajectory is smoother and the determined reality is improved. The effect of pose accuracy and efficiency.
  • the original point cloud receiving module 320 includes:
  • the point cloud data collection unit is used to collect point cloud data in different directions during the travel of the target object.
  • the time period dividing unit is configured to divide the point cloud data into multiple time periods according to the second frequency.
  • the original point cloud data determining unit is configured to normalize the time stamps of the point cloud data belonging to the same time period to obtain the original point cloud data corresponding to each time period.
  • the offset data determining module 330 includes:
  • the first moment determining unit is configured to determine the first moment corresponding to the original point cloud data
  • the search unit is configured to use the actual measured pose corresponding to the first moment as a starting point, and search for the target object in a pre-built map based on the original point cloud data corresponding to the first moment.
  • An offset data determining unit configured to determine the difference between the actual pose and the actual measured pose as the offset data corresponding to the actual measured pose at the first moment;
  • the interpolation operation unit is configured to perform an interpolation operation according to the offset data of two adjacent first moments to obtain the offset data corresponding to each of the actual measured poses between two adjacent first moments.
  • the search unit is specifically configured to: take the actual measured pose corresponding to the first moment as a starting point to initialize the current transformation relationship applied to the actual measured pose, and The transformation relationship is used to move and/or rotate the actual measured pose into an estimated pose; according to the current transformation relationship, the pose of the original point cloud data corresponding to the first moment is adjusted to the estimated pose ; Use the adjusted original point cloud data to match in a pre-built map; when the matching result meets the convergence condition, set the estimated pose as the actual target object at the first moment Posture.
  • the search unit is further configured to: when the matching result does not meet the convergence condition, adjust the transformation relationship according to the matching result; use the adjusted transformation relationship , As the current transformation relationship, and continue to perform the adjusting the pose of the original point cloud data corresponding to the first moment to the estimated pose until the matching result meets the convergence condition.
  • the search unit is also configured to perform matching on the reference point cloud data in the pre-built map using the adjusted original point cloud data. Rasterization to obtain multiple grids; determine the probability distribution corresponding to each grid according to the number of reference point cloud data included in the grid; determine where each of the original point cloud data falls According to the probability distribution corresponding to the grid, the likelihood value of each of the original point cloud data is calculated; the product of the likelihood values of all the original point cloud data is used as the result of the matching.
  • the actual pose determination module 340 includes:
  • the first correction unit is configured to determine the sum of the offset data and the measured pose at the same moment as the actual pose of the target object at each moment.
  • the actual pose determination module 340 includes:
  • the time difference determining unit is used to determine a preset time difference.
  • the second correction unit is configured to add the offset data with the time difference from the actual measured pose to the actual measured pose for the actual measured pose at each time to obtain the target object at each time The actual pose.
  • the above-mentioned product can execute the method provided in any embodiment of the present application, and has corresponding functional modules and beneficial effects for the execution method.
  • FIG. 4 is a schematic structural diagram of a positioning device provided in Embodiment 4 of this application.
  • the positioning device includes: a processor 40, a memory 41, an input device 42 and an output device 43.
  • the number of processors 40 in the positioning device may be one or more.
  • One processor 40 is taken as an example in FIG. 4.
  • the number of memories 41 in the positioning device may be one or more, and one memory 41 is taken as an example in FIG. 4.
  • the processor 40, the memory 41, the input device 42, and the output device 43 of the positioning device may be connected by a bus or other methods. In FIG. 4, the connection by a bus is taken as an example.
  • the positioning device can be a computer, a server, etc. In this embodiment, the positioning device is used as a server for detailed description.
  • the server may be an independent server or a cluster server.
  • the memory 41 can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the positioning method described in any embodiment of the present application (for example, the actual measurement position in the positioning device).
  • the memory 41 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system and an application program required by at least one function; the data storage area may store data created according to the use of the device, and the like.
  • the memory 41 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the memory 41 may further include a memory remotely provided with respect to the processor 40, and these remote memories may be connected to the device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the input device 42 can be used to receive input digital or character information, and to generate key signal input related to the viewer user settings and function control of the positioning device, and can also be a camera for acquiring images and a pickup device for acquiring audio data.
  • the output device 43 may include audio equipment such as a speaker. It should be noted that the specific composition of the input device 42 and the output device 43 can be set according to actual conditions.
  • the processor 40 executes various functional applications and data processing of the device by running the software programs, instructions, and modules stored in the memory 41, that is, realizes the above-mentioned positioning method.
  • the fifth embodiment of the present application also provides a storage medium containing computer-executable instructions.
  • the computer-executable instructions are used to perform a positioning method when executed by a computer processor, including:
  • Correction processing is performed on the actual measured pose using the offset data to obtain the actual pose of the target object at each time.
  • a storage medium containing computer-executable instructions provided by the embodiments of the present application and the computer-executable instructions are not limited to the above-mentioned positioning method operations, and can also execute the positioning methods provided in any embodiment of the present application. Related operations, and have corresponding functions and beneficial effects.
  • each part of this application can be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods can be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if it is implemented by hardware, as in another embodiment, it can be implemented by any one or a combination of the following technologies known in the art: Discrete logic circuits, application-specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

一种定位的方法、装置、设备及存储介质,涉及自动驾驶技术。该方法接收目标对象按照第一频率所检测到的至少两个实测位姿;接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。

Description

定位的方法、装置、设备及存储介质
本申请要求在2019年11月21日提交中国专利局、申请号为201911149095.2的中国专利申请的优先权,以上申请的全部内容通过引用结合在本申请中。
技术领域
本申请涉及自动驾驶技术,例如涉及一种定位的方法、装置、设备及存储介质。
背景技术
对无人汽车进行定位是自动驾驶技术中的重要组成部分。具体的,在无人汽车在行进过程中,可以通过激光雷达获取无人汽车周围的点云数据,并依据该点云数据在预先构建的地图进行匹配,以确定无人汽车在该地图的位置。并实现对该无人汽车的精确导航,实现自动驾驶。
需要注意的是,由激光雷达容易受到环境的影响,如下雨、交通拥堵等,而使得所获取的点云数据出现较大的噪声、变得不平滑,增加了使用该点云数据与地图进行匹配的难度,降低了定位的准确性。
一般的,为了解决上述的问题,可以设置有大量的校准参数,并通过调整该校准参数,来对点云数据进行过滤和平滑的处理,以减少点云数据的噪声,便于提高定位的准确性。但是,基于过滤和平滑技术的定位结果对校准参数非常敏感,这就给校准的精度和频率带来了沉重负担。
发明内容
本申请提供一种定位的方法、装置、设备及存储介质,以提高定位的效率和准确性。
第一方面,本申请提供了一种定位方法,该方法包括:
接收目标对象按照第一频率所检测到的至少两个实测位姿;
接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;
基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;
使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
所述接收所述目标对象按照第二频率所检测到的至少两个原始点云数据,包括:
在目标对象的行进过程中,采集不同方位的点云数据;
按照第二频率,将所述点云数据划分为多个时间段;
将属于同一所述时间段内的所述点云数据的时间戳进行归一化,得到各所述时间段所对应的原始点云数据。
所述基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据,包括:
确定所述原始点云数据所对应的第一时刻;
以所述第一时刻对应的所述实测位姿为起点,基于所述第一时刻对应的所述原始点云数据在预先构建的地图中,搜索所述目标对象在第一时刻的实际位姿;
将所述实际位姿与实测位姿之间的差距,确定为在所述第一时刻时,所述实测位姿对应的偏移数据;
依相邻两个所述第一时刻的偏移数据进行插值运算,得到相邻两个所述第一时刻之间每个所述实测位姿对应的偏移数据。
所述以所述第一时刻对应的所述实测位姿为起点,基于所述第一时刻对应的所述原始点云数据在预先构建的地图中,搜索所述目标对象在第一时刻的实际位姿,包括:
以所述第一时刻对应的所述实测位姿为起点,初始化当前的、应用于所述实测位姿的变换关系,所述变换关系用于将所述实测位姿移动和/或旋转为估计位姿;
依据当前的所述变换关系,将所述第一时刻对应的原始点云数据的位姿调整为所述估计位姿;
使用调整后的所述原始点云数据,在预先构建的地图中进行匹配;
当所述匹配的结果满足收敛条件时,将所述估计位姿设置为所述目标对象在第一时刻的实际位姿。
在所述使用调整后的所述原始点云数据,在预先构建的地图中进行匹配之后,还包括:
当所述匹配的结果不满足收敛条件时,根据所述匹配的结果,调整所述变换关系;
使用调整后的所述变换关系,作为当前的变换关系,并继续执行所述将所述第一时刻对应的原始点云数据的位姿调整为所述估计位姿直到所述匹配的结果满足所述收敛条件。
所述使用调整后的所述原始点云数据,在预先构建的地图中进行匹配,包括:
对预先构建的地图中的参考点云数据进行栅格化,得到多个栅格;
依据所述栅格中所包括的参考点云数据的数量,确定每个所述栅格对应的概率分布;
确定每个所述原始点云数据所落入的栅格;
依据所述栅格对应的概率分布,计算每个所述原始点云数据的似然值;
将所有所述原始点云数据的似然值的乘积,作为所述匹配的结果。
所述使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,包括:
将处于同一时刻下的所述偏移数据和所述实测位姿之和,确定为各个时刻下所述目标对象的实际位姿。
所述使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,包括:
确定一预置的时间差;
针对各时刻下的实测位姿,将与所述实测位姿具有所述时间差的所述偏移数据与所述实测位姿相加,得到各个时刻下所述目标对象的实际位姿。
第二方面,本申请还提供了一种定位装置,该装置包括:
实测位姿接收模块,用于接收目标对象按照第一频率所检测到的至少两个实测位姿;
原始点云接收模块,用于接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;
偏移数据确定模块,用于基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;
实际位姿确定模块,用于使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
第三方面,本申请还提供了一种定位设备,该设备包括:存储器以及一个或多个处理器;
所述存储器,用于存储一个或多个程序;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如第一方面中任一所述的定位方法。
第四方面,本申请还提供了一种包含计算机可执行指令的存储介质,其特征在于,所述计算机可执行指令在由计算机处理器执行时用于执行如第一方面中任一所述的定位方法。
本申请通过接收目标对象按照第一频率所检测到的至少两个实测位姿;接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,解决了因原始点云数据的采集过程容易受环境噪声的影响,而带来定位准确性降低的问题,使得所确定的目标对象的实际位姿的轨迹更加顺滑、且提高所确定的实际位姿的准确性和效率。
附图说明
图1A为本申请实施例一提供的一种定位方法的流程图;
图1B为本申请实施例一提供的一种获取实测位姿与原始点云数据的时刻的示意图;
图1C为本申请实施例一提供的一种实际位姿与实测位姿的关系示意图;
图2A为本申请实施例二提供的一种定位方法的流程图;
图2B为本申请实施例二提供的一种偏移数据的插值方式的示意图;
图3为本申请实施例三提供的一种定位装置的结构示意图;
图4为本申请实施例四提供的一种定位设备的结构示意图。
具体实施方式
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅仅用于解释本申请,而非对本申请的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与本申请相关的部分而非全部结构。
实施例一
图1A为本申请实施例一提供的一种定位方法的流程图,图1B为本申请实施例一提供的一种获取实测位姿与原始点云数据的时刻的示意图,图1C为本申请实施例一提供的一种实际位姿与实测位姿的关系示意图。本实施例可适用于定位目标对象的情况,该方法可以由定位装置执行。其中,该装置可由软件和/或硬件实现,并一般集成在目标对象的控制器中。需要说明的是,本实施例中的目标对象具体可指车辆、机器人、智能家具设备以及智能服务设备等能够自行执行相关工作的电子终端设备。可以理解的是,上述目标对象在无人控制的自动行进过程中,基于本实施例提供的定位方法,可以快速准确的实现对终端位姿的确定,以实现对该目标对象的精确导航以及避障。
为了清楚说明该方法,本实施例中,将以该目标对象为车辆的例子进行详细说明。参照图1A,该方法具体包括如下步骤:
S110、接收目标对象按照第一频率所检测到的至少两个实测位姿。
本实施例中,位姿可以包括位置和朝向。一般的,可以确定一参考坐标系,并确定目标对象在该参考坐标系中的坐标,作为该目标对象的位置;确定目标对象在该参考坐标系中的旋转角度,作为该目标对象的朝向。
实测位姿可以是通过预置在目标对象的硬件所检测到的位姿。本实施例中对通过何种硬件获取实测位姿不作限定,本实施例中将通过举例进行说明。
1、位置的确定
示例性的,可以是在目标对象中预先设置关于卫星定位的硬件模块,以卫星定位的方式进行位置的确定。
其中,卫星定位是一种使用卫星对某物进行准确定位的技术。卫星定位***是以确定空间位置为目标而构成的相互关联的一个集合体或装置(部件)。这个***可以保证在任意时刻,地球上任意一点都可以同时观测到至少4颗卫星,以保证卫星可以采集到该观测点的经纬度和高度,以便实现导航、定位、授时等功能。这项技术可以用来引导飞机、船舶、车辆以及个人,安全、准确地沿着选定的路线,准时到达目的地。
一般的,主流定位***除美国的全球定位***(Global Positioning System,GPS)外,还有中国的北斗卫星导航***、欧盟的伽利略卫星导航***、俄罗斯全球导航卫星***等。
进一步的,以目标对象的位置为从GPS***获得的定位结果确定为例进行说明,当参考坐标系以经纬度进行表示时,该定位结果可以包括目标对象所在位置的经纬度坐标。还可以通过高斯-克吕格投影、墨卡托投影、兰伯特投影、通用横轴墨卡托(Universal Transverse Mercator,UTM)投影等方法将该GPS的定位结果转换为平面坐标系下的坐标。
需要注意的是,由于卫星定位的精确度有限,一般为10米左右,不足以满足目标对象对高精度定位的需求。特别对于一些机器人、智能家具设备以及智能服务设备等能够自行执行相关工作的电子终端设备,需要设置该电子终端设备具备在其工作空间进行精确定位的功能。
2、朝向的确定
示例性的,本实施例中,可以是在目标对象中预先设置惯性测量单元(Inertial measurement unit,IMU),以测量目标对象的三轴姿态角(或角速率)以及加速度的方式进行朝向的确定,以旋转角度进行表示。
一般的,一个IMU内会装有三轴的陀螺仪和三个方向的加速度计,来测量物体在三维空间中的角速度和加速度,并以此解算出物体的姿态。为了提高可靠性,还可以为每个轴配备更多的传感器。一般而言。IMU要安装在被测物体的重心上。
可以设置预置在目标对象的硬件以第一频率进行实测位姿的检测,形成携带有时间戳的、由实测位姿组成的第一序列。
需要注意的是,一般,通过硬件的方式,在以第一频率所确定的每个时刻均是可以获取到实测位姿的,也就是说,第一序列在时间上是顺滑的。但由于硬件的使用环境的影响,也存在确定的位置不准确的问题。如,针对基于卫星 的定位,当卫星信号差时,可能无法为车辆的自动驾驶提供稳定、准确的定位结果。此外,为了提高卫星的定位结果的准确性,一般的,可以增加关于卫星定位的硬件的配置,但也带来了高昂的硬件成本。
S120、接收所述目标对象按照第二频率所检测到的至少两个原始点云数据。
点云数据是指透过三维扫描器所取得的数据的一种保存形式。一般的,在点云数据中,扫描数据以点的形式记录,每一个点可以包含有三维坐标,有些可能含有颜色信息(RGB)或反射面强度信息(Intensity)等。
三维扫面器可以是激光雷达、红外扫描仪等。本实施例中,可以是以激光雷达为例进行说明。
其中,激光雷达,是以发射激光束探测目标的位置、速度等特征量的雷达***。其工作原理是向目标发射探测信号(激光束),然后将接收到的从目标反射回来的信号(目标回波)与发射信号进行比较,作适当处理后,就可获得目标的有关信息,如目标距离、方位、高度、速度、姿态、甚至形状等参数。在本实施例中,激光雷达可以获取目标对象周围环境物体与目标对象的距离,以生成点云数据。
一般的,在目标对象自动行进的应用场景中,如在车辆自动驾驶的应用场景中,可以将激光雷达安装在车顶上,机械式的激光雷达绕轴扫射。该激光雷达可以通过多个激光发射组件快速旋转的同时,发射高频率激光束对外界环境进行持续性的扫描,以在目标对象的行进过程中,采集不同方位的点云数据。可以按照第二频率,将该点云数据划分为多个时间段,其中,该第二频率可以是激光雷达的旋转一周的频率,也可以是激光雷达的旋转一周的频率的倍数。每个时间段的点云数据,至少包括激光雷达每扫完一圈时,所扫描到的点云数据。可以是以运动补偿的方式,将属于同一时间段内的点云数据的时间戳进行归一化,得到各时间段所对应的原始点云数据,以消除由于车辆行驶和激光雷达旋转,所带来的数据偏差。
也就是说,可以设置预置在目标对象以第二频率进行原始点云数据的扫描,形成携带有时间戳的、由原始点云数据组成的第二序列。
本实施例中,可以设置第一频率大于第二频率。还可以设置第一频率是第二频率的倍数。示例性的,可以将第二频率设置为50赫兹,第一频率设置为500赫兹。也就是说,对于实测位姿对应的每个时刻而言,都可以获取该时刻下对应的原始点云数据。
示例性的,在第二频率为50赫兹,第一频率为500赫兹的例子中,参照如图1B中所示的时间轴,假设按照500赫兹的频率,获取时刻1到时刻10的10 个实测位姿。该第一序列,可以表示为{实测位姿1、实测位姿2、实测位姿3……,实测位姿n},其中,1,2,……,n为时间戳;同时,还可以按照50赫兹的频率,在时刻1和时刻10获取两个原始点云数据。该第二序列,可以表示为{原始点云数据1、原始点云数据10……,原始点云数据m},其中,1,10,……,m为时间戳。
S130、基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据。
本实施例中,偏移数据为实测位姿与目标对象的实际位姿之间的差距。
本实施例中,可以基于原始点云数据确定出偏移数据的第三序列。该第三序列,可以表示为{偏移数据1、偏移数据2、偏移数据3……,实偏移数据t},其中,1,2,……,t为时间戳。使用该偏移数据对实测位姿进行校正处理,以确定目标对象的实际位姿。
对于具有相同时间戳的实测位姿和原始点云数据,可以认为该实测位姿与原始点云数据位于相同的时刻下。
一般的,可以采用构建地图的方式,通过对原始点云数据与预先构建的地图中的点云数据进行匹配,确定目标对象在地图中的估计位姿。可以将该估计位姿与实测位姿的差距,作为偏移数据。
其中,在构建地图时,可以采用同步定位与地图构建(Simultaneous Localization and Mapping,SLAM)的技术。其中,SLAM最早在机器人领域提出,它指的是:机器人从未知环境的未知地点出发,在运动过程中通过重复观测到的环境特征定位自身位置和姿态,再根据自身位置构建周围环境的增量式地图,从而达到同时定位和地图构建的目的。
当然,本实施例中,由于实测位姿是按照第一频率进行检测,原始点云数据是按照第二频率进行采集,也就是说,并非所有实测位姿对应的时刻均有对应的原始点云数据。如参考图1B中的时间轴,其中,时刻1和时刻10下既有原始点云数据,也有实测位姿,可以直接确定时刻1和时刻10下的偏移数据;而时刻2到时刻9下具有实测位姿,但并没有获取对应的原始点云数据,无法直接获取偏移数据。
在一实施例中,当前时刻下的偏移数据可以依据相邻时刻下的偏移数据进行确定,如,时刻2到时刻9下的偏移数据可以是时刻1和时刻10之间的偏移数据的线性插值。
在又一实施例中,当前时刻下的偏移数据可以依据上一时刻下的偏移数据进行确定。如,时刻2到时刻9的偏移数据与时刻1相同,时刻11到时刻12…… 的偏移数据与时刻10的偏移数据相同。
需要说明的是,一般的,基于原始点云数据与预先构建的地图中的点云数据进行匹配,来确定目标对象在地图中的位姿,一方面,原始点云数据的获取容易受到环境因素的干扰,如在下雨天气、车辆拥堵的情况下,容易造成原始点云数据存在较大的噪声,进而影响到定位的准确性;另一方面,一般的,在进行点云数据的匹配是个耗费时间和耗费硬件算力的过程,容易给定位带来一定的滞后性,而滞后性在某些应用场景,如车辆无人驾驶的导航中,是致命的。本实施例中,使用实测位姿与原始点云数据确定的偏移数据之间的结合,来确定目标对象的实际位姿,一方面,以实测位姿为主的定位方式,可以减少由于原始点云数据受环境因素影响所带来的定位不准确的问题;另一方面,由于,无需在每一时刻,均进行原始点云数据与地图中的匹配,减少了匹配所带来的耗时,减少定位结果的滞后性。
S140、使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
在一实施例中,将处于同一时刻下的偏移数据和实测位姿之和,确定为各个时刻下目标对象的实际位姿。如,参照第一序列和第三序列,实际位姿1为偏移数据1与实测位姿1之和。示例性的,参照图1C,坐标系的横轴为时间轴,纵轴表示位姿的幅度;黑实线表示实测位姿随时间t的变化曲线,黑虚线表示实际位姿随时间t的变化曲线。则当时间t=k时,实际位姿k为偏移数据k与实测位姿k之和。
在又一实施例中,可以确定一预置的时间差;针对各时刻下的实测位姿,将与实测位姿具有该时间差的偏移数据与所述实测位姿相加,得到各个时刻下目标对象的实际位姿。如,时间差为△时,参照第一序列和第三序列,实际位姿1为偏移数据1+△与实测位姿1之和,当△为一个时刻时,则实际位姿1为偏移数据2与实测位姿1之和。通过时间差的设置,相当于将基于原始点云数据确定偏移数据时所产生的噪声,平均分配到时间差△的时间段内,可以达到在保留使用偏移数据变化的大趋势的同时,将该噪声进行平滑处理的效果。其中,基于原始点云数据确定偏移数据时所产生的噪声,有可能是由于原始点云数据受到环境因素影响所带来的,也有可能是,原始点云数据与地图中的点云数据进行匹配所带来的。该时间差△可以经过试验获得,可以设置为0.8s,在不影响实际位姿的准确度的情况下,也可以设置为更大的时间。
本实施例的技术方案,通过接收目标对象按照第一频率所检测到的至少两个实测位姿;接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据; 使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,解决了因原始点云数据的采集过程容易受环境噪声的影响,而带来定位准确性降低的问题,实现使得所确定的目标对象的实际位姿的轨迹更加顺滑、且提高所确定的实际位姿的准确性和效率的效果。
实施例二
图2A为本申请实施例二提供的一种定位方法的流程图;图2B为本申请实施例二提供的一种偏移数据的插值方式的示意图。
本实施例在上述实施例的基础上进一步细化偏移数据的确定方式,参照图2A,该方法可以包括:
S210、接收目标对象按照第一频率所检测到的至少两个实测位姿。
参照如图1B中所示的时间轴,假设按照500赫兹的第一频率,获取时刻1到时刻10的10个实测位姿。该第一序列,可以表示为{实测位姿1、实测位姿2、实测位姿3……,实测位姿n},其中,1,2,……,n为时间戳。
S220、接收所述目标对象按照第二频率所检测到的至少两个原始点云数据。
参照如图1B中所示的时间轴,假设按照50赫兹的第二频率,在时刻1和时刻10获取两个原始点云数据。该第二序列,可以表示为{原始点云数据1、原始点云数据10……,原始点云数据m},其中,1,10,……,m为时间戳。
对于具有相同时间戳的实测位姿和原始点云数据,可以认为该实测位姿与原始点云数据位于相同的时刻下。
S230、确定所述原始点云数据所对应的第一时刻。
本实施例中,第一时刻为原始点云数据对应的时刻,如图1B中的原始点云数据1的时间戳为时刻1、原始点云数据1的时间戳为时刻10……
S240、以所述第一时刻对应的所述实测位姿为起点,基于所述第一时刻对应的所述原始点云数据在预先构建的地图中,搜索所述目标对象在第一时刻的实际位姿。
一般的,可以采用最近点迭代(Iterative Closest Point,ICP)、正态分布变换(Normal Distributions Transform,NDT)等算法,对原始点云数据与预先构建的地图中的点云数据中进行匹配,以确定目标对象在地图中的实际位姿。
示例性的,以NDT算法为例,步骤S240可以进一步细化为如下步骤:
S11、以该第一时刻对应的该实测位姿为起点,初始化当前的、应用于该实测位姿的变换关系,该变换关系用于将该实测位姿移动和/或旋转为估计位姿。
其中,当实测位姿使用包括位置和朝向的坐标的向量进行表征时,该变换关系可以是移动矩阵和/或旋转矩阵。
S12、依据当前的该变换关系,将该第一时刻对应的原始点云数据的位姿调整为该估计位姿。
将实测位姿的向量乘以该移动矩阵和/或旋转矩阵,即可实现对该实测位姿的移动和/或旋转,以得到估计位姿。
本实施例中,可以通过使用该变换关系获取多个估计位姿,以在满足收敛条件时,从多个估计位姿中,确定出一估计位姿,作为该目标对象在第一时刻的实际位姿。
S13、使用调整后的该原始点云数据,在预先构建的地图中进行匹配;
对预先构建的地图中的参考点云数据进行栅格化,得到多个栅格。如,将预先构建的地图中的参考点云数据所占的空间划分成指定大小(CellSize)的栅格或体素(Voxel)。
可以依据该栅格中所包括的参考点云数据的数量,确定每个该栅格对应的概率分布。当使用正态分布来表示每个该栅格对应的概率分布,则可以计算每个栅格的多维正态分布参数,如均值q、协方差矩阵Σ,以确定每个栅格概率分布。
可以确定每个该原始点云数据所落入的栅格;依据该栅格对应的概率分布,计算每个该原始点云数据的似然值;并将所有该原始点云数据的似然值的乘积,作为该匹配的结果。该匹配的结果的值越大,则可以表明该估计位姿作为实际位姿的概率越大。
S14、判断该匹配的结果是否满足收敛条件。
若是,则执行步骤S15;若否,则执行步骤S16。
本实施例中,可以使用匹配的结果,基于牛顿优化算法对变换关系进行调整。当变换关系的调整幅度小于预设值时,可以认为匹配的结果满足收敛条件,则该变换关系所对应的估计位姿,可以作为实际位姿。
S15、将该估计位姿设置为该目标对象在第一时刻的实际位姿。
S16、根据该匹配的结果,调整该变换关系。
S17、使用调整后的该变换关系,作为当前的变换关系,并继续执行步骤S12直到所述匹配的结果满足所述收敛条件。
S250、将所述实际位姿与实测位姿之间的差距,确定为在所述第一时刻时,所述实测位姿对应的偏移数据。
本实施例中,偏移数据为实测位姿与目标对象的实际位姿之间的差距。
本实施例中,可以基于原始点云数据确定出偏移数据的第三序列。该第三序列,可以表示为{偏移数据1、偏移数据2、偏移数据3……,实偏移数据t},其中,1,2,……,t为时间戳。使用该偏移数据对实测位姿进行校正处理,以确定目标对象的实际位姿。
在一实施例中,为了减少计算偏移数据时所涉及的计算量,可以在执行上述步骤S240以确定实测位姿时,增加对上一时刻的偏移数据的考虑,如依据上一时刻的偏移数据先对当前时刻的实测位姿进行校正处理,并以校正后的实测位姿为起点,基于所述第一时刻对应的原始点云数据在预先构建的地图中,搜索目标对象在第一时刻的实际位姿;再执行步骤S250。一般的,相邻时刻之间的偏移数据一般存在连续性或者差距较小,先以上一时刻的偏移时刻对当前时刻的实测位姿进行校正,相当于减小了校正后的实测位姿与实际位姿的差距,减小了在预先构建的地图中搜索实际位姿的搜索范围。示例性的,在使用NDT算法进行实际位姿的搜索时,可以减少NDT算法中调整变换关系的次数,也是减少了使用原始点云数据在地图中进行匹配的次数,进而减少了搜索得到实际位姿的时间。
当然需要注意的是,在计算偏移数据时,仍然是计算实际位姿与实测位姿之间的差距,而不是计算实际位姿与校正后的实测位姿之间的差距。
可以分为两种情况进行分析:
1、上一时刻存在偏移数据
示例性的,当时刻t=1时,假设在该时刻之前不存在已计算得到的偏移数据,则可以利用该时刻的原始点云数据1和实测位姿1,在预先构建的地图中,以NDT算法,搜索得到实际位姿1;之后,将得到的实际位姿1减去实测位姿1,得到时刻t=1时的偏移数据1。
2、上一时刻不存在偏移数据
当时刻t=k+1时,假设在该时刻k+1之前的时刻k存在已计算得到的偏移数据k,则可以利用时刻k+1测得的实测位姿k+1,并使用时刻k的偏移数据k对实测位姿k+1进行校正处理,得到校正后的实测位姿k+1(如,实测位姿k+1与偏移数据k相加),并以校正后的实测位姿k+1在预先构建的地图中,使用NDT算法,搜索得到实际位姿k+1;之后,将得到的实际位姿k+1减去实测位姿k+1,得到时刻t=k+1时的偏移数据k+1。
以此类推,则可以得到各个时刻的偏移数据。
S260、依相邻两个所述第一时刻的偏移数据进行插值运算,得到相邻两个 所述第一时刻之间每个所述实测位姿对应的偏移数据。
本实施例中,当前时刻下的偏移数据可以依据相邻两个第一时刻下的偏移数据进行确定,如,时刻2到时刻9下的偏移数据(偏移数据2到偏移数据9)可以是时刻1和时刻10的偏移数据(偏移数据1和偏移数据10)的线性插值。
该插值运算可以是线性插值或者非线性插值。本实施例中,以线性插值为例进行举例说明,参照图2B,坐标系的横轴为时间轴,虚线所对应的时刻为第一时刻,每相邻两个第一时刻之间的时刻所对应的偏移数据,可以依据依相邻两个第一时刻的偏移数据进行线性插值运算得到。如时刻t=k+x为第一时刻t=k与第一时刻t=k+1之间的时刻,则x为小于1大于0的数。可以使用correction(k)表示时刻k所对应的偏移数据k;用t(k)表示时刻k。则偏移数据k+1可以表示为:
Figure PCTCN2019126326-appb-000001
S270、使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
本实施例中,可以将处于同一时刻下的偏移数据和实测位姿之和,确定为各个时刻下目标对象的实际位姿。如,参照第一序列和第三序列,实际位姿1为偏移数据1与实测位姿1之和。
本实施例的技术方案,通过接收目标对象按照第一频率所检测到的至少两个实测位姿;接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;确定所述原始点云数据所对应的第一时刻;以所述第一时刻对应的所述实测位姿为起点,基于所述第一时刻对应的所述原始点云数据在预先构建的地图中,搜索所述目标对象在第一时刻的实际位姿;将所述实际位姿与实测位姿之间的差距,确定为在所述第一时刻时,所述实测位姿对应的偏移数据;依相邻两个所述第一时刻的偏移数据进行插值运算,得到相邻两个所述第一时刻之间每个所述实测位姿对应的偏移数据;使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,解决了因原始点云数据的采集过程容易受环境噪声的影响,而带来定位准确性降低的问题,实现使得所确定的目标对象的实际位姿的轨迹更加顺滑、且提高所确定的实际位姿的准确性和效率的效果。
实施例三
图3为本申请实施例三提供的一种定位装置的结构示意图。
参照图3,该装置具体包括如下结构:实测位姿接收模块310、原始点云接 收模块320、偏移数据确定模块330和实际位姿确定模块340。
实测位姿接收模块310,用于接收目标对象按照第一频率所检测到的至少两个实测位姿。
原始点云接收模块320,用于接收所述目标对象按照第二频率所检测到的至少两个原始点云数据。
偏移数据确定模块330,用于基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据。
实际位姿确定模块340,用于使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
本实施例的技术方案,通过接收目标对象按照第一频率所检测到的至少两个实测位姿;接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,解决了因原始点云数据的采集过程容易受环境噪声的影响,而带来定位准确性降低的问题,实现使得所确定的目标对象的实际位姿的轨迹更加顺滑、且提高所确定的实际位姿的准确性和效率的效果。
在上述技术方案的基础上,所述原始点云接收模块320,包括:
点云数据采集单元,用于在目标对象的行进过程中,采集不同方位的点云数据。
时间段划分单元,用于按照第二频率,将所述点云数据划分为多个时间段。
原始点云数据确定单元,用于将属于同一所述时间段内的所述点云数据的时间戳进行归一化,,得到各所述时间段所对应的原始点云数据。
在上述技术方案的基础上,所述偏移数据确定模块330,包括:
第一时刻确定单元,用于确定所述原始点云数据所对应的第一时刻;
搜索单元,用于以所述第一时刻对应的所述实测位姿为起点,基于所述第一时刻对应的所述原始点云数据在预先构建的地图中,搜索所述目标对象在第一时刻的实际位姿;
偏移数据确定单元,用于将所述实际位姿与实测位姿之间的差距,确定为在所述第一时刻时,所述实测位姿对应的偏移数据;
插值运算单元,用于依相邻两个所述第一时刻的偏移数据进行插值运算,得到相邻两个所述第一时刻之间每个所述实测位姿对应的偏移数据。
在上述技术方案的基础上,所述搜索单元,具体用于:以所述第一时刻对应的所述实测位姿为起点,初始化当前的、应用于所述实测位姿的变换关系,所述变换关系用于将所述实测位姿移动和/或旋转为估计位姿;依据当前的所述变换关系,将所述第一时刻对应的原始点云数据的位姿调整为所述估计位姿;使用调整后的所述原始点云数据,在预先构建的地图中进行匹配;当所述匹配的结果满足收敛条件时,将所述估计位姿设置为所述目标对象在第一时刻的实际位姿。
在上述技术方案的基础上,所述搜索单元,还用于:当所述匹配的结果不满足收敛条件时,根据所述匹配的结果,调整所述变换关系;使用调整后的所述变换关系,作为当前的变换关系,并继续执行所述将所述第一时刻对应的原始点云数据的位姿调整为所述估计位姿直到所述匹配的结果满足所述收敛条件。
在上述技术方案的基础上,所述搜索单元,还用于在使用调整后的所述原始点云数据,在预先构建的地图中进行匹配时,对预先构建的地图中的参考点云数据进行栅格化,得到多个栅格;依据所述栅格中所包括的参考点云数据的数量,确定每个所述栅格对应的概率分布;确定每个所述原始点云数据所落入的栅格;据所述栅格对应的概率分布,计算每个所述原始点云数据的似然值;将所有所述原始点云数据的似然值的乘积,作为所述匹配的结果。
在上述技术方案的基础上,所述实际位姿确定模块340,包括:
第一校正单元,用于将处于同一时刻下的所述偏移数据和所述实测位姿之和,确定为各个时刻下所述目标对象的实际位姿。
在上述技术方案的基础上,所述实际位姿确定模块340,包括:
时间差确定单元,用于确定一预置的时间差。
第二校正单元,用于针对各时刻下的实测位姿,将与所述实测位姿具有所述时间差的所述偏移数据与所述实测位姿相加,得到各个时刻下所述目标对象的实际位姿。
上述产品可执行本申请任意实施例所提供的方法,具备执行方法相应的功能模块和有益效果。
实施例四
图4为本申请实施例四提供的一种定位设备的结构示意图。如图4所示,该定位设备包括:处理器40、存储器41、输入装置42以及输出装置43。该定位设备中处理器40的数量可以是一个或者多个,图4中以一个处理器40为例。该定位设备中存储器41的数量可以是一个或者多个,图4中以一个存储器41为例。该定位设备的处理器40、存储器41、输入装置42以及输出装置43可以 通过总线或者其他方式连接,图4中以通过总线连接为例。该定位设备可以是电脑和服务器等。本实施例以定位设备为服务器进行详细说明,该服务器可以是独立服务器或集群服务器。
存储器41作为一种计算机可读存储介质,可用于存储软件程序、计算机可执行程序以及模块,如本申请任意实施例所述的定位方法对应的程序指令/模块(例如,定位装置中的实测位姿接收模块310、原始点云接收模块320、偏移数据确定模块330和实际位姿确定模块340)。存储器41可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序;存储数据区可存储根据设备的使用所创建的数据等。此外,存储器41可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实例中,存储器41可进一步包括相对于处理器40远程设置的存储器,这些远程存储器可以通过网络连接至设备。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入装置42可用于接收输入的数字或者字符信息,以及产生与定位设备的观众用户设置以及功能控制有关的键信号输入,还可以是用于获取图像的摄像头以及获取音频数据的拾音设备。输出装置43可以包括扬声器等音频设备。需要说明的是,输入装置42和输出装置43的具体组成可以根据实际情况设定。
处理器40通过运行存储在存储器41中的软件程序、指令以及模块,从而执行设备的各种功能应用以及数据处理,即实现上述的定位方法。
实施例五
本申请实施例五还提供一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时用于执行一种定位方法,包括:
接收目标对象按照第一频率所检测到的至少两个实测位姿;
接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;
基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;
使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
当然,本申请实施例所提供的一种包含计算机可执行指令的存储介质,其计算机可执行指令不限于如上所述的定位方法操作,还可以执行本申请任意实施例所提供的定位方法中的相关操作,且具备相应的功能和有益效果。
通过以上关于实施方式的描述,所属领域的技术人员可以清楚地了解到, 本申请可借助软件及必需的通用硬件来实现,当然也可以通过硬件实现,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如计算机的软盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、闪存(FLASH)、硬盘或光盘等,包括若干指令用以使得一台计算机设备(可以是机器人,个人计算机,服务器,或者网络设备等)执行本申请任意实施例所述的定位方法。
值得注意的是,上述定位装置中,所包括的各个单元和模块只是按照功能逻辑进行划分的,但并不局限于上述的划分,只要能够实现相应的功能即可;另外,各功能单元的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。
应当理解,本申请的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行***执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
在本说明书的描述中,参考术语“在一实施例中”、“在又一实施例中”、“示例性的”或“在具体的实施例中”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施例或示例中以合适的方式结合。

Claims (11)

  1. 一种定位方法,包括:
    接收目标对象按照第一频率所检测到的至少两个实测位姿;
    接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;
    基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;
    使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
  2. 根据权利要求1所述的方法,其中,所述接收所述目标对象按照第二频率所检测到的至少两个原始点云数据,包括:
    在目标对象的行进过程中,采集不同方位的点云数据;
    按照第二频率,将所述点云数据划分为多个时间段;
    将属于同一所述时间段内的所述点云数据的时间戳进行归一化,得到各所述时间段所对应的原始点云数据。
  3. 根据权利要求1所述的方法,其中,所述基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据,包括:
    确定所述原始点云数据所对应的第一时刻;
    以所述第一时刻对应的所述实测位姿为起点,基于所述第一时刻对应的所述原始点云数据在预先构建的地图中,搜索所述目标对象在第一时刻的实际位姿;
    将所述实际位姿与实测位姿之间的差距,确定为在所述第一时刻时,所述实测位姿对应的偏移数据;
    依相邻两个所述第一时刻的偏移数据进行插值运算,得到相邻两个所述第一时刻之间每个所述实测位姿对应的偏移数据。
  4. 根据权利要求3所述的方法,其中,所述以所述第一时刻对应的所述实测位姿为起点,基于所述第一时刻对应的所述原始点云数据在预先构建的地图中,搜索所述目标对象在第一时刻的实际位姿,包括:
    以所述第一时刻对应的所述实测位姿为起点,初始化当前的、应用于所述实测位姿的变换关系,所述变换关系用于将所述实测位姿移动和/或旋转为估计位姿;
    依据当前的所述变换关系,将所述第一时刻对应的原始点云数据的位姿调整为所述估计位姿;
    使用调整后的所述原始点云数据,在预先构建的地图中进行匹配;
    当所述匹配的结果满足收敛条件时,将所述估计位姿设置为所述目标对象在第一时刻的实际位姿。
  5. 根据权利要求4所述的方法,其中,在所述使用调整后的所述原始点云数据,在预先构建的地图中进行匹配之后,还包括:
    当所述匹配的结果不满足收敛条件时,根据所述匹配的结果,调整所述变换关系;
    使用调整后的所述变换关系,作为当前的变换关系,并继续执行所述将所述第一时刻对应的原始点云数据的位姿调整为所述估计位姿直到所述匹配的结果满足所述收敛条件。
  6. 根据权利要求4所述的方法,其中,所述使用调整后的所述原始点云数据,在预先构建的地图中进行匹配,包括:
    对预先构建的地图中的参考点云数据进行栅格化,得到多个栅格;
    依据所述栅格中所包括的参考点云数据的数量,确定每个所述栅格对应的概率分布;
    确定每个所述原始点云数据所落入的栅格;
    依据所述栅格对应的概率分布,计算每个所述原始点云数据的似然值;
    将所有所述原始点云数据的似然值的乘积,作为所述匹配的结果。
  7. 根据权利要求1所述的方法,其中,所述使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,包括:
    将处于同一时刻下的所述偏移数据和所述实测位姿之和,确定为各个时刻下所述目标对象的实际位姿。
  8. 根据权利要求1所述的方法,其中,所述使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿,包括:
    确定一预置的时间差;
    针对各时刻下的实测位姿,将与所述实测位姿具有所述时间差的所述偏移数据与所述实测位姿相加,得到各个时刻下所述目标对象的实际位姿。
  9. 一种定位装置,包括:
    实测位姿接收模块,用于接收目标对象按照第一频率所检测到的至少两个实测位姿;
    原始点云接收模块,用于接收所述目标对象按照第二频率所检测到的至少两个原始点云数据;
    偏移数据确定模块,用于基于所述原始点云数据,确定与所述实测位姿在相同时刻下的偏移数据;
    实际位姿确定模块,用于使用所述偏移数据对所述实测位姿进行校正处理,得到各个时刻下所述目标对象的实际位姿。
  10. 一种定位设备,包括:存储器以及一个或多个处理器;
    所述存储器,用于存储一个或多个程序;
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-8中任一所述的定位方法。
  11. 一种包含计算机可执行指令的存储介质,其中,所述计算机可执行指令在由计算机处理器执行时用于执行如权利要求1-8中任一所述的定位方法。
PCT/CN2019/126326 2019-11-21 2019-12-18 定位的方法、装置、设备及存储介质 WO2021097983A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911149095.2A CN110889808B (zh) 2019-11-21 2019-11-21 一种定位的方法、装置、设备及存储介质
CN201911149095.2 2019-11-21

Publications (1)

Publication Number Publication Date
WO2021097983A1 true WO2021097983A1 (zh) 2021-05-27

Family

ID=69748281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/126326 WO2021097983A1 (zh) 2019-11-21 2019-12-18 定位的方法、装置、设备及存储介质

Country Status (2)

Country Link
CN (1) CN110889808B (zh)
WO (1) WO2021097983A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113671523A (zh) * 2021-08-18 2021-11-19 Oppo广东移动通信有限公司 机器人定位方法、装置、存储介质及机器人
CN114485607A (zh) * 2021-12-02 2022-05-13 陕西欧卡电子智能科技有限公司 一种运动轨迹的确定方法、作业设备、装置、存储介质
CN116148879A (zh) * 2021-11-22 2023-05-23 珠海一微半导体股份有限公司 一种机器人提升障碍物标注精度的方法

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110646785B (zh) * 2019-09-30 2023-02-21 上海润欣科技股份有限公司 基于阵列调频连续波和传感算法的厂线用定位***
CN112703368B (zh) * 2020-04-16 2022-08-09 华为技术有限公司 车辆定位的方法和装置、定位图层生成的方法和装置
CN111552757B (zh) * 2020-04-30 2022-04-01 上海商汤临港智能科技有限公司 生成电子地图的方法、装置、设备及存储介质
CN111708048B (zh) * 2020-08-19 2021-02-05 深圳市速腾聚创科技有限公司 点云的运动补偿方法、装置和***
CN111966109B (zh) * 2020-09-07 2021-08-17 中国南方电网有限责任公司超高压输电公司天生桥局 基于柔性直流换流站阀厅的巡检机器人定位方法及装置
CN112883134A (zh) * 2021-02-01 2021-06-01 上海三一重机股份有限公司 数据融合建图方法、装置、电子设备及存储介质
CN113008274B (zh) * 2021-03-19 2022-10-04 奥特酷智能科技(南京)有限公司 车辆初始化定位方法、***及计算机可读介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170344018A1 (en) * 2016-05-24 2017-11-30 Baidu Online Network Technology (Beijing) Co., Ltd. Unmanned vehicle, method, apparatus and system for positioning unmanned vehicle
US20180306922A1 (en) * 2017-04-20 2018-10-25 Baidu Online Network Technology (Beijing) Co., Ltd Method and apparatus for positioning vehicle
CN108732584A (zh) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 用于更新地图的方法和装置
CN108732603A (zh) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 用于定位车辆的方法和装置
CN109945856A (zh) * 2019-02-18 2019-06-28 天津大学 基于惯性/雷达的无人机自主定位与建图方法
CN110243358A (zh) * 2019-04-29 2019-09-17 武汉理工大学 多源融合的无人车室内外定位方法及***

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9575184B2 (en) * 2014-07-03 2017-02-21 Continental Advanced Lidar Solutions Us, Inc. LADAR sensor for a dense environment
CN105487535A (zh) * 2014-10-09 2016-04-13 东北大学 一种基于ros的移动机器人室内环境探索***与控制方法
CN108917759A (zh) * 2018-04-19 2018-11-30 电子科技大学 基于多层次地图匹配的移动机器人位姿纠正算法
CN108873001B (zh) * 2018-09-17 2022-09-09 江苏金智科技股份有限公司 一种精准评判机器人定位精度的方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170344018A1 (en) * 2016-05-24 2017-11-30 Baidu Online Network Technology (Beijing) Co., Ltd. Unmanned vehicle, method, apparatus and system for positioning unmanned vehicle
CN108732584A (zh) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 用于更新地图的方法和装置
CN108732603A (zh) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 用于定位车辆的方法和装置
US20180306922A1 (en) * 2017-04-20 2018-10-25 Baidu Online Network Technology (Beijing) Co., Ltd Method and apparatus for positioning vehicle
CN109945856A (zh) * 2019-02-18 2019-06-28 天津大学 基于惯性/雷达的无人机自主定位与建图方法
CN110243358A (zh) * 2019-04-29 2019-09-17 武汉理工大学 多源融合的无人车室内外定位方法及***

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113671523A (zh) * 2021-08-18 2021-11-19 Oppo广东移动通信有限公司 机器人定位方法、装置、存储介质及机器人
CN116148879A (zh) * 2021-11-22 2023-05-23 珠海一微半导体股份有限公司 一种机器人提升障碍物标注精度的方法
CN116148879B (zh) * 2021-11-22 2024-05-03 珠海一微半导体股份有限公司 一种机器人提升障碍物标注精度的方法
CN114485607A (zh) * 2021-12-02 2022-05-13 陕西欧卡电子智能科技有限公司 一种运动轨迹的确定方法、作业设备、装置、存储介质
CN114485607B (zh) * 2021-12-02 2023-11-10 陕西欧卡电子智能科技有限公司 一种运动轨迹的确定方法、作业设备、装置、存储介质

Also Published As

Publication number Publication date
CN110889808A (zh) 2020-03-17
CN110889808B (zh) 2023-02-28

Similar Documents

Publication Publication Date Title
WO2021097983A1 (zh) 定位的方法、装置、设备及存储介质
US10878243B2 (en) Method, device and apparatus for generating electronic map, storage medium, and acquisition entity
Wen et al. GNSS NLOS exclusion based on dynamic object detection using LiDAR point cloud
WO2020093378A1 (en) Vehicle positioning system using lidar
KR20210111180A (ko) 위치 추적 방법, 장치, 컴퓨팅 기기 및 컴퓨터 판독 가능한 저장 매체
AU2018278849A1 (en) Vehicle navigation system using pose estimation based on point cloud
CN111670419A (zh) 用于自主导航的主动补充曝光设置
JP2016109650A (ja) 位置推定装置、位置推定方法、位置推定プログラム
CN112051575A (zh) 一种毫米波雷达与激光雷达的调整方法及相关装置
CN113240813B (zh) 三维点云信息确定方法及装置
US11373328B2 (en) Method, device and storage medium for positioning object
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
US20220390607A1 (en) Collaborative estimation and correction of lidar boresight alignment error and host vehicle localization error
RU2513900C1 (ru) Способ и устройство определения координат объектов
CN110794434B (zh) 一种位姿的确定方法、装置、设备及存储介质
KR102028323B1 (ko) 영상 레이더의 영상 보정 장치 및 시스템
CN116465393A (zh) 基于面阵激光传感器的同步定位和建图方法及装置
Deusch et al. Improving localization in digital maps with grid maps
WO2020244467A1 (zh) 一种运动状态估计方法及装置
CN112578363B (zh) 激光雷达运动轨迹获取方法及装置、介质
JP7462847B1 (ja) 位置推定装置、位置推定方法、及び位置推定プログラム
WO2024127480A1 (ja) 位置推定装置、位置推定方法、及び位置推定プログラム
CN117745779B (zh) 一种光学和sar共孔径一致性成像方法
US20240053487A1 (en) Systems and methods for transforming autonomous aerial vehicle sensor data between platforms
US20240031670A1 (en) System and method to reduce an amount of sunlight and an amount of specular reflection in drone sensing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19953499

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19953499

Country of ref document: EP

Kind code of ref document: A1