CN113494914A - Method for vehicle positioning, device for positioning and vehicle - Google Patents

Method for vehicle positioning, device for positioning and vehicle Download PDF

Info

Publication number
CN113494914A
CN113494914A CN202010253143.9A CN202010253143A CN113494914A CN 113494914 A CN113494914 A CN 113494914A CN 202010253143 A CN202010253143 A CN 202010253143A CN 113494914 A CN113494914 A CN 113494914A
Authority
CN
China
Prior art keywords
data
vehicle
state
coordinate
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010253143.9A
Other languages
Chinese (zh)
Inventor
李千山
王逸平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Priority to CN202010253143.9A priority Critical patent/CN113494914A/en
Publication of CN113494914A publication Critical patent/CN113494914A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The invention relates to a method for vehicle localization, comprising: acquiring sensor data; coordinate transforming the sensor data to obtain coordinate transformed data; ascertaining whether a first state is satisfied, and if so: -performing a clustering process on the coordinate transformed data to obtain clustered data; -gaussian modeling the clustered data to obtain gaussian modeled data; -determining position data of the vehicle based on the gaussian modeled data; ascertaining whether a second state is satisfied, and if so: -generating a point cloud map based on the coordinate transformed data; -determining position data of the vehicle based on the point cloud map. Furthermore, the invention relates to a device for vehicle localization, a non-transitory computer-readable storage medium and a vehicle.

Description

Method for vehicle positioning, device for positioning and vehicle
Technical Field
The invention relates to the field of automatic driving, in particular to a method for positioning a vehicle, a device for positioning and a vehicle.
Background
Autonomous driving is a major subject of current vehicle research. The precondition for autonomous driving is the definition of the road network on which the vehicle is traveling and the sensing of the environment surrounding the vehicle, so that the road network concerned and objects and other traffic participants in the surrounding environment can be identified. One approach is to use cameras and lidar to acquire data for the road network and to sense the distance of objects around the vehicle.
Currently, as for the classification of automatic driving, there is a general international acceptance of the standard of SAE (society of automotive engineers) classified into six grades L0-L5. Level L0, meaning fully manual, the device provides at most some auxiliary warnings and signals, such as radar warning when backing a car, distance warning when driving a car; l1 has some transverse or longitudinal auxiliary function to intervene in the driving operation, which can be called auxiliary driving, such as adaptive cruise, automatic emergency braking, etc., and the device starts to have active control action on the vehicle; the L2 device enables automatic driving of the vehicle in both lateral and longitudinal directions, but the driver is constantly attentive and ready to take over driving of the vehicle. The automated driving of L3 enables a high degree of machine operation, the driver can completely abandon the manoeuvre and only in a few cases need to take over the car; there is a large gap between L3 and L4, i.e., the steering wheel can be completely eliminated. The L3 device needs to consider human-machine coordination, switching between human operation and machine operation, and the L4 does not consider human intervention vehicle operation. Full intelligence of the road vehicle is achieved by the highest L5.
With the development of automatic driving technology, higher requirements are also put on the accuracy and the efficiency of positioning of the vehicle.
Disclosure of Invention
It is therefore an object of the present invention to provide a method, a device and a vehicle which enable an improved automatic driving of the vehicle.
According to a first aspect of the invention, a method for vehicle localization is provided. The method comprises the following steps:
-acquiring sensor data;
-coordinate transforming the sensor data to obtain coordinate transformed data;
-ascertaining whether a first state is satisfied, and if so:
-performing a clustering process on the coordinate transformed data to obtain clustered data;
-gaussian modeling the clustered data to obtain gaussian modeled data;
-determining position data of the vehicle based on the gaussian modeled data;
-ascertaining whether a second state is satisfied, and if so:
-generating a point cloud map based on the coordinate transformed data;
-determining position data of the vehicle based on the point cloud map.
The method and the device for vehicle positioning according to the invention can be adapted to different application scenarios and can be flexibly adjusted, for example, adjusting an algorithm and/or a processing pipeline.
In some embodiments, the method further comprises: the method comprises the steps of acquiring characteristic parameters related to the sensor, and configuring, especially automatically configuring, coordinate transformation based on the characteristic parameters, preferably, the characteristic parameters related to the sensor relate to hardware and/or software characteristics of the sensor in data processing, and preferably, the characteristic parameters related to the sensor indicate a coordinate system referred by the sensor data.
In some embodiments, "configuring a coordinate transformation based on the characteristic parameter" includes:
if the sensor data is referenced to a sensor coordinate system, the coordinate transformation is configured such that the coordinate transformation comprises: transforming from the sensor coordinate system to a vehicle coordinate system and from the vehicle coordinate system to a terrestrial coordinate system;
if the sensor data is referenced to the vehicle coordinate system, the coordinate transformation is configured such that the coordinate transformation comprises: transforming from a vehicle coordinate system to a terrestrial coordinate system; and
if the sensor data is referenced to the terrestrial coordinate system, no coordinate transformation is required.
In some embodiments, the method further comprises: ascertaining whether a third state is satisfied, and if the third state is satisfied: the coordinate-transformed data is provided to a particle filter without clustering and/or gaussian modeling for determining position data of the vehicle.
In some embodiments, the method further comprises: the vehicle state parameters and/or user control instructions are acquired, preferably the user control instructions indicate instructions entered by the user via the input device. Whether the first state and/or the second state and/or the third state is satisfied is determined based on the vehicle state parameter and/or the manipulation instruction of the user. Preferably, the vehicle state parameters include: GPS positioning data of the vehicle, data processing configuration of the vehicle, and/or vehicle speed.
In some embodiments, determining whether the first state and/or the second state and/or the third state is satisfied is based on GPS positioning data of the vehicle, preferably ascertaining applicable legal provisions in the current geographic area based on the GPS positioning data of the vehicle.
In some embodiments, each state (first state, second state, and third state) may correspond to a particular vehicle state. The second state may be determined to be satisfied, for example, when the vehicle is detected to have a particular data processing configuration (e.g., the vehicle is equipped with an available image processor) and/or a corresponding maneuver indication is entered by the user. In some embodiments, it may be determined that the second state is satisfied when it is detected that the processing capability of the vehicle is greater than a predetermined threshold (e.g., a processor having a greater number of processors available or having a higher performance) and/or a corresponding maneuver indication is entered by the user.
In some embodiments, when the positioning result obtained in one positioning mode is lower than a predetermined accuracy or certainty (e.g., characterized by a statistical variance or covariance), then a transition may be made to another positioning mode to determine that the state for that positioning mode is satisfied.
In some embodiments, different processing pipelines may be transitioned as vehicle speed changes. The first state may be determined to be satisfied, for example, when the vehicle speed is greater than a predetermined speed threshold and/or a corresponding maneuver indication is input by the user.
In some embodiments, the second or third state may be determined to be satisfied when the vehicle speed is less than a predetermined speed threshold and/or a corresponding maneuver indication is input by a user.
In some embodiments, when the current regional regulations indicate that point cloud data may be directly applied and/or a corresponding manipulation indication is input by a user, then it may be determined that the third state is satisfied.
According to a second aspect of the present invention, there is provided an apparatus for vehicle localization, the apparatus comprising:
a coordinate transformation module configured to coordinate transform the sensor data to obtain coordinate transformed data;
a clustering module configured to perform clustering processing on the coordinate-transformed data to obtain clustered data;
a Gaussian modeling module configured to Gaussian model the clustered data to obtain Gaussian modeled data;
a point cloud map generation module configured to generate a point cloud map based on the coordinate-transformed data;
a positioning module configured to determine position data of a vehicle;
an analysis module configured to ascertain whether the first state or the second state is satisfied,
-if a first state is satisfied, outputting control instructions to cause the respective positioning module to determine position data of the vehicle based on the gaussian modeled data;
-if the second state is fulfilled, outputting control instructions to cause the respective localization module to determine position data of the vehicle based on the point cloud map.
In some embodiments, an analysis module is configured to acquire characteristic parameters related to the sensor and to output control instructions based on the characteristic parameters to cause configuration, in particular automatic configuration, of the coordinate transformation. Preferably, the characteristic parameter relating to the sensor indicates a coordinate system to which the sensor data is referenced.
Preferably, if the analysis module ascertains: the sensor data is referenced to a sensor coordinate system, then causing the coordinate transformation module to be configured such that the coordinate transformation comprises: transforming from the sensor coordinate system to a vehicle coordinate system and from the vehicle coordinate system to a terrestrial coordinate system; the sensor data is referenced to a vehicle coordinate system, then causing the coordinate transformation module to be configured such that the coordinate transformation comprises: transforming from a vehicle coordinate system to a terrestrial coordinate system; the sensor data is referenced to the terrestrial coordinate system, then no coordinate transformation is required.
In some embodiments, the device comprises a particle filter module as the localization module and the analysis module is configured for ascertaining whether a third state is satisfied, and if the third state is satisfied, outputting control instructions to cause a determination of the position data of the vehicle on the basis of the particle filter module, preferably coordinate-transformed data being provided to the particle filter module without clustering and/or gaussian modeling for determining the position data of the vehicle.
In some embodiments, the analysis module is configured to obtain vehicle state parameters and/or user manipulation instructions; and determining whether the first state and/or the second state and/or the third state is satisfied based on a vehicle state parameter and/or a manipulation instruction of a user, preferably, the vehicle state parameter includes: GPS positioning data of the vehicle, data processing configuration of the vehicle, and/or vehicle speed.
In some embodiments, the analysis module is configured to determine whether the first state and/or the second state and/or the third state is satisfied based on the GPS positioning data of the vehicle, preferably to ascertain applicable legal provisions in the current geographic area based on the GPS positioning data of the vehicle.
According to a third aspect of the present invention, there is provided an apparatus for vehicle localization, the apparatus comprising: one or more processors; and one or more memories configured to store a series of computer-executable instructions and computer-accessible data associated with the series of computer-executable instructions, wherein the series of computer-executable instructions, when executed by the one or more processors, cause the one or more processors to perform the method of any of the above embodiments.
In some embodiments, the processor comprises a GPU and a CPU, at least method steps of "generating a point cloud map based on the coordinate transformed data" and "determining location data of the vehicle based on the point cloud map" are performed in the GPU, and at least method steps of "clustering the coordinate transformed data to obtain clustered data", "gaussian modeling the clustered data to obtain gaussian modeled data", and "determining location data of the vehicle based on the gaussian modeled data" are performed in the CPU.
According to a fourth aspect of the present invention, there is provided a non-transitory computer readable storage medium having stored thereon a series of computer executable instructions which, when executed by one or more computing devices, cause the one or more computing devices to perform a method as in any one of the above embodiments.
According to a fourth aspect of the invention, a vehicle is provided, characterized in that the vehicle comprises an arrangement according to one of the above-mentioned embodiments.
Drawings
Some examples of apparatus and/or methods are illustrated below with reference to the accompanying drawings, in which:
FIG. 1 illustrates an exemplary flow chart of a method for vehicle localization;
FIG. 2 illustrates an exemplary block diagram of an apparatus for vehicle localization.
Detailed Description
The present disclosure will now be described with reference to the accompanying drawings, which illustrate several embodiments of the disclosure. It should be understood, however, that the present disclosure may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, the embodiments described below are intended to provide a more complete disclosure and to fully convey the scope of the disclosure to those skilled in the art. It is also to be understood that the embodiments disclosed herein can be combined in various ways to provide further additional embodiments.
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. All terms (including technical and scientific terms) used herein have the meaning commonly understood by one of ordinary skill in the art unless otherwise defined. Well-known functions or constructions may not be described in detail for brevity and/or clarity.
Herein, the term "a or B" includes "a and B" and "a or B" rather than exclusively including only "a" or only "B" unless otherwise specifically stated.
The term "exemplary" means "serving as an example, instance, or illustration" herein. Any implementation exemplarily described herein is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, the disclosure is not limited by any expressed or implied theory presented in the preceding technical field, background, brief summary or the detailed description.
In addition, "first," second, "and like terms may also be used herein for reference purposes only, and" first, "" second "may also refer to a plurality of" first, "" second. For example, the terms "first," "second," and other such numerical terms referring to structures or elements do not imply a sequence or order unless clearly indicated by the context.
It will be further understood that the terms "comprises/comprising," "includes" and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Unless otherwise defined, all terms (including technical and scientific terms) are used herein in their ordinary meaning in the art to which examples pertain.
It should be noted that: the order of the method steps herein may be flexibly configured, with steps labeled with numbers only for convenience of description and not for limitation.
In the automatic driving technology, a high-precision positioning technology is an extremely critical technology. As autopilot technology evolves, various forms of location technology have been developed, which may be based on different algorithms, different processing pipelines, different hardware, and/or different legal regulations. Each positioning technique may have its own application scenarios and technical effects.
The method and the device for vehicle positioning according to the invention can be adapted to different application scenarios and can be flexibly adjusted, for example, adjusting an algorithm and/or a processing pipeline. For this purpose, the expansion or adaptation of the processing pipeline can be implemented on the basis of a plug-in mechanism.
Methods for vehicle positioning according to some embodiments of the present invention are set forth in further detail below.
FIG. 1 illustrates an exemplary flow chart of a method for vehicle localization. According to the invention, the method may comprise:
step 101: acquiring sensor data;
step 102: coordinate transforming the sensor data to obtain coordinate transformed data;
step 103: ascertaining whether a particular state, e.g., a first state, a second state, and/or a third state, is satisfied;
if the first state is satisfied, then performing:
step 110: clustering the coordinate-transformed data to obtain clustered data;
step 120: performing Gaussian modeling on the clustered data to obtain Gaussian modeled data; and
step 130: determining position data of the vehicle based on the Gaussian modeled data,
additionally or alternatively, if the second state is satisfied, performing:
step 210: generating a point cloud map based on the coordinate-transformed data;
step 220: determining position data of the vehicle based on the point cloud map,
additionally or alternatively, if the third state is satisfied, performing:
step 310: the coordinate-transformed data is provided to a particle filter without clustering and/or gaussian modeling for determining position data of the vehicle.
In the present context, the vehicle may be an autonomous vehicle (also referred to as a main vehicle or a self-vehicle, an ego car), i.e. a mobile conveyance with autonomous driving capability, which may be a car, a passenger car, a truck, a van, a train, a ship, a motorcycle, a tricycle, or other mobile conveyance.
In this context, the sensor may be a camera, a radar and/or a lidar. The camera may be a video camera, a high-speed camera, or a still image camera. The camera data may be the raw output of the camera. Alternatively, the camera data may be pre-processed data of the camera. For example, the camera data may include a plurality of image frames. An image frame of the plurality of image frames may include a plurality of pixel points or a pixel point cloud, e.g., a plurality of pixel points in a two-dimensional arrangement. Furthermore, the camera data may contain image information, e.g. color information of individual pixels of the image information.
LiDAR (LiDAR) may be configured to acquire LiDAR data based on emitting light (e.g., a pulsed laser) and measuring a portion of the light reflected from an object in the vicinity of the LiDAR. Lidar data may contain information data such as target distance, azimuth, altitude, speed, attitude, and even shape. For example, the lidar data may include a point cloud of range information, which may include information (range values) related to the range of a plurality of points (of the point cloud) from the lidar.
As described above, for the positioning technology, a plurality of positioning technologies have been developed, each of which may have application scenarios and technical effects suitable for itself. However, most of the current vehicles can only be operated according to a predefined process flow on the basis of a predefined positioning method, and cannot adapt the positioning method adaptively or flexibly according to application scenarios and/or user commands. The user must typically transport the vehicle to a garage for maintenance, which is relatively costly, to update or replace the hardware and/or software used for location.
Furthermore, it should be understood that: the high-precision positioning technology in automatic driving is divided into absolute positioning and relative positioning. Common laser radar, camera, ultrasonic radar and millimeter wave radar belong to the relative positioning category; the GNSS (global navigation satellite system) positioning is combined with the fusion positioning of the IMU inertial sensor, so that the latitude and longitude information of the vehicle and the current attitude information can be obtained, the absolute position of the vehicle in a terrestrial coordinate system is reflected, and the vehicle belongs to absolute positioning.
For this reason, it is generally necessary to convert the sensor data from the relative coordinate system into the terrestrial coordinate system to acquire the absolute position of the vehicle in the terrestrial coordinate system to perform the automated driving. However, as described above, some sensor data may be raw output and some sensor data may be pre-processed data output. For example, some cameras output data based on their own coordinate system, while some camera data may be transformed into data based on the vehicle's coordinate system, and even into the earth's coordinate system, by hardware and/or software processing within the camera itself. This makes the processing pipeline for the positioning algorithm unable to adapt to changes in sensor configuration in time. In particular, after a sensor replacement or after a hardware or software update of the sensor, the processing line cannot be adapted in time, and the vehicle must be transported to the garage for maintenance, which is relatively complex.
According to the invention, a processing pipeline based on a Plugin mechanism is provided, so that different positioning methods can be flexibly adapted to specific application scenes and/or user instructions.
According to some embodiments of the present invention, a method for vehicle localization may incorporate multiple localization techniques, which may be based on different algorithms, hardware, and/or regulations. To this end, specific states need to be ascertained in order to execute the corresponding processing pipeline. These particular states are associated with vehicle state parameters and/or user manipulation instructions. To this end, in some embodiments, in executing step 103, the vehicle state parameter and/or the user's manipulation instruction may be acquired, and whether the specific state is satisfied may be determined based on the vehicle state parameter and/or the user's manipulation instruction. These particular states may be the first state, the second state, and the third state as described in some embodiments of the invention. The vehicle state parameters may include: GPS positioning data of the vehicle, data processing configuration of the vehicle (e.g., hardware and/or software configuration of the vehicle for positioning technology, such as processor type, number of processors, etc.), and/or vehicle speed. The manipulation instruction of the user may be an instruction input by the user via an input device, such as an in-vehicle device or a portable mobile apparatus.
In some embodiments, each state (the first state, the second state, and the third state) may correspond to a manipulation instruction of one user, respectively. For example, the user can select an available positioning method via the input device, thereby generating a corresponding control command. When a manipulation indication matching a certain state is detected, the corresponding processing pipeline is activated.
In some embodiments, each state (first state, second state, and third state) may correspond to a particular vehicle state. The second state may be determined to be satisfied, for example, when the vehicle is detected to have a particular data processing configuration (e.g., the vehicle is equipped with an available image processor) and/or a corresponding maneuver indication is entered by the user. In some embodiments, it may be determined that the second state is satisfied when it is detected that the processing capability of the vehicle is greater than a predetermined threshold (e.g., a processor having a greater number of processors available or having a higher performance) and/or a corresponding maneuver indication is entered by the user.
In some embodiments, when a positioning result in one positioning mode is below a predetermined accuracy or certainty (e.g., characterized by a statistical variance or covariance), then a transition may be made to another positioning mode to determine that the state for that positioning mode is satisfied.
In some embodiments, different processing pipelines may be transitioned as vehicle speed changes. The first state may be determined to be satisfied, for example, when the vehicle speed is greater than a predetermined speed threshold and/or a corresponding maneuver indication is input by the user. Since, as the vehicle speed increases, the vehicle's own processor is loaded more, a positioning mode with a relatively small processing load needs to be activated for this purpose.
In some embodiments, the second or third state may be determined to be satisfied when the vehicle speed is less than a predetermined speed threshold and/or a corresponding maneuver indication is input by a user.
In some embodiments, when the current regional regulations indicate that point cloud data may be directly applied and/or a corresponding manipulation indication is input by a user, then it may be determined that the third state is satisfied.
According to some embodiments of the invention, the coordinate-transformed data enters a first positioning mode corresponding to the first state. In the first positioning mode, the coordinate-transformed data needs to undergo a series of numerical processing such as clustering processing, gaussian modeling, and the like. The gaussian modeled data can then be fed to a positioning module for determining position data of the vehicle, i.e. the position of the vehicle in the terrestrial coordinate system.
In the clustering process, clustering is performed on point cloud data formed from sensor data, particularly coordinate-transformed sensor data. Each data point can correspond to a feature vector, and the feature vector comprises a plurality of feature values with different attributes (such as Euclidean distance, color information, texture, normal vector and the like) and is calculated based on the attributes; and clustering according to the attributes of the calculated points so as to segment the points with different attributes. For this reason, an appropriate cluster search radius needs to be set, and if the search radius takes a very small value, an actual object is divided into a plurality of clusters; if the value is set too high, then multiple objects are segmented into a cluster, so a test is needed to find the most appropriate search radius. The point cloud data can eventually be clustered by traversing the individual points in the point cloud to perform appropriate clustering for different objects (e.g., trees, buildings, curbs, or signboards).
In some embodiments, the sensor data may be accompanied by more noise points that may interfere with the implementation of subsequent method steps, such as clustering, gaussian modeling, thereby reducing the accuracy and reliability of the localization. For this purpose, the sensor data may also be subjected to a filtering process. Therefore, the interference effect of noise on the subsequent method steps is further reduced, and the accuracy and the reliability of the method are improved.
Gaussian distributions, i.e. normal distributions, are the most common probability distribution models, and are often used to characterize some random variables such as noise, feature distributions, pixel grayness in image processing, pattern recognition, and computer vision, and in addition, normal distribution functions are often selected as windowing functions for localization processes such as smoothing filtering, Gabor transformation, and the like. This is because, on the one hand, a normal distribution reflects a statistical law about the amount of variation that is ubiquitous in nature, and, on the other hand, a normal distribution function has very good mathematical properties, has continuous derivatives of various orders, has the same functional form in the time domain and the frequency domain, and so on, and is very easy to analyze. In gaussian modeling, the clustered data can be gaussian modeled to obtain gaussian modeled data. The gaussian models may include both Single Gaussian Models (SGMs) and mixed gaussian models (GMMs).
In the current embodiment, the clustered data may be modeled based on a single gaussian model.
Can be based on a gaussian distribution as follows:
Figure BDA0002436216940000111
where μmay be represented by the training sample mean (i.e., the mean of each cloud class) and Σ may be represented by the sample variance (i.e., the variance of each cloud class). Thus, the clustered data set can be numerically characterized by the gaussian distribution described above.
In step 130, position data of the vehicle may be determined based on the gaussian modeled data using a suitable positioning algorithm. In some embodiments, the position of the vehicle in the earth coordinates can be ascertained by means of a static map on the basis of gaussian-modeled data. The static map may be a "high precision (HD) map" or a "planning map". Static maps for autonomous driving may typically be electronic digital maps with high accuracy and large data dimensions. In some embodiments, the detection data of the IMU inertial sensors may be additionally fused. It should be understood that the algorithm for localization based on gaussian modeled data may be various and is not limited to the above-described embodiment. In the first positioning mode, the sensor data needs to go through a number of processing steps, and the accuracy of the final positioning also depends on the reliability of the steps, which is crucial.
According to some embodiments of the invention, the coordinate-transformed data enters a second positioning mode corresponding to the second state. In a second localization mode, a point cloud map or a real-time map may be generated based on the coordinate-transformed sensor data. The real-time map may be presented on the real-time map with respect to real-time lane information and/or real-time road conditions on the road. In some embodiments, the coordinate transformed data may generate a point cloud map without clustering and/or gaussian modeling. In some embodiments, the sensor data may be accompanied by more noisy points that may interfere with the generation of subsequent point cloud maps, thereby reducing the accuracy and reliability of the localization. For this purpose, the sensor data may also be subjected to a filtering process. Therefore, the interference effect of noise is further reduced, and the accuracy and the reliability of the method are improved. To obtain an accurate point cloud map, the vehicle requires a strong data processing configuration. Especially when the vehicle is equipped with a special processor, such as a Graphic Processor (GPU), an accurate point cloud map can be obtained. The GPU may at least reduce the dependence of the graphics card on the CPU and perform at least a part or all of the work of the original CPU.
Furthermore, as noted above, location techniques are also subject to local regulations. In some countries or regions, direct application of point cloud data acquired by, for example, lidar, is not allowed. That is, the user prohibits the acquisition of direct point cloud data, but must go through a pre-processing step. In other countries or regions, there are no similar regulations. According to some embodiments of the invention, the coordinate-transformed data enters a third positioning mode corresponding to the third state. In a third localization mode, the coordinate transformed data may be provided to a particle filter without clustering, gaussian modeling, or point cloud map generation. The particle filtering method is a process of approximating a probability density function by searching a group of random samples propagated in a state space, and substituting a sample mean value for integral operation to obtain a state minimum variance distribution. The samples herein refer to particles (i.e., the fundamental units in the point cloud), and any form of probability density distribution can be approximated when the number of samples approaches infinity.
In some embodiments, regulatory data associated with GPS positioning may be stored in the vehicle, and when the GPS positioning data ascertains that applicable legal provisions in the current geographic area are adapted to the determined positioning mode, the processing pipeline may preferably be automatically adjusted, for example, with the user's consent, to transition from one positioning mode to another.
Furthermore, according to the present invention, a processing pipeline based on the Plugin mechanism, adaptively enters or leaves different coordinate transformation algorithms in the plug-in mode or according to the sensor data characteristics.
In some embodiments, sensor-related characteristic parameters relating to the hardware and/or software characteristics of the sensor in terms of data processing may be acquired prior to entering the coordinate transformation step 102. For example, the characteristic parameter associated with the sensor indicates the coordinate system to which the data from the sensor is referenced. In a memory inside the vehicle, special data records can be stored, which record the characteristic parameters of a specific sensor type. The coordinate transformation can then be configured, in particular automatically, based on the characteristic parameters. In some embodiments, if the data from the sensor is referenced to the sensor coordinate system, the coordinate transformation is configured such that coordinate transformation 102 comprises: from the sensor coordinate system to the vehicle coordinate system and from the vehicle coordinate system to the terrestrial coordinate system 1021. In some embodiments, if the data from the sensors is referenced to the vehicle coordinate system, the coordinate transformation is configured such that coordinate transformation 102 comprises: from the vehicle coordinate system to the terrestrial coordinate system 1022. In some embodiments, if the data from the sensors is referenced to the terrestrial coordinate system, no coordinate transformation is required. These different coordinate transformation algorithms can be flexibly inserted into or removed from the main processing pipeline, so that the coordinate changes are adapted to the current sensor characteristics without costly maintenance at the factory.
An apparatus for vehicle localization in accordance with some embodiments of the present invention is set forth in further detail below. FIG. 2 illustrates an exemplary block diagram of the device 14 for vehicle localization. The vehicle 10 includes an onboard sensor 12 and a device 14 for vehicle localization. The in-vehicle sensors 12 may communicate their own sensed real-time vehicle surroundings data to the device 14. As shown in fig. 2, the apparatus 14 includes a storage device 16 and a processing device 18. The processing device 18 may include one or more processors (e.g., CPUs and/or GPUs). The storage device 16 may store detection data of the in-vehicle sensor (vehicle speed, GPS, camera data, laser radar data, etc.), data relating to characteristic parameters of the sensor, legal regulations relating to a geographical position, and the like.
It should be understood that the processing device 18 may be configured as any device having data processing and analysis functionality including a processor. For example, the processing device 18 may be configured as one or more processors, or the processing device 18 may be configured as a computer, server, or even other intelligent handheld device. The processor may be connected to the memory module via an interconnection bus. The memory modules may include main memory, read only memory, and mass storage devices, such as various disk drives, tape drives, and the like. A "processor" or "controller" is not limited to a CPU or GPU, but may include Digital Signal Processor (DSP) hardware, network processors, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), etc. The storage device 16 may be a Read Only Memory (ROM), a Random Access Memory (RAM), and a non-volatile memory for storing software. Other hardware, conventional and/or custom, may also be included.
The processing device 18 may include a plurality of functional modules. For example, a coordinate transformation module 20, which may be configured to coordinate transform the sensor data to obtain coordinate transformed data; a clustering module 21 configured to perform clustering processing on the coordinate-transformed data to obtain clustered data; a gaussian modeling module 22 configured for gaussian modeling the clustered data to obtain gaussian modeled data; a point cloud map generation module 23 configured to generate a point cloud map based on the coordinate-transformed data; an analysis module 24 configured to ascertain whether a determined state (e.g. a first state, a second state or a third state) is satisfied, and if the first state is satisfied, to output a control instruction to cause the respective positioning module 25 to determine position data of the vehicle based on the gaussian-modeled data; if the second state is satisfied, control instructions are output to cause the respective localization module 25' to determine the position data of the vehicle on the basis of the point cloud map, whereby the coordinate-transformed data can be used without clustering and/or gaussian modeling for determining the position data of the vehicle; if the third state is satisfied, control instructions are output to cause the respective localization module 25 "(particle filter module) to determine the position data of the vehicle, whereby the coordinate-transformed data can be provided to the particle filter module without clustering and/or gaussian modeling for determining the position data of the vehicle. A localization module 25, 25', 25 ″ is designed to determine the position data of the vehicle in association with the satisfied state. Finally, the vehicle position derived by the device 14 is used to guide the vehicle to travel automatically.
When the first state is satisfied, the coordinate transformation module 20, the clustering module 21, the Gaussian modeling module 22, and the positioning module 25 may constitute subjects that execute a processing pipeline.
When the second state is satisfied, the coordinate transformation module 20, the point cloud map generation module 23, and the localization module 25' may be configured as a main body of the execution processing pipeline.
When the third state is satisfied, the coordinate transformation module 20 and the positioning module 25 ″ (particle filter module) may be configured as bodies of an execution processing pipeline.
It should be understood that: other processing modules, such as filtering modules, etc., may additionally or alternatively be provided. Further, the functional blocks are functionally distinguished from each other, and there is no strict limitation in storage location and physical location. In some cases, a module may be configured as a single processor. In some cases, a module may also be combined with other modules into a processor. For example, the clustering module 21 and the gaussian modeling module 22 may be combined into one processor.
In some embodiments, the analysis module may be configured to acquire characteristic parameters related to the sensor and to output control instructions based on the characteristic parameters to cause configuration, in particular automatic configuration, of the coordinate transformation. The characteristic parameter associated with the sensor indicates a coordinate system to which the sensor data is referenced. If the analysis module ascertains: the sensor data is referenced to a sensor coordinate system, then causing the coordinate transformation module to be configured such that the coordinate transformation comprises: transforming from the sensor coordinate system to a vehicle coordinate system and from the vehicle coordinate system to a terrestrial coordinate system; the sensor data is referenced to a vehicle coordinate system, then causing the coordinate transformation module to be configured such that the coordinate transformation comprises: transforming from a vehicle coordinate system to a terrestrial coordinate system; the sensor data is referenced to the terrestrial coordinate system, then no coordinate transformation is required.
In some embodiments, the analysis module may be configured to obtain vehicle state parameters and/or maneuver instructions input by a user via the input device from the on-board sensors; and determining whether a specific state is satisfied based on the vehicle state parameter and/or a manipulation instruction of the user. The vehicle state parameters include: GPS positioning data of the vehicle, data processing configuration of the vehicle, and/or vehicle speed.
In some embodiments, the analysis module may be configured to determine whether a particular condition is satisfied based on the GPS positioning data of the vehicle, and preferably to ascertain applicable legal provisions in the current geographic area based on the GPS positioning data of the vehicle.
In some embodiments, the analysis module may be configured to: and sending a control instruction according to the consistency evaluation result of the static map and the point cloud map, wherein the control instruction causes the automatic driving plan of the main vehicle to be adapted. As described above, the static map may not be presented on the static map with respect to real-time lane information and/or real-time road conditions on the road. Thus, poor or even dangerous autopilot planning may result when road physical properties and/or real-time road conditions on the road change. In some embodiments, the road physical property change may include: lane changes, curvature changes of the road, width changes of the road, and/or slope changes of the road. Real-time road conditions on the road may include: a construction road section, a traffic jam road section, a traffic accident road section and/or a road sealing road section; the lane change includes: lane increase, lane decrease, lane separation, and/or lane merge; the change in curvature of the roadway includes: a change in curvature of one or more lanes of the road; the width variation of the road includes: a width change of one or more lanes of the road; and the change in the slope of the road includes: the slope of one or more lanes of the road changes. The point cloud map can identify the changes in time, so that the current driving plan is prompted to be adjusted, and potential risks are effectively avoided.
The functions of the various elements shown in the figures may be implemented in the form of dedicated hardware, such as a "signal provider", "signal processing unit", "processor", "controller", etc., as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a dedicated processor, by a shared processor, or by a plurality of individual processors, some or all of which may be shared. The block diagram may illustrate, for example, a high-level circuit diagram implementing the principles of the present disclosure. Similarly, flowcharts, state transition diagrams, pseudocode, and the like may represent various processes, operations, or steps which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly represented. The methods disclosed in the specification and claims may be implemented by a device having means for performing each respective action of the methods.
It is to be understood that the disclosure of various actions, processes, operations, steps or functions disclosed in the specification or claims are not to be interpreted as being in a particular order unless explicitly or implicitly indicated as such, for example, for technical reasons. Thus, the disclosure of multiple acts or functions does not limit the multiple acts or functions to a particular order unless such acts or functions are not interchangeable for technical reasons. Further, in some examples, a single action, function, process, operation, or step may include or be divided into multiple sub-actions, sub-functions, sub-processes, sub-operations, or sub-steps, respectively. Unless expressly excluded, such sub-actions can be included in, and part of, the disclosure of the individual action.
Although exemplary embodiments of the present disclosure have been described, it will be understood by those skilled in the art that various changes and modifications can be made to the exemplary embodiments of the present disclosure without substantially departing from the spirit and scope of the present disclosure. Accordingly, all changes and modifications are intended to be included within the scope of the present disclosure.

Claims (15)

1. A method for vehicle localization, the method comprising:
acquiring sensor data;
coordinate transforming the sensor data to obtain coordinate transformed data;
ascertaining whether a first state is satisfied, and if so:
-performing a clustering process on the coordinate transformed data to obtain clustered data;
-gaussian modeling the clustered data to obtain gaussian modeled data;
-determining position data of the vehicle based on the gaussian modeled data;
ascertaining whether a second state is satisfied, and if so:
-generating a point cloud map based on the coordinate transformed data;
-determining position data of the vehicle based on the point cloud map.
2. The method of claim 1, further comprising:
acquiring characteristic parameters related to the sensor;
configuring, in particular automatically configuring, a coordinate transformation based on the characteristic parameter,
preferably, the sensor-related characteristic parameter relates to a hardware and/or software characteristic of the sensor with respect to data processing, preferably the sensor-related characteristic parameter indicates a coordinate system to which the sensor data are referenced.
3. The method of claim 2, wherein configuring the coordinate transformation based on the characteristic parameter comprises:
if the sensor data is referenced to a sensor coordinate system, the coordinate transformation is configured such that the coordinate transformation comprises: transforming from the sensor coordinate system to a vehicle coordinate system and from the vehicle coordinate system to a terrestrial coordinate system;
if the sensor data is referenced to the vehicle coordinate system, the coordinate transformation is configured such that the coordinate transformation comprises: transforming from a vehicle coordinate system to a terrestrial coordinate system;
if the sensor data is referenced to the terrestrial coordinate system, no coordinate transformation is required.
4. The method of claim 1, further comprising:
ascertaining whether a third state is satisfied, and if the third state is satisfied:
-providing the coordinate-transformed data to a particle filter without clustering and/or gaussian modeling for determining position data of the vehicle.
5. The method of claim 4, further comprising:
acquiring vehicle state parameters and/or user control instructions, wherein the user control instructions preferably indicate instructions input by a user through an input device;
determining whether the first state, the second state, or the third state is satisfied based on the vehicle state parameter and/or a manipulation instruction of the user,
preferably, the vehicle state parameters include: GPS positioning data of the vehicle, data processing configuration of the vehicle, and/or vehicle speed.
6. Method according to claim 5, characterized in that it is determined whether the first state, the second state or the third state is fulfilled based on the GPS positioning data of the vehicle, preferably that legal provisions applicable in the current geographical area are ascertained based on the GPS positioning data of the vehicle.
7. An apparatus for vehicle localization, the apparatus comprising:
a coordinate transformation module configured to coordinate transform the sensor data to obtain coordinate transformed data;
a clustering module configured to perform clustering processing on the coordinate-transformed data to obtain clustered data;
a Gaussian modeling module configured to Gaussian model the clustered data to obtain Gaussian modeled data;
a point cloud map generation module configured to generate a point cloud map based on the coordinate-transformed data;
a positioning module configured to determine position data of a vehicle;
an analysis module configured to ascertain whether the first state or the second state is satisfied,
-if a first state is satisfied, outputting control instructions to cause the respective positioning module to determine position data of the vehicle based on the gaussian modeled data;
-if the second state is fulfilled, outputting control instructions to cause the respective localization module to determine position data of the vehicle based on the point cloud map.
8. Device according to claim 7, characterized by an evaluation module which is designed to detect a characteristic variable relating to the sensor and to output control commands for the configuration, in particular the automatic configuration, of the coordinate transformation on the basis of the characteristic variable,
preferably, the characteristic parameter relating to the sensor indicates the coordinate system to which the sensor data is referenced,
preferably, if the analysis module ascertains:
the sensor data is referenced to a sensor coordinate system, then causing the coordinate transformation module to be configured such that the coordinate transformation comprises: transforming from the sensor coordinate system to a vehicle coordinate system and from the vehicle coordinate system to a terrestrial coordinate system;
the sensor data is referenced to a vehicle coordinate system, then causing the coordinate transformation module to be configured such that the coordinate transformation comprises: transforming from a vehicle coordinate system to a terrestrial coordinate system;
the sensor data is referenced to the terrestrial coordinate system, then no coordinate transformation is required.
9. Device according to claim 7, characterized in that the device comprises a particle filter module as a localization module and the analysis module is configured for ascertaining whether a third state is fulfilled, if the third state is fulfilled, control instructions being output to cause a determination of the position data of the vehicle on the basis of the particle filter module, preferably the coordinate-transformed data being provided to the particle filter module without clustering and/or Gaussian modeling for determining the position data of the vehicle.
10. The device according to claim 9, characterized in that the analysis module is configured to obtain vehicle status parameters and/or user manipulation instructions; and determining whether the first state, the second state, or the third state is satisfied based on a vehicle state parameter and/or a manipulation instruction of a user, preferably, the vehicle state parameter includes: GPS positioning data of the vehicle, data processing configuration of the vehicle, and/or vehicle speed.
11. The device of claim 10, wherein the analysis module is configured to determine whether the first state, the second state, or the third state is satisfied based on the GPS positioning data of the vehicle, and preferably to ascertain applicable legal provisions in the current geographic area based on the GPS positioning data of the vehicle.
12. An apparatus for vehicle localization, the apparatus comprising:
one or more processors; and
one or more memories configured to store a series of computer-executable instructions and computer-accessible data associated with the series of computer-executable instructions,
wherein the series of computer-executable instructions, when executed by the one or more processors, cause the one or more processors to perform the method of any one of claims 1-6.
13. The apparatus of claim 12, wherein the processor comprises a GPU and a CPU, wherein at least method steps of generating a point cloud map based on the coordinate transformed data and determining location data of the vehicle based on the point cloud map are performed in the GPU, and wherein at least method steps of clustering the coordinate transformed data to obtain clustered data, gaussian modeling the clustered data to obtain gaussian modeled data, and determining location data of the vehicle based on the gaussian modeled data are performed in the CPU.
14. A non-transitory computer-readable storage medium having stored thereon a series of computer-executable instructions that, when executed by one or more computing devices, cause the one or more computing devices to perform the method of any of claims 1-6.
15. A vehicle, characterized in that it comprises a device according to one of claims 7 to 13.
CN202010253143.9A 2020-04-02 2020-04-02 Method for vehicle positioning, device for positioning and vehicle Pending CN113494914A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010253143.9A CN113494914A (en) 2020-04-02 2020-04-02 Method for vehicle positioning, device for positioning and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010253143.9A CN113494914A (en) 2020-04-02 2020-04-02 Method for vehicle positioning, device for positioning and vehicle

Publications (1)

Publication Number Publication Date
CN113494914A true CN113494914A (en) 2021-10-12

Family

ID=77993142

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010253143.9A Pending CN113494914A (en) 2020-04-02 2020-04-02 Method for vehicle positioning, device for positioning and vehicle

Country Status (1)

Country Link
CN (1) CN113494914A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006112994A (en) * 2004-10-18 2006-04-27 Japan Radio Co Ltd Positioning device and positioning method
CN101114018A (en) * 2007-08-01 2008-01-30 上海华龙信息技术开发中心 Virtual positioning method aiming at disabled satellite positioning and used equipment thereof
DE102017116360A1 (en) * 2016-07-20 2018-01-25 Harman Becker Automotive Systems Gmbh CLUSTERING OF RECORDINGS OF OBJECTS ALONG STREAMS FOR NAVIGATION-RELATED PROCESSES
US20180158206A1 (en) * 2016-12-02 2018-06-07 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for testing accuracy of high-precision map
CN109443369A (en) * 2018-08-20 2019-03-08 北京主线科技有限公司 The method for constructing sound state grating map using laser radar and visual sensor
CN109543715A (en) * 2018-10-23 2019-03-29 武汉理工大学 A kind of ship air route is extracted and the method for track deviation detection
CN109581302A (en) * 2018-12-12 2019-04-05 北京润科通用技术有限公司 A kind of trailer-mounted radar data tracking method and system
US20190204092A1 (en) * 2017-12-01 2019-07-04 DeepMap Inc. High definition map based localization optimization
CN110009741A (en) * 2019-06-04 2019-07-12 奥特酷智能科技(南京)有限公司 A method of the build environment point cloud map in Unity

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006112994A (en) * 2004-10-18 2006-04-27 Japan Radio Co Ltd Positioning device and positioning method
CN101114018A (en) * 2007-08-01 2008-01-30 上海华龙信息技术开发中心 Virtual positioning method aiming at disabled satellite positioning and used equipment thereof
DE102017116360A1 (en) * 2016-07-20 2018-01-25 Harman Becker Automotive Systems Gmbh CLUSTERING OF RECORDINGS OF OBJECTS ALONG STREAMS FOR NAVIGATION-RELATED PROCESSES
US20180158206A1 (en) * 2016-12-02 2018-06-07 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for testing accuracy of high-precision map
US20190204092A1 (en) * 2017-12-01 2019-07-04 DeepMap Inc. High definition map based localization optimization
CN109443369A (en) * 2018-08-20 2019-03-08 北京主线科技有限公司 The method for constructing sound state grating map using laser radar and visual sensor
CN109543715A (en) * 2018-10-23 2019-03-29 武汉理工大学 A kind of ship air route is extracted and the method for track deviation detection
CN109581302A (en) * 2018-12-12 2019-04-05 北京润科通用技术有限公司 A kind of trailer-mounted radar data tracking method and system
CN110009741A (en) * 2019-06-04 2019-07-12 奥特酷智能科技(南京)有限公司 A method of the build environment point cloud map in Unity

Similar Documents

Publication Publication Date Title
US11769052B2 (en) Distance estimation to objects and free-space boundaries in autonomous machine applications
CN111095291B (en) Real-time detection of lanes and boundaries by autonomous vehicles
CN114631117A (en) Sensor fusion for autonomous machine applications using machine learning
TW201937399A (en) Systems and methods for identifying and positioning objects around a vehicle
CN113496290A (en) Training machine learning models using images augmented with simulated objects
WO2020210127A1 (en) Neural network training using ground truth data augmented with map information for autonomous machine applications
US11945499B2 (en) Method and apparatus for trailer angle measurement and vehicle
CN108466621B (en) Vehicle and system for controlling at least one function of vehicle
CN111256693B (en) Pose change calculation method and vehicle-mounted terminal
CN115136148A (en) Projecting images captured using a fisheye lens for feature detection in autonomous machine applications
WO2024012211A1 (en) Autonomous-driving environmental perception method, medium and vehicle
CN116048060A (en) 3D surface structure estimation based on real world data using neural networks for autonomous systems and applications
US20230060542A1 (en) Method and Apparatus for Evaluating Maps for Autonomous Driving and Vehicle
CN112249016A (en) U-turn control system and method for autonomous vehicle
Eraqi et al. Static free space detection with laser scanner using occupancy grid maps
CN112183157A (en) Road geometry identification method and device
CN113494914A (en) Method for vehicle positioning, device for positioning and vehicle
CN111077893B (en) Navigation method based on multiple vanishing points, electronic equipment and storage medium
CN113534222A (en) Method for vehicle positioning, device for vehicle positioning and vehicle
CN113609888A (en) Object detection with planar homography and self-supervised scene structure understanding
CN112519779A (en) Location-based vehicle operation
CN111174796A (en) Navigation method based on single vanishing point, electronic equipment and storage medium
US20230202497A1 (en) Hypothesis inference for vehicles
Wang Automated disconnected towing system
US20230024799A1 (en) Method, system and computer program product for the automated locating of a vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination