CN110843792B - Method and apparatus for outputting information - Google Patents

Method and apparatus for outputting information Download PDF

Info

Publication number
CN110843792B
CN110843792B CN201911198018.6A CN201911198018A CN110843792B CN 110843792 B CN110843792 B CN 110843792B CN 201911198018 A CN201911198018 A CN 201911198018A CN 110843792 B CN110843792 B CN 110843792B
Authority
CN
China
Prior art keywords
speed
speed information
current
information
quality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911198018.6A
Other languages
Chinese (zh)
Other versions
CN110843792A (en
Inventor
林坚
张晔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201911198018.6A priority Critical patent/CN110843792B/en
Publication of CN110843792A publication Critical patent/CN110843792A/en
Application granted granted Critical
Publication of CN110843792B publication Critical patent/CN110843792B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • B60W2050/0052Filtering, filters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/55External transmission of data to or from the vehicle using telemetry

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the disclosure discloses a method and a device for outputting information, which can be used in the field of automatic driving. One embodiment of the method comprises: in response to the fact that the current speed information of the obstacle observed by the sensor is received, scoring the quality of the current speed information to obtain a current quality score; acquiring speed information and mass scores of the obstacles from different sensors in past preset time and stored in a cache queue; if speed information meeting the preset conditions and having a quality score higher than the current quality score exists in the cache queue, replacing the current speed information with the speed information; and filtering according to the current speed information to obtain and output the fusion speed information of the obstacle. The implementation method can realize the time sequence verification of a single sensor and the cross verification among different sensors, and achieves the complementary advantages of noise robustness and the sensors.

Description

Method and apparatus for outputting information
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a method and a device for outputting information.
Background
In the perception module of automatic driving, the multi-sensor fusion is often used as the last module of perception, receives the results from each sensor and fuses and outputs the results, thereby playing the role of complementary advantages among different sensors. The velocity attribute of the obstacle directly affects the downstream decision making, and is one of the important attributes of the obstacle. The accurate and stable speed attribute estimation can be output for the downstream only by fully utilizing the speed estimation advantages of different sensors in different scenes.
The existing multi-sensor speed fusion is generally carried out under a single filtering frame, and the speed fusion of sensors from different sources is realized by modeling noise variance; or under an algorithm framework similar to GMM (Gaussian mixture model), modeling the speeds of different sensors by a multi-Gaussian model, and realizing speed fusion output in a hybrid model mode.
Under the filtering framework, since the input sources of the multi-sensor fusion are from different sensors, the heterogeneous time sequence input means that the markov assumption does not hold, namely, there is no dependency relationship between the state and the state of the previous frame, and the motion model has difficulty in accurately describing the time sequence relationship. Meanwhile, heterogeneous observation also brings difficulty to modeling of an observation model, observation noise dimensions given by different sensors are not consistent, motion expression capabilities of different sensors in different scenes are different, and the difference of scenes expressed by observation noise changes is not natural enough and is difficult to numerically and explicitly model. In addition, in the case of unstable data correlation quality, noise is easily introduced into the filtering framework, and abnormal observation caused by unstable data correlation quality is difficult to capture by using the filtering framework.
In terms of sensor capability complementation, although the hybrid model framework can model the motion outputs of different sensors, the modeling does not fully utilize the advantage complementation capability among different sensors, and the output of each sensor is still viewed in a splitting way instead of viewing the observation of multiple sensors as the motion expression of the same object under different angles.
In summary, the current technology is difficult to perform scene-oriented fine modeling on the motion expression capability of each sensor, has insufficient tolerance on noise, and is difficult to fully utilize the capabilities of different sensors for advantage complementation.
Disclosure of Invention
Embodiments of the present disclosure propose methods and apparatuses for outputting information.
In a first aspect, an embodiment of the present disclosure provides a method for outputting information, including: in response to the fact that the current speed information of the obstacle observed by the sensor is received, scoring the quality of the current speed information to obtain a current quality score; acquiring speed information and mass scores of the obstacles from different sensors in past preset time and stored in a cache queue; if speed information meeting the preset conditions and having a quality score higher than the current quality score exists in the cache queue, replacing the current speed information with the speed information; and filtering according to the current speed information to obtain and output the fusion speed information of the obstacle.
In some embodiments, the filtering according to the current speed information to obtain the fusion speed information of the obstacle includes: calculating a final quality score according to the current speed information; and carrying out filtering and noise variance modeling according to the current speed information and the final mass fraction to obtain the fusion speed information of the barrier.
In some embodiments, the method further comprises: storing the current speed information and the state of the current speed information into the cache queue, wherein the state comprises: convergence, non-convergence, anomaly; or storing the current speed information and the current quality fraction into the cache queue.
In some embodiments, the current speed information includes: dynamic and static states, speed magnitude and speed direction; and scoring the quality of the current speed information to obtain a current quality score, comprising: acquiring scene information observed by a sensor; determining the state of the current speed information; acquiring a quality score template of the sensor according to the scene information and the state; and finding out the mass fractions corresponding to the dynamic and static states, the speed and the speed direction according to the mass fraction template.
In some embodiments, determining the state of the current speed information comprises: determining the stability of the speed magnitude and the stability of the speed direction; if the speed magnitude and the speed direction are stable, the state of the speed information is convergence; if only one of the speed and the speed direction is stable, the state of the speed information is not converged; if the speed magnitude and the speed direction are unstable, the state of the speed information is abnormal.
In some embodiments, if there is speed information in the cache queue that meets the predetermined condition and has a quality score higher than the current quality score, replacing the current speed information with the speed information includes: traversing the quality scores observed by different sensors in the cache queue, and comparing the quality scores with the current quality scores; and if a dynamic state, a speed size or a speed direction with higher score is found in the cache queue and a preset condition is met, replacing the corresponding value in the speed information of the current quality score by the dynamic state, the speed size or the speed direction with higher score, wherein the preset condition is that a speed included angle for replacing the speed directions in the speed information of the two parties is within a certain range, and the change direction of the speed size after replacement is consistent with the change direction of the current speed size.
In some embodiments, calculating a final mass score from the current velocity information comprises: weights are respectively set for the dynamic and static states, the speed magnitude and the speed direction, wherein the weight of the dynamic and static states is greater than the weight of the speed magnitude and the weight of the speed direction, and the weight of the speed magnitude is equal to the weight of the speed direction; and calculating the weighted sum of the scores of the dynamic and static states, the scores of the speed and the scores of the speed directions, and normalizing the weighted sum to be used as a final quality score.
In some embodiments, filtering and noise variance modeling based on the updated current velocity information and the final quality score includes: setting the noise variance of the filter according to the final quality fraction; selecting a filtering strategy of a filter according to scene information observed by a sensor; and inputting the updated current speed information into a filter to obtain the fusion speed information of the barrier.
In a second aspect, an embodiment of the present disclosure provides an apparatus for outputting information, including: a scoring unit configured to score a quality of the current speed information in response to receiving current speed information of the obstacle observed by the sensor, resulting in a current quality score; an acquisition unit configured to acquire speed information and quality scores from different sensors in a past predetermined time held in a buffer queue; the refining unit is configured to replace the current speed information by using the speed information if the speed information which meets a preset condition and has a quality score higher than the current quality score exists in the cache queue; and the filtering unit is configured to filter according to the current speed information to obtain and output the fusion speed information of the obstacle.
In some embodiments, the filtering unit is further configured to: calculating a final quality score according to the current speed information; and carrying out filtering and noise variance modeling according to the current speed information and the final quality score to obtain the fusion speed information of the barrier.
In some embodiments, the apparatus further comprises a storage unit configured to: storing the current speed information and the state of the current speed information into the cache queue, wherein the state comprises: convergence, non-convergence, anomaly; or storing the current speed information and the current quality fraction into the cache queue.
In some embodiments, the current speed information includes: dynamic and static states, speed magnitude and speed direction; and the scoring unit is further configured to: acquiring scene information observed by a sensor; determining the state of the current speed information; acquiring a quality score template of the sensor according to the scene information and the state; and finding out the mass fractions corresponding to the dynamic and static states, the speed and the speed direction according to the mass fraction template.
In some embodiments, the scoring unit is further configured to: determining the stability of the speed magnitude and the stability of the speed direction; if the speed magnitude and the speed direction are stable, the state of the speed information is convergence; if only one of the speed and the speed direction is stable, the state of the speed information is not converged; if the speed magnitude and the speed direction are unstable, the state of the speed information is abnormal.
In some embodiments, the refinement unit is further configured to: traversing the quality scores observed by different sensors in the cache queue, and comparing the quality scores with the current quality scores; and if a dynamic state, a speed size or a speed direction with higher score is found in the cache queue and a preset condition is met, replacing the corresponding value in the speed information of the current quality score by the dynamic state, the speed size or the speed direction with higher score, wherein the preset condition is that a speed included angle for replacing the speed directions in the speed information of the two parties is within a certain range, and the change direction of the speed size after replacement is consistent with the change direction of the current speed size.
In some embodiments, the computing unit is further configured to: weights are respectively set for the dynamic and static states, the speed magnitude and the speed direction, wherein the weight of the dynamic and static states is greater than the weight of the speed magnitude and the weight of the speed direction, and the weight of the speed magnitude is equal to the weight of the speed direction; and calculating the weighted sum of the scores of the dynamic and static states, the scores of the speed and the scores of the speed directions, and normalizing the weighted sum to be used as a final quality score.
In some embodiments, the filtering unit is further configured to: setting the noise variance of the filter according to the final quality fraction; selecting a filtering strategy of a filter according to scene information observed by a sensor; and inputting the updated current speed information into a filter to obtain the fusion speed information of the barrier.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; a storage device having one or more programs stored thereon which, when executed by one or more processors, cause the one or more processors to implement a method as in any one of the first aspects.
In a fourth aspect, embodiments of the disclosure provide a computer readable medium having a computer program stored thereon, wherein the program when executed by a processor implements a method as in any one of the first aspect.
The method and the device for outputting information provided by the embodiment of the disclosure provide a multi-sensor speed estimation framework, have high expandability and can be flexibly embedded into different sensors; and the time sequence verification of a single sensor and the cross verification among different sensors can be realized, and the complementary advantages of noise robustness and the sensors are realized.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram for one embodiment of a method for outputting information, according to the present disclosure;
FIG. 3 is a schematic diagram of one application scenario of a method for outputting information according to the present disclosure;
FIG. 4 is a flow diagram of yet another embodiment of a method for outputting information in accordance with the present disclosure;
FIG. 5 is a schematic block diagram illustrating one embodiment of an apparatus for outputting information according to the present disclosure;
FIG. 6 is a schematic block diagram of a computer system suitable for use with an electronic device implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the present method for outputting information or apparatus for outputting information may be applied.
As shown in fig. 1, the system architecture 100 may include an unmanned vehicle 101.
The driverless vehicle 101 may have mounted therein a drive control device 1011, a network 1012, a laser radar 1013, a millimeter wave radar 1014, and a camera 1015. Network 1012 is used to provide a medium for communication links between driving control device 1011, lidar 1013, millimeter-wave radar 1014, and camera 1015. Network 1012 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.
A driving control device (also referred to as an in-vehicle brain) 1011 is responsible for intelligent control of the unmanned vehicle 101. The driving control device 1011 may be a separately provided Controller, such as a Programmable Logic Controller (PLC), a single chip microcomputer, an industrial Controller, or the like; or the equipment consists of other electronic devices which have input/output ports and have the operation control function; but also a computer device installed with a vehicle driving control type application.
It should be noted that, in practice, at least one sensor, such as a camera, a gravity sensor, a wheel speed sensor, etc., may also be installed in the unmanned vehicle 101. In some cases, the unmanned vehicle 101 may further include GNSS (Global Navigation Satellite System) equipment, SINS (Strap-down Inertial Navigation System), and the like.
It should be noted that the method for outputting information provided in the embodiment of the present application is generally performed by the driving control device 1011, and accordingly, the apparatus for outputting information is generally provided in the driving control device 1011.
It should be understood that the number of drive control devices, networks, lidar, millimeter wave radar, and cameras in fig. 1 are merely illustrative. There may be any number of steering control devices, networks, lidar, millimeter wave radar, and cameras, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for outputting information in accordance with the present disclosure is shown. The method for outputting information comprises the following steps:
step 201, in response to receiving the current speed information of the obstacle observed by the sensor, scoring the quality of the current speed information to obtain a current quality score.
In the present embodiment, an execution subject (e.g., a driving control apparatus shown in fig. 1) of the method for outputting information may receive current speed information of an obstacle analyzed by various sensors. The sensor may comprise at least one of: laser radar, millimeter wave radar, camera. The camera is used for visual motion observation. The speed information may include: dynamic and static states, speed magnitude, speed direction and the like. The current quality score can be obtained according to the quality score of the current speed information. The quality can be evaluated by stability, for example, whether the dynamic and static states change within a predetermined time, whether the mean value and variance of the velocity are within a predetermined range. The degree of change in the direction of velocity, etc. The quality can be scored according to the degree of stability, for example, the speed information change interval is tested according to experience, the highest scoring of the change degree is 5 points when the change degree is less than 5%, and the scoring of the change degree exceeds 50% is 0 point.
In some alternative implementations of the present embodiment, the quality may be scored by determining whether the state of the speed information converges. Determining the stability of the speed magnitude and the stability of the speed direction; if the speed magnitude and the speed direction are stable, the state of the speed information is convergence; if only one of the speed and the speed direction is stable, the state of the speed information is not converged; if the speed magnitude and the speed direction are unstable, the state of the speed information is abnormal. We split the observed mass into 3 discrete states: convergence, no convergence, anomaly.
Each state has a set of corresponding observation score map (mass fraction) to realize scoring of dynamic and static states, speed magnitude and speed direction. For example, in the same scene, 3 sensors × 3 speed states, and a total of 9 quality score templates (score templates) need only be stored by using a c + + map structure, and the corresponding quality scores are looked up as keys based on the sensor type + speed state during query. Score maps between different states under the same sensor satisfy the relationship of convergence > non-convergence > anomaly. For example, in the case of the laser radar, in the short-distance observation, the mass fraction of the detected velocity in the stationary state in the convergent state is 5, the mass fraction of the detected velocity in the magnitude direction is 5, and the mass fraction in the velocity direction is 5. All the quality scores are 0 in the abnormal state. Under the same speed state of different sensors, the relationship of the score map is obtained in an empirical mode (empirical values are derived from benchmark, and the relationship of motion expression capability between the sensors under different scenes can be obtained by evaluating motion errors of the sensors under different scenes). We write the score map in different states of each sensor in configuration, and we can also do the adaptation by just modifying the score map if the motion estimation capability of the upstream sensor changes.
Alternatively, stability may also be judged by track _ id, which is the identity of the detected obstacle, which does not change if continuously detected. In general, we want the speed not to have a particularly large jump, the speed is toward not having a sharp jitter, track id can be stable (meaning the data correlation quality is high), and a smooth speed trend and a stable correlation quality often mean the convergence of the speed.
Alternatively, the stability may also be determined by the similarity of the trajectory of the unmanned vehicle and the trajectory of the obstacle, and if the similarity is greater than a predetermined similarity threshold, the stability may be high, otherwise, the stability is low.
In some optional implementations of the present embodiment, the quality may also be scored in conjunction with context information (e.g., turns, lane changes, etc.). Acquiring scene information observed by a sensor; determining the state of the current speed information; acquiring a quality score template of the sensor according to the scene information and the state; and finding out the mass fractions corresponding to the dynamic and static states, the speed and the speed direction according to the mass fraction template. The judgment of the speed state of each sensor has a general judgment method and also has a judgment basis for scene subdivision. In general, we want the speed not to have a particularly large jump, the speed is toward not having a sharp jitter, track id can be stable (meaning the data correlation quality is high), and a smooth speed trend and a stable correlation quality often mean the convergence of the speed. In addition, the capability of each sensor under different scenes is different, for example, the speed orientation of the laser radar is often not stable enough under the scene of point cloud sparsity (common in medium and long distances); the speed of the millimeter wave radar has a stair-shaped characteristic in a low-speed scene of the main vehicle, the size of the millimeter wave radar is not stable enough, and the precision of the millimeter wave radar is not as good as that of the speed estimation of the main vehicle at a constant speed; vision has poor ability to estimate motion in dense occlusion scenes. By means of the scene information, the speed state can be set to be convergent in a scene that a single sensor verifies that the single sensor is relatively robust, and the speed state can be set to be other states in a scene that the motion estimation capability is relatively poor, so that the speed noise of the single sensor is reduced, and the motion estimation capability of the single sensor can be mined to the maximum extent.
Step 202, obtaining the speed information and the mass fraction of the obstacles from different sensors in the past preset time, which are stored in the buffer queue.
In this embodiment, the different sensors may be different sensors of the same type, or may be different types of sensors. Different sensors observe the same obstacle. The unmanned vehicle maintains a cache queue, the cache queue can store the states (convergence, non-convergence and abnormity) of the speed information observed by the movement of each single sensor within the past preset time (for example, 0.5 second), and then the quality score is calculated during traversal comparison, and the storage mode can save the memory overhead. The buffer queue may also hold velocity information and corresponding mass scores observed by the motion of individual sensors over a predetermined time (e.g., 0.5 seconds) in the past. The second way can reduce the amount of computation.
In step 203, if speed information meeting the predetermined condition and having a quality score higher than the current quality score exists in the cache queue, the speed information is used to replace the current speed information.
In this embodiment, the motion observation of the current frame is scored to obtain the current quality score, then the quality scores of different sensor observations (the same sensor observation has been subjected to quality evaluation by convergence judgment) in the cache queue are traversed, and compared with the current quality score, if a better observation (represented in that the current quality score is greater than the current quality score of the current frame, for example, the quality score of the velocity size is greater than the quality score of the velocity size predicted by the current frame) is found in the cache queue, the velocity size of the better observation is tried to refine (refine) the current observation. The refine meets certain condition constraints, such as that the included angle between the two speeds is within a certain range, and the direction of the refine is consistent with the change direction of the current speed), such as that the current movement trend is acceleration, the speed of the refine can only be larger than the current speed, otherwise, jitter exists. If the refine is completed, one or more of the dynamic and static states, the speed magnitude and the speed direction in the currently observed speed information are replaced by the better observed speed information. After traversing the cache queue, if refine is completed, the quality score of the currently observed speed information is changed correspondingly.
Optionally, if speed information meeting a predetermined condition and having a quality score higher than the current quality score exists in the cache queue, the current speed information is not replaced. The original speed information is retained and step 204 is entered.
Optionally, after traversing the buffer queue, the current speed information and state may be stored in the buffer. Or the current speed information and quality score may be stored in a cache.
And step 204, filtering according to the current speed information to obtain and output the fusion speed information of the obstacle.
In the present embodiment, the filtering may be performed by a filter such as a kalman filter. The multi-sensor information fusion kalman filter is a common means in the prior art, and therefore, the detailed description is omitted.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for outputting information according to the present embodiment. In the application scenario of fig. 3, the time stamp of each sensor is actually different, i.e. the observations of the individual sensors do not enter the driving control device in parallel, but in series. For example:
1. at the current time t, speed information observed by the sensor A arrives and is recorded as At, and speed information of [ Ct-5, At-4, Bt-3, Bt-2, Ct-1] and the prediction states (convergence, non-convergence and abnormity) of the speed information and the speed information are in a buffer queue;
2. after the At comes in, quality evaluation is carried out (a historical information queue observed by the sensor A is stored in advance and comprises track id, speed, direction, motion and static, scene and the like) to obtain the observed state (convergence, non-convergence and abnormity) of the At;
3. after the At obtains the observation state, performing refine, making refine state equal to the At observation value, refine score map equal to the score map of the At (obtained by the At observation state), traversing [ Ct-5, At-4, Bt-3, Bt-2, Ct-1] mentioned in 1 (not refine with the At-4); for example, for the first comparison, Ct-1 and At, knowing the observed states of Ct-1 and At, one set of score map can be queried in the score map template, and then the dynamic and static states, velocity magnitude and velocity direction score of Ct-1 and At are compared, if the velocity magnitude score of Ct-1 is greater than the velocity magnitude score of At, it will be determined whether the redefine condition (passing direction, velocity change) is satisfied, if so, the redefine state velocity is changed to Ct-1 velocity, and the redefine score velocity of score map is changed to Ct-1 velocity score; if not, not changing;
4. after sequentially traversing Ct-1, Bt-2, Bt-3 and Ct-5, a refine state and a refine score map can be obtained, and a final belief score (final mass fraction) is obtained by weighting the refine score map;
5. if At does not complete refine in the refine model (refine condition is not satisfied), then refine state and refine score are At's own state and score.
6. And filtering and noise variance modeling are carried out on the current speed information and the final mass fraction to obtain the fusion speed information of the obstacle.
The method provided by the above embodiment of the present disclosure performs quality modeling by using the self timing information and scene information of the single sensor. And then the short-time consistency of the motion is utilized to realize the compensation of the deficiency of the sensor. For example, the current observation has a poor velocity magnitude and velocity heading, but there are other observation sources that are better in the short term, and it is desirable to not update the motion model with the current observation, unless the current observation noise can be better modeled, and the motion model will be biased by the current observation. The method disclosed by the invention can replace the current poor observation speed and angle with the own speed and angle, and is equivalent to not filtering the current observation to a certain extent. Meanwhile, the observation quality score given by the method can reflect the spectrum leaning degree of observation, and the higher the quality score is, the more spectrum leaning is observed.
With further reference to fig. 4, a flow 400 of yet another embodiment of a method for outputting information is shown. The process 400 of the method for outputting information includes the steps of:
step 401, in response to receiving current speed information of an obstacle observed by a sensor, scoring the quality of the current speed information to obtain a current quality score.
Step 402, obtaining the speed information and the mass fraction of the obstacles from different sensors in the past preset time, which are stored in a buffer queue.
In step 403, if speed information meeting the predetermined condition and having a quality score higher than the current quality score exists in the cache queue, the speed information is used to replace the current speed information.
Steps 401 and 403 are substantially the same as step 201 and 203, and therefore will not be described again.
Step 404, calculating a final quality score according to the current speed information.
In this embodiment, the final mass fraction is obtained by performing weighted normalization on each component (dynamic and static state, velocity magnitude, velocity direction, etc.) of the mass fraction. Wherein, the weight of the dynamic and static state is larger than the weight of the speed and the weight of the speed direction. The weight of the velocity magnitude and the weight of the velocity direction may also be equal. For example, the weights of the dynamic and static states, the speed magnitude and the speed direction can be set to 0.5, 0.25 and 0.25 respectively. If the dynamic and static states are judged incorrectly, the possibility that the judgment results of the speed magnitude and the speed direction are correct is not high.
And 405, performing filtering and noise variance modeling according to the current speed information and the final quality fraction to obtain the fusion speed information of the obstacle.
In the present embodiment, the noise variance of the filter is set according to the final quality score; selecting a filtering strategy of a filter according to scene information observed by a sensor; and inputting the updated current speed information into a filter to obtain the fusion speed information of the barrier. The noise variance modeling is related to the quality score, with a higher quality score meaning a smaller noise variance.
Meanwhile, in order to improve the speed stability, the motion scenes of the obstacles are classified. In high-speed straight-ahead motion, due to inertia of motion, a smoother filtering strategy is used; and in a low-speed (often in turning, head-off and the like) scene, a more sensitive filtering strategy is adopted. By classifying scenes, the speed transformation under different scenes can be more finely modeled, so that the motion estimation is more accurate.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the flow 400 of the method for outputting information in the present embodiment represents a step of expanding the function of the filter. Therefore, the scheme described by the embodiment can introduce more noise variance related data, thereby realizing the robustness to noise.
The solution described in this embodiment can solve the following problems:
1. refined modeling of a scene
The motion estimation capability of a single sensor depends on scenes, and no sensor can be perfect in all scenes, so that the fine modeling of the scenes and the numerical reflection of the motion expression of the single sensor are very important. The invention adopts a quality observation model to discretize the speed state into three states of convergence, non-convergence and abnormity, and each state has a corresponding quality fraction template to describe the observed quality from multiple dimensions. By matching the scene with the velocity state, the impact of the scene on the observed quality can be modeled and the downstream modules are affected by the quality scores.
2. Tolerance to noise
The modeling of the observation quality is completed through the time sequence analysis of the data correlation quality and the analysis of the speed time sequence state; while the poor quality observations (which can be considered as noise) have the opportunity to be refined by the better observations in the short-time buffer queue, thereby participating in filtering; or still enters the filtering with a very low observation score, and participates in the filtering with very high observation noise or even does not participate in the filtering, so as to realize the robustness to the noise.
3. Multi-sensor capability complementation
Based on the refinement process, other heterogeneous observations of the current observation in a short time window can be found, the capability difference between different observations is described in a quality score mode, and the advantage complementation between different sensors is realized in a mode of refining the small quality score observation through the large quality score observation.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an apparatus for outputting information, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for outputting information of the present embodiment includes: a scoring unit 501, an acquisition unit 502, a refinement unit 503 and a filtering unit 504. The scoring unit 501 is configured to score the quality of the current speed information in response to receiving the current speed information of the obstacle observed by the sensor, so as to obtain a current quality score; an acquisition unit 502 configured to acquire speed information and quality scores from different sensors in a past predetermined time held in a buffer queue; a refining unit 503 configured to replace the current speed information with the speed information if speed information satisfying a predetermined condition and having a quality score higher than the current quality score exists in the cache queue; and a filtering unit 504 configured to perform filtering according to the current speed information, obtain and output fusion speed information of the obstacle.
In this embodiment, the specific processing of the scoring unit 501, the obtaining unit 502, the refining unit 503 and the filtering unit 504 of the apparatus 500 for outputting information may refer to step 201, step 202, step 203 and step 204 in the corresponding embodiment of fig. 2.
In some optional implementations of this embodiment, the filtering unit 504 is further configured to: calculating a final quality score according to the current speed information; and carrying out filtering and noise variance modeling according to the current speed information and the final mass fraction to obtain the fusion speed information of the barrier.
In some optional implementations of this embodiment, the apparatus 500 further comprises a storage unit (not shown in the drawings) configured to: storing the current speed information and the state of the current speed information into a cache queue, wherein the state comprises the following steps: convergence, non-convergence, anomaly; or storing the current speed information and the current quality score into a buffer queue.
In some optional implementations of this embodiment, the current speed information includes: dynamic and static states, speed magnitude and speed direction; and the scoring unit 501 is further configured to: acquiring scene information observed by a sensor; determining the state of the current speed information; acquiring a quality score template of the sensor according to the scene information and the state; and finding out the mass fractions corresponding to the dynamic and static states, the speed and the speed direction according to the mass fraction template.
In some optional implementations of this embodiment, the scoring unit 501 is further configured to: determining the stability of the speed magnitude and the stability of the speed direction; if the speed magnitude and the speed direction are stable, the state of the speed information is convergence; if only one of the speed and the speed direction is stable, the state of the speed information is not converged; if the speed magnitude and the speed direction are unstable, the state of the speed information is abnormal.
In some optional implementations of this embodiment, the refining unit 503 is further configured to: traversing the quality scores observed by different sensors in the cache queue, and comparing the quality scores with the current quality scores; and if a dynamic state, a speed size or a speed direction with higher score is found in the cache queue and a preset condition is met, replacing the corresponding value in the speed information of the current quality score by the dynamic state, the speed size or the speed direction with higher score, wherein the preset condition is that a speed included angle for replacing the speed directions in the speed information of the two parties is within a certain range, and the change direction of the speed size after replacement is consistent with the change direction of the current speed size.
In some optional implementations of this embodiment, the filtering unit 504 is further configured to: weights are respectively set for the dynamic and static states, the speed magnitude and the speed direction, wherein the weight of the dynamic and static states is greater than the weight of the speed magnitude and the weight of the speed direction, and the weight of the speed magnitude is equal to the weight of the speed direction; and calculating the weighted sum of the scores of the dynamic and static states, the scores of the speed and the scores of the speed directions, and normalizing the weighted sum to be used as a final quality score.
In some optional implementations of this embodiment, the filtering unit 504 is further configured to: setting the noise variance of the filter according to the final quality fraction; selecting a filtering strategy of a filter according to scene information observed by a sensor; and inputting the updated current speed information into a filter to obtain the fusion speed information of the barrier.
Referring now to FIG. 6, a schematic diagram of an electronic device (e.g., the driving control device of FIG. 1) 600 suitable for use in implementing embodiments of the present disclosure is shown. The driving control apparatus shown in fig. 6 is only an example, and should not bring any limitation to the functions and the range of use of the embodiment of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure. It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: in response to the fact that the current speed information of the obstacle observed by the sensor is received, scoring the quality of the current speed information to obtain a current quality score; acquiring speed information and quality scores from different sensors in past preset time stored in a cache queue; if speed information meeting the preset conditions and having a quality score higher than the current quality score exists in the cache queue, replacing the current speed information with the speed information; and filtering according to the current speed information to obtain and output the fusion speed information of the obstacle.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a scoring unit, an obtaining unit, a refining unit, and a filtering unit. The names of these units do not in some cases constitute a limitation to the unit itself, and for example, the acquiring unit may also be described as a "unit that acquires speed information and mass scores of the obstacle from different sensors in a past predetermined time held in the buffer queue".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the inventive concept. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (18)

1. A method for outputting information, comprising:
in response to receiving current speed information of an obstacle observed by a sensor, scoring the quality of the current speed information to obtain a current quality score;
acquiring speed information and mass scores of the obstacles from different sensors in past preset time and stored in a cache queue;
if speed information which meets a preset condition and has a quality score higher than the current quality score exists in the cache queue, replacing the current speed information with the speed information;
and filtering according to the current speed information to obtain and output the fusion speed information of the obstacle.
2. The method of claim 1, wherein the filtering according to the current speed information to obtain the fusion speed information of the obstacle comprises:
calculating a final quality score according to the current speed information;
and carrying out filtering and noise variance modeling according to the current speed information and the final quality score to obtain the fusion speed information of the barrier.
3. The method according to claim 1 or 2, wherein the method further comprises:
storing the current speed information and the state of the current speed information into the cache queue, wherein the state comprises: convergence, non-convergence, anomaly; or
And storing the current speed information and the current quality fraction into the cache queue.
4. The method of claim 1 or 2, wherein the current speed information comprises: dynamic and static states, speed magnitude and speed direction; and
the scoring the quality of the current speed information to obtain a current quality score includes:
acquiring scene information observed by the sensor;
determining a state of the current speed information;
acquiring a quality score template of the sensor according to the scene information and the state;
and finding out mass fractions corresponding to the dynamic and static states, the speed and the speed direction according to the mass fraction template.
5. The method of claim 4, wherein the determining the state of the current speed information comprises:
determining the stability of the speed magnitude and the stability of the speed direction;
if the speed magnitude and the speed direction are stable, the state of the speed information is convergence;
if only one of the speed and the speed direction is stable, the state of the speed information is not converged;
if the speed magnitude and the speed direction are unstable, the state of the speed information is abnormal.
6. The method of claim 4, wherein if speed information that meets a predetermined condition and has a quality score higher than the current quality score exists in the cache queue, replacing the current speed information with the speed information comprises:
traversing the quality scores observed by different sensors in the cache queue, and comparing the quality scores with the current quality scores;
and if a dynamic state, a speed size or a speed direction with higher score is found in the cache queue and a preset condition is met, replacing the corresponding value in the speed information of the current quality score by the dynamic state, the speed size or the speed direction with higher score, wherein the preset condition is that a speed included angle for replacing the speed directions in the speed information of the two parties is within a certain range, and the change direction of the speed size after replacement is consistent with the change direction of the current speed size.
7. The method of claim 2, wherein said calculating a final mass score from current velocity information comprises:
weights are respectively set for the dynamic and static states, the speed magnitude and the speed direction, wherein the weight of the dynamic and static states is greater than the weight of the speed magnitude and the weight of the speed direction;
and calculating the weighted sum of the scores of the dynamic and static states, the score of the speed magnitude and the score of the speed direction, and normalizing the weighted sum to be used as a final quality score.
8. The method of claim 2, wherein the filtering and noise variance modeling based on the current velocity information and the final mass fraction to obtain the fusion velocity information of the obstacle comprises:
setting the noise variance of a filter according to the final quality fraction;
selecting a filtering strategy of a filter according to the scene information observed by the sensor;
and inputting the current speed information into the filter to obtain the fusion speed information of the obstacle.
9. An apparatus for outputting information, comprising:
the system comprises a scoring unit and a control unit, wherein the scoring unit is configured to score the quality of current speed information in response to receiving the current speed information of an obstacle observed by a sensor to obtain a current quality score;
an acquisition unit configured to acquire speed information and mass scores of the obstacles from different sensors in a past predetermined time held in a buffer queue;
a refining unit configured to replace the current speed information with speed information if the speed information meeting a predetermined condition and having a quality score higher than the current quality score exists in the cache queue;
and the filtering unit is configured to filter according to the current speed information to obtain and output the fusion speed information of the obstacle.
10. The apparatus of claim 9, wherein the filtering unit is further configured to:
calculating a final quality score according to the current speed information;
and carrying out filtering and noise variance modeling according to the current speed information and the final quality score to obtain the fusion speed information of the barrier.
11. The apparatus of claim 9 or 10, wherein the apparatus further comprises a storage unit configured to:
storing the current speed information and the state of the current speed information into the cache queue, wherein the state comprises: convergence, non-convergence, anomaly; or
And storing the current speed information and the current quality fraction into the cache queue.
12. The apparatus of claim 9 or 10, wherein the current speed information comprises: dynamic and static states, speed magnitude and speed direction; and
the scoring unit is further configured to:
acquiring scene information observed by the sensor;
determining a state of the current speed information;
acquiring a quality score template of the sensor according to the scene information and the state;
and finding out mass fractions corresponding to the dynamic and static states, the speed and the speed direction according to the mass fraction template.
13. The apparatus of claim 12, wherein the scoring unit is further configured to:
determining the stability of the speed magnitude and the stability of the speed direction;
if the speed magnitude and the speed direction are stable, the state of the speed information is convergence;
if only one of the speed and the speed direction is stable, the state of the speed information is not converged;
if the speed magnitude and the speed direction are unstable, the state of the speed information is abnormal.
14. The apparatus of claim 12, wherein the refining unit is further configured to:
traversing the quality scores observed by different sensors in the cache queue, and comparing the quality scores with the current quality scores;
and if a dynamic state, a speed size or a speed direction with higher score is found in the cache queue and a preset condition is met, replacing the corresponding value in the speed information of the current quality score by the dynamic state, the speed size or the speed direction with higher score, wherein the preset condition is that a speed included angle for replacing the speed directions in the speed information of the two parties is within a certain range, and the change direction of the speed size after replacement is consistent with the change direction of the current speed size.
15. The apparatus of claim 10, wherein the filtering unit is further configured to:
weights are respectively set for the dynamic and static states, the speed magnitude and the speed direction, wherein the weight of the dynamic and static states is greater than the weight of the speed magnitude and the weight of the speed direction;
and calculating the weighted sum of the scores of the dynamic and static states, the score of the speed magnitude and the score of the speed direction, and normalizing the weighted sum to be used as a final quality score.
16. The apparatus of claim 10, wherein the filtering unit is further configured to:
setting the noise variance of a filter according to the final quality fraction;
selecting a filtering strategy of a filter according to the scene information observed by the sensor;
and inputting the current speed information into the filter to obtain the fusion speed information of the obstacle.
17. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
18. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-8.
CN201911198018.6A 2019-11-29 2019-11-29 Method and apparatus for outputting information Active CN110843792B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911198018.6A CN110843792B (en) 2019-11-29 2019-11-29 Method and apparatus for outputting information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911198018.6A CN110843792B (en) 2019-11-29 2019-11-29 Method and apparatus for outputting information

Publications (2)

Publication Number Publication Date
CN110843792A CN110843792A (en) 2020-02-28
CN110843792B true CN110843792B (en) 2021-05-25

Family

ID=69606144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911198018.6A Active CN110843792B (en) 2019-11-29 2019-11-29 Method and apparatus for outputting information

Country Status (1)

Country Link
CN (1) CN110843792B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111427037B (en) * 2020-03-18 2022-06-03 北京百度网讯科技有限公司 Obstacle detection method and device, electronic equipment and vehicle-end equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107015559A (en) * 2015-10-19 2017-08-04 福特全球技术公司 Use hash weighted integral and the probability inference of the target following of summation
CN107161141A (en) * 2017-03-08 2017-09-15 深圳市速腾聚创科技有限公司 Pilotless automobile system and automobile
CN108983213A (en) * 2018-09-07 2018-12-11 百度在线网络技术(北京)有限公司 Determination method, apparatus, equipment and the storage medium of barrier stationary state
CN109635868A (en) * 2018-12-10 2019-04-16 百度在线网络技术(北京)有限公司 Determination method, apparatus, electronic equipment and the storage medium of barrier classification

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107015559A (en) * 2015-10-19 2017-08-04 福特全球技术公司 Use hash weighted integral and the probability inference of the target following of summation
CN107161141A (en) * 2017-03-08 2017-09-15 深圳市速腾聚创科技有限公司 Pilotless automobile system and automobile
CN108983213A (en) * 2018-09-07 2018-12-11 百度在线网络技术(北京)有限公司 Determination method, apparatus, equipment and the storage medium of barrier stationary state
CN109635868A (en) * 2018-12-10 2019-04-16 百度在线网络技术(北京)有限公司 Determination method, apparatus, electronic equipment and the storage medium of barrier classification

Also Published As

Publication number Publication date
CN110843792A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
US12017663B2 (en) Sensor aggregation framework for autonomous driving vehicles
CN108921200B (en) Method, apparatus, device and medium for classifying driving scene data
CN109937343B (en) Evaluation framework for prediction trajectories in automated driving vehicle traffic prediction
CN111919225B (en) Training, testing, and validating autonomous machines using a simulated environment
US10671068B1 (en) Shared sensor data across sensor processing pipelines
CN111240312B (en) Learning-based dynamic modeling method for automatically driven vehicles
US11704554B2 (en) Automated training data extraction method for dynamic models for autonomous driving vehicles
US11269329B2 (en) Dynamic model with learning based localization correction system
JP2021514885A (en) Feature extraction method based on deep learning used for LIDAR positioning of autonomous vehicles
KR20180049029A (en) Control error correction planning method for autonomous vehicle driving
CN112896191B (en) Track processing method and device, electronic equipment and computer readable medium
US10909377B2 (en) Tracking objects with multiple cues
CN112415558B (en) Processing method of travel track and related equipment
CN114179832B (en) Lane changing method for automatic driving vehicle
CN114758502A (en) Double-vehicle combined track prediction method and device, electronic equipment and automatic driving vehicle
CN110843792B (en) Method and apparatus for outputting information
CN113119999A (en) Method, apparatus, device, medium, and program product for determining automatic driving characteristics
Liu et al. Precise Positioning and Prediction System for Autonomous Driving Based on Generative Artificial Intelligence
CN114970112B (en) Method, device, electronic equipment and storage medium for automatic driving simulation
US11036225B2 (en) Method for evaluating localization system of autonomous driving vehicles
CN114394111B (en) Lane changing method for automatic driving vehicle
CN115675528A (en) Automatic driving method and vehicle based on similar scene mining
CN113327456A (en) Lane structure detection method and device
CN115583243B (en) Method for determining lane line information, vehicle control method, device and equipment
WO2022198590A1 (en) Calibration method and apparatus, intelligent driving system, and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant