CN114268787A - AR-HUD-based delay compensation method, device, equipment and storage medium - Google Patents

AR-HUD-based delay compensation method, device, equipment and storage medium Download PDF

Info

Publication number
CN114268787A
CN114268787A CN202111566281.3A CN202111566281A CN114268787A CN 114268787 A CN114268787 A CN 114268787A CN 202111566281 A CN202111566281 A CN 202111566281A CN 114268787 A CN114268787 A CN 114268787A
Authority
CN
China
Prior art keywords
projection
time
delay
motion
projection object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111566281.3A
Other languages
Chinese (zh)
Other versions
CN114268787B (en
Inventor
佘明钢
张鑫
康书芳
刘征征
牛津
李军华
王玉龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongsoft Group Dalian Co ltd
Neusoft Corp
Original Assignee
Dongsoft Group Dalian Co ltd
Neusoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongsoft Group Dalian Co ltd, Neusoft Corp filed Critical Dongsoft Group Dalian Co ltd
Priority to CN202111566281.3A priority Critical patent/CN114268787B/en
Publication of CN114268787A publication Critical patent/CN114268787A/en
Application granted granted Critical
Publication of CN114268787B publication Critical patent/CN114268787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a time delay compensation method, a time delay compensation device, time delay compensation equipment and a storage medium based on AR-HUD. The method comprises the following steps: determining projection delay of a projection object in an AR-HUD (augmented reality head-up display) in a projection processing flow; carrying out delay compensation on historical motion data of the projection object by using projection delay, and predicting a motion track of the projection object; and determining the projection position of the projection object at the expected projection moment according to the motion trail. The method and the device can realize the advanced analysis of the projection position of the projection object at a certain projection time in the future, so that the virtual image of the projection object can be rapidly displayed at the projection position determined in advance when the expected projection time is reached, the calculation operation of the projection position is not required to be executed, the problem that the projection object moves in the projection calculation stage to cause projection deviation is further avoided on the basis of determining the accuracy of the motion trail of the projection object, and the actual fitting degree of the projection object in the AR-HUD after projection is improved.

Description

AR-HUD-based delay compensation method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of data processing, in particular to a time delay compensation method, a time delay compensation device, time delay compensation equipment and a storage medium based on AR-HUD.
Background
With the rapid development of Augmented Reality (AR for short) and Head Up Display (HUD for short), existing AR-HUD products can be used on a vehicle, various projection object information is combined with vehicle information and surrounding environment information to form corresponding virtual images, and corresponding projection positions are calculated to be projected to the positions in front of the driver, so that the driver can concentrate on driving.
However, since the AR-HUD starts from collecting various types of relevant driving information of the projection object, generates a corresponding virtual image, and reaches a stage of finally calculating the projection position to execute the actual projection operation, a certain processing delay may be generated, so that when the AR-HUD executes the actual projection operation, the relevant driving information of the projection object at the actual projection time has changed, so that the projection object generates a certain projection offset, resulting in a problem that the virtual image of the projection object is not attached to the actual position.
Disclosure of Invention
The application provides a delay compensation method, a delay compensation device and a storage medium based on AR-HUD, through analyzing projection delay of a projection object in a projection processing flow, delay compensation is carried out on the projection position of the projection object, the projection accuracy of the projection object in the AR-HUD is guaranteed, the problem that the projection position deviates due to the projection delay is prevented, and the actual attaching degree of the projection object in the AR-HUD after projection is improved.
In a first aspect, an embodiment of the present application provides a delay compensation method based on an AR-HUD, where the method includes:
determining projection delay of a projection object in an AR-HUD (augmented reality head-up display) in a projection processing flow;
carrying out delay compensation on historical motion data of the projection object by utilizing the projection delay, and predicting the motion trail of the projection object;
and determining the projection position of the projection object at the expected projection moment according to the motion trail.
Further, the projection delay time at least includes a motion acquisition delay time and a position calculation delay time of the projection object.
Further, the performing delay compensation on the historical motion data of the projection object by using the projection delay and predicting the motion trajectory of the projection object includes:
determining data acquisition time corresponding to historical motion data of the projection object;
carrying out delay compensation on the data acquisition time by adopting the motion acquisition delay to obtain data acquisition time corresponding to the historical motion data;
and predicting the motion trail of the projection object according to the data acquisition time and the historical motion data.
Further, before determining the projection position of the projection object at the expected projection time according to the motion trajectory, the method further includes:
and performing delay compensation on the projection processing time of the projection object at present by adopting the position calculation delay to obtain the corresponding expected projection time.
Further, the determining of the projection delay existing in the projection processing flow of the projection object in the augmented reality head-up display AR-HUD comprises:
predicting the projection delay of the projection object at the initial projection moment;
continuously utilizing the difference between the historical motion data acquired at the next projection time in the adjacent projection times and the target motion data determined at the previous projection time to correct the projection delay at the previous projection time to be used as the projection delay at the next projection time;
and the target motion data is the projection position of the projection object at the later projection time, which is determined after the projection time delay at the previous projection time is adopted for time delay compensation.
Further, the predicting the motion trajectory of the projection object includes:
carrying out delay compensation on historical motion data of the projection object by utilizing the projection delay to determine a historical motion track of the projection object;
and determining the relative motion track of the projection object under the reference position origin of the vehicle as the motion track of the projection object according to the historical motion track and the historical driving track of the vehicle.
Further, the determining the projection position of the projection object at the expected projection time according to the motion trajectory includes:
predicting relative position information of the projection object under the reference position origin at the expected projection moment according to the relative motion track;
and taking the position of the vehicle at the expected projection time as a new reference position origin, and compensating the relative position information to obtain the projection position of the projection object at the predicted projection time.
Further, the historical movement trajectory of the projection object and the historical travel trajectory of the host vehicle are determined using at least one of the following sensing devices:
a vehicle sensor, a vehicle speed sensor, a radar, a camera, an accelerometer, a gyroscope sensor, a Global Positioning System (GPS) module, and a Dead Reckoning (DR) sensor.
In a second aspect, an embodiment of the present application provides an AR-HUD based delay compensation apparatus, including:
the projection delay determining module is used for determining projection delay of a projection object in the augmented reality head-up display AR-HUD in a projection processing flow;
the delay compensation module is used for performing delay compensation on historical motion data of the projection object by using the projection delay and predicting a motion track of the projection object;
and the projection position determining module is used for determining the projection position of the projection object at the expected projection moment according to the motion trail.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a processor and a memory, the memory being configured to store a computer program, the processor being configured to invoke and execute the computer program stored in the memory to perform the AR-HUD based latency compensation method provided in the first aspect of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium for storing a computer program, where the computer program causes a computer to execute the AR-HUD based delay compensation method as provided in the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, comprising computer program/instructions, wherein the computer program/instructions, when executed by a processor, implement the AR-HUD based delay compensation method as provided in the first aspect of the present application.
The delay compensation method, the device, the equipment and the storage medium based on the AR-HUD provided by the embodiment of the application firstly determine the projection delay of a projection object in the AR-HUD in a projection processing flow, so as to perform delay compensation on historical motion data of the projection object by using the projection delay, thereby predicting the motion trail of the projection object, enabling the predicted motion trail of the projection object to be closer to a real motion trail, ensuring the prediction accuracy of the motion trail of the projection object, further determining the projection position of the projection object at a certain expected projection time in the future according to the motion trail, realizing the advanced analysis of the projection position of the projection object at a certain projection time in the future, enabling the virtual image of the projection object to be rapidly displayed at the projection position determined in advance when the expected projection time is reached, and needing not to perform the calculation operation of the projection position, therefore, on the basis of determining the accuracy of the motion track of the projection object, the problem that the projection object moves in the projection calculation stage to cause projection deviation is further avoided, and the actual fit degree of the projection object in the AR-HUD after projection is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating an AR-HUD based delay compensation method according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating a projection processing flow of a projection object according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating another AR-HUD-based delay compensation method according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a process of converting the relative position between the host vehicle and the projection target according to an embodiment of the present application;
FIG. 5 is a schematic block diagram of an AR-HUD-based delay compensation device according to an embodiment of the present application;
fig. 6 is a schematic block diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In order to solve the problem that projection deviation occurs due to the fact that projection objects move correspondingly under projection processing delay when AR-HUD projects various related driving references in the surrounding environment of a vehicle in the driving process of the vehicle, the embodiment of the application designs a scheme for carrying out delay compensation on projection positions of the projection objects by utilizing the projection delay existing in the projection processing flow of the projection objects, carries out delay compensation on historical motion data of the projection objects by utilizing the projection delay existing in the projection processing flow of the projection objects so as to predict the motion trail of the projection objects more accurately, and further determines the projection positions of the projection objects at a certain expected projection time in the future in the motion trail, thereby realizing the advanced analysis of the projection positions of the projection objects at the certain projection time in the future and enabling virtual images of the projection objects to be displayed at the projection positions determined in advance quickly when the expected projection time is reached, the calculation operation of the projection position is not required to be executed, so that the problem of projection deviation caused by movement of the projection object in the projection calculation stage is further avoided on the basis of determining the accuracy of the motion track of the projection object, and the actual attaching degree of the projection object in the AR-HUD after projection is improved.
Fig. 1 is a flowchart illustrating an AR-HUD-based delay compensation method according to an embodiment of the present application. Referring to fig. 1, the method may specifically include the following steps:
and S110, determining the projection time delay of the projection object in the AR-HUD in the projection processing flow.
Specifically, the AR-HUD, as a head-up display implemented by using the AR technology, can generate a virtual image including various driving related information in combination with the driving information of the vehicle and the surrounding driving environment during driving of the vehicle, and project the virtual image onto a corresponding position in front of the driver, so as to prevent the driver from looking ahead during driving of the vehicle, and ensure driving safety. The projection objects in the present application are mainly various driving reference objects which are specified in the AR-HUD and have a relative position display requirement with the host vehicle in the driving environment around the vehicle, including but not limited to a front vehicle, a pedestrian, an obstacle, a lane line, a navigation destination, an intersection guide point, a road guide line, and the like during the driving of the host vehicle.
When the AR-HUD performs corresponding projection on the virtual image of each projection object in the driving process of the vehicle, as shown in fig. 2, the adopted projection processing flow mainly may be: the method comprises the steps that firstly, various data interaction detection such as radar, a camera or navigation analysis and other interaction equipment between the vehicle and various projection objects in the driving process are used for collecting various position information of the projection objects in the historical driving process of the vehicle so as to analyze the motion trail of the projection objects and generate virtual images of the projection objects; secondly, calculating a projection position at the current moment according to the motion tracks of the vehicle and the projection object; and thirdly, performing actual projection operation under the projection position.
In this case, it is considered that the actual projection operation is performed only after the motion data acquisition and position calculation steps of the projection object are performed in the projection processing flow, so that a certain projection delay is generated.
Therefore, before the actual projection operation is performed on the projection object in the AR-HUD, various processing complexities and calculation overheads of the projection object in the projection processing flow need to be analyzed first, so as to calculate the projection delay existing in the projection processing flow of the projection object.
As an optional implementation scheme in the embodiment of the present application, for the projection delay, the present application may be determined in the following two ways:
under the first condition, a large amount of historical projection data of the projection operation of the AR-HUD in the driving process of the vehicle is acquired, the projection difference between the actual motion position and the projection position of each projection object at different historical moments is continuously analyzed, so that the projection processing performance of the AR-HUD on the projection object is judged, the projection delay of the projection object in the AR-HUD is estimated in advance by adopting an estimation method, and the projection delay of the projection object in the projection processing flow can be directly acquired in the actual projection processing flow of the projection object subsequently.
In the second case, since the host vehicle and the projection object usually have different processing differences during the movement process, the projection delay of the projection object at different times also has a certain change. Therefore, after the projection delay of the projection object in the AR-HUD is estimated by adopting an estimation method, the projection delay can be used as the projection delay of the projection object at the initial projection moment. Then, starting from the initial projection time, at each projection time, the projection delay at the projection time can be adopted to perform delay compensation on the motion trajectory of the projection object, so as to determine the projection position of the projection object at any projection time according to the motion trajectory. Therefore, in order to ensure the accuracy of the projection delay of the projection object at each projection time, the method and the device can continuously correct the projection delay at the previous projection time before calculating the projection position of the projection object each time, so as to obtain more accurate projection delay at the projection time. Specifically, continuously utilizing the difference between the historical motion data acquired at the next projection time in the adjacent projection times and the target motion data determined at the previous projection time, and correcting the projection delay at the previous projection time as the projection delay at the next projection time; and the target motion data determined at the previous projection time is the projection position of the projection object at the later projection time, which is determined by adopting the projection delay at the previous projection time to carry out delay compensation.
Taking two adjacent projection times as an example, when the projection of the projection object is performed at the next projection time in the adjacent projection times, the historical motion data of the projection object at the next projection time is firstly acquired. At this time, the actual projection operation on the projection object is already completed at the previous projection time in the adjacent projection times, so when the projection of the projection object is performed at the previous projection time, the projection delay corrected at the previous projection time may be used to perform delay compensation on the motion trajectory of the projection object, and the projection position of the projection object at the next projection time is determined according to the motion trajectory, and is used as the target motion data in the present application, where the target motion data may represent the motion position of the projection object at the next projection time. Therefore, when the projection object is projected at the next projection time of the adjacent projection times, by analyzing the difference between the historical motion data acquired by the projection object at the next projection time and the target motion data determined at the previous projection time, it can be determined whether the projection position determined after the delay compensation is performed according to the projection delay at the previous projection time is accurate, and then the projection delay at the previous projection time is correspondingly corrected to be used as the projection delay at the next projection time. For example, when the deviation of the two is within a preset reasonable range, it is indicated that the projection delay at the previous projection time is reliable, so that the projection delay at the previous projection time is directly used as the projection delay at the next projection time; or, when the deviation of the two exceeds the preset reasonable range, it indicates that the projection delay at the previous projection time is not reliable, so that the projection delay at the previous projection time needs to be corrected to reduce the deviation of the two, thereby obtaining the projection delay at the next projection time. Before the projection position of the projection object at each projection time is calculated, corresponding adjacent projection time can be determined, and then the same steps are executed, so that the projection delay at each projection time is corrected, and the real-time accuracy of the projection delay is ensured.
And S120, performing delay compensation on the historical motion data of the projection object by using projection delay, and predicting the motion track of the projection object.
After determining the projection delay of the projection object in the projection processing flow, it is considered that the projection position of the projection object at each time is usually determined by predicting the motion position of the projection object at the time, and therefore before calculating the projection position of the projection object, it is first necessary to ensure the accuracy of the motion trajectory of the projection object.
At this time, in the projection processing flow of the projection object, at least one sensing device selected from a vehicle sensor, a vehicle speed sensor, a radar, a camera, an accelerometer, a gyroscope sensor, a global positioning system GPS module, and a dead reckoning DR sensor mounted on the vehicle and the projection object is used, and through interactive calibration of various data between the vehicle and the projection object, historical motion data of the projection object at each historical motion time can be acquired. Then, the corresponding acquisition time of the historical motion data at each historical motion moment can be subjected to delay compensation through the projection delay of the projection object, and the actual acquisition time of each historical motion data is obtained. Furthermore, by adopting each historical motion data and the actual acquisition time of the historical motion data, a more accurate motion track of the projection object can be obtained by adopting a curve fitting mode.
It should be noted that, as shown in fig. 2, the projection delay in the present application may include at least two of a motion acquisition delay Δ t1 and a position calculation delay Δ t2 of a projection object.
When data interaction detection is performed between the host vehicle and the projection object, the motion trajectory of the projection object is generally analyzed by taking the receiving time when the host vehicle receives the relevant motion data of the projection object as the collecting time of each historical data. At this time, there will be a certain transmission delay when the projection object transmits the relevant motion data to the host vehicle, that is, the motion acquisition delay in the present application.
Furthermore, each time the AR-HUD calculates the current projection position of the projection object at the current time, there is a delay in calculating the position due to the calculation overhead of the projection position from the current time to the actual projection operation performed after the projection position is calculated.
Therefore, in the present application, the performing delay compensation on the historical motion data of the projection object by using the projection delay may specifically be: determining data acquisition time corresponding to historical motion data of the projection object; carrying out delay compensation on the data acquisition time by adopting motion acquisition delay to obtain data acquisition time corresponding to historical motion data; and predicting the motion trail of the projection object according to the data acquisition time and the historical motion data. That is, the reception time each time the respective pieces of historical motion data of the projection object are received is determined as the data acquisition time of the historical motion data. Then, the data acquisition time of each historical motion data is directly subtracted by the motion acquisition delay in the application, so that the data acquisition time corresponding to each historical motion data can be obtained, and the motion trail of the projection object can be obtained by performing curve fitting on each historical motion data according to each data acquisition time in the following process.
And S130, determining the projection position of the projection object at the expected projection time according to the motion trail.
In consideration of the fact that a certain calculation delay always exists when the current projection position of the projection object is calculated at the current moment when the projection object is projected, the projection position of the projection object at a certain expected projection moment in the future can be determined in advance according to the motion track after the motion track of the projection object is predicted, so that the projection position of the projection object at each projection moment is calculated in advance each time, when the expected projection moment is reached, the actual projection operation of the projection object can be directly and quickly executed without calculating the projection position, and the problem that the projection is deviated due to the fact that the projection object moves within the delay generated by calculating the projection position is solved.
Further, considering that the projection object always has a certain condition and does not move according to the predicted motion trajectory, in order to ensure the projection accuracy of the projection object, the expected projection time in the present application is usually not too far away from the current time at which the projection object is located, that is, the present application determines in advance the projection position of the projection object in a future closer range each time, and the closer range can ensure that the projection object does not have abnormal motion outside the motion trajectory.
For example, the present application may perform delay compensation on the projection processing time at which the projection object is currently located by using position calculation delay, so as to obtain a corresponding expected projection time. Namely, the expected projection time when the projection position is determined in advance each time can be obtained by directly adding the projection processing time of the projection object at present to the position calculation delay in the application.
The technical scheme provided by the embodiment of the application comprises the steps of firstly determining the projection delay of a projection object in an AR-HUD in a projection processing flow, carrying out delay compensation on historical motion data of the projection object by utilizing the projection delay, predicting the motion track of the projection object, enabling the predicted motion track of the projection object to be closer to a real motion track, ensuring the prediction accuracy of the motion track of the projection object, further determining the projection position of the projection object at a certain expected projection time in the future according to the motion track, realizing advanced analysis of the projection position of the projection object at the certain projection time in the future, enabling a virtual image of the projection object to be rapidly displayed at the projection position determined in advance when the expected projection time is reached, and needing no calculation operation of the projection position, thus on the basis of determining the accuracy of the motion track of the projection object, the problem that projection deviation occurs due to the fact that the projection object moves in the projection calculation stage is further avoided, and the actual attaching degree of the projection object in the AR-HUD after projection is improved.
Further, since the vehicle may be in an irregular driving state continuously, and it takes a certain time to generate the virtual image, when the virtual image is actually projected, the projection position may be shifted due to the vehicle motion, so that the virtual image of the projection object may not be attached to the actual position. Further, it is considered that when the AR-HUD displays the virtual images of the respective projection objects in front of the host vehicle, the displayed virtual images of the respective projection objects are viewed from the perspective of the real-time traveling state of the host vehicle. Therefore, in order to solve the problem that projection deviation occurs due to irregular movement of the vehicle when the AR-HUD projects various relevant running references in the surrounding environment of the vehicle during running of the vehicle, the motion trajectory of the projection object predicted in the present application may be a relative movement state of the projection object with respect to the vehicle, and the motion trajectory predicted by the projection object in the world coordinate system may be converted into a relative coordinate system of a real-time running state of the vehicle to separate the relative movement trajectory of the projection object from the travel trajectory of the vehicle, so as to realize real-time projection compensation of the projection object under the motion of the vehicle, thereby preventing the deviation of the projection position due to the motion of the vehicle.
Fig. 3 is a flowchart illustrating another delay compensation method based on AR-HUD according to an embodiment of the present application. As shown in fig. 3, the method may specifically include the following steps:
s310, determining projection time delay of the projection object in the AR-HUD in the projection processing flow.
And S320, performing delay compensation on the historical motion data of the projection object by using projection delay, and determining the historical motion track of the projection object.
Optionally, by using various data interaction detection between the host vehicle and each projection object during the driving process, such as interaction devices such as radar, a camera, or navigation analysis, first, each position information of each projection object during the historical driving process of the host vehicle may be acquired as the historical motion data of the projection object in the present application. Then, the corresponding acquisition time of the historical motion data at each historical motion moment can be subjected to delay compensation through the projection delay of the projection object, and the actual acquisition time of each historical motion data is obtained. Furthermore, by adopting each historical motion data and the actual acquisition time of the historical motion data, a more accurate historical motion track of the projection object can be obtained by adopting a curve fitting mode.
And S330, determining a relative motion track of the projection object under the reference position origin of the vehicle as the motion track of the projection object according to the historical motion track and the historical driving track of the vehicle.
During the running process of the vehicle, the current running position of the vehicle can be determined in real time through a vehicle sensor, a vehicle speed sensor, a radar, a camera, an accelerometer, a gyroscope sensor, a global positioning system GPS module, a dead reckoning DR sensor and the like which are arranged on the vehicle, so that the historical running track of the vehicle is obtained.
In this case, considering that the historical movement trajectory of the projection object is a trajectory determined in the world coordinate system, the projection object may be constantly at a fixed position in the world coordinate system while the host vehicle is traveling, for example, fixed facilities such as an obstacle around the host vehicle and an intersection guide point, or may be constantly changed, for example, a preceding vehicle and a pedestrian while the host vehicle is traveling, so that the historical movement trajectory of the projection object may exist in various states. Therefore, in the present application, it is necessary to analyze the relative motion state of the projection target with respect to the host vehicle, and considering that the relative motion positions of the projection target at the same historical time in the historical motion trajectory of the projection target change with the travel positions of the projection target at different historical times with respect to the host vehicle, for example, the travel position of the host vehicle at the time t1 is set as the origin of the relative motion analysis, and the relative motion positions of the projection target at the same historical time are different from the travel position of the host vehicle at the time t2, so that the relative motion trajectories determined by the projection pair with respect to the travel positions of the host vehicle at the time t1 and the time t2 are also different. Therefore, in order to uniquely analyze the relative motion state of the projection object relative to the host vehicle, in the present application, the driving position of the host vehicle at any historical time may be used as the reference position origin of the host vehicle, then the motion position of the projection object at each historical time in the historical motion trajectory of the host vehicle is analyzed, the relative motion position of the projection object relative to the reference position origin of the host vehicle at each historical time is analyzed, and the relative motion positions of the projection object at each historical time determined under the reference position origin of the host vehicle are combined, so that the relative motion trajectory of the projection object under the reference position origin of the host vehicle can be obtained, and the subsequent motion state of the projection object under the reference position origin of the host vehicle can be predicted.
Wherein the historical movement track of the projection object and the historical driving track of the vehicle are determined by adopting at least one sensing device of the following items: the system comprises a vehicle sensor, a vehicle speed sensor, a radar, a camera, an accelerometer, a gyroscope sensor, a Global Positioning System (GPS) module and a Dead Reckoning (DR) sensor.
For example, the reference position origin of the host vehicle in the present application may be determined by: screening out the historical driving position of the vehicle at a preset historical moment from the historical driving track of the vehicle; the history traveling position is used as the origin of the reference position of the host vehicle.
Then, by converting the historical travel path of the vehicle under the reference position origin of the vehicle, that is, converting the historical travel path of the vehicle under the world coordinate system under the relative coordinate system represented by the reference position origin of the vehicle, the position conversion relationship at each historical time can be determined according to the position information before and after the vehicle is converted at each historical time, so that the historical motion path of the projection object can be subjected to motion conversion relative to the vehicle by adopting the position conversion relationship at each historical time.
For example, the historical travel position at each historical time in the historical travel trajectory of the host vehicle may be represented as t 1: v1(x1, y1), t 2: v2(x2, y2) and t 3: v3(x3, y3), etc., when t 1: after v1(x1, y1) is taken as the reference position origin of the host vehicle, the relative travel position of the host vehicle under the reference position origin at each history time may be represented as t 1: rv1(0, 0), t 2: rv2(Rx2, Ry2) and t 3: rv3(Rx3, Ry3), and the like. Then, using the historical travel position of the host vehicle at each historical time and the relative travel position of the host vehicle after the down-conversion at the historical time, the position conversion relationship at each historical time, for example, the coordinate system conversion matrices M1, M2, M3 at each historical time, can be obtained.
For example, the scene of the preceding vehicle warning is taken as an example, the projection objects in the present application may be respective preceding vehicles in the driving process of the own vehicle, and meanwhile, the historical driving positions t1 at respective historical moments in the historical motion trajectories of the preceding vehicles may be obtained: p1(x1, y1), t 2: p2(x2, y2) and t 3: p3(x3, y3), and the like, and a historical travel position t1 at each historical time in the historical travel trajectory of the host vehicle: v1(x1, y1), t 2: v2(x2, y2) and t 3: v3(x3, y3), and the like.
Then, as shown in fig. 4, with t 1: v1(x1, y1) is a reference position origin, and the relative driving position t1 of the vehicle under the reference position origin can be obtained: rv1(0, 0), t 2: rv2(Rx2, Ry2) and t 3: rv3(Rx3, Ry3), etc., to create a relative coordinate system represented by the origin of the reference position of the host vehicle. Then, the positional conversion relationships M1, M2, M3 and the like for each historical time can be obtained by using the travel position information before and after the host vehicle is converted from the world coordinates to the relative coordinate system for each historical time.
And then, according to the position conversion relation at each historical time, converting the historical motion trail of the projection object to be below the reference position origin of the vehicle to obtain the relative motion trail of the projection object under the reference position origin of the vehicle. That is, the historical movement position of the projection object at each historical time is first determined from the historical movement trajectory of the projection object. Then, for each historical time, the historical motion position of the projection object at the historical time is subjected to position conversion by adopting the position conversion relationship when the world coordinate system is converted into the relative coordinate system represented by the reference position origin of the vehicle at the historical time, so as to obtain the relative motion position of the projection object at the reference position origin of the vehicle at the historical time, and the relative motion position of the projection object at the reference position origin of the vehicle at each historical time can be obtained by executing the same steps at each historical time. And finally, performing curve fitting on the relative motion positions of the projection object at each historical moment to obtain the relative motion track of the projection object under the reference position origin of the vehicle.
For example, in a case where the preceding vehicle warning is located, as shown in fig. 4, when the position conversion relationships M1, M2, M3 and the like at each historical time are obtained, for the historical travel position t1 at each historical time in the historical motion trajectory of the preceding vehicle: p1(x1, y1), t 2: p2(x2, y2) and t 3: p3(x3, y3), etc., may be executed at each history time, so as to obtain the relative movement position t1 of the front vehicle at the reference position origin of the host vehicle at each history time: p1 ' (x1 ', y1 '), t 2: p2 ' (x2 ', y2 ') and t 3: p3 ' (x3 ', y3 '), and the like, and further curve-fitting the respective relative movement positions of the front vehicle at the respective historical times to generate corresponding curve-fitting equations Fx (t, x) of 0 and Fy (t, y) of 0 as the relative movement locus of the front vehicle at the reference position origin of the host vehicle.
And S340, predicting the relative position information of the projection object under the expected projection time under the reference position origin according to the relative motion track.
After obtaining a more accurate relative motion trajectory determined by the projection object after the delay compensation, the position information of the projection object at a future expected projection time can be directly predicted from the relative motion trajectory and used as the relative position information of the projection object at the expected projection time relative to the reference position origin of the vehicle, so that the position information of the projection object relative to the vehicle at the expected projection time is analyzed later, and the real-time motion of the vehicle is adopted to perform the motion compensation on the projection object.
For example, as shown in fig. 4, according to the relative motion trajectory of the front vehicle under the reference position origin of the host vehicle, the expected projection time in the present application may be input into Fx (t, x) being 0 and Fy (t, y) being 0, so as to predict the relative position information of the projection object under the reference position origin at the expected projection time in advance.
And S350, taking the position of the vehicle at the expected projection time as a new reference position origin, and compensating the relative position information to obtain the projection position of the projection object at the predicted projection time.
Considering that the AR-HUD displays the virtual images of the respective projection objects in front of the host vehicle, the displayed virtual images of the respective projection objects are viewed from the perspective of the real-time traveling state of the host vehicle. Therefore, after obtaining the relative position information of the projection object with respect to the host vehicle at the reference position origin at the expected projection time, it is necessary to further analyze the relative movement position of the projection object with respect to the position of the host vehicle at the expected projection time.
Specifically, first, the vehicle sensor, the vehicle speed sensor, the radar, the camera, the accelerometer, the gyroscope sensor, the GPS module, the DR dead reckoning sensor, and the like mounted on the vehicle can acquire the position of the vehicle at the expected projection time. Then, the position of the vehicle at the expected projection time is used as a new reference position origin, and the relative position information of the projection object at the expected projection time under the previous reference position origin is converted to be under the new reference position origin, so that the relative position information of the projection object at the expected projection time is compensated, the relative position information of the projection object at the new reference position origin is obtained, and the projection position of the projection object at the expected projection time is analyzed.
Optionally, for the projection position of the projection object at the expected projection time, the following operations may be specifically performed: the position of the vehicle at the expected projection time is converted to be under the origin of the reference position of the vehicle, and the relative driving position of the vehicle under the origin of the reference position at the expected projection time is obtained; taking the position of the vehicle at the expected projection time as a new reference position origin, and determining the position conversion relation of the expected projection time at the new reference position origin by using the relative driving position; and calculating the projection position of the projection object at the expected projection time by using the position conversion relation and the relative position information of the expected projection time.
That is, since the relative position information of the projection object at the reference position origin at the expected projection time is obtained, when the projection position of the projection object at the expected projection time is analyzed by converting to the new reference position origin again, the position conversion relationship between the previous reference position origin and the coordinate system represented by the new reference position origin needs to be analyzed first.
Therefore, the position of the vehicle in the original world coordinate system at the expected projection time is firstly converted into the coordinate system represented by the origin of the previous reference position, and the relative driving position of the vehicle at the expected projection time under the origin of the reference position is obtained. Then, when the vehicle position at the expected projection time is taken as a new reference position origin, the position conversion relationship corresponding to the expected projection time when the vehicle is down-converted from the previous reference position origin to the new reference position origin may be determined using the relative travel position of the vehicle at the previous reference position origin and the origin position coordinates at the new reference position origin. Finally, according to the position conversion relation at the expected projection time, the relative position information of the projection object at the expected projection time under the previous reference position origin can be converted to the new reference position origin to obtain the relative position information of the projection object at the expected projection time under the new reference position origin, so that the projection position of the projection object at the expected projection time is further determined.
For example, as shown in fig. 4, if the relative travel position of the host vehicle position at the expected projection time under the reference position origin of the host vehicle is represented as t: rv (Rx, Ry), and the predicted relative position information of the projection object under the reference position origin at the expected projection time is represented as t: p ' (x ', y '), then, according to the relative driving position t of the host vehicle under the reference position origin at the expected projection time: rv (Rx, Ry) and t expressed when the host vehicle position at the expected projection time is taken as a new reference position origin: rv (0, 0), namely, a position conversion relation M when the relative coordinate system represented by the previous reference position origin is converted to the relative coordinate system represented by the new reference position origin at the expected projection time is determined, and then relative position information after the relative coordinate system represented by the previous reference position origin is converted to the relative coordinate system represented by the new reference position origin at the expected projection time is obtained as p ═ M × p', so that the projection position of the projection object at the expected projection time is determined.
According to the technical scheme provided by the embodiment of the application, on the basis of carrying out delay compensation on the motion trail of the projection object, the motion compensation is further carried out on the real-time projection of the projection object under the motion of the vehicle by converting the predicted motion trail of the projection object under the world coordinate system into the relative coordinate system of the real-time running state of the vehicle, so that the joint compensation of the projection object under the motion and projection delay of the vehicle is realized, the problem of deviation of the projection position caused by the motion and projection delay of the vehicle is prevented, and the actual attaching degree of the projection object in the AR-HUD after projection is further improved.
Fig. 5 is a schematic block diagram of an AR-HUD-based delay compensation device according to an embodiment of the present application. As shown in fig. 5, the apparatus 500 may include:
a projection delay determining module 510, configured to determine a projection delay existing in a projection processing flow of a projection object in the augmented reality head-up display AR-HUD;
a delay compensation module 520, configured to perform delay compensation on historical motion data of the projection object by using the projection delay, and predict a motion trajectory of the projection object;
a projection position determining module 530, configured to determine, according to the motion trajectory, a projection position of the projection object at an expected projection time.
Further, the projection delay time at least includes a motion acquisition delay time and a position calculation delay time of the projection object.
Further, the delay compensation module 520 may be specifically configured to:
determining data acquisition time corresponding to historical motion data of the projection object;
carrying out delay compensation on the data acquisition time by adopting the motion acquisition delay to obtain data acquisition time corresponding to the historical motion data;
and predicting the motion trail of the projection object according to the data acquisition time and the historical motion data.
Further, the delay compensation apparatus 500 based on AR-HUD may further include:
and the expected projection determining module is used for performing delay compensation on the projection processing time of the projection object at present by adopting the position calculation delay to obtain the corresponding expected projection time.
Further, the projection delay determining module 510 may be specifically configured to:
predicting the projection delay of the projection object at the initial projection moment;
continuously utilizing the difference between the historical motion data acquired at the next projection time in the adjacent projection times and the target motion data determined at the previous projection time to correct the projection delay at the previous projection time to be used as the projection delay at the next projection time;
and the target motion data is the projection position of the projection object at the later projection time, which is determined after the projection time delay at the previous projection time is adopted for time delay compensation.
Further, the delay compensation module 520 may be specifically configured to:
carrying out delay compensation on historical motion data of the projection object by utilizing the projection delay to determine a historical motion track of the projection object;
and determining the relative motion track of the projection object under the reference position origin of the vehicle as the motion track of the projection object according to the historical motion track and the historical driving track of the vehicle.
Further, the projection position determining module 530 may be specifically configured to:
predicting relative position information of the projection object under the reference position origin at the expected projection moment according to the relative motion track;
and taking the position of the vehicle at the expected projection time as a new reference position origin, and compensating the relative position information to obtain the projection position of the projection object at the predicted projection time.
Further, the historical movement trajectory of the projection object and the historical travel trajectory of the host vehicle are determined using at least one of the following sensing devices:
the system comprises a vehicle sensor, a vehicle speed sensor, a radar, a camera, an accelerometer, a gyroscope sensor, a Global Positioning System (GPS) module and a Dead Reckoning (DR) sensor.
In the embodiment of the application, firstly, the projection delay existing in the projection processing flow of the projection object in the AR-HUD is determined, so that the projection delay is utilized to perform delay compensation on historical motion data of the projection object, thereby predicting the motion track of the projection object, enabling the predicted motion track of the projection object to be closer to the real motion track, ensuring the prediction accuracy of the motion track of the projection object, further determining the projection position of the projection object at a certain expected projection time in the future according to the motion track, realizing the advanced analysis of the projection position of the projection object at a certain projection time in the future, enabling the virtual image of the projection object to be rapidly displayed at the projection position determined in advance when the expected projection time is reached, and on the basis of determining the accuracy of the motion track of the projection object without executing the calculation operation of the projection position, the problem that projection deviation occurs due to the fact that the projection object moves in the projection calculation stage is further avoided, and the actual attaching degree of the projection object in the AR-HUD after projection is improved.
It is to be understood that apparatus embodiments and method embodiments may correspond to one another and that similar descriptions may refer to method embodiments. To avoid repetition, further description is omitted here. Specifically, the apparatus 500 shown in fig. 5 may perform any method embodiment provided in the present application, and the foregoing and other operations and/or functions of each module in the apparatus 500 are respectively for implementing corresponding processes in each method of the embodiment of the present application, and are not described herein again for brevity.
The apparatus 500 of the embodiments of the present application is described above in connection with the drawings from the perspective of functional modules. It should be understood that the functional modules may be implemented by hardware, by instructions in software, or by a combination of hardware and software modules. Specifically, the steps of the method embodiments in the present application may be implemented by integrated logic circuits of hardware in a processor and/or instructions in the form of software, and the steps of the method disclosed in conjunction with the embodiments in the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. Alternatively, the software modules may be located in random access memory, flash memory, read only memory, programmable read only memory, electrically erasable programmable memory, registers, and the like, as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps in the above method embodiments in combination with hardware thereof.
Fig. 6 is a schematic block diagram of an electronic device 600 provided in an embodiment of the present application.
As shown in fig. 6, the electronic device 600 may include:
a memory 610 and a processor 620, the memory 610 being configured to store a computer program and to transfer the program code to the processor 620. In other words, the processor 620 may call and execute a computer program from the memory 610 to implement the method in the embodiment of the present application.
For example, the processor 620 may be configured to perform the above-described method embodiments according to instructions in the computer program.
In some embodiments of the present application, the processor 620 may include, but is not limited to:
general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like.
In some embodiments of the present application, the memory 610 includes, but is not limited to:
volatile memory and/or non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DR RAM).
In some embodiments of the present application, the computer program may be partitioned into one or more modules, which are stored in the memory 610 and executed by the processor 620 to perform the methods provided herein. The one or more modules may be a series of computer program instruction segments capable of performing certain functions, the instruction segments describing the execution of the computer program in the electronic device.
As shown in fig. 6, the electronic device may further include:
a transceiver 630, the transceiver 630 may be connected to the processor 620 or the memory 610.
The processor 620 may control the transceiver 630 to communicate with other devices, and specifically, may transmit information or data to the other devices or receive information or data transmitted by the other devices. The transceiver 630 may include a transmitter and a receiver. The transceiver 630 may further include one or more antennas.
It should be understood that the various components in the electronic device are connected by a bus system that includes a power bus, a control bus, and a status signal bus in addition to a data bus.
Embodiments of the present application also provide a computer storage medium having a computer program stored thereon, where the computer program, when executed by a computer, enables the computer to execute the method of the above method embodiments. In other words, the present application also provides a computer program product containing instructions, which when executed by a computer, cause the computer to execute the method of the above method embodiments.
When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the present application occur, in whole or in part, when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a Digital Video Disk (DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the module is merely a logical division, and other divisions may be realized in practice, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. For example, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. An AR-HUD-based delay compensation method is characterized by comprising the following steps:
determining projection delay of a projection object in an AR-HUD (augmented reality head-up display) in a projection processing flow;
carrying out delay compensation on historical motion data of the projection object by utilizing the projection delay, and predicting the motion trail of the projection object;
and determining the projection position of the projection object at the expected projection moment according to the motion trail.
2. The method of claim 1, wherein the projection delays comprise at least motion acquisition delays and position computation delays of the projection object.
3. The method of claim 2, wherein the performing delay compensation on the historical motion data of the projection object by using the projection delay and predicting the motion trail of the projection object comprises:
determining data acquisition time corresponding to historical motion data of the projection object;
carrying out delay compensation on the data acquisition time by adopting the motion acquisition delay to obtain data acquisition time corresponding to the historical motion data;
and predicting the motion trail of the projection object according to the data acquisition time and the historical motion data.
4. The method of claim 2, further comprising, prior to determining the projection location of the projection object at the expected projection time based on the motion trajectory:
and performing delay compensation on the projection processing time of the projection object at present by adopting the position calculation delay to obtain the corresponding expected projection time.
5. The method according to claim 1, wherein determining a projection delay in a projection process flow of a projection object within an augmented reality heads-up display (AR-HUD) comprises:
predicting the projection delay of the projection object at the initial projection moment;
continuously utilizing the difference between the historical motion data acquired at the next projection time in the adjacent projection times and the target motion data determined at the previous projection time to correct the projection delay at the previous projection time to be used as the projection delay at the next projection time;
and the target motion data is the projection position of the projection object at the later projection time, which is determined after the projection time delay at the previous projection time is adopted for time delay compensation.
6. The method of claim 1, wherein predicting the trajectory of the projected object comprises:
carrying out delay compensation on historical motion data of the projection object by utilizing the projection delay to determine a historical motion track of the projection object;
and determining the relative motion track of the projection object under the reference position origin of the vehicle as the motion track of the projection object according to the historical motion track and the historical driving track of the vehicle.
7. The method of claim 6, wherein determining the projection position of the projection object at an expected projection time according to the motion trajectory comprises:
predicting relative position information of the projection object under the reference position origin at the expected projection moment according to the relative motion track;
and taking the position of the vehicle at the expected projection time as a new reference position origin, and compensating the relative position information to obtain the projection position of the projection object at the predicted projection time.
8. The method according to claim 6, wherein the historical motion trajectory of the projection object and the historical travel trajectory of the host vehicle are determined using at least one of the following sensing devices:
the system comprises a vehicle sensor, a vehicle speed sensor, a radar, a camera, an accelerometer, a gyroscope sensor, a Global Positioning System (GPS) module and a Dead Reckoning (DR) sensor.
9. An AR-HUD based delay compensation apparatus, comprising:
the projection delay determining module is used for determining projection delay of a projection object in the augmented reality head-up display AR-HUD in a projection processing flow;
the delay compensation module is used for performing delay compensation on historical motion data of the projection object by using the projection delay and predicting a motion track of the projection object;
and the projection position determining module is used for determining the projection position of the projection object at the expected projection moment according to the motion trail.
10. An electronic device, comprising:
a processor and a memory, the memory for storing a computer program, the processor for invoking and executing the computer program stored in the memory to perform the AR-HUD based latency compensation method of any one of claims 1-8.
11. A computer-readable storage medium storing a computer program for causing a computer to execute the AR-HUD based delay compensation method according to any one of claims 1 to 8.
12. A computer program product comprising computer programs/instructions, characterized in that the computer programs/instructions, when executed by a processor, implement the AR-HUD based delay compensation method according to any of claims 1-8.
CN202111566281.3A 2021-12-20 2021-12-20 Delay compensation method, device, equipment and storage medium based on AR-HUD Active CN114268787B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111566281.3A CN114268787B (en) 2021-12-20 2021-12-20 Delay compensation method, device, equipment and storage medium based on AR-HUD

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111566281.3A CN114268787B (en) 2021-12-20 2021-12-20 Delay compensation method, device, equipment and storage medium based on AR-HUD

Publications (2)

Publication Number Publication Date
CN114268787A true CN114268787A (en) 2022-04-01
CN114268787B CN114268787B (en) 2024-06-11

Family

ID=80828119

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111566281.3A Active CN114268787B (en) 2021-12-20 2021-12-20 Delay compensation method, device, equipment and storage medium based on AR-HUD

Country Status (1)

Country Link
CN (1) CN114268787B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860111A (en) * 2022-05-23 2022-08-05 Oppo广东移动通信有限公司 Touch track updating method and device, electronic equipment and storage medium
CN115171384A (en) * 2022-07-04 2022-10-11 南京四维智联科技有限公司 Key vehicle position delay compensation method and device in vehicle-mounted display process

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101404122A (en) * 2007-10-02 2009-04-08 爱信艾达株式会社 Driving support device, driving support method, and computer program
CN111260950A (en) * 2020-01-17 2020-06-09 清华大学 Trajectory prediction-based trajectory tracking method, medium and vehicle-mounted equipment
CN111277796A (en) * 2020-01-21 2020-06-12 深圳市德赛微电子技术有限公司 Image processing method, vehicle-mounted vision auxiliary system and storage device
CN112129313A (en) * 2019-06-25 2020-12-25 安波福电子(苏州)有限公司 AR navigation compensation system based on inertial measurement unit
US20210125411A1 (en) * 2019-10-24 2021-04-29 Lg Electronics Inc. Ar mobility and method of controlling ar mobility
US20210157325A1 (en) * 2019-11-26 2021-05-27 Zoox, Inc. Latency accommodation in trajectory generation
CN113022580A (en) * 2021-03-17 2021-06-25 地平线(上海)人工智能技术有限公司 Trajectory prediction method, trajectory prediction device, storage medium and electronic equipment
CN113665589A (en) * 2021-09-18 2021-11-19 广州小鹏自动驾驶科技有限公司 Vehicle longitudinal control method and device, vehicle-mounted terminal and storage medium
CN113687598A (en) * 2021-10-25 2021-11-23 南京信息工程大学 Prediction feedforward tracking control method and device based on internal model and storage medium thereof

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101404122A (en) * 2007-10-02 2009-04-08 爱信艾达株式会社 Driving support device, driving support method, and computer program
CN112129313A (en) * 2019-06-25 2020-12-25 安波福电子(苏州)有限公司 AR navigation compensation system based on inertial measurement unit
US20210125411A1 (en) * 2019-10-24 2021-04-29 Lg Electronics Inc. Ar mobility and method of controlling ar mobility
US20210157325A1 (en) * 2019-11-26 2021-05-27 Zoox, Inc. Latency accommodation in trajectory generation
CN111260950A (en) * 2020-01-17 2020-06-09 清华大学 Trajectory prediction-based trajectory tracking method, medium and vehicle-mounted equipment
CN111277796A (en) * 2020-01-21 2020-06-12 深圳市德赛微电子技术有限公司 Image processing method, vehicle-mounted vision auxiliary system and storage device
CN113022580A (en) * 2021-03-17 2021-06-25 地平线(上海)人工智能技术有限公司 Trajectory prediction method, trajectory prediction device, storage medium and electronic equipment
CN113665589A (en) * 2021-09-18 2021-11-19 广州小鹏自动驾驶科技有限公司 Vehicle longitudinal control method and device, vehicle-mounted terminal and storage medium
CN113687598A (en) * 2021-10-25 2021-11-23 南京信息工程大学 Prediction feedforward tracking control method and device based on internal model and storage medium thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SUKAN, MENGU: "Augmented Reality Interfaces for Enabling Fast and Accurate Task Localization", COLUMBIA UNIVERSITY, 31 December 2017 (2017-12-31) *
陈伟海;张晓媚;吕章刚;吴星明;: "适用于独立式车辆自动跟踪的力延时柔性虚杆建模", 机器人, no. 05, 15 September 2011 (2011-09-15) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860111A (en) * 2022-05-23 2022-08-05 Oppo广东移动通信有限公司 Touch track updating method and device, electronic equipment and storage medium
CN115171384A (en) * 2022-07-04 2022-10-11 南京四维智联科技有限公司 Key vehicle position delay compensation method and device in vehicle-mounted display process
CN115171384B (en) * 2022-07-04 2024-05-14 南京四维智联科技有限公司 Method and device for compensating position delay of key vehicle in vehicle-mounted display process

Also Published As

Publication number Publication date
CN114268787B (en) 2024-06-11

Similar Documents

Publication Publication Date Title
CN108828527B (en) Multi-sensor data fusion method and device, vehicle-mounted equipment and storage medium
CN111721285B (en) Apparatus and method for estimating location in automatic valet parking system
US20200088858A1 (en) Multi-sensor calibration method, multi-sensor calibration device, computer device, medium and vehicle
CN113715814B (en) Collision detection method, device, electronic equipment, medium and automatic driving vehicle
EP3621286B1 (en) Method, and apparatus for clock synchronization, device, storage medium and vehicle
CN114268787A (en) AR-HUD-based delay compensation method, device, equipment and storage medium
JP7258233B2 (en) backward horizon state estimator
US11648936B2 (en) Method and apparatus for controlling vehicle
CN110779538B (en) Allocating processing resources across local and cloud-based systems relative to autonomous navigation
US10990111B2 (en) Position determination apparatus and method for vehicle
US11042761B2 (en) Method and system for sensing an obstacle, and storage medium
CN110654380B (en) Method and device for controlling a vehicle
US11946746B2 (en) Method for satellite-based detection of a vehicle location by means of a motion and location sensor
Sandblom et al. Sensor data fusion for multiple configurations
CN113311905B (en) Data processing system
KR20210073281A (en) Method and apparatus for estimating motion information
CN111721305A (en) Positioning method and apparatus, autonomous vehicle, electronic device, and storage medium
CN113112643A (en) Evaluation method and device for predicted trajectory, electronic device and storage medium
CN111624550B (en) Vehicle positioning method, device, equipment and storage medium
US11158193B2 (en) Position estimation apparatus, position estimation method, and computer readable medium
JP7210429B2 (en) Method, Apparatus and Robotic Apparatus, and Computer Storage Medium for Improving Robustness of Visual Inertial Odometry Systems
CN112824835A (en) Vehicle positioning method, device and computer readable storage medium
CN114241041A (en) AR-HUD-based motion compensation method, device, equipment and storage medium
CN108776333B (en) Data secondary cascade fusion method and system, vehicle-mounted equipment and storage medium
US20220319054A1 (en) Generating scene flow labels for point clouds using object labels

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant