CN114332235A - Calibration method of vehicle-mounted AR-HUD - Google Patents
Calibration method of vehicle-mounted AR-HUD Download PDFInfo
- Publication number
- CN114332235A CN114332235A CN202111395591.3A CN202111395591A CN114332235A CN 114332235 A CN114332235 A CN 114332235A CN 202111395591 A CN202111395591 A CN 202111395591A CN 114332235 A CN114332235 A CN 114332235A
- Authority
- CN
- China
- Prior art keywords
- calibration
- hud
- coordinates
- vehicle
- coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Instrument Panels (AREA)
Abstract
The invention discloses a calibration method of an on-vehicle AR-HUD (augmented reality display device), and belongs to the technical field of augmented reality display methods. The method comprises the steps of placing a plurality of calibration reference objects at preset intervals in a real scene; actually measuring the distance from a camera mounted on the automobile to be calibrated to each calibration reference object, and determining the coordinate of each calibration reference object in a world coordinate system; and starting the AR-HUD, displaying the position of each calibration reference object in a screen projection picture of the AR-HUD, determining the pixel coordinate, namely the pixel coordinate, of each calibration reference object in a calibration characteristic image in the screen projection picture of the AR-HUD, and determining the calibration coefficient of the vehicle-mounted AR-HUD according to the world coordinate and the pixel coordinate.
Description
Technical Field
The invention belongs to the technical field of augmented reality display methods, and particularly relates to a calibration method of a vehicle-mounted AR-HUD.
Background
An Augmented Reality Head-Up Display (ARHUD) is a driving assistance device currently used in passenger cars. The prompting information such as vehicle speed, rotating speed, navigation and fault warning can be displayed in the front head-up range of the driver and combined with surrounding real scenes, the frequency of the driver for lowering the head to observe the instrument is reduced, the driving safety is improved, and meanwhile the perception of the driver to the surrounding environment is improved.
Since the AR-HUD does not have a camera-like image capturing capability and cannot directly acquire a projection image projected on a display surface, the correspondence between the two-dimensional projection space coordinates and the three-dimensional space pixel coordinates in the world coordinate system cannot be directly calculated from the pixel correspondence of different image spaces.
At present, research on calibration calculation of an individual camera is mature, for example, a Zhangyingyou calibration method, the camera is calibrated through internal parameters of the camera obtained through calculation, and the calibrated camera is used in a projection geometry correction system to obtain images in real time. However, for calibrating the AR-HUD system, no mature algorithm can effectively and accurately calculate the internal calibration parameters of the AR-HUD.
Because the sizes and the camera installation of different motorcycle types are different mainly, same, consequently need adjust AR-HUD's demarcation, satisfy different scene demands.
In view of the above, the present invention is particularly proposed.
Disclosure of Invention
The invention aims to provide a calibration method of a vehicle-mounted AR-HUD, and provides a method for calculating a calibration coefficient of the vehicle-mounted AR-HUD, wherein the display of whether an automobile drifts or not is taken as a theoretical basis. The technical scheme of the scheme has a plurality of technical beneficial effects, which are described as follows:
the calibration method of the vehicle-mounted AR-HUD is provided, and comprises the following steps:
placing a plurality of calibration reference objects at preset intervals in a real scene;
actually measuring the distance from a camera mounted on the automobile to be calibrated to each calibration reference object, and determining the coordinate of each calibration reference object in a world coordinate system, namely, the world coordinate;
starting the AR-HUD, displaying the position of each calibration reference object in a screen projection picture of the AR-HUD, and determining the pixel coordinate of each calibration reference object in a calibration characteristic image in the screen projection picture of the AR-HUD, namely the pixel coordinate;
and determining a calibration coefficient of the vehicle-mounted AR-HUD according to the world coordinate and the pixel coordinate.
In a preferred or optional embodiment, the automobile to be calibrated is provided with a plurality of different automobile types and the AR-HUD is installed, and the calibration coefficient of the on-board AR-HUD of each automobile type is determined.
In a preferred or alternative embodiment, the method of determining calibration coefficients for an in-vehicle AR-HUD based on the world coordinates and pixel coordinates comprises,
the world coordinates and the pixel coordinates satisfy:
wherein, XC、YCAnd ZCFor a plurality of said calibration references in world coordinates, XW、YWAnd ZWCoordinates of a plurality of calibration reference objects in pixel coordinates;
and the coordinates of the plurality of calibration reference objects in world coordinates and the coordinates in pixel coordinates reversely deduct the calibration coefficients R and T of the vehicle-mounted AR-HUD.
In a preferred or alternative embodiment, the world coordinates and pixel coordinates satisfy the Zhang friend camera calibration method.
Compared with the prior art, the technical scheme provided by the invention has the following beneficial effects:
according to the method, the world coordinate is generated by actually measuring the distance from the camera to the calibration, and the generated pixel coordinate is started by the AR-HUD to check the vehicle-mounted AR-HUD. The spatial coordinate parameter conversion algorithm can solve the following problems: and (3) displaying the AR of the Lane Departure Warning (LDW Lane Departure Warning), namely solving the pixel coordinates on the AR-HUD picture according to the world coordinates of the Lane line provided by the front-view camera of the automobile, drawing an AR Warning symbol and attaching the AR Warning symbol to the real Lane line.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a calibration method of an on-vehicle AR-HUD, which comprises the following steps:
and S101, placing a plurality of calibration reference objects, such as water bottles, boxes or balls with the same shape, at preset intervals in a real scene.
S102, the distance from a camera mounted on the automobile to be calibrated to each calibration reference object is measured actually, and the coordinate of each calibration reference object in a world coordinate system is determined, namely the world coordinate. The actual distance from the camera to each calibration reference object is measured through an infrared measuring device, the actual distance comprises the distance in the x direction, the distance in the y direction and the distance in the z direction, or the distance in the x direction and the distance in the y direction, the camera is used as the origin of coordinates, and the world coordinates are calibrated through coordinate parameter solving software in the prior art.
And S103, starting the AR-HUD, displaying the position of each calibration reference object in the projection screen of the AR-HUD, and determining the pixel coordinate of each calibration reference object in the calibration characteristic image in the projection screen of the AR-HUD, namely the pixel coordinate. For example, the screen projection frame has a plurality of coordinate grids, and the coordinates of the calibration reference object are determined by acquiring the grid positions of the calibration reference object on the screen projection frame.
And S104, determining the calibration coefficient of the vehicle-mounted AR-HUD according to the world coordinate and the pixel coordinate. The world coordinate and the pixel coordinate satisfy Zhangyingyou camera calibration method, and the world coordinate and the pixel coordinate satisfy:
wherein, XC、YCAnd ZCFor coordinates of a plurality of calibration references in world coordinates, XW、YWAnd ZWSeating in pixel coordinates for multiple calibration referencesMarking;
and (3) reversely deducing the calibration coefficients R and T of the vehicle-mounted AR-HUD by coordinates of the plurality of calibration reference objects in world coordinates and coordinates of the plurality of calibration reference objects in pixel coordinates, for example, calculating the calibration coefficients R and T of the vehicle-mounted AR-HUD according to a matrix solving method.
The method comprises the steps that the AR-HUD is installed on the automobile to be calibrated, and the calibration coefficient of the vehicle-mounted AR-HUD of each automobile type can be determined. So as to form total data, and the coordinates can be output according to different vehicle models.
The calibration coefficients R and T of the vehicle-mounted AR-HUD are important parameters of an algorithm in software of an automobile part ARHUD, and the characteristic of virtual-real combination (namely AR enhancement display) is realized. The ARHUD software can be called AR Creator, the calibration coefficient R and T of the vehicle-mounted AR-HUD are input into the ARHUD software, and AR display of Lane Departure Warning (LDW Lane Departure Warning) is carried out, namely, pixel coordinates on an AR-HUD picture are obtained according to Lane line world coordinates provided by a front-view camera of an automobile, and an AR Warning symbol is drawn and is attached to a real Lane line.
The products provided by the present invention are described in detail above. The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the core concepts of the present invention. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the invention without departing from the inventive concept, and those improvements and modifications also fall within the scope of the claims of the invention.
Claims (6)
1. A calibration method for a vehicle-mounted AR-HUD is characterized by comprising the following steps:
placing a plurality of calibration reference objects at preset intervals in a real scene;
actually measuring the distance from a camera mounted on the automobile to be calibrated to each calibration reference object, and determining the coordinate of each calibration reference object in a world coordinate system, namely, the world coordinate;
starting the AR-HUD, displaying the position of each calibration reference object in a screen projection picture of the AR-HUD, and determining the pixel coordinate of each calibration reference object in a calibration characteristic image in the screen projection picture of the AR-HUD, namely the pixel coordinate;
and determining a calibration coefficient of the vehicle-mounted AR-HUD according to the world coordinate and the pixel coordinate.
2. The method according to claim 1, wherein the vehicle to be calibrated is provided with a plurality of different vehicle types and AR-HUDs, and the calibration coefficient of the on-vehicle AR-HUD of each vehicle type is determined.
3. The method according to claim 1, wherein the method of determining calibration coefficients for an in-vehicle AR-HUD based on the world coordinates and pixel coordinates comprises,
the world coordinates and the pixel coordinates satisfy:
wherein, XC、YCAnd ZCFor a plurality of said calibration references in world coordinates, XW、YWAnd ZWCoordinates of a plurality of calibration reference objects in pixel coordinates;
and the coordinates of the plurality of calibration reference objects in world coordinates and the coordinates in pixel coordinates reversely deduct the calibration coefficients R and T of the vehicle-mounted AR-HUD.
4. The method of claim 3, wherein the world coordinates and pixel coordinates satisfy Zhang Yongyou Camera calibration.
5. The method of claim 1, wherein determining the coordinates of each of the calibration references in a world coordinate system comprises:
and taking the camera as the origin of coordinates, and calibrating world coordinates by coordinate parameter solving software.
6. The method according to claim 3, characterized in that the calibration coefficients R and T of the on-board AR-HUD are calculated according to a matrix solving method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111395591.3A CN114332235A (en) | 2021-11-23 | 2021-11-23 | Calibration method of vehicle-mounted AR-HUD |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111395591.3A CN114332235A (en) | 2021-11-23 | 2021-11-23 | Calibration method of vehicle-mounted AR-HUD |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114332235A true CN114332235A (en) | 2022-04-12 |
Family
ID=81045818
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111395591.3A Pending CN114332235A (en) | 2021-11-23 | 2021-11-23 | Calibration method of vehicle-mounted AR-HUD |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114332235A (en) |
-
2021
- 2021-11-23 CN CN202111395591.3A patent/CN114332235A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7245295B2 (en) | METHOD AND DEVICE FOR DISPLAYING SURROUNDING SCENE OF VEHICLE-TOUCHED VEHICLE COMBINATION | |
CN109688392B (en) | AR-HUD optical projection system, mapping relation calibration method and distortion correction method | |
CN105825475B (en) | 360 degree of full-view image generation methods based on single camera | |
US8514282B2 (en) | Vehicle periphery display device and method for vehicle periphery image | |
US10467789B2 (en) | Image processing device for vehicle | |
JP2018531530A6 (en) | Method and apparatus for displaying surrounding scene of vehicle / towed vehicle combination | |
JP2018531530A5 (en) | ||
CN111559314B (en) | Depth and image information fused 3D enhanced panoramic looking-around system and implementation method | |
US9100554B2 (en) | Method for displaying an image on a display device in a vehicle, driver assistance system and vehicle | |
US20160221503A1 (en) | Method and apparatus for displaying the surroundings of a vehicle, and driver assistance system | |
JP7295123B2 (en) | Surround view system with adjusted projection plane | |
TW201422464A (en) | Automatic calibration method and system for vehicle display system | |
EP3664014B1 (en) | Display control device | |
JP5299296B2 (en) | Vehicle periphery image display device and vehicle periphery image display method | |
CN111582080A (en) | Method and device for realizing 360-degree all-round monitoring of vehicle | |
JP3301421B2 (en) | Vehicle surrounding situation presentation device | |
US9849835B2 (en) | Operating a head-up display of a vehicle and image determining system for the head-up display | |
EP3326146B1 (en) | Rear cross traffic - quick looks | |
CN112825546A (en) | Generating a composite image using an intermediate image surface | |
TW201605247A (en) | Image processing system and method | |
JP2013024712A (en) | Method and system for calibrating multiple camera | |
JP2008034964A (en) | Image display apparatus | |
CN114332235A (en) | Calibration method of vehicle-mounted AR-HUD | |
US11403770B2 (en) | Road surface area detection device | |
KR101398068B1 (en) | Vehicle Installed Camera Extrinsic Parameter Estimation Method and Apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |