CN116027283A - Method and device for automatic calibration of a road side sensing unit - Google Patents

Method and device for automatic calibration of a road side sensing unit Download PDF

Info

Publication number
CN116027283A
CN116027283A CN202111253457.XA CN202111253457A CN116027283A CN 116027283 A CN116027283 A CN 116027283A CN 202111253457 A CN202111253457 A CN 202111253457A CN 116027283 A CN116027283 A CN 116027283A
Authority
CN
China
Prior art keywords
sensor
calibration
interest
detection result
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111253457.XA
Other languages
Chinese (zh)
Inventor
梁津垚
李昕润
M·P·察普夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to CN202111253457.XA priority Critical patent/CN116027283A/en
Publication of CN116027283A publication Critical patent/CN116027283A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention relates to the field of calibration of sensors. The invention relates to a method for automatic calibration of a road side sensing unit comprising a first sensor and a second sensor, the first sensor and the second sensor comprising at least partially overlapping detection ranges, the method comprising the steps of: s1: acquiring a first detection result of a fixed road feature determination section integrated with a calibration reference by a first sensor; s2: acquiring a second detection result of the determined section of the fixed road feature by a second sensor; and, S3: a calibration parameter between the first sensor and the second sensor is determined based on a comparison of the first detection result and the second detection result. The invention also relates to a device for automatic calibration of a road side sensing unit, a deployment apparatus for calibrating a reference, and a computer program product.

Description

Method and device for automatic calibration of a road side sensing unit
Technical Field
The invention relates to a method for automatic calibration of a road side sensing unit, to a device for automatic calibration of a road side sensing unit, to a deployment apparatus for calibrating a reference object, and to a computer program product.
Background
With the deep development of the internet of vehicles technology, more and more road-side devices have the capabilities of environment sensing and information exchange. By means of a plurality of sensors arranged at the road end, real-time traffic conditions can be captured and valuable information processed, while they can also provide non-line-of-sight traffic information to the vehicle, which greatly improves traffic safety.
For the road side sensing unit, more reliable information acquisition is usually realized based on fusion results of multiple sensors, and common sensors include cameras, millimeter wave radars and lidars. However, the reliability of the information input is premised on the sensors being calibrated to each other. In practical applications, the initially set relative positional relationship is followed when different sensors are installed, however, such relative positional relationship may vary depending on the external environment. Thus, it is often necessary to periodically calibrate the rotational-translational relationship between the sensors.
In the existing joint calibration strategy of the camera and the radar, a test vehicle or a manual calibration board needs to be periodically arranged in the fields of view of the camera and the radar, and meanwhile, the camera and the radar acquire calibration point information, so that the position relation of the camera and the radar can be analyzed when the calibration points are enough. However, there are a number of disadvantages in this calibration scheme, in particular, the calibration plate is not fixed in the road environment, but may drift due to external interference, which cannot ensure that all objects to be calibrated have a uniform reference standard. In addition, manual intervention is always required in the process of deploying the calibration plate, which increases the calibration difficulty and limits the application scenario especially for road sections with large traffic flow.
In this context, it is desirable to provide an improved calibration scheme for a road side sensor set.
Disclosure of Invention
It is an object of the present invention to provide a method for automatic calibration of a road side sensing unit, an apparatus for automatic calibration of a road side sensing unit, a deployment device for calibrating a reference object and a computer program product for solving at least part of the problems of the prior art.
According to a first aspect of the present invention, there is provided a method for automatic calibration of a road side sensing unit comprising a first sensor and a second sensor, the first sensor and the second sensor comprising at least partially overlapping detection ranges, the method comprising the steps of:
s1: acquiring a first detection result of a fixed road feature determination section integrated with a calibration reference by a first sensor;
s2: acquiring a second detection result of the determined section of the fixed road feature by a second sensor; and
s3: a calibration parameter between the first sensor and the second sensor is determined based on a comparison of the first detection result and the second detection result.
The invention comprises the following technical conception: by integrating the calibration reference object into the fixed road feature, the calibration reference object is not frequently shifted due to the change of the external environment condition, and the calibration reference object is not required to be carried manually and repeatedly to appear in the surrounding environment of the road side sensing unit, so that the reliable mapping between the detection results of the multiple sensors is facilitated, and the labor cost is greatly saved. In addition, as the calibration reference object is directly embedded into the fixed road feature, the known geometric rule of the road feature (for example, two lane mark lines are known to be parallel to each other, and lamp post (outline) sequences on two sides of the road extend along a straight line) can be used as prior information, so that the number of the required calibration reference objects can be saved when the feature point extraction or coordinate transformation is carried out on the detection result, and the manufacturing process of the conventional calibration plate and related calibration algorithm are simplified.
Optionally, the method further comprises the steps of:
acquiring reference position information in the world coordinate system about a determined section of a fixed road feature, said reference position information being pre-stored in particular in the calibration reference; and
and adjusting the determined calibration parameters between the first sensor and the second sensor based on the reference position information.
In particular, the following technical advantages are achieved: after the internal synchronization of the road side sensor group is completed, the results of the sensors can be additionally unified into a real world coordinate system, so that the information acquired by the road side sensing unit can be suitable for wider application scenes.
Optionally, the step S3 includes:
extracting a region of interest limited by a calibration reference object from the first detection result and the second detection result respectively;
determining a first position coordinate of a corner point of the region of interest in a first sensor coordinate system and a second position coordinate of the corner point of the region of interest in a second sensor coordinate system; and
a rotational translation matrix between the first sensor coordinate system and the second sensor coordinate system is calculated based on the first position coordinate and the second position coordinate of the corner point.
In particular, the following technical advantages are achieved: thus, the joint calibration process does not need to depend on the real world coordinates of the target point.
Optionally, the step S3 includes:
extracting a region of interest limited by a calibration reference object from the first detection result and the second detection result respectively;
projecting the region of interest detected by the first sensor and the region of interest detected by the second sensor into the same coordinate system; and
and adjusting a rotation translation matrix between the first sensor coordinate system and the second sensor coordinate system according to the deviation between the projection results.
In particular, the following technical advantages are achieved: according to the method, final calibration is completed according to the overlapping degree between corresponding point projections, multiple data acquisition is not needed to be performed by changing angles, high-precision external parameter calibration can be realized even under the condition of fewer target points, and time expenditure is saved.
Optionally, the calibration reference delimits a rectangular region of interest, two parallel edges of which are parallel to the longitudinal extension direction of the fixed road feature and/or to the extension direction of the fixed road feature sequence, and the other two parallel edges of which are perpendicular to the longitudinal extension direction of the fixed road feature and/or to the extension direction of the fixed road feature sequence.
In particular, the following technical advantages are achieved: in the process of forming the calibration reference object, the existing geometric outline of the fixed road features is fully utilized, and the deployment quantity and difficulty of the calibration reference object are effectively reduced.
Optionally, at least one calibration reference is arranged at least three corner points of the rectangular region of interest, respectively.
Here, because follow the shape or the distribution rule of road characteristic, consequently need not mark complete four angular points and just can fit out approximate quadrangle region, effectively reduce the deployment quantity of demarcation reference object.
Optionally, the calibration reference comprises an array of planar identifiers comprising ArUco codes, QR codes, bar codes, symbols, characters or numbers, and/or a stereoscopic identifier comprising a corner reflector.
In particular, the following technical advantages are achieved: by arranging the calibration reference in an array form, the information of the region of interest in the sensor space can be directly reconstructed through a machine vision algorithm, so that the calibration process is independent of the real world coordinates of the target points, the calibration reference does not need to prestore information about road characteristics, and the complexity of the calibration reference is simplified.
Optionally, the planar identifier is printed and/or glued in the lane marking and the contour of the planar identifier is adapted to the shape of the defined section of the lane marking.
In particular, the following technical advantages are achieved: this makes the calibration reference longer in the road environment and less susceptible to external factors. Furthermore, the contour adaptation is applied in such a way that no significant influence is exerted on the appearance of the original road environment.
Optionally, the plane identifier is pre-stored with digital sequence number information.
In particular, the following technical advantages are achieved: encoding only digital information may simplify the encoding content and the difficulty of application compared to complex encoding patterns. In addition, for the machine vision algorithm, the capturing sequence of each corner point is limited through the numerical sequence number information, so that the subsequent shape fitting is facilitated.
Optionally, the first sensor and/or the second sensor comprises an image sensor, in the step S3:
determining pixel distances among all angular points of the region of interest in an image shot by an image sensor;
acquiring the actual distance between each angular point of the region of interest;
solving a correspondence matrix between a pixel coordinate system of the image sensor and a ground plane coordinate system according to the pixel distance and the actual distance; and
and determining coordinates of each corner point of the region of interest under the image sensor coordinate system based on the correspondence matrix.
In particular, the following technical advantages are achieved: the perspective relation is skillfully utilized to complete the transformation from the source coordinate to the target coordinate, thereby realizing the position restoration under the coordinate system of the image sensor.
According to a second aspect of the present invention, there is provided an apparatus for automatic calibration of a road side sensing unit, the apparatus being for performing the method according to the first aspect of the present invention, the apparatus comprising:
a first acquisition module configured to be able to acquire a first detection result of a determined section of the fixed road feature, in which a calibration reference is integrated, by a first sensor;
a second acquisition module configured to be able to acquire a second detection result of the determined section of the fixed road feature by a second sensor, and
a determination module configured to determine a calibration parameter between the first sensor and the second sensor based on a comparison of the first detection result and the second detection result.
According to a third aspect of the present invention, there is provided a roadside sensing unit including:
a first sensor and a second sensor, the first sensor and the second sensor comprising detection ranges that at least partially overlap; and
the apparatus according to the second aspect of the invention.
According to a fourth aspect of the invention there is provided a deployment device for a calibration reference configured to enable integration of the calibration reference used in the method according to the first aspect of the invention into a fixed road feature.
According to a fifth aspect of the present invention, there is provided a computer program product, wherein the computer program product comprises a computer program for implementing the method according to the first aspect of the present invention when executed by a computer.
Drawings
The principles, features and advantages of the present invention may be better understood by describing the present invention in more detail with reference to the drawings. The drawings include:
FIG. 1 illustrates a flow chart of a method for automatic calibration of a road side sensing unit according to an exemplary embodiment of the invention;
FIG. 2 shows a schematic diagram of a road side sensing unit including an apparatus for automatic calibration of the road side sensing unit, according to an exemplary embodiment of the present invention;
FIG. 3 illustrates a schematic diagram of a fixed road feature integrated with a calibration reference according to an exemplary embodiment of the present invention;
FIG. 4 shows a schematic view of a fixed road feature integrated with a calibration reference according to another exemplary embodiment of the invention;
FIG. 5 shows a schematic view of a fixed road feature integrated with a calibration reference according to another exemplary embodiment of the invention; and
FIG. 6 shows a schematic diagram of a deployment device for calibrating a reference, according to an exemplary embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous technical effects to be solved by the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and a plurality of exemplary embodiments. It should be understood that the detailed description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 shows a flow chart of a method for automatic calibration of a road side sensing unit according to an exemplary embodiment of the invention.
In step S1, a first detection result of a first sensor of a defined section of the fixed road feature is acquired, in which a calibration reference is integrated. In the case of an image sensor, for example, the image sensor is used to record an image of a defined section of a fixed road feature, so that the calibration reference integrated in the defined section is captured as completely as possible.
In step S2, a second detection result of the determined section of the fixed road feature by the second sensor is acquired. In the case of a second sensor, which is a lidar sensor, the stationary road feature with the calibration reference is scanned by means of the lidar sensor, and a corresponding set of point clouds is acquired therefrom. In the case of a millimeter-wave radar, the second sensor emits electromagnetic waves to the surroundings and accordingly receives echoes, in particular echoes caused by reflectors integrated in fixed road features, which are contained in the echoes.
In step S3, a calibration parameter between the first sensor and the second sensor is determined based on a comparison of the first detection result and the second detection result. In the sense of the present invention, a calibration parameter between a first sensor and a second sensor for example indicates a rotational-translational relationship of the second sensor relative to the first sensor, and vice versa, with reference to the first sensor.
In this case, for example, a region of interest defined by the calibration reference is extracted from the first and second detection results, respectively, and the position coordinates of the corner points of the region of interest in the first and second sensor coordinate systems are determined. Then, based on the positional relationship of the same corner point in the two coordinate systems, a rotational translation matrix between the first sensor coordinate system and the second sensor coordinate system is calculated.
Furthermore, given an initial relative positional relationship between the first sensor and the second sensor, for example, the region of interest detected by the first sensor and the region of interest detected by the second sensor may also be projected into the same coordinate system. And then adjusting a rotation translation matrix between the first sensor coordinate system and the second sensor coordinate system according to the deviation between the projection results. Here, for example, the coordinate overlapping degree between the two sensors may be determined according to the overlapping degree between the projection results of the corresponding points on the region of interest, and if the overlapping degree does not meet the predefined standard, the overlapping degree may be increased by gradually adjusting the angle, the translation distance, and the like by determining the step size, so as to achieve the ideal calibration result.
In one embodiment, the first sensor relates to an image sensor and the second sensor relates to a lidar sensor.
For image sensors, for example, a region of interest defined by a calibration reference can be extracted from the image. In particular, a calibration reference in the form of a coding pattern can be identified in the image by means of a suitable machine vision algorithm, and the digital sequence number information pre-stored in the coding pattern can be determined therefrom. Then, the coding patterns are connected in the order of the number sequence to fit the basic shape of the region of interest in the image and to determine the pixel coordinates of the corner points in the image. In order to restore the position information of each angular point in the coordinate space of the image sensor, the Z coordinates of the points on the ground plane (virtual aerial view plane) where the region of interest is located, which is defined by the calibration reference object, may be set to a fixed value (e.g. 0), and then a ground plane coordinate system is established with any point in the region of interest as the origin, and for simplicity, the top-left-most point of the camera plane may be selected as the origin. By measuring the actual distance between the corner points, the coordinates of the corner points in the established ground plane coordinate system can be determined. Based on the pixel coordinates of each corner and its coordinates in the ground plane coordinate system, a homography matrix H mapping the ground plane coordinate system to the pixel coordinate system may be determined. Based on the previous assumption that the Z coordinates of all points on the ground plane established are assumed to be 0", the homography matrix H described above can be considered equivalent to the inside and outside parameters of the image sensor, and thus the inside and outside parameters of the image sensor can be constrained by the homography matrix.
According to the scale uncertainty of an image sensor (especially a monocular camera), the homography matrix H should have 8 degrees of freedom, wherein a pair of matching corner points can only constrain two equations, so 4 corner points are needed to solve the homography matrix H. Finally, the internal reference matrix K of the image sensor can be further derived based on a solution of the symmetric matrix, which is shown in the following formula:
Figure BDA0003323148720000071
where K is an internal reference matrix of the image sensor, xc, yc, zc are pseudo three-dimensional coordinates of the target point in the image sensor coordinate system (herein Zc is assumed to be 0), u, v are two-dimensional coordinates of the target point in the pixel coordinate system of the image sensor, fx, fy are pixel representations of the focal length f of the image sensor in x, y directions, cx, cy are the origin translation dimensions of the pixel coordinate system relative to the image sensor coordinate system. The reference matrix of the image sensor can be represented by K, which is simplified.
For the laser radar sensor, for example, a plane of an interested region limited by a calibration reference object can be segmented from point clouds acquired by a laser radar by using a RANSAC algorithm, and then a basic shape of the interested region is fitted to obtain a three-dimensional coordinate of a corresponding corner point in a laser radar coordinate system. The three-dimensional coordinates of the corner points can also be obtained using a visual operation interface in the ROS tool.
The relative positional relationship between the image sensor and the lidar can then be determined by the EPnP algorithm. Here, on the premise that three-dimensional position coordinates of four corner points in the image sensor coordinate system and the lidar coordinate system are known, conversion from the lidar coordinate system to the image sensor coordinate system is performed for each set of corresponding corner points by the following formula:
Figure BDA0003323148720000081
wherein Xc, yc, zc are pseudo three-dimensional coordinates of the target point in the image sensor coordinate system (Zc is assumed to be 0 here), X L 、Y L 、Z L Is the three-dimensional coordinates of the target point under the laser radar coordinates. After all four groups of corresponding angular points are traversed, a rotation matrix R and a translation vector T between the coordinate system of the image sensor and the coordinate system of the laser radar are solved. The final calibration is aimed at determining the relative positional relationship between the two sensor coordinate systems as a whole, i.e. the rotational translation matrix
Figure BDA0003323148720000082
Here, since the pseudo three-dimensional coordinates (i.e., two-dimensional coordinates) of the target point in the image sensor coordinate space are obtained based on the corresponding assumption conditions, the point cloud may be adjusted to a two-dimensional point cloud having a fixed height (for example, adjusted to 0 m) by adjusting the external matrix of the laser radar when obtaining the point cloud data of the laser radar. And then, the conversion between the two-dimensional coordinates of the image sensor and the two-dimensional coordinates of the laser radar is realized through the solution of the affine matrix.
In another embodiment, the first sensor relates to an image sensor and the second sensor relates to a millimeter wave radar sensor. In this case, a calibration reference in the form of a corner reflector is pre-embedded in the road sign or in the sequence of lamp posts on the road side, or a coding pattern of a highly reflective material is printed in four sections of the lane marking respectively, so that the region of interest can be limited by a structural or pattern array in a plane coplanar or parallel to the ground plane.
For the image sensor, two-dimensional position coordinates (pseudo three-dimensional position coordinates) of four corner points of the region of interest in the image sensor coordinate system can be determined in a similar manner to the above-described embodiment.
For millimeter wave radar sensors, the radial distance of each corner point relative to the millimeter wave radar and the angle relative to the radar center plane may be determined by the reflected echo of the millimeter wave at the corner reflector. And thus obtaining two-dimensional coordinate information of each angular point of the region of interest. Next, a rotational translation matrix of the image sensor coordinate system and the millimeter wave radar coordinate system is calculated based on the formed corresponding pair of corner points, for example, by ICP algorithm.
In an optional step S4, reference position information in the world coordinate system about the determined section of the fixed road feature is additionally acquired, and the calibration parameters that have been determined between the first sensor and the second sensor are adjusted based on the reference position information. Here, the reference information may be obtained in the following ways: 1) The reference position information is prestored in the calibration reference object in a coding form, so that longitude and latitude coordinates of the fixed road characteristics can be directly read through identification of the calibration reference object; 2) Retrieving longitude and latitude coordinates of the fixed road features from a road map or a traffic specification database by means of a communication interface in the road side sensing unit; 3) The latitude and longitude coordinates of the fixed road feature (the determined section) are located on site manually by means of a locating device, such as a GPS sensor. It should be noted here that: in the exemplary embodiment shown in fig. 1, step S4 is shown as being performed after the first three steps. However, it is also possible for step S4 to be carried out, for example, directly after step S1, or for step S4 to be carried out in parallel with one or more of steps S2 to S4.
Fig. 2 shows a schematic diagram of a road side sensing unit 1 according to an exemplary embodiment of the invention, which road side sensing unit 1 comprises a device 10 for automatic calibration.
The road side sensing unit 1 may be, for example, a street lamp, a traffic sign, a portal frame on a highway, a traffic light or other common road side monitoring device arranged on the road side.
Fixed road features are understood to be common road elements present in a traffic environment, which include, inter alia, street lamps, traffic lights, road signs, zebra crossings, lane markings, road shoulders, road arrow markings, etc. Unlike dynamic environment objects, these fixed road features are mostly belonging to the road design planning domain and therefore do not change at will.
In the sense of the present invention, a calibration reference is understood to be any meaningful pattern or structure suitable for mutual calibration between at least two sensors. For calibration between the image sensor and the lidar, the calibration reference can be printed or glued in the form of a planar identifier array in the lane marking. For calibration between an image sensor and a millimeter wave radar, the calibration reference can be constructed into a pattern of highly reflective material or a corner reflector structure and embedded in an array into an original ground reticle or guideboard. As an example, all calibration reference arrays are integrated in the same section of a fixed road feature with at least one side of each calibration reference being flush with the outer line of that section. As another example, the calibration references are distributed in sections of different fixed road features. It is to be appreciated that the plane of the region of interest defined by the calibration reference lies in or parallel to the road plane.
As shown in fig. 1, the roadside sensing unit 1 is mounted with at least a first sensor 20 and a second sensor 30. Here, the first sensor 20 and the second sensor 30 may be the same type (i.e., redundant) sensor, or may be sensors based on different detection principles, including, for example, a laser radar sensor, a millimeter wave radar sensor, an image sensor, an ultrasonic sensor, and the like. In the exemplary embodiment shown in fig. 1, the first sensor 20 is an image sensor (e.g., a monocular camera) and the second sensor 30 is a lidar sensor, wherein the second sensor 30 has a field of view 100 that at least partially overlaps the first sensor 20. The image sensor is capable of providing an image or video of the object under test. The lidar sensor may provide point cloud information of the object under test.
To achieve calibration of the first sensor 20 and the second sensor 30, a specific form of calibration reference is arranged in the overlapping field of view 100 of the first sensor 20 and the second sensor 30. It is important here that the calibration reference is not carried by a human or test vehicle and is periodically present in this overlapping field of view 100, but is integrated into a determined section of the fixed road feature.
Also arranged in the road side sensing unit 1 is a device 10 for automatic calibration, the device 10 comprising: a first acquisition module 11 configured to be able to acquire a first detection result of a determined section of the fixed road feature by the first sensor 20; a second acquisition module 12 configured to be able to acquire a second detection result of the determined section of the fixed road feature by the second sensor 30; a determination module 13 configured to determine a calibration parameter between the first sensor 20 and the second sensor 30 based on a comparison of the first detection result and the second detection result.
Fig. 3 shows a schematic view of a fixed road feature integrated with a calibration reference according to an exemplary embodiment of the invention.
As shown in fig. 3, a rectangular array of four ArUco codes A, B, C, D is applied, for example by printing or pasting, into the lane markings 201, 202. Here, each ArUco code A, B, C, D is configured to be the same width as the lane marking 201, 202 and is pre-encoded with the number numbers "1", "2", "3", "4" in a clockwise order, respectively, two A, D of the four ArUco codes being embedded in the left lane marking 201 at certain intervals, and the other two B, C being embedded in the corresponding positions of the right lane marking 202. Thus, two parallel edges of the rectangular region of interest 200 defined by the ArUco code array are parallel to the longitudinal extension of the lane markings 201, 202, and the other two parallel edges are perpendicular to the longitudinal extension of the lane markings 201, 202.
As another example, the calibration reference at one corner in the rectangular region of interest 200 may be omitted, such that only three ArUco codes A, B, C are applied in the lane marker lines 201, 202, even if this can define the basic shape of the rectangular region of interest 200 in the detection result of the sensor. The reason for this is that: the fact that the existing lane marking lines 201, 202 are parallel to each other is fully utilized, so that the exact position of the fourth corner can be easily determined through the parallel relationship on the premise that three corners are known.
On the right side of fig. 3 is shown a region of interest 210 in the imaging plane of the image sensor and a region of interest 220 in a bird's eye view (i.e. ground plane), respectively. In the imaging plane, for example, a pixel coordinate system is established with the upper left corner of the image as the origin and the pixel coordinates of the four corner points of the rectangular region of interest 210 are determined therefrom. In the ground plane, by knowing the actual distance between the calibration references (which can be achieved, for example, by means of the auxiliary ranging function of other sensors or by means of manual field measurements), a ground plane coordinate system can be established with the corner point of the upper left corner as the origin, here assuming a Z coordinate of 0. The coordinates of the corner points in the ground plane coordinate system can then be determined. As already explained in connection with the joint calibration procedure in fig. 1, the homography matrix H and thus the coordinates of the corner points of the region of interest 200 in the image sensor coordinate space can be determined by means of the perspective relationship between the two planes.
Fig. 4 shows a schematic view of a fixed road feature integrated with a calibration reference according to another exemplary embodiment of the invention.
As shown in fig. 4, the calibration reference A, B, C, D in the form of an ArUco code is likewise integrated in the lane marking 201, however unlike the embodiment shown in fig. 3: the four ArUco codes A, B, C, D are not integrated in a distributed manner in the defined sections of two lane markings 201, 202 parallel to one another, but are jointly applied in one lane marking 201. Here, the outer edges of the two ArUco codes A, D are flush with the outer edge of one side of the lane marking, and the outer edges of the other two ArUco codes B, C are flush with the outer edge of the other side of the lane marking, so that the rectangular array is adapted to the outer contour of the lane marking as a whole. Similar to fig. 3, the rectangular region of interest 200 is likewise bounded by the calibration reference A, B, C, D.
Alternatively or additionally, it is also conceivable to print the above-mentioned coding patterns in a strongly reflective material, or to additionally integrate other forms of reflective identifiers at each coding pattern, so as to be suitable for more types of sensor calibration.
It is also conceivable that the calibration reference A, B, C, D is not applied directly in the lane markings 201 of the road surface, but on curbs or shoulders extending along both sides of the road, so that the visual perception of the autonomous vehicle is less disturbed by the coding pattern embedded in the lane markings.
Fig. 5 shows a schematic view of a fixed road feature integrated with a calibration reference according to another exemplary embodiment of the invention.
As shown in fig. 5, lamp posts 203 are arranged at certain intervals on both sides of the road, and calibration references A, B, C, D are integrated on these lamp posts 203. Here, all calibration references A, B, C, D are located at the same position of different lampposts, respectively, such that the rectangular frame 200 enclosed by the four calibration references A, B, C, D together is parallel to the road plane.
It is also conceivable to additionally pre-store the reference position information about the fixed road feature 203 in coded form in the calibration reference A, B, C, D. For example, longitude and latitude coordinates in a real world coordinate system with respect to the lamppost 203 may be read from the calibration reference a by respective sensors of the roadside sensing unit. From this, the reference position information at the other corner positions B, C, D can be further calculated. By knowing the information, the alignment with a real world coordinate system can be realized on the basis of carrying out internal calibration on the road-side multisensor, so that the traffic information collected by the road-side is suitable for wider application scenes.
FIG. 6 shows a schematic diagram of a deployment device 300 for calibrating a reference, according to an exemplary embodiment of the present invention.
In this exemplary embodiment, the calibration references A, B, C, D to be integrated into the fixed road feature 201 are in the form of an array of coding patterns, and the fixed road feature 201 is a lane marker line. Here, the deployment device 300 for calibrating the reference A, B, C, D includes, for example, a moving mechanism 301, an applying mechanism 302, and a storing mechanism 303. The list of calibration references to be integrated, the integrated position coordinates and the integrated trajectory are pre-stored in the storage means 303, which information has been received from the outside by a communication interface (not shown) for example during the initialization phase of the deployment device 300. Based on the pre-stored information, the movement mechanism 301 may drive the deployment device 300 along a predetermined trajectory and upon reaching a specified location, print, paste, photo-etch, or otherwise permanently embed the coded pattern into the fixed road feature 201 via the application mechanism 302.
As another example, the deployment device 300 may be configured as or be part of a device dedicated to printing lane markings 201. Thus, the deployment device 300 can achieve the completion of embedding of the calibration reference A, B, C, D while printing the lane markings 201 without the need for secondary printing.
As another example, the calibration reference A, B, C, D is not in the form of a coded pattern, but rather is in the form of a corner reflector or other three-dimensional identifier, for example, whereby it is conceivable that the application mechanism 302 is configured in other suitable ways to effect the application of a particular type of calibration reference A, B, C, D.
Although specific embodiments of the invention have been described in detail herein, they are presented for purposes of illustration only and are not to be construed as limiting the scope of the invention. Various substitutions, alterations, and modifications can be made without departing from the spirit and scope of the invention.

Claims (15)

1. A method for automatic calibration of a roadside sensing unit (1), the roadside sensing unit (1) comprising a first sensor (20) and a second sensor (30), the first sensor (20) and the second sensor (30) comprising at least partially overlapping detection ranges, the method comprising the steps of:
s1: acquiring a first detection result of a first sensor (20) for a defined section of the fixed road feature (201), in which defined section calibration references (A, B, C, D) are integrated;
s2: acquiring a second detection result of the determined section of the fixed road feature (201) by a second sensor (30); and
s3: a calibration parameter between the first sensor (20) and the second sensor (30) is determined based on a comparison of the first detection result and the second detection result.
2. The method of claim 1, wherein the method further comprises the steps of:
-acquiring reference position information in a world coordinate system about a determined section of the fixed road feature (201), said reference position information being pre-stored in particular in the calibration reference (a, B, C, D); and
-adjusting the determined calibration parameter between the first sensor (20) and the second sensor (30) based on the reference position information.
3. The method according to claim 1 or 2, wherein the step S3 comprises:
extracting a region of interest (200) defined by the calibration references (a, B, C, D) from the first detection result and the second detection result, respectively;
determining a first position coordinate of the corner of the region of interest (200) in a first sensor coordinate system and a second position coordinate in a second sensor coordinate system; and
a rotational translation matrix between the first sensor coordinate system and the second sensor coordinate system is calculated based on the first position coordinate and the second position coordinate of the corner point.
4. A method according to any one of claims 1 to 3, wherein said step S3 comprises:
extracting a region of interest (200) defined by the calibration references (a, B, C, D) from the first detection result and the second detection result, respectively;
projecting the region of interest (200) detected by the first sensor (20) and the region of interest (200) detected by the second sensor (30) into the same coordinate system; and
and adjusting a rotation translation matrix between the first sensor coordinate system and the second sensor coordinate system according to the deviation between the projection results.
5. The method according to any one of claims 1 to 4, wherein the plane in which the region of interest (200) defined by the calibration reference (a, B, C, D) lies is coplanar or parallel to the road plane.
6. Method according to any one of claims 1 to 5, wherein the calibration reference (a, B, C, D) delimits a rectangular region of interest (200), two parallel edges of the rectangular region of interest (200) being parallel to the longitudinal extension direction of the fixed road feature (201) and/or to the extension direction of the fixed road feature sequence, and the other two parallel edges of the rectangular region of interest (200) being perpendicular to the longitudinal extension direction of the fixed road feature (201) and/or to the extension direction of the fixed road feature sequence.
7. Method according to claim 6, wherein at least one calibration reference (a, B, C, D) is arranged at least three corner points of the rectangular region of interest (200), respectively.
8. The method according to any one of claims 1 to 7, wherein the calibration reference (a, B, C, D) comprises an array of planar identifiers comprising ArUco codes, QR codes, bar codes, symbols, characters or numbers and/or stereoscopic identifiers comprising corner reflectors.
9. The method according to claim 8, wherein the planar identifier is printed and/or glued in a lane marking, the contour of the planar identifier being adapted to the shape of a determined section of the lane marking.
10. The method according to claim 8 or 9, wherein the plane identifier is pre-stored with digital sequence number information.
11. The method according to any one of claims 1 to 10, wherein the first sensor (20) and/or the second sensor (20) comprises an image sensor, in the step S3:
determining pixel distances between corner points of the region of interest (200) in an image captured by an image sensor;
acquiring actual distances between corner points of the region of interest (200);
solving a correspondence matrix between a pixel coordinate system of the image sensor and a ground plane coordinate system according to the pixel distance and the actual distance; and
coordinates of each corner of the region of interest (200) in the image sensor coordinate system are determined based on the correspondence matrix.
12. An apparatus (10) for automatic calibration of a road side perception unit (1), the apparatus (10) being for performing the method according to any one of claims 1 to 11, the apparatus (10) comprising:
a first acquisition module (11) configured to be able to acquire a first detection result of a determined section of the fixed road feature (201) by a first sensor (20);
a second acquisition module (12) configured to be able to acquire a second detection result of the determined section of the fixed road feature (201) by a second sensor (30), and
a determination module (13) configured to determine a calibration parameter between the first sensor (20) and the second sensor (30) based on a comparison of the first detection result and the second detection result.
13. A roadside sensing unit (1), the roadside sensing unit (1) comprising:
-a first sensor (20) and a second sensor (30), the first sensor (20) and the second sensor (30) comprising detection ranges that at least partially overlap; and
the device (10) according to claim 12.
14. Deployment device (300) for a calibration reference (a, B, C, D) configured to enable integration of the calibration reference (a, B, C, D) used in the method according to any one of claims 1 to 11 into a fixed road feature (201).
15. A computer program product, wherein the computer program product comprises a computer program for implementing the method according to any one of claims 1 to 11 when executed by a computer.
CN202111253457.XA 2021-10-27 2021-10-27 Method and device for automatic calibration of a road side sensing unit Pending CN116027283A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111253457.XA CN116027283A (en) 2021-10-27 2021-10-27 Method and device for automatic calibration of a road side sensing unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111253457.XA CN116027283A (en) 2021-10-27 2021-10-27 Method and device for automatic calibration of a road side sensing unit

Publications (1)

Publication Number Publication Date
CN116027283A true CN116027283A (en) 2023-04-28

Family

ID=86073172

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111253457.XA Pending CN116027283A (en) 2021-10-27 2021-10-27 Method and device for automatic calibration of a road side sensing unit

Country Status (1)

Country Link
CN (1) CN116027283A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116449347A (en) * 2023-06-14 2023-07-18 蘑菇车联信息科技有限公司 Calibration method and device of roadside laser radar and electronic equipment
CN116843747A (en) * 2023-08-31 2023-10-03 北京路凯智行科技有限公司 Calibration method and calibration system for camera and laser radar

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116449347A (en) * 2023-06-14 2023-07-18 蘑菇车联信息科技有限公司 Calibration method and device of roadside laser radar and electronic equipment
CN116449347B (en) * 2023-06-14 2023-10-03 蘑菇车联信息科技有限公司 Calibration method and device of roadside laser radar and electronic equipment
CN116843747A (en) * 2023-08-31 2023-10-03 北京路凯智行科技有限公司 Calibration method and calibration system for camera and laser radar
CN116843747B (en) * 2023-08-31 2024-01-26 北京路凯智行科技有限公司 Calibration method and calibration system for camera and laser radar

Similar Documents

Publication Publication Date Title
US10896539B2 (en) Systems and methods for updating highly automated driving maps
CN110832275B (en) System and method for updating high-resolution map based on binocular image
CN112435300B (en) Positioning method and device
CN111337906A (en) Sensor calibration method and sensor calibration apparatus
CN109844457B (en) System and method for generating digital road model
CN116027283A (en) Method and device for automatic calibration of a road side sensing unit
CN112740225B (en) Method and device for determining road surface elements
US10996337B2 (en) Systems and methods for constructing a high-definition map based on landmarks
Jende et al. A fully automatic approach to register mobile mapping and airborne imagery to support the correction of platform trajectories in GNSS-denied urban areas
Jeong et al. Hdmi-loc: Exploiting high definition map image for precise localization via bitwise particle filter
CN111207762A (en) Map generation method and device, computer equipment and storage medium
US20230351687A1 (en) Method for detecting and modeling of object on surface of road
Chellappa et al. On the positioning of multisensor imagery for exploitation and target recognition
US11513211B2 (en) Environment model using cross-sensor feature point referencing
CN114119682A (en) Laser point cloud and image registration method and registration system
Ke et al. Roadway surveillance video camera calibration using standard shipping container
CN112424568A (en) System and method for constructing high-definition map
KR102195040B1 (en) Method for collecting road signs information using MMS and mono camera
CN112255604A (en) Method and device for judging accuracy of radar data and computer equipment
Hasheminasab et al. Linear Feature-based image/LiDAR integration for a stockpile monitoring and reporting technology
Iannucci et al. Cross-Modal Localization: Using automotive radar for absolute geolocation within a map produced with visible-light imagery
CN112257535B (en) Three-dimensional matching equipment and method for avoiding object
US11733373B2 (en) Method and device for supplying radar data
Jende et al. Fully automatic feature-based registration of mobile mapping and aerial nadir images for enabling the adjustment of mobile platform locations in GNSS-denied urban environments
CN116917936A (en) External parameter calibration method and device for binocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication