CN116989825A - Combined calibration method and system for road side laser radar-camera-UTM coordinate system - Google Patents

Combined calibration method and system for road side laser radar-camera-UTM coordinate system Download PDF

Info

Publication number
CN116989825A
CN116989825A CN202310972417.3A CN202310972417A CN116989825A CN 116989825 A CN116989825 A CN 116989825A CN 202310972417 A CN202310972417 A CN 202310972417A CN 116989825 A CN116989825 A CN 116989825A
Authority
CN
China
Prior art keywords
coordinate system
laser radar
camera
utm
lidar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310972417.3A
Other languages
Chinese (zh)
Inventor
陈仕韬
严宇宸
张皓霖
翟帅帅
郑南宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Shun'an Artificial Intelligence Research Institute
Xian Jiaotong University
Original Assignee
Ningbo Shun'an Artificial Intelligence Research Institute
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Shun'an Artificial Intelligence Research Institute, Xian Jiaotong University filed Critical Ningbo Shun'an Artificial Intelligence Research Institute
Priority to CN202310972417.3A priority Critical patent/CN116989825A/en
Publication of CN116989825A publication Critical patent/CN116989825A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • G01S19/41Differential correction, e.g. DGPS [differential GPS]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Manufacturing & Machinery (AREA)
  • Image Processing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a road side laser radar-camera-UTM coordinate system joint calibration method and a system, wherein the method comprises the following steps: selecting a GNSS RTK equipment acquisition point and acquiring a UTM coordinate sequence P UTM The method comprises the steps of carrying out a first treatment on the surface of the Sequentially extracting GNSS RTK equipment acquisition points from the laser radar point cloud, and acquiring a coordinate sequence P under a laser radar coordinate system Lidar The method comprises the steps of carrying out a first treatment on the surface of the Sequentially extracting GNSS RTK equipment acquisition points in a camera image, and acquiring a coordinate sequence P under a pixel coordinate system Pixel The method comprises the steps of carrying out a first treatment on the surface of the P-based UTM 、P Lidar And P Pixel The external parameter transformation relation among the laser radar coordinate system, the camera coordinate system and the UTM coordinate system is obtained through nonlinear optimization calculation, so that the joint external parameter calibration of the road side laser radar-camera-UTM coordinate system can be realized more conveniently, economically and accurately. The application avoids the high cost of acquiring the high-precision map, has no loss of calibration precision, can acquire the map at different traffic intersections for multiple times, does not interfere with normal traffic, and reduces the potential safety hazard of acquisition personnel.

Description

Combined calibration method and system for road side laser radar-camera-UTM coordinate system
Technical Field
The application belongs to the field of intelligent traffic, and particularly relates to a method and a system for jointly calibrating a laser radar-camera-UTM coordinate system on a road side.
Background
Along with the development of intelligent traffic, intelligent road-end equipment is required to sense traffic conditions accurately in real time, and at present, the road-end equipment mainly comprises sensors such as laser radar, RGB (red, green and blue) cameras and the like. In order to ensure that the laser radar and the RGB camera have coordinated and consistent space description on the scene, external parameters of the laser radar and the RGB camera need to be calibrated.
In order to complete the docking with the upstream and downstream algorithms, the sensing information of the road-end equipment needs to be converted into a unified coordinate system, and a UTM coordinate system is generally selected as the unified coordinate system in the intelligent traffic field. The method comprises the steps of converting road-end equipment into a UTM coordinate system, matching a laser radar point cloud with a high-precision map, positioning the coordinate of the laser radar in the UTM coordinate system, and obtaining the coordinate of the camera in the UTM coordinate system through the external parameter calibration relation of the laser radar and the RGB camera.
However, the acquisition of the high-precision map is very difficult, and has extremely high economic cost and policy cost, and the establishment of the high-precision map at each intersection for calibrating the road-side equipment to the UTM coordinate system is an extremely low-cost scheme. Thus, improvements are needed in the art.
Disclosure of Invention
In order to solve the problems in the prior art, the embodiment of the application aims to provide a combined calibration method, a system and a device capable of being used for realizing combined external parameter calibration of a road side laser radar-camera-UTM coordinate system more conveniently, economically and accurately.
In order to achieve the above object, a first aspect of the present application provides an external parameter calibration method for a roadside lidar-camera-UTM coordinate system, comprising the steps of:
selecting a GNSS RTK equipment acquisition point and acquiring a UTM coordinate sequence P UTM
Sequentially extracting GNSS RTK equipment acquisition points from the laser radar point cloud, and acquiring a coordinate sequence P under a laser radar coordinate system Lidar
Sequentially extracting GNSS RTK equipment acquisition points in a camera image, and acquiring a coordinate sequence P under a pixel coordinate system Pixel
P-based UTM 、P Lidar And P Pixel And obtaining the external parameter transformation relation among the laser radar coordinate system, the camera coordinate system and the UTM coordinate system through nonlinear optimization calculation.
Further, selecting a GNSS RTK device acquisition point and acquiring a UTM coordinate sequence P UTM Comprising the following steps: selecting a public area perceived by a camera and a laser radar as an acquisition area, and judging the public area by observing the scene characteristics in a camera image and a laser radar point cloud when the external parameters of the camera and the laser radar are unknown; selecting at least 3 acquisition points in a public area, wherein the acquisition points are distributed as uniformly as possible in the acquisition area and are not positioned on the same straight line; the longitude and latitude coordinates of the earth are converted into plane coordinates through a projection algorithm, and the UTM coordinates are increased in altitude information on the basis of the plane coordinates.
Further, GNSS RTK equipment acquisition points are sequentially extracted from the laser radar point cloud, and a coordinate sequence P under a laser radar coordinate system is obtained Lidar The method comprises the following steps:
obtaining a plane equation through plane fitting according to the point cloud corresponding to the ground plane;
for each acquisition point, extracting a point cloud corresponding to the GNSS RTK equipment, obtaining a linear equation through linear fitting, wherein the GNSS RTK measurement equipment is in a straight rod shape, the equipment is required to be kept in a vertical state with the ground during measurement, and the linear fitting is carried out on the GNSS RTK measurement equipment in the laser radar point cloud by using a RANSAC algorithm to obtain the linear equation;
and calculating the coordinates of each acquisition point under the laser radar coordinate system according to the linear equation and the ground plane equation which are obtained by fitting.
Further, GNSS RTK equipment acquisition points are sequentially selected from the camera image, and a coordinate sequence P under a pixel coordinate system is obtained Pixel The method comprises the following steps:
detecting GNSS RTK equipment in the camera image, and acquiring the accurate position of the GNSS RTK equipment in the image;
and extracting an intersection point of the GNSS RTK equipment and the ground to obtain pixel coordinates (u, v) of the acquisition point, wherein the pixel coordinates take the upper left corner of the image as an original point, the right corner of the image as a positive u-axis direction and the downward corner of the image as a positive v-axis direction.
Furthermore, when the GNSS RTK device is detected in the camera image, the detection method adopts a target detection algorithm of a YOLO or SSD deep learning type, and detection is realized by marking and training the GNSS RTK device in advance or manual selection is adopted.
Further, based on P UTM ,P Lidar And P Pixel The method for obtaining the external parameter transformation relation between the laser radar coordinate system and the camera coordinate system and between the laser radar coordinate system and the UTM coordinate system through nonlinear optimization calculation comprises the following steps:
based on UTM coordinate sequence P UTM And a laser radar coordinate sequence P Lidar Obtaining an external parameter relation between a laser radar coordinate system and a UTM coordinate system through nonlinear optimization;
based on laser radar coordinate sequence P Lidar Projection onto an image and pixel coordinate sequence P Pixel And obtaining the external parameter relation between the laser radar coordinate system and the camera coordinate system through nonlinear optimization.
Further, based on UTM coordinate sequence P UTM And a laser radar coordinate sequence P Lidar When obtaining the external parameter relation between the laser radar coordinate system and the UTM coordinate system through nonlinear optimization, n matching point pairs are selected altogether, wherein the coordinate sequence P UTM And P Lidar Written in matrix form:
the equation that needs to be optimized is:
wherein the method comprises the steps ofIs the optimal external parameter from the laser radar coordinate system to the UTM coordinate system.
Further, based on the laser radar coordinate sequence P Lidar Projection onto an image and pixel coordinate sequence P Pixei Obtaining the external reference relation between the laser radar coordinate system and the camera coordinate system through nonlinear optimization, wherein the external reference relation comprises the following steps:
a) Firstly, an internal reference matrix K and a distortion coefficient D of a camera are obtained, and a matrix with the internal reference K of 3 multiplied by 3 of the camera is obtained, wherein the form is as follows:
wherein f x And f y Represents the focal length in the transverse and longitudinal directions c x And c y Representing a pixel offset; the distortion coefficient vector D is obtained as a 5-dimensional vector, and the form is as follows:
D=[k1,k2,p1,p2,k3]
where k1, k2, k3 are the first three coefficients of radial distortion and p1, p2 are the first two coefficients of tangential distortion;
b)will P Lidar Is converted from a lidar coordinate system to a camera coordinate system:
c) Performing projection transformation according to the internal reference matrix K, and converting into homogeneous coordinates to obtainWherein Z is KP Camera Is a third dimensional coordinate of (2);
d) The distortion is increased according to taylor expressions of radial distortion and tangential distortion, and the formula is:
radial distortion:
tangential distortion:
wherein (x) 0 ,y 0 ) The coordinates before distortion and the coordinates after distortion are (x, y);
e) Constructing an optimization equation:
where n is the number of sampling points, D (-) represents the process of increasing distortion,is the optimal external parameter from the laser radar coordinate system to the camera coordinate system.
In a second aspect of the present application, a joint calibration system is provided, including a data acquisition and uploading module, a target detection and extraction module, and an external parameter calculation and verification module;
the data acquisition and uploading module acquires longitude, latitude and altitude information by using GNSS RTK equipment and uploads the longitude, latitude and altitude information to the database; inputting camera internal parameters and distortion coefficients by a user, and uploading the camera internal parameters and distortion coefficients to a database; the laser radar acquires the point cloud during sampling and uploads the point cloud to a database; the camera acquires pictures during sampling and uploads the pictures to the database;
the target detection and extraction module detects and extracts an intersection point of the GNSS RTK equipment in the point cloud and the ground, and detects and extracts an intersection point of the GNSS RTK equipment in the image and the ground;
the external parameter calculation and verification module is used for calculating external parameters of the laser radar coordinate system and the UTM coordinate system, calculating external parameters of the laser radar coordinate system and the camera coordinate system, converting the laser radar point cloud into the UTM coordinate system, comparing and verifying with a high-precision map, and projecting the laser radar point cloud onto an image for verification.
In a third aspect of the present application, a joint calibration device is provided, including a GNSS RTK apparatus, a lidar, a camera, a memory, and a processor; the GNSS RTK device is used for measuring UTM coordinates of the determined location; the laser radar is used for collecting three-dimensional point cloud data of target equipment; the camera is used for collecting image data of the target equipment; the memory is used for storing the data of the equipment and the uploading data of the user; the processor is used for executing the method steps of the joint calibration method of the road side laser radar-camera-UTM coordinate system.
Compared with the prior art, the application has at least the following beneficial effects:
according to the joint calibration method of the road side laser radar-camera-UTM coordinate system, UTM coordinate information of a plurality of points is collected through GNSS RTK equipment, and simultaneously, the laser radar and the camera respectively collect three-dimensional point cloud and image data at corresponding moments; detecting and extracting three-dimensional point clouds and image data of each acquisition point to obtain coordinates of the acquisition point under a laser radar coordinate system and a pixel coordinate system; calculating coordinate values of each acquisition point in different coordinate systems to obtain external parameter transformation relations among a laser radar coordinate system, a camera coordinate system and a UTM coordinate system; the GNSS RTK equipment is used for directly selecting points to acquire coordinates, so that the high cost of acquiring a high-precision map is saved, and the precision is not lost; because GNSS RTK equipment is nimble portable, is straight-bar form, and alone can use by hand, can gather at different traffic intersections, does not disturb normal traffic, has avoided using big demarcation plate or demarcation car to the influence of traffic when putting according to different positions in the middle of the road, has also reduced the potential safety hazard of gathering personnel. By means of the data information of the plurality of acquisition points, the external parameter calibration of the laser radar and the camera is realized, more matching point pairs can be selected, and in the scheme of using the calibration plate traditionally, each calibration plate can only have four matching point pairs, and the distribution is centralized, so that the calibration of the road side large visual angle is not facilitated, and therefore, the scheme can obtain the external parameter of the laser radar and the camera more accurately.
Drawings
Fig. 1 is a schematic flow chart of a road side laser radar-camera-UTM coordinate system joint calibration method without depending on a high-precision map.
FIG. 2 is a schematic illustration of handheld GNSS RTK device measurements.
Fig. 3 is a schematic flow diagram of laser radar point cloud processing.
Fig. 4 is a schematic diagram of laser radar point cloud and verification high-precision map matching after calibration.
Fig. 5 is a schematic diagram of the laser radar point cloud projected into an image after calibration.
FIG. 6 is a schematic diagram of a web application interface of a joint calibration system designed.
Detailed Description
The present application will be described in further detail with reference to specific embodiments and drawings in order to make the objects, technical solutions and advantages of the present application more apparent. It should be noted that the specific embodiments described herein are for the purpose of illustrating the application only and are not to be construed as limiting the application.
A first aspect of the embodiment of the present application provides a laser radar-camera-UTM coordinate system joint calibration method, and fig. 1 is a flowchart of the method.
As shown in fig. 1, the method may include the steps of:
1) Selecting a GNSS RTK equipment acquisition point and acquiring a UTM coordinate sequence P UTM
2) Sequentially extracting GNSS RTK equipment acquisition points from the laser radar point cloud, and acquiring the acquisition points under a laser radar coordinate systemCoordinate sequence P Lidar
3) Sequentially extracting GNSS RTK equipment acquisition points in a camera image, and acquiring a coordinate sequence P under a pixel coordinate system Pixel
4) P-based UTM 、P Lidar And P Pixei And obtaining the external parameter transformation relation among the laser radar coordinate system, the camera coordinate system and the UTM coordinate system through nonlinear optimization calculation.
It should be noted that the concept of the coordinate system used in the embodiment of the present application is as follows: UTM (Universal Transverse Mercator, universal transverse-axis ink Carbot) coordinate System T UTM Laser radar coordinate system T Lidar Camera coordinate system T Camera And a pixel coordinate system T Pixel Wherein the pixel coordinate system T Pixel Is a two-dimensional coordinate system, and the others are three-dimensional coordinate systems.
In an embodiment of the application, the transformation from a lidar coordinate system to a UTM coordinate systemTransformation from camera coordinate system to UTM coordinate system +.>Is the parameter ultimately required by the calibration method of the present application. The transformation parameters obtained are a 4×4 matrix comprising a 3×3 rotation matrix and a three-dimensional translation vector, wherein the rotation matrix has units of radians and the translation vector has units of meters.
According to the embodiment of the application, a GNSS RTK device acquisition point is selected, and a UTM coordinate sequence P is acquired UTM Comprising: the public area perceived by the camera and the laser radar is selected as an acquisition area, and when the external parameters of the camera and the laser radar are unknown, the public area is judged by observing the scene characteristics in the camera image and the laser radar point cloud.
According to the embodiment of the application, a GNSS RTK device acquisition point is selected, and a UTM coordinate sequence P is acquired UTM Further comprising: selecting at least 3 acquisition points in a common area, the acquisition points being in acquisitionThe area distribution is as uniform as possible and not collinear. It should be noted that, the more the number of the acquisition points is, the more accurate the final calibration effect is, and the projections of the acquisition points on the image are uniformly distributed on the image as much as possible, so as to avoid aggregation at the edge of the image.
According to the embodiment of the application, a GNSS RTK device acquisition point is selected, and a UTM coordinate sequence P is acquired UTM Further comprising: each acquisition point obtains longitude, latitude and altitude information through GNSS RTK equipment, and corresponding UTM coordinates (E, N, H) are obtained through conversion. It should be noted that, the UTM coordinate system is a projection coordinate, and is a plane coordinate that is obtained by converting longitude and latitude coordinates of the earth through a projection algorithm. The UTM coordinate system divides the east-west direction of the earth into 60 areas, namely, longitude intervals 1-60; the north-south direction is divided into 20 sections from 80 degrees in south latitude to 84 degrees in north latitude, namely, latitude section C-X. Each latitude and longitude grid thus divided is identified by a unique number, referred to as a coordinate band. Knowing the latitude and longitude and the local UTM coordinate band, the latitude and longitude can be converted into UTM coordinates, i.e., (E, N). In the embodiment of the application, the UTM coordinate is added with altitude information on the basis of the two-dimensional UTM coordinate system, so that a three-dimensional coordinate system, namely (E, N, H), is changed, and the conversion relation between the UTM coordinate system and the laser radar coordinate system and the camera coordinate system is more conveniently described.
According to one embodiment of the application, GNSS RTK device acquisition points are sequentially extracted from a laser radar point cloud, and a coordinate sequence P under a laser radar coordinate system is obtained Lidar The method comprises the following steps:
1) And obtaining a plane equation through plane fitting according to the point cloud corresponding to the ground plane. In one embodiment of the present application, the plane fitting uses RANSAC (random sample consensus) algorithm, and the resulting plane equation is ax+by+cz+d=0.
2) And extracting point clouds corresponding to the GNSS RTK equipment for each acquisition point, and obtaining a linear equation through linear fitting. It should be noted that, the GNSS RTK measurement apparatus is in a straight bar shape, and a level gauge is provided thereon, and the apparatus needs to be kept in a vertical state with respect to the ground during measurement. Thus in one embodiment of the application, the RANSAC algorithm is used for laser lightPerforming straight line fitting by using GNSS RTK measurement equipment in Lei Dadian cloud to obtain a straight line equation as followsWherein a is 0 ,b 0 ,c 0 ≠0。
3) And calculating coordinates (x, y, z) of each acquisition point under a laser radar coordinate system according to the linear equation and the ground plane equation which are obtained by fitting. In the embodiment of the application, the acquisition point is the intersection point of the linear equation and the ground plane equation, and the two equations are combined, so that the coordinate of the acquisition point can be obtained by solving:
according to one embodiment of the application, GNSS RTK device acquisition points are sequentially selected in the camera image, and a coordinate sequence P under a pixel coordinate system is obtained Pixel The method comprises the following steps:
1) The GNSS RTK device is detected in the image. The detection method can adopt a target detection algorithm of deep learning types such as YOLO, SSD and the like, and detection can be realized by marking and training GNSS RTK equipment in advance. The detection method can also adopt a manual selection mode, and the detection method needs manual intervention, but is more accurate, and the time, the calculation force and other costs are saved. In one embodiment of the application, a manual selection mode is adopted to find the accurate position of the GNSS RTK device in the image.
2) And extracting an intersection point of the GNSS RTK equipment and the ground to obtain pixel coordinates (u, v) of the acquisition point. In one embodiment of the present application, the pixel coordinates are in the positive u-axis direction to the right and in the positive v-axis direction with the upper left corner of the image as the origin.
According to one embodiment of the application, based on P UTM ,P Lidar And P Pixel The method for obtaining the external parameter transformation relation between the laser radar coordinate system and the camera coordinate system and between the laser radar coordinate system and the UTM coordinate system through nonlinear optimization calculation comprises the following steps:
1) UTM-based coordinate sequenceP UTM And a laser radar coordinate sequence P Lidar And obtaining the external parameter relation between the laser radar coordinate system and the UTM coordinate system through nonlinear optimization. In one embodiment of the present application, n matching point pairs are selected altogether, wherein the coordinate sequence P UTM And P Lidar Each of which can be written in matrix form:
the equation that needs to be optimized is
Wherein the method comprises the steps ofIs the optimal external parameter from the laser radar coordinate system to the UTM coordinate system.
2) Based on laser radar coordinate sequence P Lidar Projection onto an image and pixel coordinate sequence P Pixel And obtaining the external parameter relation between the laser radar coordinate system and the camera coordinate system through nonlinear optimization. In one embodiment of the application, it is desirable that:
a) Firstly, an internal reference matrix K and a distortion coefficient D of a camera are obtained. The method comprises the steps of obtaining a matrix with internal parameters K of 3 multiplied by 3 of a camera, wherein the matrix is in the form of:
wherein f x And f y Represents the focal length in the transverse and longitudinal directions c x And c y Representing a pixel offset; the distortion coefficient vector D is obtained as a 5-dimensional vector, and the form is as follows:
D=[k1,k2,p1,p2,k3]
where k1, k2, k3 are the first three coefficients of radial distortion and p1, p2 are the first two coefficients of tangential distortion.
b) Will P Lidar Is converted from a lidar coordinate system to a camera coordinate system:
c) Performing projection transformation according to the internal reference matrix K, and converting into homogeneous coordinates to obtainWherein Z is KP Camera Is defined by the third dimensional coordinates of (a).
d) The distortion is increased according to taylor expressions of radial distortion and tangential distortion, and the formula is:
radial distortion:
tangential distortion:wherein (x) 0 ,y 0 ) The coordinates before distortion and the coordinates after distortion are (x, y).
e) Constructing an optimization equation:
where n is the number of sampling points, D (-) represents the process of increasing distortion,is the optimal external parameter from the laser radar coordinate system to the camera coordinate system.
In a second aspect of the application, there is provided a joint calibration system comprising:
1) And a data acquisition and uploading module: acquiring longitude, latitude and altitude information by using GNSS RTK equipment, and uploading the longitude, latitude and altitude information to a database; inputting camera internal parameters and distortion coefficients by a user, and uploading the camera internal parameters and distortion coefficients to a database; the laser radar acquires the point cloud during sampling and uploads the point cloud to a database; and the camera acquires the picture during sampling and uploads the picture to the database.
2) The target detection and extraction module: and detecting and extracting an intersection point of the GNSS RTK equipment in the point cloud and the ground, and detecting and extracting an intersection point of the GNSS RTK equipment in the image and the ground.
3) And the external parameter calculation and verification module: and calculating external parameters of the laser radar coordinate system and the UTM coordinate system, calculating external parameters of the laser radar coordinate system and the camera coordinate system, converting the laser radar point cloud into the UTM coordinate system, comparing and verifying with a high-precision map, and projecting the laser radar point cloud onto an image for verification.
In a third aspect of the present application, there is provided a joint calibration device comprising: a GNSS RTK device for measuring UTM coordinates of the determined location; the laser radar is used for collecting three-dimensional point cloud data of target equipment; a camera for acquiring image data of a target device; the memory is used for storing the data of the equipment and the uploading data of the user; and the processor is used for executing the method steps and calculating to obtain the external parameters.

Claims (10)

1. A combined calibration method of a road side laser radar-camera-UTM coordinate system is characterized by comprising the following steps:
selecting a GNSS RTK equipment acquisition point and acquiring a UTM coordinate sequence P UTM
Sequentially extracting GNSS RTK equipment acquisition points from the laser radar point cloud, and acquiring a coordinate sequence P under a laser radar coordinate system Lidar
Sequentially extracting GNSS RTK equipment acquisition points in a camera image, and acquiring a coordinate sequence P under a pixel coordinate system Pixel
P-based UTM 、P Lidar And P Pixel And obtaining the external parameter transformation relation among the laser radar coordinate system, the camera coordinate system and the UTM coordinate system through nonlinear optimization calculation.
2. The roadside lidar-camera-UTM coordinate system joint calibration method of claim 1, wherein a GNSS RTK device acquisition point is selected and UTM coordinates are acquiredSequence P UTM Comprising the following steps: selecting a public area perceived by a camera and a laser radar as an acquisition area, and judging the public area by observing the scene characteristics in a camera image and a laser radar point cloud when the external parameters of the camera and the laser radar are unknown; selecting at least 3 acquisition points in a public area, wherein the acquisition points are distributed as uniformly as possible in the acquisition area and are not positioned on the same straight line; the longitude and latitude coordinates of the earth are converted into plane coordinates through a projection algorithm, and the UTM coordinates are increased in altitude information on the basis of the plane coordinates.
3. The roadside laser radar-camera-UTM coordinate system joint calibration method according to claim 1, wherein GNSS RTK device acquisition points are sequentially extracted from a laser radar point cloud, and a coordinate sequence P under the laser radar coordinate system is obtained Lidar The method comprises the following steps:
obtaining a plane equation through plane fitting according to the point cloud corresponding to the ground plane;
for each acquisition point, extracting a point cloud corresponding to the GNSS RTK equipment, obtaining a linear equation through linear fitting, wherein the GNSS RTK measurement equipment is in a straight rod shape, the equipment is required to be kept in a vertical state with the ground during measurement, and the linear fitting is carried out on the GNSS RTK measurement equipment in the laser radar point cloud by using a RANSAC algorithm to obtain the linear equation;
and calculating the coordinates of each acquisition point under the laser radar coordinate system according to the linear equation and the ground plane equation which are obtained by fitting.
4. The roadside lidar-camera-UTM coordinate system joint calibration method according to claim 1, wherein GNSS RTK device acquisition points are sequentially selected in the camera image, and a coordinate sequence P in a pixel coordinate system is obtained Pixel The method comprises the following steps:
detecting GNSS RTK equipment in the camera image, and acquiring the accurate position of the GNSS RTK equipment in the image;
and extracting an intersection point of the GNSS RTK equipment and the ground to obtain pixel coordinates (u, v) of the acquisition point, wherein the pixel coordinates take the upper left corner of the image as an original point, the right corner of the image as a positive u-axis direction and the downward corner of the image as a positive v-axis direction.
5. The roadside lidar-camera-UTM coordinate system joint calibration method according to claim 1, wherein when the GNSS RTK apparatus is detected in the camera image, the detection method adopts a YOLO or SSD deep learning type target detection algorithm, and detection is realized by labeling and training the GNSS RTK apparatus in advance, or manual selection is adopted.
6. The roadside lidar-camera-UTM coordinate system joint calibration method of claim 1, wherein the method is based on P UTM ,P Lidar And P Pixel The method for obtaining the external parameter transformation relation between the laser radar coordinate system and the camera coordinate system and between the laser radar coordinate system and the UTM coordinate system through nonlinear optimization calculation comprises the following steps:
based on UTM coordinate sequence P UTM And a laser radar coordinate sequence P Lidar Obtaining an external parameter relation between a laser radar coordinate system and a UTM coordinate system through nonlinear optimization;
based on laser radar coordinate sequence P Lidar Projection onto an image and pixel coordinate sequence P Pixel And obtaining the external parameter relation between the laser radar coordinate system and the camera coordinate system through nonlinear optimization.
7. The roadside lidar-camera-UTM coordinate system joint calibration method according to claim 1, wherein the calibration method is based on a UTM coordinate sequence P UTM And a laser radar coordinate sequence P Lidar When obtaining the external parameter relation between the laser radar coordinate system and the UTM coordinate system through nonlinear optimization, n matching point pairs are selected altogether, wherein the coordinate sequence P UTM And P Lidar Written in matrix form:
the equation that needs to be optimized is:
wherein the method comprises the steps ofIs the optimal external parameter from the laser radar coordinate system to the UTM coordinate system.
8. The roadside lidar-camera-UTM coordinate system joint calibration method according to claim 1, wherein the laser radar coordinate system is represented by a laser radar coordinate sequence P Lidar Projection onto an image and pixel coordinate sequence P Pixel Obtaining the external reference relation between the laser radar coordinate system and the camera coordinate system through nonlinear optimization, wherein the external reference relation comprises the following steps:
a) Firstly, an internal reference matrix K and a distortion coefficient D of a camera are obtained, and a matrix with the internal reference K of 3 multiplied by 3 of the camera is obtained, wherein the form is as follows:
wherein f x And f y Represents the focal length in the transverse and longitudinal directions c x And c y Representing a pixel offset; the distortion coefficient vector D is obtained as a 5-dimensional vector, and the form is as follows:
D=[k1,k2,p1,p2,k3]
where k1, k2, k3 are the first three coefficients of radial distortion and p1, p2 are the first two coefficients of tangential distortion;
b) Will P Lidar Is converted from a lidar coordinate system to a camera coordinate system:
c) Performing projection transformation according to the internal reference matrix K, and converting into homogeneous coordinates to obtainWherein Z is KP Camera Is a third dimensional coordinate of (2);
d) The distortion is increased according to taylor expressions of radial distortion and tangential distortion, and the formula is:
radial distortion:
tangential distortion:
wherein (x) 0 ,y 0 ) The coordinates before distortion and the coordinates after distortion are (x, y);
e) Constructing an optimization equation:
where n is the number of sampling points, D (-) represents the process of increasing distortion,is the optimal external parameter from the laser radar coordinate system to the camera coordinate system.
9. The road side laser radar-camera-UTM coordinate system combined calibration system is characterized by comprising a data acquisition and uploading module, a target detection and extraction module and an external parameter calculation and verification module;
the data acquisition and uploading module acquires longitude, latitude and altitude information by using GNSS RTK equipment and uploads the longitude, latitude and altitude information to the database; inputting camera internal parameters and distortion coefficients by a user, and uploading the camera internal parameters and distortion coefficients to a database; the laser radar acquires the point cloud during sampling and uploads the point cloud to a database; the camera acquires pictures during sampling and uploads the pictures to the database;
the target detection and extraction module detects and extracts an intersection point of the GNSS RTK equipment in the point cloud and the ground, and detects and extracts an intersection point of the GNSS RTK equipment in the image and the ground;
the external parameter calculation and verification module is used for calculating external parameters of the laser radar coordinate system and the UTM coordinate system, calculating external parameters of the laser radar coordinate system and the camera coordinate system, converting the laser radar point cloud into the UTM coordinate system, comparing and verifying with a high-precision map, and projecting the laser radar point cloud onto an image for verification.
10. The combined calibration device is characterized by comprising GNSS RTK equipment, a laser radar, a camera, a memory and a processor; the GNSS RTK device is used for measuring UTM coordinates of the determined location; the laser radar is used for collecting three-dimensional point cloud data of target equipment; the camera is used for collecting image data of the target equipment; the memory is used for storing the data of the equipment and the uploading data of the user; a processor for performing the method steps of the method of any one of claims 1-8 in combination with calibrating a laser radar-camera-UTM coordinate system at the road side.
CN202310972417.3A 2023-08-03 2023-08-03 Combined calibration method and system for road side laser radar-camera-UTM coordinate system Pending CN116989825A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310972417.3A CN116989825A (en) 2023-08-03 2023-08-03 Combined calibration method and system for road side laser radar-camera-UTM coordinate system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310972417.3A CN116989825A (en) 2023-08-03 2023-08-03 Combined calibration method and system for road side laser radar-camera-UTM coordinate system

Publications (1)

Publication Number Publication Date
CN116989825A true CN116989825A (en) 2023-11-03

Family

ID=88527909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310972417.3A Pending CN116989825A (en) 2023-08-03 2023-08-03 Combined calibration method and system for road side laser radar-camera-UTM coordinate system

Country Status (1)

Country Link
CN (1) CN116989825A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118209081A (en) * 2024-05-16 2024-06-18 杭州计算机外部设备研究所(中国电子科技集团公司第五十二研究所) Method for measuring distance and positioning target by multi-photoelectric linkage with turntable

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118209081A (en) * 2024-05-16 2024-06-18 杭州计算机外部设备研究所(中国电子科技集团公司第五十二研究所) Method for measuring distance and positioning target by multi-photoelectric linkage with turntable

Similar Documents

Publication Publication Date Title
CN110570466B (en) Method and device for generating three-dimensional live-action point cloud model
CN111473739B (en) Video monitoring-based surrounding rock deformation real-time monitoring method for tunnel collapse area
CN112836737A (en) Roadside combined sensing equipment online calibration method based on vehicle-road data fusion
CN110031829B (en) Target accurate distance measurement method based on monocular vision
CN107560592B (en) Precise distance measurement method for photoelectric tracker linkage target
CN110782498B (en) Rapid universal calibration method for visual sensing network
CN113205604A (en) Feasible region detection method based on camera and laser radar
CN113223075A (en) Ship height measuring system and method based on binocular camera
CN116597013B (en) Satellite image geometric calibration method based on different longitude and latitude areas
CN112270320B (en) Power transmission line tower coordinate calibration method based on satellite image correction
CN111996883B (en) Method for detecting width of road surface
CN113569647B (en) AIS-based ship high-precision coordinate mapping method
CN116989825A (en) Combined calibration method and system for road side laser radar-camera-UTM coordinate system
CN106152978B (en) A kind of untouchable area measurement method based on image analysis
CN106504287A (en) Monocular vision object space alignment system based on template
CN102012213B (en) New method for measuring foreground height through single image
CN111311659B (en) Calibration method based on three-dimensional imaging of oblique plane mirror
CN110033492A (en) Camera marking method and terminal
CN104599281A (en) Panoramic image and remote sensing image registration method based on horizontal line orientation consistency
CN115511878A (en) Side slope earth surface displacement monitoring method, device, medium and equipment
CN112365545A (en) Calibration method of laser radar and visible light camera based on large-plane composite target
CN106709432B (en) Human head detection counting method based on binocular stereo vision
CN103453882B (en) A kind of ceiling of clouds measuring system based on aircraft and ceiling of clouds measuring method
CN106845360A (en) High-resolution crop surface model construction method based on unmanned aerial vehicle remote sensing
CN112446915A (en) Picture-establishing method and device based on image group

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination