CN114693807A - Method and system for reconstructing mapping data of power transmission line image and point cloud - Google Patents

Method and system for reconstructing mapping data of power transmission line image and point cloud Download PDF

Info

Publication number
CN114693807A
CN114693807A CN202210405718.3A CN202210405718A CN114693807A CN 114693807 A CN114693807 A CN 114693807A CN 202210405718 A CN202210405718 A CN 202210405718A CN 114693807 A CN114693807 A CN 114693807A
Authority
CN
China
Prior art keywords
image
current image
coordinates
point cloud
mapping data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210405718.3A
Other languages
Chinese (zh)
Other versions
CN114693807B (en
Inventor
毛锋
戴永东
王茂飞
姚建光
高超
吴奇伟
王神玉
仲坚
张泽
鞠玲
翁蓓蓓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center
State Grid Jiangsu Electric Power Co Ltd
Taizhou Power Supply Co of State Grid Jiangsu Electric Power Co Ltd
Original Assignee
State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center
State Grid Jiangsu Electric Power Co Ltd
Taizhou Power Supply Co of State Grid Jiangsu Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center, State Grid Jiangsu Electric Power Co Ltd, Taizhou Power Supply Co of State Grid Jiangsu Electric Power Co Ltd filed Critical State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center
Priority to CN202210405718.3A priority Critical patent/CN114693807B/en
Publication of CN114693807A publication Critical patent/CN114693807A/en
Application granted granted Critical
Publication of CN114693807B publication Critical patent/CN114693807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method and a system for reconstructing mapping data of an image and point cloud of a power transmission line, wherein the method comprises the following steps: acquiring a current image through a shooting device; acquiring pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data; comparing the current image with the reference image, and judging whether the current image is abnormal; when the current image is abnormal, calibrating the shooting device to obtain parameter information; reconstructing the mapping data according to the parameter information; the method can improve the precision of the distance measurement task performed by using the current image and correct the distance measurement error.

Description

Method and system for reconstructing mapping data of power transmission line image and point cloud
Technical Field
The invention relates to the technical field of data processing, in particular to a method and a system for reconstructing mapping data of an image and point cloud of a power transmission line.
Background
The laser radar scanning technology is a new three-dimensional data acquisition technology, and mass point cloud data can be rapidly acquired by using laser radar scanners carried on different platforms such as tripods, automobiles, airplanes and satellites. The point cloud data contains rich information such as longitude and latitude coordinates, strength, multiple echoes, colors and the like of each point, and has relevant application in the fields of surveying and mapping, forestry, agriculture, digital cities and the like.
The visible light two-dimensional picture that unmanned aerial vehicle gathered need can restore out three-dimensional electric power scene through the process of rebuilding, and it can have certain deviation with the actual conditions. The technical advantage of laser radar point cloud mapping is that the space information of an electric power scene can be accurately restored and distance measurement is carried out, and the disadvantage of laser radar point cloud mapping is that color information in the electric power scene cannot be restored, the visualization effect is poor, and accurate electric power object classification is difficult to carry out by utilizing laser radar point cloud data. Therefore, in practical applications, pixels in the two-dimensional image often correspond to point cloud space coordinates in the laser point cloud data, and for the image and the point cloud data of the same power transmission line site, a mapping relationship can be established between target pixel coordinates in the image and the point cloud data of a target point (space coordinates of the target point), and based on the mapping relationship, a distance measurement task can be performed, for example, an actual distance between targets corresponding to two target pixel points in the image in the real world can be determined.
However, when the photographing device is abnormal due to aging, displacement, etc., the photographing device outputs an abnormal image, the mapping relationship established between the original point cloud data and the image cannot be applied to the abnormal image, and a large distance measurement error occurs when the abnormal image is used for performing a distance measurement task, so that the distance measurement accuracy cannot be guaranteed. Patent document CN102982548A provides a multi-view stereoscopic video capture system and a camera parameter calibration method thereof: acquiring internal and external parameters of each camera in the system; acquiring multi-view images of a common scene at the same time through each camera, and performing feature point detection and matching on the multi-view images to obtain matching points among the view images; reconstructing by using camera parameters to obtain three-dimensional space point cloud coordinates of matching points among the viewpoint images; adjusting and optimizing by utilizing a sparse bundle set according to the three-dimensional space point cloud coordinates and the internal and external parameters of the camera to obtain a reprojection error, and optimizing the reprojection error and the internal and external parameters of the camera; judging whether to carry out secondary optimization according to the optimized reprojection error; and judging whether to recalibrate the parameters according to the secondary optimization result. However, the method cannot solve the problem that the distance measurement task is affected due to the fact that the abnormal image is inaccurate in point cloud coordinate acquisition caused by aging, displacement and the like of the shooting device.
Disclosure of Invention
The invention provides a method and a system for reconstructing mapping data of an image and point cloud of a power transmission line, which can adjust the existing mapping data, reestablish the mapping relation aiming at the current image, improve the precision of a distance measurement task performed by utilizing the current image and correct a distance measurement error.
A reconstruction method of mapping data of an image and point cloud of a power transmission line comprises the following steps:
acquiring a current image through a shooting device;
acquiring pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data;
comparing the current image with the reference image, and judging whether the current image is abnormal;
when the current image is abnormal, calibrating the shooting device to obtain parameter information;
and reconstructing the mapping data according to the parameter information.
Further, comparing the current image with the reference image to determine whether the current image is abnormal includes:
selecting a feature point in a current image, comparing the pixel coordinate of the feature point in the current image with the pixel coordinate of the corresponding feature point in a reference image, if the pixel coordinate of the feature point in the current image is equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is normal, and if the pixel coordinate of the feature point in the current image is not equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is determined to be an abnormal image.
Further, judging whether the current image is abnormal further comprises:
and calculating multiple groups of offsets of the pixel coordinates of the multiple characteristic points in the current image and the pixel coordinates of the corresponding multiple characteristic points in the reference image, if the multiple groups of offsets are equal, determining that the current image is offset, and if the multiple groups of offsets are not equal, determining that the current image is distorted.
Further, the parameter information includes internal parameters, external parameters and distortion parameters of the photographing device.
Further, reconstructing the mapping data according to the parameter information includes:
and reconstructing the mapping data of the current image with the offset according to the external reference, the internal reference, the pixel coordinates of a plurality of characteristic points in the current image and the space coordinates of point clouds in the point cloud data corresponding to the external reference, the internal reference and the current image.
Further, reconstructing the mapping data according to the parameter information includes:
and reconstructing the mapping data of the distorted current image according to the internal parameters, the external parameters and the distortion parameters.
Further, reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters includes:
calculating the space coordinates of the characteristic point cloud in the point cloud data corresponding to the normal pixel coordinates in the normal image according to the internal reference and the external reference;
calculating the coordinates of the normal pixels corresponding to the coordinates of target pixels in the distorted current image according to the distortion parameters;
and reconstructing the mapping data according to the internal parameters, the external parameters, the target pixel coordinates and the space coordinates of the characteristic point cloud.
Further, the distortion parameters include a radial deformation coefficient and a tangential deformation coefficient, and the target pixel coordinate is calculated by the following formula:
Figure BDA0003601775200000031
x”=x'×(1+k1×r2+k2×r4)+2×p1×x'×y'+p2×(r2+2×x'2);
y”=y'×(1+k1×r2+k2×r4)+2×p2×x'×y'+p1×(r2+2×y'2);
Figure BDA0003601775200000041
wherein r is2=x'2+y'2R denotes a warping factor of physical coordinates of the image, k1, k2 are radial deformation coefficients, p1, p2 are tangential deformation coefficients, and u is a distortion coefficientd、vdIs the pixel coordinate in the distorted current image, u, v are the normal pixel coordinates in the normal image, x ', y' are the intermediate quantities from the camera coordinate system to the image coordinate system, x "and y" are the coordinates of the distorted position, cx,cyAmount of deviation of optical axis from coordinate center of projection plane, fxAnd fyIs the focal length of the camera.
Further, after calculating the coordinates of the normal pixel corresponding to the coordinates of the target pixel in the distorted current image, the method further comprises:
and rounding the target pixel coordinates of the non-integer.
A mapping data reconstruction system of a power transmission line image and point cloud comprises a shooting device and a server, wherein the server comprises a processor and a storage device, the storage device stores a plurality of instructions, and the processor is used for reading the instructions and executing the method.
The method and the system for reconstructing the mapping data of the power transmission line image and the point cloud provided by the invention at least have the following beneficial effects:
the current image acquired by the shooting device can be effectively identified, so that the mapping data is reconstructed according to the abnormal condition (offset or distortion), the mapping relation between the current image shot by the shooting device and the point cloud data is reestablished, the precision of a distance measurement task performed by using the current image is improved, and the distance measurement error is corrected.
Drawings
Fig. 1 is a flowchart of an embodiment of a method for reconstructing mapping data of an image and a point cloud of a power transmission line according to the present invention.
Fig. 2 is a schematic diagram of a distorted image in an application scene of the reconstruction method of the mapping data of the power transmission line image and the point cloud provided by the invention.
Fig. 3 is a schematic structural diagram of an embodiment of the apparatus for reconstructing mapping data of an image of a power transmission line and a point cloud provided in the present invention.
Fig. 4 is a flowchart of an embodiment of the reconstruction system of the mapping data of the power transmission line image and the point cloud provided by the present invention.
Detailed Description
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
In order to facilitate understanding of the present application, some concepts related to the present application will be described.
Laser Imaging Detection and Ranging (LIDAR): by emitting outgoing light (such as laser beam) with wavelength of about 900nm, the outgoing light can be reflected by the obstacle after encountering the obstacle, and the processing unit calculates the distance between the obstacle and the laser radar according to the time difference between the reflected light and the outgoing light. In addition, the processing unit can also estimate the reflectivity of the target according to the cross section condition of the reflected light signal obtained after receiving the reflected light. The airborne laser radar has small volume and high integration degree, and more application scenes are available.
Point cloud data (point cloud data) refers to a collection of vectors in a three-dimensional coordinate system. The scan data is recorded in the form of dots, each dot containing three-dimensional coordinates, some of which may contain color information (e.g., red, green, blue) or Intensity information (Intensity).
In the related art, map mapping can be performed by adopting modes such as an airborne laser radar, and the like, and the map has position information of a plurality of objects. However, the point cloud data has no image data and is good in intuitiveness, and if the image data can be directly adopted to measure the distance of the target object, the convenience of power transmission line detection can be effectively improved.
In order to realize accurate measurement by directly using the power transmission line image, each pixel in the image needs to have spatial coordinate information. The two types of data (image data and point cloud data) aiming at the same target object are fused, so that the fused image of the power transmission line can be used for measuring the clearance distance and the like of the power transmission line, and the precision reaches the sub-meter level.
Referring to fig. 1, in some embodiments, there is provided a method for reconstructing mapping data of a power transmission line image and a point cloud, including:
s1, acquiring a current image through a shooting device;
s2, obtaining pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of the point cloud data;
s3, comparing the current image with the reference image, and judging whether the current image is abnormal;
s4, when the current image is abnormal, calibrating the shooting device to obtain parameter information;
and S5, reconstructing the mapping data according to the parameter information.
Specifically, in step S1, the camera may be mounted on a tower on the power transmission line, and may capture an image of the power transmission line.
Further, in step S2, the photographing device photographs an image in a normal state for establishing a mapping relationship with the point cloud data, where the image is a reference image, the point cloud data is obtained by the laser radar, and a mapping relationship between pixel coordinates of the reference image and point cloud space coordinates of the point cloud data is established and mapping data is formed for storage.
In this step, the mapping relationship between the reference image and the point cloud data includes the mapping relationship between the coordinates of the target pixel in the reference image and the spatial coordinates of the target point in the point cloud data. The target point may be a characteristic point (e.g., a corner point, an end point) of a device (e.g., a cross arm, an insulator, etc.) on the power transmission line siteVertex, center point, etc.). For example, the mapping data may include: the mapping relationship between the pixel coordinates of the center point of the insulator in the reference image and the spatial coordinates (e.g., world coordinates including longitude, latitude, and height) of the center point of the insulator in the point cloud data, for example, the pixel coordinates (u) of the center point of the insulator in the reference image1,v1) Spatial coordinates (X) in point cloud data corresponding to the center point of the insulator1,Y1,Z1). The mapping relationship may be a functional relationship, and after the target pixel coordinates in the reference image are obtained, the spatial coordinates of the target point in the point cloud data may be obtained according to the mapping relationship. After the spatial coordinates of the target point in the point cloud data are obtained, the coordinates of the target pixel in the reference image can also be correspondingly obtained according to the mapping relation.
It can be understood that after the mapping data is obtained, the mapping relation between the target pixel coordinates in the reference image and the spatial coordinates of the target point in the point cloud data can be obtained. That is to say, a target pixel point is selected from the reference image, the spatial coordinate corresponding to the target pixel point can be obtained by using the mapping data, and the actual distance between two target pixel points in the reference image in the real world can be further determined. Therefore, the distance measurement task can be carried out according to the reference image, and the method can be suitable for monitoring the power transmission line site. When the shooting device is used as a monitoring camera installed on a power transmission line site, under the condition that the shooting device is not abnormal due to aging, displacement and the like, the image shot by the shooting device is relatively stable and unchanged. That is, in the case where the photographing device is not abnormal due to aging, displacement, or the like, the current image photographed by the photographing device is equivalent to the reference image. Therefore, the distance measurement task can be performed by using the current image shot by the shooting device according to the established mapping relation between the reference image shot by the shooting device and the point cloud data. However, after the photographing device is abnormal due to aging, displacement, and the like, the current image photographed by the photographing device will be abnormal, the current image photographed by the photographing device is not identical to the reference image, and if the abnormal current image is used for performing the distance measurement task, a large distance measurement error occurs, and the distance measurement accuracy cannot be guaranteed.
Therefore, it is necessary to determine whether an abnormality occurs in the photographing device by using the current image acquired by the photographing device.
Further, in step S3, comparing the current image with the reference image to determine whether the current image is abnormal includes:
selecting a feature point in a current image, comparing the pixel coordinate of the feature point in the current image with the pixel coordinate of the corresponding feature point in a reference image, if the pixel coordinate of the feature point in the current image is equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is normal, and if the pixel coordinate of the feature point in the current image is not equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is determined to be an abnormal image.
Further, judging whether the current image is abnormal further comprises:
and calculating multiple groups of offsets of the pixel coordinates of the multiple characteristic points in the current image and the pixel coordinates of the corresponding multiple characteristic points in the reference image, if the multiple groups of offsets are equal, determining that the current image is offset, and if the multiple groups of offsets are not equal, determining that the current image is distorted.
The distortion may include lens distortion, such as barrel distortion and pincushion distortion, among others. Lens distortion is a generic term for the intrinsic perspective distortion of an optical lens, i.e., distortion due to perspective. Referring to fig. 2, barrel distortion and pincushion distortion are shown.
The wide-angle lens brings barrel distortion while obtaining a wide field of view and a special photographing effect. Barrel distortion does not affect imaging definition, but affects imaging position accuracy, which brings errors or even misjudgments to image analysis and image measurement. The barrel distortion brought to the visual system by the wide-angle lens is nonlinear, the deformation is small at the center of the image, and the deformation is larger as the distance from the center of the image is larger. Pincushion distortion is a phenomenon in which a picture "shrinks" toward the middle due to a lens. The pincushion distortion phenomenon is most easily perceived when using a telephoto lens or using the telephoto end of a zoom lens. In a scene monitored by a power transmission line, because the distance between the electric towers is long, a long-focus lens or a wide-angle lens is often required to be used, and the image distortion phenomenon caused by lens distortion is obvious.
Further, in step S4, when the current image is abnormal, the imaging device is calibrated to obtain parameter information. In this step, the shooting device may be calibrated by using a checkerboard calibration method to obtain parameter information of the shooting device.
Specifically, the calibration of the shooting device by using the checkerboard calibration method includes: preparing a checkerboard, wherein the size of the checkerboard is known, shooting the checkerboard at different angles by using a shooting device to obtain a group of images, detecting characteristic points in the images such as calibration board corner points to obtain pixel coordinate values of the calibration board corner points, calculating to obtain physical coordinate values of the calibration board corner points according to the known checkerboard size and the world coordinate system origin, and calibrating the camera according to the pixel coordinate of each corner point and the physical coordinate of each corner point in the world coordinate system to obtain an internal and external parameter matrix and a distortion parameter of the camera.
The parameter information includes internal parameters, external parameters, and distortion parameters of the photographing device.
Further, in step S5, reconstructing the mapping data for the current image that has been shifted but not distorted, based on the parameter information, includes:
and reconstructing the mapping data of the current image with the offset according to the external reference, the internal reference, the pixel coordinates of a plurality of characteristic points in the current image and the corresponding space coordinates of point clouds in the point cloud data.
Specifically, the internal reference includes a physical size dx of each pixel on the horizontal axis x of the image, a physical size dy of each pixel on the vertical axis y of the image, a warping factor r of physical coordinates of the image, a focal length f, and a number u of horizontal pixels of a phase difference between central pixel coordinates of the image and origin pixel coordinates of the image0Number v of pixels in horizontal and vertical directions of phase difference between central pixel coordinate of image and image origin pixel coordinate0That is, (u)0,v0) Presentation cameraAnd (4) pixel coordinates of the intersection point of the machine optical axis and the image plane.
The external parameters include a rotation matrix R and a translation vector T, which are transformed from the spatial coordinate system to the camera coordinate system.
Further, in the case that the current image is not distorted, for the same target, the spatial coordinates of the target in the point cloud data, the shooting midpoint of the shooting device, and the pixel coordinates of the target in the current image are collinear, and based on the collinear relationship, an expression as shown in formula (1) is constructed:
Figure BDA0003601775200000091
in the formula (1), (x, y) are image plane coordinates of the target point. (u)0,v0) F represents the focal length, which is the pixel coordinate of the intersection of the optical axis of the camera and the image plane. (X)s,YS,ZS) As coordinates of the center of the camera under the coordinate system of the point cloud, (X)A,YA,ZA) Representing the spatial coordinates of the target point in the point cloud data. a isi,bi,ci(i is 1,2,3) is a rotation matrix of the image.
Wherein, the expression of the rotation matrix R is shown in formula (2):
Figure BDA0003601775200000092
since the image plane coordinates (x, y) of the target point and the pixel coordinates (u, v) of the target point can be transformed correspondingly. By calibrating the camera, the internal part is known to participate in external parameters, i.e. (u) is known0,v0)、f、(Xs,YS,ZS). Thus, the spatial coordinates (X) of the target point in the point cloud data are used as the basisA,YA,ZA) Then the target pixel coordinates (u, v) in the current image may be determined. In this way, a mapping relation between the coordinates of the target pixel in the current image shot by the shooting device and the spatial coordinates of the target point in the point cloud data is constructed.
In another embodiment, according to the internal and external parameters in the parameter information, a mapping relationship between the current image captured by the capturing device and the point cloud data can be constructed according to a spatial transformation relationship.
The conversion relationship between the pixel coordinates and the space coordinates may be expressed as shown in formula (3):
Figure BDA0003601775200000101
in formula (3), dxAnd dyRespectively, the physical size of each pixel on the horizontal axis x and the vertical axis y of the image, (u)0,v0) F represents the focal length of the shooting device, and the parameters are internal parameters. (u, v) are the pixel coordinates of the target point, (X)w,YW,ZW) Is the spatial coordinates of the target point in the point cloud data. R denotes a rotation matrix and T denotes a translation vector.
In some embodiments, the rotation matrix R may also be expressed as shown in equation (4):
Figure BDA0003601775200000102
for example,
Figure BDA0003601775200000103
and (3) representing the rotation angles of the coordinate axes of the camera around the y axis, the x axis and the z axis of the point cloud coordinate system respectively, namely attitude angles, as shown in the formula (5).
T=[tx,ty,tz]; (5)
Wherein, tx、ty、tzRespectively representing the position coordinate values of the camera center under the point cloud coordinate system.
Since the internal and external parameters of the camera are known, the spatial coordinates (X) of the target point in the point cloud data are used as a basisw,YW,ZW) Then the target pixel coordinates (u) in the current image may be determinedV). In this way, a mapping relation between the coordinates of the target pixel in the current image shot by the shooting device and the spatial coordinates of the target point in the point cloud data is constructed.
Further, in step S5, if the current image has been distorted, reconstructing the mapping data according to the parameter information includes:
and reconstructing the mapping data of the distorted current image according to the internal parameters, the external parameters and the distortion parameters.
Specifically, reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters includes:
according to the internal parameters and the external parameters, calculating the space coordinates of the characteristic point cloud in the point cloud data corresponding to the normal pixel coordinates in the normal image, specifically, in the step, according to the internal parameters and the external parameters in the parameter information, by using the expression or the formula (3), calculating the space coordinates of the target point in the point cloud data corresponding to the normal pixel coordinates in the normal image;
calculating the coordinates of the normal pixels corresponding to the coordinates of target pixels in the distorted current image according to the distortion parameters;
reconstructing the mapping data according to the internal parameters, the external parameters, the target pixel coordinates and the spatial coordinates of the feature point cloud, and specifically, reconstructing the mapping data according to the above expressions (1) - (5).
The mathematical model of distortion is explained below.
An internal reference matrix a (dx, dy, R, u, v, f) of the camera, an external reference matrix [ R | T ], distortion coefficients [ k1, k2, k3, p1, p2], physical dimensions dx and dy of one pixel, a focal length f, a warping factor R of physical coordinates of the image.
The radial distortion mathematical model is shown as formula (6):
Figure BDA0003601775200000111
wherein r is2=x'2+y'2The radial distortion at the image edge is large.
The tangential distortion mathematical model is shown as formula (7):
Figure BDA0003601775200000112
the five vectors k1, k2, k3, P1 and P2 are distortion parameters, u and v are pixel coordinates in a distorted image, and u 'and v' are corrected pixel coordinates.
It can be understood that xc、yc、zcThe coordinates of a pixel point in a camera coordinate system, x ', y' are intermediate quantities from the camera coordinate system to an image coordinate system, and the normal position coordinates of a pixel in the pixel coordinate system of an image can be expressed as formulas (8) to (10):
x'=xc/zc; (8)
y'=yc/zc; (9)
Figure BDA0003601775200000121
x "and y" are distortion position coordinates and can be expressed as formulas (11) to (13):
x”=x'×(1+k1×r2+k2×r4)+2×p1×x'×y'+p2×(r2+2×x'2); (11)
y”=y'×(1+k1×r2+k2×r4)+2×p2×x'×y'+p1×(r2+2×y'2); (12)
Figure BDA0003601775200000122
wherein r is2=x'2+y'2R denotes a warping factor of physical coordinates of the image, k1, k2 are radial deformation coefficients, p1, p2 are tangential deformation coefficients, and u is a distortion coefficientd、vdAs pixel coordinates, u, v, in the distorted current imageIs the normal pixel coordinate in the normal image, x ', y' are the intermediate quantities from the camera coordinate system to the image coordinate system, x "and y" are the coordinates of the distorted position, cx,cyAmount of deviation of optical axis from coordinate center of projection plane, fxAnd fyIs the focal length of the camera.
In order to determine an image without distortion based on an image with known distortion, the mapping relationship can be derived by a distortion model.
The relationship between the normal image imgR and the distorted image imgD is as shown in equations (14) to (16):
Figure BDA0003601775200000123
Figure BDA0003601775200000124
Figure BDA0003601775200000131
wherein r is2=x'2+y'2R denotes a warping factor of physical coordinates of the image, k1, k2 are radial deformation coefficients, p1, p2 are tangential deformation coefficients, and u is a distortion coefficientd、vdIs the pixel coordinate in the distorted current image, u, v are the normal pixel coordinates in the normal image, x ', y' are the intermediate quantities from the camera coordinate system to the image coordinate system, x "and y" are the coordinates of the distorted position, cx,cyOffset of the optical axis from the center of the projection plane coordinate, fxAnd fyThe focal length of the camera is generally equal to the focal length of the camera.
Since it is determined that the current image photographed by the photographing device is distorted, the current image is a distorted abnormal image. In this way, after the normal pixel coordinates (u, v) in the normal image are calculated in the above step, the target pixel in the current image corresponding to the normal pixel coordinates (u, v) can be calculated from the above equations (14) to (16)Coordinates (u)d,vd) Therefore, the mapping relation between the target pixel coordinates in the current image shot by the shooting device and the space coordinates of the target point in the point cloud data is established.
It is understood that the normal pixel coordinates (u, v) of the normal image are integers, for example, the normal pixel coordinates (1,1), but the calculated target pixel coordinates in the current image corresponding to the normal pixel coordinates may be non-integers, for example, the calculated target pixel coordinates (1.1, 1.4).
In some embodiments, when the calculated target pixel coordinate is a non-integer pixel coordinate, one integer pixel coordinate adjacent to the non-integer pixel coordinate in the current image is taken as the target pixel coordinate. For example, the approximations may be distinguished by the manner of rounding. For example, when the calculated target pixel coordinate is (1.1, 1.2), the integer pixel coordinate (1,1) in the current image may be used as the target pixel coordinate. In this way, a mapping relationship between the coordinates of the target pixel in the current image and the spatial coordinates of the target point in the point cloud data can be constructed.
In another embodiment, when the calculated target pixel coordinate is a non-integer pixel coordinate, the current image is scaled up, and one integer pixel coordinate corresponding to the non-integer pixel coordinate is selected as the target pixel coordinate in the scaled-up current image. For example, when the calculated target pixel coordinate is (1.1, 1.2), the current image may be enlarged by 10 times, and the integer pixel coordinate (11, 12) in the enlarged current image may be selected as the target pixel coordinate. In this way, a mapping relationship between the coordinates of the target pixel in the enlarged current image and the spatial coordinates of the target point in the point cloud data can be constructed.
It can be seen from this embodiment that, with the method provided in the embodiment of the present application, the mapping relationship between the reference image and the point cloud data established in the original mapping data can be adjusted by using the newly established mapping relationship between the current image and the point cloud data, so as to obtain the adjusted mapping data. Based on the mapping relation between the current image and the point cloud data in the adjusted mapping data, the current image shot by the shooting device can be utilized to carry out the distance measurement task, and the distance measurement precision is high and the error is small.
The method provided by the embodiment at least comprises the following beneficial effects:
the current image acquired by the shooting device can be effectively identified, so that the mapping data is reconstructed according to the abnormal condition (offset or distortion), the mapping relation between the current image shot by the shooting device and the point cloud data is reestablished, the precision of a distance measurement task performed by using the current image is improved, and the distance measurement error is corrected.
Further, referring to fig. 3, in some embodiments, there is also provided an apparatus for reconstructing mapping data of a power transmission line image and a point cloud, including:
the acquisition module 201 is used for acquiring a current image through a shooting device;
a data obtaining module 202, configured to obtain pre-stored mapping data, where the mapping data includes a mapping relationship between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data;
a judging module 203, configured to compare the current image with the reference image, and judge whether the current image is abnormal;
a calibration module 204, configured to calibrate the shooting device to obtain parameter information when the current image is abnormal;
a reconstructing module 205, configured to reconstruct the mapping data according to the parameter information.
Specifically, the determining module 203 is further configured to:
selecting a feature point in a current image, comparing the pixel coordinate of the feature point in the current image with the pixel coordinate of the corresponding feature point in a reference image, if the pixel coordinate of the feature point in the current image is equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is normal, and if the pixel coordinate of the feature point in the current image is not equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is determined to be an abnormal image.
The determining module 203 is further configured to: :
and calculating multiple groups of offsets of the pixel coordinates of the multiple characteristic points in the current image and the pixel coordinates of the corresponding multiple characteristic points in the reference image, if the multiple groups of offsets are equal, determining that the current image is offset, and if the multiple groups of offsets are not equal, determining that the current image is distorted.
The parameter information includes internal parameters, external parameters and distortion parameters of the photographing device.
Further, the reconstruction module 205 is further configured to:
and reconstructing the mapping data of the current image with the offset according to the external reference, the internal reference, the pixel coordinates of a plurality of characteristic points in the current image and the corresponding space coordinates of point clouds in the point cloud data.
Further, the reconstruction module 205 is further configured to:
and reconstructing the mapping data of the distorted current image according to the internal parameter, the external parameter and the distortion parameter.
Further, the reconstruction module 205 is further configured to:
according to the internal parameters and the external parameters, calculating the space coordinates of the feature point cloud in the point cloud data corresponding to the normal pixel coordinates in the normal image;
calculating the coordinates of the normal pixels corresponding to the coordinates of target pixels in the distorted current image according to the distortion parameters;
and reconstructing the mapping data according to the internal parameters, the external parameters, the target pixel coordinates and the space coordinates of the characteristic point cloud.
For a specific reconstruction method, please refer to the above embodiments, which are not described herein again.
Referring to fig. 4, in some embodiments, there is also provided a mapping data reconstruction system of a power transmission line image and a point cloud, including a camera 301 and a server 302, where the server 302 includes a processor 3021 and a storage 3022, the storage 3022 stores a plurality of instructions, and the processor 3021 is configured to read the plurality of instructions and execute the above-mentioned method.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage devices may include various types of storage units such as system memory, Read Only Memory (ROM), and permanent storage.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention. It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A reconstruction method of mapping data of an image and a point cloud of a power transmission line is characterized by comprising the following steps:
acquiring a current image through a shooting device;
acquiring pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data;
comparing the current image with the reference image, and judging whether the current image is abnormal;
when the current image is abnormal, calibrating the shooting device to obtain parameter information;
and reconstructing the mapping data according to the parameter information.
2. The method of claim 1, wherein comparing the current image with the reference image to determine whether the current image is abnormal comprises:
selecting a feature point in a current image, comparing the pixel coordinate of the feature point in the current image with the pixel coordinate of the corresponding feature point in a reference image, if the pixel coordinate of the feature point in the current image is equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is normal, and if the pixel coordinate of the feature point in the current image is not equal to the pixel coordinate of the corresponding feature point in the reference image, the current image is determined to be an abnormal image.
3. The method of claim 2, wherein determining whether the current image is abnormal further comprises:
and calculating multiple groups of offsets of the pixel coordinates of the multiple characteristic points in the current image and the pixel coordinates of the corresponding multiple characteristic points in the reference image, if the multiple groups of offsets are equal, determining that the current image is offset, and if the multiple groups of offsets are not equal, determining that the current image is distorted.
4. The method according to claim 1 or 3, wherein the parameter information includes internal parameters, external parameters, and distortion parameters of the photographing device.
5. The method of claim 4, wherein reconstructing the mapping data according to the parameter information comprises:
and reconstructing the mapping data of the current image with the offset according to the external reference, the internal reference, the pixel coordinates of a plurality of characteristic points in the current image and the space coordinates of point clouds in the point cloud data corresponding to the external reference, the internal reference and the current image.
6. The method of claim 4, wherein reconstructing the mapping data according to the parameter information comprises:
and reconstructing the mapping data of the distorted current image according to the internal parameters, the external parameters and the distortion parameters.
7. The method of claim 6, wherein reconstructing the mapping data from the internal parameters, external parameters, and distortion parameters comprises:
calculating the space coordinates of the characteristic point cloud in the point cloud data corresponding to the normal pixel coordinates in the normal image according to the internal reference and the external reference;
calculating the coordinates of the normal pixels corresponding to the coordinates of target pixels in the distorted current image according to the distortion parameters;
and reconstructing the mapping data according to the internal parameters, the external parameters, the target pixel coordinates and the space coordinates of the characteristic point cloud.
8. The method of claim 7, wherein the distortion parameters include a radial deformation coefficient and a tangential deformation coefficient, and wherein the target pixel coordinates are calculated by the following equation:
Figure FDA0003601775190000021
x″=x′×(1+k1×r2+k2×r4)+2×p1×x′×y′+p2×(r2+2×x′2);
y″=y′×(1+k1×r2+k2×r4)+2×p2×x′×y′+p1×(r2+2×y′2);
Figure FDA0003601775190000022
wherein r is2=x′2+y′2R denotes the warping factor of the image physical coordinates, k1, k2 are the radial deformation coefficients, p1, p2 are the tangential deformation coefficients, ud、vdFor the distorted current imagePixel coordinates, u, v are normal pixel coordinates in a normal image, x ', y' are intermediate quantities from the camera coordinate system to the image coordinate system, x "and y" are coordinates of the distorted position, cx,cyAmount of deviation of optical axis from coordinate center of projection plane, fxAnd fyIs the focal length of the camera.
9. The method of claim 7, wherein after calculating the normal pixel coordinates corresponding to target pixel coordinates in the distorted current image, further comprising:
and rounding the target pixel coordinates of the non-integer.
10. The system for reconstructing the mapping data of the power transmission line image and the point cloud is characterized by comprising a shooting device and a server, wherein the server comprises a processor and a storage device, the storage device stores a plurality of instructions, and the processor is used for reading the instructions and executing the method according to any one of claims 1 to 9.
CN202210405718.3A 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud Active CN114693807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210405718.3A CN114693807B (en) 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210405718.3A CN114693807B (en) 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud

Publications (2)

Publication Number Publication Date
CN114693807A true CN114693807A (en) 2022-07-01
CN114693807B CN114693807B (en) 2024-02-06

Family

ID=82142473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210405718.3A Active CN114693807B (en) 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud

Country Status (1)

Country Link
CN (1) CN114693807B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437303A (en) * 2023-12-18 2024-01-23 江苏尚诚能源科技有限公司 Method and system for calibrating camera external parameters

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100061601A1 (en) * 2008-04-25 2010-03-11 Michael Abramoff Optimal registration of multiple deformed images using a physical model of the imaging distortion
CN102436660A (en) * 2011-11-08 2012-05-02 北京新岸线网络技术有限公司 Automatic correction method and device of 3D camera image
CN105222788A (en) * 2015-09-30 2016-01-06 清华大学 The automatic correcting method of the aircraft course deviation shift error of feature based coupling
CN109410207A (en) * 2018-11-12 2019-03-01 贵州电网有限责任公司 A kind of unmanned plane line walking image transmission line faultlocating method based on NCC feature
CN113887641A (en) * 2021-10-11 2022-01-04 山东信通电子股份有限公司 Hidden danger target determination method, device and medium based on power transmission channel
CN114050650A (en) * 2021-11-12 2022-02-15 国网冀北电力有限公司电力科学研究院 Intelligent tower based on power transmission line regional autonomous system architecture
CN114255396A (en) * 2021-11-01 2022-03-29 南方电网数字电网研究院有限公司 Power transmission line environment reconstruction method, system and device and controller

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100061601A1 (en) * 2008-04-25 2010-03-11 Michael Abramoff Optimal registration of multiple deformed images using a physical model of the imaging distortion
CN102436660A (en) * 2011-11-08 2012-05-02 北京新岸线网络技术有限公司 Automatic correction method and device of 3D camera image
CN105222788A (en) * 2015-09-30 2016-01-06 清华大学 The automatic correcting method of the aircraft course deviation shift error of feature based coupling
CN109410207A (en) * 2018-11-12 2019-03-01 贵州电网有限责任公司 A kind of unmanned plane line walking image transmission line faultlocating method based on NCC feature
CN113887641A (en) * 2021-10-11 2022-01-04 山东信通电子股份有限公司 Hidden danger target determination method, device and medium based on power transmission channel
CN114255396A (en) * 2021-11-01 2022-03-29 南方电网数字电网研究院有限公司 Power transmission line environment reconstruction method, system and device and controller
CN114050650A (en) * 2021-11-12 2022-02-15 国网冀北电力有限公司电力科学研究院 Intelligent tower based on power transmission line regional autonomous system architecture

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437303A (en) * 2023-12-18 2024-01-23 江苏尚诚能源科技有限公司 Method and system for calibrating camera external parameters
CN117437303B (en) * 2023-12-18 2024-02-23 江苏尚诚能源科技有限公司 Method and system for calibrating camera external parameters

Also Published As

Publication number Publication date
CN114693807B (en) 2024-02-06

Similar Documents

Publication Publication Date Title
CN111179358B (en) Calibration method, device, equipment and storage medium
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CA2819956C (en) High accuracy camera modelling and calibration method
Shah et al. A simple calibration procedure for fish-eye (high distortion) lens camera
CN104392435B (en) Fisheye camera scaling method and caliberating device
US7479982B2 (en) Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device
US20020122117A1 (en) Camera device, camera system and image processing method
CN111815716A (en) Parameter calibration method and related device
US20130300831A1 (en) Camera scene fitting of real world scenes
CN106595700A (en) Target channel space reference calibration method based on three-point coordinate measurement
JP4052382B2 (en) Non-contact image measuring device
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
KR20130121290A (en) Georeferencing method of indoor omni-directional images acquired by rotating line camera
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
CN112102387A (en) Depth estimation performance testing method and system based on depth camera
CN114693807B (en) Method and system for reconstructing mapping data of power transmission line image and point cloud
EP4242609A1 (en) Temperature measurement method, apparatus, and system, storage medium, and program product
CN114979956A (en) Unmanned aerial vehicle aerial photography ground target positioning method and system
CN112907647B (en) Three-dimensional space size measurement method based on fixed monocular camera
CN112419427A (en) Method for improving time-of-flight camera accuracy
CN109682312B (en) Method and device for measuring length based on camera
CN116381712A (en) Measurement method based on linear array camera and ground laser radar combined device
CN113034615B (en) Equipment calibration method and related device for multi-source data fusion
CN116563370A (en) Distance measurement method and speed measurement method based on monocular computer vision
CN113421300B (en) Method and device for determining actual position of object in fisheye camera image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant