CN114913242A - Camera calibration method, device, system, electronic equipment and storage medium - Google Patents

Camera calibration method, device, system, electronic equipment and storage medium Download PDF

Info

Publication number
CN114913242A
CN114913242A CN202210535889.8A CN202210535889A CN114913242A CN 114913242 A CN114913242 A CN 114913242A CN 202210535889 A CN202210535889 A CN 202210535889A CN 114913242 A CN114913242 A CN 114913242A
Authority
CN
China
Prior art keywords
camera
calibration
vector
coordinate system
mobile robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210535889.8A
Other languages
Chinese (zh)
Inventor
朱欣
李笑千
张展鹏
成慧
蔡源浔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202210535889.8A priority Critical patent/CN114913242A/en
Publication of CN114913242A publication Critical patent/CN114913242A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the application provides a camera calibration method, a device, a system, electronic equipment and a storage medium. The method comprises the following steps: in the process that the mobile robot moves according to the first track, acquiring calibration plate image information acquired by a camera fixed on the mobile robot and odometer information acquired by an odometer fixed on the mobile robot; determining first attitude transformation data of the camera in the motion process according to internal parameters of the camera and image information of a calibration plate; determining second position and posture transformation data of the mobile robot in the motion process according to the odometer information; and performing external reference calibration on the camera based on the first position and orientation transformation data and the second position and orientation transformation data to obtain an external reference calibration result, wherein the external reference calibration result is used for representing the transformation relation between a camera coordinate system corresponding to the camera and a body coordinate system corresponding to the mobile robot. By adopting the embodiment of the application, the calibration complexity can be reduced, and the calibration efficiency and the calibration environment applicability can be improved.

Description

Camera calibration method, device, system, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of camera calibration technologies, and in particular, to a method, an apparatus, a system, an electronic device, and a storage medium for camera calibration.
Background
The technology of instant positioning and map building (SLAM) is a core technology for a robot to build an environment map and position itself in an unknown environment, and the SLAM technology is mainly applied to the fields of robots, Augmented Reality (AR), Virtual Reality (VR), automatic driving and the like.
In the SLAM technology, information of different sensors is often required to be fused, in order to fuse data, a relative relationship between the different sensors needs to be calibrated first, the relative relationship between the different sensors is called an external parameter, and the result of external parameter calibration influences the final positioning and mapping effect.
The existing external reference calibration method usually depends on a high-cost built motion capture system or a specific road sign of a specific scene, and is complex in calibration steps, low in calibration efficiency and not suitable for mass production environment.
Disclosure of Invention
The application provides a camera calibration method, a camera calibration device, a camera calibration system, an electronic device and a storage medium, which can reduce calibration complexity and improve calibration efficiency and calibration environment applicability.
In a first aspect, an embodiment of the present application provides a camera calibration method, where the method includes:
in the process that the mobile robot moves according to the first track, acquiring calibration plate image information acquired by a camera fixed on the mobile robot and odometer information acquired by an odometer fixed on the mobile robot;
determining first attitude transformation data of the camera in the motion process according to the internal parameters of the camera and the image information of the calibration plate;
determining second position and posture transformation data of the mobile robot in the motion process according to the odometer information;
and performing camera external reference calibration based on the first position and posture transformation data and the second position and posture transformation data to obtain an external reference calibration result, wherein the external reference calibration result is used for representing a transformation relation between a camera coordinate system corresponding to the camera and a body coordinate system corresponding to the mobile robot.
In the embodiment, the first position and posture transformation data of the camera in the motion process are determined by utilizing the image information of the calibration plate, the second position and posture transformation data of the mobile robot in the motion process are determined by utilizing the information of the odometer, the camera external reference calibration is realized by combining the first position and posture transformation data and the second position and posture data, the calibration process does not depend on a complex motion capture system and is not limited by a specific scene or a dominant road sign, only the calibration plate and the odometer are used, the operation is simple, the cost is low, the calibration difficulty and the calibration time consumption are favorably reduced, the calibration efficiency is improved, and the calibration environment is more flexible.
In some possible embodiments, the determining first pose transformation data of the camera during the motion according to the internal reference of the camera and the calibration board image information includes:
obtaining at least one group of calibration plate images according to the calibration plate image information, wherein each group of calibration plate images comprise a first frame of calibration plate image and a second frame of calibration plate image which are adjacent;
obtaining a first pose of the camera relative to the calibration plate according to the internal parameters of the camera and the coordinates of each calibration point in the first frame of calibration plate image in a world coordinate system and a pixel coordinate system;
obtaining a second pose of the camera relative to the calibration plate according to the internal reference of the camera and the coordinates of each calibration point in the second frame of calibration plate image under the world coordinate system and the pixel coordinate system;
determining a pose transformation vector of the camera between each two adjacent frames based on the first pose and the second pose, the first pose transformation data comprising the pose transformation vector of the camera between each two adjacent frames.
In the embodiment, the pose transformation vector of the camera between two adjacent frames is calculated by using the information of the images of the calibration plates of two adjacent frames, and because the absolute scale information of the calibration plates is used as a reference, a scale estimator is not required to be introduced in the pose calculation process, so that the complexity and difficulty of pose calculation are reduced, the calculation error caused by the lack of scale information is reduced, more accurate camera pose transformation data is obtained, and the calibration precision is improved.
In some possible embodiments, the determining second pose transformation data of the mobile robot during the motion process according to the odometry information includes:
obtaining at least one group of odometry data according to the odometry information, wherein each group of odometry data comprises adjacent first frame odometry data and second frame odometry data;
obtaining a third posture of the mobile robot relative to the world coordinate system according to the first frame of odometer data;
obtaining a fourth pose of the mobile robot relative to the world coordinate system according to the second frame of odometer data;
determining a pose transformation vector of the mobile robot between each two adjacent frames based on the third pose and the fourth pose, the second pose transformation data comprising the pose transformation vector of the mobile robot between each two adjacent frames.
In the embodiment, the position and pose transformation vector of the mobile robot between two adjacent frames is calculated by using the odometer data of two adjacent frames, the odometer data has higher precision and is easy to directly acquire, the complexity and difficulty of position and pose calculation are favorably reduced, more reliable position and pose transformation data of the mobile robot are obtained, and the calibration precision is favorably improved.
In some possible embodiments, the pose transformation vector of the camera between each two adjacent frames includes a first translation vector and a first rotation vector, wherein the first translation vector represents a position vector of the camera pointing from the second frame to the first frame, and the first rotation vector represents a rotation vector of the camera rotating from the first frame to the second frame;
the pose transformation vector of the mobile robot between each two adjacent frames comprises a second translation vector and a second rotation vector, wherein the second translation vector represents a position vector of the mobile robot pointing to the first frame from the second frame, and the second rotation vector represents a rotation vector of the mobile robot rotating from the first frame to the second frame;
the calibrating the external reference of the camera based on the first position posture transformation data and the second position posture transformation data to obtain an external reference calibrating result comprises the following steps:
establishing a rotation transformation equation based on the first rotation vector, the second rotation vector and a rotation vector to be calibrated, wherein the rotation vector to be calibrated is used for representing a rotation transformation relation between the camera coordinate system and the machine body coordinate system;
establishing a translation transformation equation based on the first translation vector, the second translation vector, a third rotation vector, the to-be-calibrated rotation vector and the to-be-calibrated translation vector, wherein the third rotation vector is the first rotation vector or the second rotation vector, and the to-be-calibrated translation vector is used for representing a displacement transformation relation between the camera coordinate system and the body coordinate system;
and solving the rotation transformation equation and the translation transformation equation to obtain an external reference calibration result, wherein the external reference calibration result comprises a calibration result of the rotation vector to be calibrated and the translation vector to be calibrated.
In the above embodiment, a rotation transformation equation and a translation transformation equation are established by using the translation vector and the rotation vector of the camera between frames, the translation vector and the rotation vector of the mobile robot between frames, and the rotation vector to be calibrated and the translation vector to be calibrated, and by solving the equations, the external reference calibration result of the camera can be quickly obtained, thereby improving the calibration efficiency.
In some possible embodiments, the establishing a rotation transformation equation based on the first rotation vector, the second rotation vector, and the rotation vector to be calibrated includes:
multiplying the first rotation vector by the rotation vector to be calibrated to obtain a first rotation mode for rotating the camera coordinate system of the first frame to the body coordinate system of the second frame;
multiplying the rotation vector to be calibrated with the second rotation vector to obtain a second rotation mode of rotating the camera coordinate system of the first frame to the body coordinate system of the second frame;
and establishing a rotation transformation equation based on the equivalence relation between the first rotation mode and the second rotation mode.
In the above embodiment, the rotational transformation equation is established by using the equivalence relation between two rotation modes of the camera coordinate system of the first frame rotating to the body coordinate system of the second frame, and accordingly, the rotational transformation relation between the camera coordinate system and the body coordinate system can be obtained through calculation.
In some possible embodiments, the third rotation vector is the second rotation vector; establishing a translation transformation equation based on the first translation vector, the second translation vector, the third rotation vector, the rotation vector to be calibrated, and the translation vector to be calibrated, including:
multiplying the rotation vector to be calibrated by the first translation vector, and adding the multiplied rotation vector to be calibrated and the translation vector to be calibrated to obtain a first translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame;
multiplying the second rotation vector by the translation vector to be calibrated, and adding the multiplied second rotation vector and the translation vector to be calibrated to obtain a second translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame;
and establishing a translation transformation equation based on the equivalence relation between the first translation mode and the second translation mode.
In the above embodiment, the translation transformation equation is established by using the equivalence relation between the two translation modes of translating the body coordinate system of the second frame to the camera coordinate system of the first frame, and accordingly, the translation transformation relation between the camera coordinate system and the body coordinate system can be obtained through calculation.
In some possible embodiments, the rotational transformation equation further includes a rotational noise factor, and the translational transformation equation further includes the rotational noise factor and a translational noise factor; the solving of the rotational transformation equation and the translational transformation equation to obtain the external reference calibration result comprises:
obtaining a first residual error item caused by rotation noise according to the rotation transformation equation, and obtaining a calibration result of the rotation vector to be calibrated by minimizing the first residual error item;
and obtaining a second residual error term caused by rotation noise and translation noise according to the translation transformation equation, and obtaining a calibration result of the translation vector to be calibrated by minimizing the second residual error term.
In the embodiment, the noise influence in the actual calibration process is considered, the residual error term caused by the noise is extracted according to the rotation transformation equation and the translation transformation equation, and the more accurate external parameter calibration result can be obtained by minimizing the residual error term.
In some possible embodiments, after obtaining the external reference calibration result, the method further includes:
judging the obtained external reference calibration result;
and under the condition that the external reference calibration result is determined not to meet the external reference calibration success condition, re-calibrating the external reference of the camera until the obtained external reference calibration result meets the external reference calibration success condition or the times of carrying out external reference calibration of the camera reach a first preset time.
In the above embodiment, an external reference calibration result determination mechanism and an external reference calibration retry mechanism are added, after the external reference calibration result is obtained, the obtained external reference calibration result is automatically determined, and when the external reference calibration result indicates that the external reference calibration of the camera fails, the external reference calibration of the camera is automatically performed again without manual intervention, which is beneficial to improving the success rate of the external reference calibration of the camera.
In some possible embodiments, before the acquiring calibration board image information collected by a camera fixed on the mobile robot and odometer information collected by an odometer sensor fixed on the mobile robot during the mobile robot moving according to the external reference calibration trajectory, the method further includes:
acquiring calibration plate image data acquired by the camera in the process that the mobile robot moves according to a second track;
performing camera internal reference calibration based on the calibration plate image data to obtain an internal reference calibration result;
and determining the internal reference of the camera according to the obtained internal reference calibration result.
In the above embodiment, before the camera external reference calibration is performed, the camera internal reference calibration process is added, and the camera internal reference required in the subsequent camera external reference calibration is determined by using the internal reference calibration result, which is helpful for improving the reliability of the camera internal reference.
In some possible embodiments, the determining the internal reference of the camera according to the obtained internal reference calibration result includes:
judging the obtained internal reference calibration result;
determining the internal reference calibration result as the internal reference of the camera under the condition that the internal reference calibration result meets the internal reference calibration success condition;
and under the condition that the internal reference calibration result does not meet the internal reference calibration success condition, re-calibrating the internal reference of the camera until the obtained internal reference calibration result meets the internal reference calibration success condition or the number of times of calibrating the internal reference of the camera reaches a second preset number.
In the embodiment, an internal reference calibration result determination mechanism and an internal reference calibration retry mechanism are added, after the internal reference calibration result is obtained, the obtained internal reference calibration result is automatically determined, and when the internal reference calibration result indicates that the camera internal reference calibration fails, the camera internal reference calibration is automatically carried out again without manual intervention, which is beneficial to improving the success rate of the camera internal reference calibration.
In a second aspect, an embodiment of the present application provides a camera calibration apparatus, where the apparatus includes:
the mobile robot comprises an acquisition unit, a processing unit and a control unit, wherein the acquisition unit is used for acquiring calibration plate image information acquired by a camera fixed on the mobile robot and odometer information acquired by an odometer fixed on the mobile robot in the process that the mobile robot moves according to a first track;
the first determining unit is used for determining first position and posture transformation data of the camera in the motion process according to the internal reference of the camera and the image information of the calibration plate;
the second determining unit is used for determining second posture transformation data of the mobile robot in the motion process according to the odometer information;
and the external reference calibration unit is used for performing external reference calibration on the camera based on the first position and posture transformation data and the second position and posture transformation data to obtain an external reference calibration result, and the external reference calibration result is used for representing the transformation relation between the camera coordinate system corresponding to the camera and the body coordinate system corresponding to the mobile robot.
In some possible embodiments, the first determining unit is specifically configured to:
obtaining at least one group of calibration plate images according to the calibration plate image information, wherein each group of calibration plate images comprise a first frame of calibration plate image and a second frame of calibration plate image which are adjacent;
obtaining a first pose of the camera relative to the calibration plate according to the internal parameters of the camera and the coordinates of each calibration point in the first frame of calibration plate image in a world coordinate system and a pixel coordinate system;
obtaining a second pose of the camera relative to the calibration plate according to the internal reference of the camera and the coordinates of each calibration point in the second frame of calibration plate image under the world coordinate system and the pixel coordinate system;
determining a pose transformation vector of the camera between each two adjacent frames based on the first pose and the second pose, the first pose transformation data comprising the pose transformation vector of the camera between each two adjacent frames.
In some possible embodiments, the second determining unit is specifically configured to:
obtaining at least one group of odometry data according to the odometry information, wherein each group of odometry data comprises adjacent first frame odometry data and second frame odometry data;
obtaining a third posture of the mobile robot relative to the world coordinate system according to the first frame of odometer data;
obtaining a fourth pose of the mobile robot relative to the world coordinate system according to the second frame of odometer data;
determining a pose transformation vector of the mobile robot between each two adjacent frames based on the third pose and the fourth pose, the second pose transformation data comprising the pose transformation vector of the mobile robot between each two adjacent frames.
In some possible embodiments, the pose transformation vector of the camera between each two adjacent frames includes a first translation vector and a first rotation vector, wherein the first translation vector represents a position vector of the camera pointing from the second frame to the first frame, and the first rotation vector represents a rotation vector of the camera rotating from the first frame to the second frame;
the pose transformation vector of the mobile robot between each two adjacent frames comprises a second translation vector and a second rotation vector, wherein the second translation vector represents a position vector of the mobile robot pointing to the first frame from the second frame, and the second rotation vector represents a rotation vector of the mobile robot rotating from the first frame to the second frame;
the external reference calibration unit is specifically used for:
establishing a rotation transformation equation based on the first rotation vector, the second rotation vector and a rotation vector to be calibrated, wherein the rotation vector to be calibrated is used for representing a rotation transformation relation between the camera coordinate system and the machine body coordinate system;
establishing a translation transformation equation based on the first translation vector, the second rotation vector, the rotation vector to be calibrated and the translation vector to be calibrated, wherein the translation vector to be calibrated is used for representing a displacement transformation relation between the camera coordinate system and the machine body coordinate system;
and solving the rotation transformation equation and the translation transformation equation to obtain an external reference calibration result, wherein the external reference calibration result comprises a calibration result of the rotation vector to be calibrated and the translation vector to be calibrated.
In some possible embodiments, the external reference calibration unit, when establishing the rotation transformation equation based on the first rotation vector, the second rotation vector, and the rotation vector to be calibrated, is specifically configured to:
multiplying the first rotation vector by the rotation vector to be calibrated to obtain a first rotation mode for rotating the camera coordinate system of the first frame to the body coordinate system of the second frame;
multiplying the rotation vector to be calibrated with the second rotation vector to obtain a second rotation mode of rotating the camera coordinate system of the first frame to the body coordinate system of the second frame;
and establishing a rotation transformation equation based on the equivalence relation between the first rotation mode and the second rotation mode.
In some possible embodiments, the third rotation vector is the second rotation vector; the external reference calibration unit is specifically configured to, when establishing a translation transformation equation based on the first translation vector, the second translation vector, the third rotation vector, the rotation vector to be calibrated, and the translation vector to be calibrated:
multiplying the rotation vector to be calibrated by the first translation vector, and adding the multiplied rotation vector to be calibrated and the translation vector to be calibrated to obtain a first translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame;
multiplying the second rotation vector by the translation vector to be calibrated, and adding the multiplied second rotation vector and the translation vector to be calibrated to obtain a second translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame;
and establishing a translation transformation equation based on the equivalence relation between the first translation mode and the second translation mode.
In some possible embodiments, the rotational transformation equation further includes a rotational noise factor, and the translational transformation equation further includes the rotational noise factor and a translational noise factor; the external reference calibration unit is specifically configured to, when solving the rotation transformation equation and the translation transformation equation to obtain an external reference calibration result:
obtaining a first residual error item caused by rotation noise according to the rotation transformation equation, and obtaining a calibration result of the rotation vector to be calibrated by minimizing the first residual error item;
and obtaining a second residual error term caused by rotation noise and translation noise according to the translation transformation equation, and obtaining a calibration result of the translation vector to be calibrated by minimizing the second residual error term.
In some possible embodiments, the apparatus further includes a first determination unit and a first retry unit, wherein the first determination unit is configured to determine the obtained external reference calibration result; the first retry unit is used for carrying out camera external parameter calibration again under the condition that the external parameter calibration result is determined not to meet the external parameter calibration success condition until the obtained external parameter calibration result meets the external parameter calibration success condition or the frequency of carrying out camera external parameter calibration reaches a first preset frequency.
In some possible embodiments, the acquiring unit is further configured to acquire calibration plate image data acquired by the camera during the movement of the mobile robot according to the second trajectory; the device also comprises an internal reference calibration unit and a third determination unit, wherein the internal reference calibration unit is used for calibrating the internal reference of the camera based on the image data of the calibration plate to obtain an internal reference calibration result; the third determining unit is used for determining the internal reference of the camera according to the obtained internal reference calibration result.
In some possible embodiments, the apparatus further includes a second determination unit and a second retry unit, wherein the second determination unit is configured to determine the obtained internal reference calibration result; the third determining unit is specifically configured to determine the internal reference calibration result as the internal reference of the camera when it is determined that the internal reference calibration result satisfies the successful internal reference calibration condition; the second retesting unit is used for re-calibrating the camera internal reference under the condition that the internal reference calibration result is determined not to meet the internal reference calibration success condition until the obtained internal reference calibration result meets the internal reference calibration success condition or the number of times of calibrating the camera internal reference reaches a second preset number.
In a third aspect, an embodiment of the present application provides a camera calibration system, where the system includes: the system comprises a mobile robot, a calibration plate, a bracket and a processor; a camera and a milemeter are fixedly arranged on the mobile robot; the calibration plate is arranged on the bracket and is positioned in the shooting range of the camera; the processor is communicatively connected to the camera and the odometer for implementing the method according to the first aspect and any one of its possible implementations.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory for storing computer program code comprising computer instructions, the electronic device performing the method of the first aspect and any one of its possible implementations as described above, if the processor executes the computer instructions.
In a fifth aspect, an embodiment of the present application provides an electronic device, including: a processor, a transmitting device, an input device, an output device, and a memory for storing computer program code comprising computer instructions, which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any one of its possible implementations.
In a sixth aspect, the present application provides a computer-readable storage medium, where a computer program is stored, where the computer program includes program instructions, and when the program instructions are executed by a processor, the processor is caused to execute the method according to the first aspect and any one of the possible implementation manners of the first aspect.
In a seventh aspect, the present application provides a computer program product, where the computer program or the instructions cause a computer to execute the method according to the first aspect and any one of the possible implementation manners described above when the computer program or the instructions are executed on the computer.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the technical aspects of the disclosure.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic view of an application scenario of a camera calibration method provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a camera calibration method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a physical relationship between a camera coordinate system and a body coordinate system according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a camera internal and external reference calibration behavior tree according to an embodiment of the present disclosure;
fig. 5 is a schematic flowchart of a camera internal and external reference calibration method according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of a camera internal and external reference calibration method according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of a calibration board image taken by a top view camera of a mobile robot according to an embodiment of the present disclosure;
FIG. 8 is a diagram illustrating the results of an internal reference calibration provided by an embodiment of the present application;
FIG. 9 is a diagram illustrating the results of an external reference calibration provided by an embodiment of the present application;
fig. 10 is a schematic diagram illustrating a distance between an optical center of an actual measurement camera and a mobile robot body according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a camera calibration apparatus according to an embodiment of the present application;
fig. 12 is a schematic hardware structure diagram of a camera calibration apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
The terms "first," "second," and the like herein are used for distinguishing between different objects and not necessarily for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The term "and/or" herein is merely an associative relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The technology of instant positioning and Mapping (SLAM) is a core technology for a robot to construct an environment map and position itself in an unknown environment, and the SLAM technology is mainly applied to the fields of robots, Augmented Reality (AR), Virtual Reality (VR), automatic driving and the like.
SLAM technologies mainly include visual SLAM (using a monocular camera, a binocular camera, a depth camera, etc.) and laser SLAM (using a laser radar). Mobile robots based on visual SLAM are more and more common, and compared with laser radar, the mobile robots based on visual SLAM are smaller in camera volume, lower in power consumption and lower in price, and can provide rich physical world visual information.
Before the SLAM technology is used, internal and external reference calibration of a sensor is involved, and the final positioning and mapping effect is directly influenced by the quality of the calibration. The internal reference mainly refers to various parameters of the sensor, such as the focal length and distortion coefficient of the camera, the radius of the differential wheel of the wheel odometer and the wheel spacing coefficient. External references refer primarily to the relative relationship between different sensors, such as the translation relationship between a camera and an odometer. In the process of positioning and mapping, information of a camera and an odometer needs to be fused. If the two are fused simply, a great positioning error can be brought, and accurate fusion needs accurate external reference calibration.
The existing external reference calibration method depends on a high-cost built motion capture system (such as a whole body motion capture system Optitrack and an optical motion capture system Vicon) or utilizes a specific road sign of a specific scene for calibration, and the calibration methods are high in cost or limited in calibration scene, have the common points of complicated calibration steps and low calibration efficiency, and are not suitable for mass production environments.
Based on this, the embodiment of the application provides a camera calibration method to reduce calibration complexity and improve calibration efficiency and calibration environment applicability.
The main body of the camera calibration method may be a camera calibration apparatus, for example, the camera calibration method may be executed by a terminal device or a server or other processing device, where the terminal device may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, or the like. In some possible implementations, the camera calibration method may be implemented by a processor calling computer readable instructions stored in a memory.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of a camera calibration method according to an embodiment of the present application, where the application scenario relates to a mobile robot 1, a calibration board 2, a bracket 3, and a processor (not shown in the figure). Wherein, the mobile robot 1 is fixedly provided with a camera 4 and a milemeter (not shown in the figure); the calibration plate 2 is arranged on the bracket 3 and is positioned in the shooting range of the camera 4; the processor is in communication with the camera 4 and the odometer.
The processor acquires information acquired by the camera 4 and the odometer, and calibrates a transformation relation between a camera coordinate system corresponding to the camera 4 and a body coordinate system corresponding to the mobile robot 1 according to the information acquired by the camera 4 and the odometer, namely, calibrates external parameters of the camera 4. Optionally, the processor is further configured to calibrate the internal parameters of the camera 4 according to the information collected by the camera 4. Optionally, the processor is also used to control the mobile robot 1 to perform planar motion on the ground.
Optionally, the support 3 is a gimbal. One end of the universal support is fixed, the other end of the universal support is connected with the calibration plate 2, and the pose of the calibration plate 2 can be changed by adjusting each joint of the universal support, so that the calibration plate 2 is in the shooting range of the camera 4. Further, it is also possible to ensure that the entire calibration plate 2 is in the image taken by the camera 4 by adjusting the height of the gimbal.
Exemplarily, as shown in fig. 1, the camera 4 is located at the top of the mobile robot 1 and its shooting direction is upward, that is, the camera 4 is a top view camera of the mobile robot 1, the calibration board 2 is suspended right above the mobile robot 1, the surface of the calibration board 2 is parallel to the ground and opposite to the camera 4, and this scene is suitable for calibrating the top view camera of the mobile robot 1.
It should be understood that the position of the camera 4 on the mobile robot 1 and the pose of the calibration plate 2 can be set according to actual requirements to adapt to different calibration requirements. For example, if it is desired to calibrate the front-view camera of the mobile robot 1, the camera 4 may be provided on the mobile robot 1 and its shooting direction is directed toward the advancing direction of the mobile robot 1, the gimbal may be adjusted so that the calibration plate 2 is located right in front of the mobile robot, and the surface of the calibration plate 2 is perpendicular to the ground and opposite to the camera 4, thereby being suitable for calibrating the front-view camera of the mobile robot 1.
Optionally, the calibration board is a visual reference system (AprilTag) based calibration board.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a camera calibration method according to an embodiment of the present disclosure, where the camera calibration method includes the following steps S201 to S204.
S201, in the process that the mobile robot moves according to the first track, the image information of a calibration board collected by a camera fixed on the mobile robot and the information of an odometer collected by an odometer fixed on the mobile robot are obtained.
The first track is a motion track required by the mobile robot when the camera on the mobile robot is subjected to external reference calibration. The first track may be set according to actual requirements, and may be an S-type or Z-type track, for example. The calibration board image information is image information obtained by shooting the calibration board by a camera. The odometer information may be information related to a motion state of the mobile robot, such as a motion distance, a wheel speed, and the like.
Specifically, in the process that the mobile robot moves according to the first track, the calibration board is kept fixed, the camera collects images of the calibration board according to a preset frame rate, and the obtained image information of the calibration board is a calibration board image sequence which comprises at least two frames of calibration board images. The acquisition time corresponding to different frames of calibration plate images is in sequence, and the acquisition time corresponding to each frame of calibration plate image can be used as a time stamp of the frame of calibration plate image to add the time stamp to each frame of calibration plate image.
In the process that the mobile robot moves according to the first track, the odometer acquires odometer data according to a preset frequency, and the obtained odometer information is an odometer data sequence which comprises at least two frames of odometer data. The acquisition time corresponding to different frames of odometry data is in sequence, the acquisition time corresponding to each frame of odometry data can be used as a time stamp of the frame of odometry data, and the time stamp is added to each frame of odometry data.
Optionally, the calibration board image sequence and the odometer data sequence are aligned, for example, the frequency of the collected data of the camera and the odometer is synchronized, or the collected data is adaptively deleted, so that each frame of calibration board image is synchronized with the time stamp of each frame of the odometer data, thereby facilitating the subsequent calculation. In addition, for data collected by the odometer, data which indicate that the mobile robot has small displacement or no difference exists between the left wheel speed and the right wheel speed can be removed, so that data which have little significance for subsequent calculation can be eliminated, the calculated amount is reduced, and the calibration efficiency is improved.
Optionally, the distance of the first trajectory is smaller than a preset distance, wherein the preset distance may be understood as the maximum distance at which the odometer can maintain a higher accuracy. The value of the preset distance may be set in combination with the actual demand, for example, the preset distance is 1 meter. The odometer has the characteristic of high precision within a short distance, and the uncertainty of the odometer may gradually increase along with the increase of the motion track of the mobile robot. Therefore, the distance of the motion trail of the mobile robot is controlled to be smaller than the preset distance, so that the mobile robot can move in a short distance, and the reliability of the odometer information is guaranteed.
S202, determining first position and posture transformation data of the camera in the motion process according to the internal reference of the camera and the image information of the calibration plate.
The internal reference of the camera refers to parameters related to the characteristics of the camera itself, such as a focal length and a pixel size. For example, the internal parameters of the camera may be obtained from factory parameters of the camera, or may be obtained by calibrating the internal parameters in advance. The camera's internal parameters are used to implement the transformation from the camera coordinate system to the pixel coordinate system of the image.
After the calibration plate image is obtained, the two-dimensional pixel coordinates of each feature point on the calibration plate under a pixel coordinate system can be determined, and in addition, because the calibration plate carries scale information, the three-dimensional world coordinates of each feature point on the calibration plate under a world coordinate system are also determined. The relative pose between the camera and the calibration plate can be estimated by utilizing the three-dimensional world coordinates and the two-dimensional pixel coordinates of each feature point and the internal parameters of the camera. Compared with other environmental features (such as road signs) lacking in scale information, the use of the calibration plate is beneficial to eliminating estimation errors caused by the lack of scale information and improving the accuracy of camera pose estimation.
The first pose transformation data of the camera in the motion process can be understood as the relative poses of the camera at different times. Specifically, in the moving process of the mobile robot, the camera moves along with the movement of the mobile robot, and the calibration plate is kept fixed, so that the pose of the camera relative to the calibration plate changes along with time, and the relative poses of the camera at different times can be determined according to the pose change of the camera relative to the calibration plate.
And S203, determining second position and posture transformation data of the mobile robot in the motion process according to the odometer information.
By using the odometer information, the real-time pose of the mobile robot in the odometer coordinate system can be estimated. Illustratively, the mobile robot is a differentially driven mobile robot, and the odometer information includes displacement and left and right wheel speeds of the mobile robot.
The second pose transformation data of the mobile robot in the motion process can be understood as the relative poses of the mobile robot at different times. Specifically, before the mobile robot moves, the odometer coordinate system is coincident with the body coordinate system, the odometer coordinate system is regarded as a world coordinate system, the body coordinate system changes along with the movement of the mobile robot in the moving process of the mobile robot, and the odometer coordinate system is fixed, so that the position of the body coordinate system relative to the odometer coordinate system (namely the position of the mobile robot in the odometer coordinate system) changes along with time, and the relative position of the mobile robot at different times can be determined according to the position change of the body coordinate system relative to the odometer coordinate system.
And S204, performing camera external reference calibration based on the first position posture transformation data and the second position posture transformation data to obtain an external reference calibration result, wherein the external reference calibration result is used for representing a transformation relation between a camera coordinate system corresponding to the camera and a body coordinate system corresponding to the mobile robot.
After the first position and posture transformation data and the second position and posture transformation data are determined, the camera external parameters are calibrated by utilizing the first position and posture transformation data and the second position and posture transformation data. It will be appreciated that the camera is rigidly fixed to the mobile robot, so there is a fixed relative displacement and orientation between the camera coordinate system and the body coordinate system and does not change with the movement of the mobile robot or with the passage of time. The camera external reference calibrated herein is used for characterizing the transformation relationship between the camera coordinate system and the body coordinate system, that is, for describing the relative displacement and orientation between the camera coordinate system and the body coordinate system.
Specifically, the first pose transformation data includes relative poses of the camera at different times, and the camera pose at the first time can be transformed to the camera pose at the second time and/or the camera pose at the second time can be transformed to the camera pose at the first time according to the first pose transformation data. The second pose transformation data comprises relative poses of the mobile robot at different times, and the pose of the mobile robot at the first time can be transformed into the pose of the mobile robot at the second time and/or the pose of the mobile robot at the second time can be transformed into the pose of the mobile robot at the first time according to the second pose transformation data. Because the transformation relation between the camera coordinate system and the body coordinate system is fixed, the camera pose and the mobile robot pose at the same time can be transformed into each other through the transformation relation, namely, the camera pose at the first time can be transformed into the mobile robot pose at the first time according to the transformation relation, and/or the camera pose at the second time can be transformed into the mobile robot pose at the second time. By integrating the conversion process, a plurality of conversion equations with the conversion relation as the quantity to be calibrated can be established, and the calibration result of the conversion relation can be obtained. In the embodiment, the first position and posture transformation data of the camera in the motion process are determined by utilizing the image information of the calibration plate, the second position and posture transformation data of the mobile robot in the motion process are determined by utilizing the information of the odometer, the camera external reference calibration is realized by combining the first position and posture transformation data and the second position and posture data, the calibration process does not depend on a complex motion capture system and is not limited by a specific scene or a dominant road sign, only the calibration plate and the odometer are used, the operation is simple, the cost is low, the calibration difficulty and the calibration time consumption are favorably reduced, the calibration efficiency is improved, and the calibration environment is more flexible.
In a possible implementation manner, determining the first pose transformation data of the camera during the motion process according to the internal reference of the camera and the calibration board image information may specifically include the following steps: obtaining at least one group of calibration plate images according to the calibration plate image information, wherein each group of calibration plate images comprise a first frame of calibration plate image and a second frame of calibration plate image which are adjacent; obtaining a first pose of the camera relative to the calibration plate according to the internal reference of the camera and the coordinates of each calibration point in the first frame of calibration plate image in a world coordinate system and a pixel coordinate system; acquiring a second pose of the camera relative to the calibration plate according to the internal reference of the camera and the coordinates of each calibration point in the second frame of calibration plate image in the world coordinate system and the pixel coordinate system; determining a pose transformation vector of the camera between each two adjacent frames based on the first pose and the second pose, the first pose transformation data comprising the pose transformation vector of the camera between each two adjacent frames.
The calibration plate image information comprises at least two frames of calibration plate images, and a group of calibration plate images are formed by two adjacent frames of calibration plate images. Each set of calibration board images includes a first frame of calibration board image and a second frame of calibration board image, where the first frame of calibration board image may be a calibration board image with an earlier timestamp (for short, a previous frame of calibration board image) or a calibration board image with a later timestamp (for short, a next frame of calibration board image). It can be understood that if the first frame of calibration plate image is the previous frame of calibration plate image, the second frame of calibration plate image is the next frame of calibration plate image; and if the first frame of calibration board image is the next frame of calibration board image, the second frame of calibration board image is the previous frame of calibration board image.
The coordinates of the index point in the world coordinate system are three-dimensional world coordinates, the coordinates of the index point in the pixel coordinate system are two-dimensional pixel coordinates, and the two-dimensional pixel coordinates can be understood as the projection of the index point in the three-dimensional space on the image. Specifically, a three-dimensional world coordinate and a two-dimensional pixel coordinate of each calibration Point form a 3D-2D Point pair, the 3D-2D Point pair and internal parameters of the camera are utilized, the Perspective-n-Point (PnP) problem is solved in an iterative mode on the basis of the minimized visual reprojection error, and the pose of the camera relative to the calibration plate can be obtained.
The first pose of the camera relative to the calibration plate is the pose of the camera relative to the calibration plate at the time indicated by the timestamp of the first frame of calibration plate image. The second pose of the camera relative to the calibration plate is the pose of the camera relative to the calibration plate at the time indicated by the timestamp of the second frame of calibration plate image. After the first pose and the second pose of the camera relative to the calibration board are obtained, a pose transformation vector of the camera between two adjacent frames can be obtained through vector operation, and the pose transformation vector can comprise a translation vector for describing relative displacement and a rotation vector for describing relative orientation.
In the embodiment, the pose transformation vector of the camera between two adjacent frames is calculated by using the information of the images of the calibration plates of two adjacent frames, and the absolute scale information of the calibration plates is used as a reference, so that a scale estimator is not required to be introduced in the pose calculation process, the complexity and difficulty of pose calculation are favorably reduced, the calculation error caused by lack of scale information is reduced, more accurate camera pose transformation data is obtained, and the calibration precision is favorably improved.
In one possible implementation, determining second posture transformation data of the mobile robot during the movement according to the odometer information comprises: obtaining at least one group of odometry data according to the odometry information, wherein each group of odometry data comprises adjacent first frame odometry data and second frame odometry data; obtaining a third posture of the mobile robot relative to a world coordinate system according to the first frame of odometer data; obtaining a fourth pose of the mobile robot relative to a world coordinate system according to the second frame of odometer data; and determining a pose transformation vector of the mobile robot between every two adjacent frames based on the third pose and the fourth pose, wherein the second pose transformation data comprises the pose transformation vector of the mobile robot between every two adjacent frames.
The odometry information comprises at least two frames of odometry data, and two adjacent frames of odometry data form a group of odometry data. Each set of odometry data comprises first frame odometry data and second frame odometry data, wherein the first frame odometry data can be odometry data with an early time stamp (called previous frame odometry data for short) or odometry data with a later time stamp (called next frame odometry data for short). It can be understood that if the first frame of odometry data is the previous frame of odometry data, the second frame of odometry data is the next frame of odometry data; and if the first frame of odometry data is the next frame of odometry data, the second frame of odometry data is the previous frame of odometry data.
Alternatively, the odometry data includes left and right wheel speeds of the mobile robot, and the pose of the mobile robot with respect to the world coordinate system may be obtained by performing calculation based on the left and right wheel speeds of the mobile robot. Illustratively, during the movement of the mobile robot, the rotating speed of the left wheel and the rotating speed of the right wheel of the mobile robot are measured through an odometer sensor, the circumference of the wheel is known, so that the speed of the left wheel can be calculated according to the rotating speed of the left wheel and the circumference of the wheel, and the speed of the right wheel can be calculated according to the rotating speed of the right wheel and the circumference of the wheel. The average value of the left wheel speed and the right wheel speed is used as the linear velocity of the mobile robot, and then the displacement of the mobile robot can be obtained through calculation according to the linear velocity and the motion time of the mobile robot. Further, the wheel spacing between the left and right wheels is also known, so that from the wheel spacing and the speed difference between the left and right wheel speeds, the angular velocity of the mobile robot can be calculated. Then, based on the calculated linear velocity, displacement, and angular velocity of the mobile robot, the position and attitude (i.e., the angle of rotation) of the mobile robot with respect to the world coordinate system can be obtained.
The third pose of the mobile robot with respect to the world coordinate system is the pose of the mobile robot with respect to the world coordinate system at the time indicated by the time stamp of the first frame of odometer data. The fourth pose of the mobile robot with respect to the world coordinate system is the pose of the mobile robot with respect to the world coordinate system at the time indicated by the time stamp of the second frame of odometer data. After the third pose and the fourth pose of the mobile robot relative to the world coordinate system are obtained, a pose transformation vector of the mobile robot between two adjacent frames can be obtained through vector operation, and the pose transformation vector can comprise a translation vector for describing relative displacement and a rotation vector for describing relative orientation.
In the embodiment, the position and pose transformation vector of the mobile robot between two adjacent frames is calculated by using the odometer data of two adjacent frames, the odometer data has higher precision and is easy to directly acquire, the complexity and difficulty of position and pose calculation are favorably reduced, more reliable position and pose transformation data of the mobile robot are obtained, and the calibration precision is favorably improved.
In one possible implementation, the pose transformation vector of the camera between each two adjacent frames comprises a first translation vector and a first rotation vector, wherein the first translation vector represents a position vector of the camera pointing to the first frame from the second frame, and the first rotation vector represents a rotation vector of the camera rotating from the first frame to the second frame; the pose transformation vector of the mobile robot between each two adjacent frames comprises a second translation vector and a second rotation vector, wherein the second translation vector represents a position vector of the mobile robot pointing to the first frame from the second frame, and the second rotation vector represents a rotation vector of the mobile robot rotating from the first frame to the second frame; the method includes performing camera external parameter calibration based on the first position posture transformation data and the second position posture transformation data to obtain an external parameter calibration result, and specifically includes the following steps: establishing a rotation transformation equation based on the first rotation vector, the second rotation vector and the rotation vector to be calibrated, wherein the rotation vector to be calibrated is used for representing the rotation transformation relation between a camera coordinate system and a body coordinate system; establishing a translation transformation equation based on the first translation vector, the second translation vector, the third rotation vector, the rotation vector to be calibrated and the translation vector to be calibrated, wherein the third rotation vector is the first rotation vector or the second rotation vector, and the translation vector to be calibrated is used for representing a displacement transformation relation between a camera coordinate system and a body coordinate system; and solving the rotation transformation equation and the translation transformation equation to obtain an external reference calibration result, wherein the external reference calibration result comprises a calibration result of the rotation vector to be calibrated and the translation vector to be calibrated.
The calibration result of the rotation vector to be calibrated is used for representing the rotation transformation relation between a camera coordinate system and a machine body coordinate system, the calibration result of the translation vector to be calibrated is used for representing the displacement transformation relation between the camera coordinate system and the machine body coordinate system, and the transformation relation between the camera coordinate system and the machine body coordinate system comprises the rotation transformation relation and the displacement transformation relation.
The translation transformation equation involves a rotation vector because when translating from one coordinate system (denoted as the starting coordinate system) to another coordinate system (denoted as the target coordinate system), if there is a relative rotation between the two coordinate systems, it is necessary to rotate the coordinate systems first and then translate the coordinate systems, for example, to rotate the target coordinate system to the same angle as the starting coordinate system first and then translate the starting coordinate system to the target coordinate system.
In an example, the rotation vector to be calibrated is a rotation vector rotating from a camera coordinate system to a body coordinate system, and the translation vector to be calibrated is a position vector pointing from the body coordinate system to the camera coordinate system.
In another example, the rotation vector to be calibrated is a rotation vector rotating from a body coordinate system to a camera coordinate system, and the translation vector to be calibrated is a position vector pointing from the camera coordinate system to the body coordinate system.
Referring to fig. 3, fig. 3 is a schematic diagram of a physical relationship between a camera coordinate system and a body coordinate system according to an embodiment of the present application. The two coordinate axes represent an X axis and a Y axis, the mobile robot moves on an XY plane, and a Z axis is omitted; b is i And B i+1 A body coordinate system representing a first frame and a second frame, respectively; c i And C i+1 Camera coordinate systems respectively representing the first frame and the second frame;
Figure BDA0003648108300000121
a rotation vector representing a rotation of the mobile robot from the first frame to the second frame (i.e., a second rotation vector);
Figure BDA0003648108300000122
a rotation vector representing a rotation of the camera from a first frame to a second frame (i.e., a first rotation vector);
Figure BDA0003648108300000123
a rotation vector (namely a rotation vector to be calibrated) which is rotated from a camera coordinate system to a body coordinate system is represented;
Figure BDA0003648108300000124
a position vector (i.e., a second translation vector) representing that the mobile robot points from the second frame to the first frame;
Figure BDA0003648108300000125
a position vector (i.e., a first translation vector) representing the camera pointing from the second frame to the first frame; B p C representing a position vector (i.e., a translation vector to be calibrated) pointing from the body coordinate system to the camera coordinate system.
As can be seen in FIG. 3, from the camera coordinate system (C) of the first frame i ) Body coordinate system to second frame (B) i+1 ) Comprising twoA path. The first rotation transformation path is based on the first rotation vector
Figure BDA0003648108300000126
The camera coordinate system (C) of the first frame i ) Camera coordinate system rotated to the second frame (C) i+1 ) Based on the rotation vector to be calibrated
Figure BDA0003648108300000127
The camera coordinate system (C) of the second frame i+1 ) Body coordinate system rotated to second frame (B) i+1 ). The second rotation transformation path is based on the rotation vector to be calibrated
Figure BDA0003648108300000128
The camera coordinate system (C) of the first frame i ) Body coordinate system rotated to first frame (B) i ) Based on the second rotation vector
Figure BDA0003648108300000129
The body coordinate system (B) of the first frame i ) Body coordinate system rotated to second frame (B) i+1 ). The rotation transformation realized by the first rotation transformation path and the second rotation transformation path is equivalent, and a rotation transformation equation can be established based on the rotation transformation.
From the body coordinate system of the second frame (B) i+1 ) Camera coordinate system to first frame (C) i ) The translation transformation of (1) comprises two paths. The first translation transformation path is based on the translation vector to be calibrated B p C ) The body coordinate system (B) of the second frame i+1 ) Camera coordinate system (C) translated to a second frame i+1 ) Based on the rotation vector to be calibrated
Figure BDA0003648108300000131
And a first translational vector
Figure BDA0003648108300000132
The camera coordinate system (C) of the second frame i+1 ) Camera coordinate system (C) translated to first frame i ). The second translation transformation path is based on the secondTranslation vector
Figure BDA0003648108300000133
Body coordinate system (B) of second frame i+1 ) Body coordinate system translated to first frame (B) i ) Based on the second rotation vector
Figure BDA0003648108300000134
And the translation vector to be calibrated ( B p C ) The body coordinate system (B) of the first frame i ) Camera coordinate system (C) translated to first frame i ). The translation transformation realized by the first translation transformation path and the second translation transformation path is equivalent, and a translation transformation equation can be established based on the translation transformation.
It will be appreciated that in the rotational transformation equation and the translational transformation equation, the first rotation vector
Figure BDA0003648108300000135
Second rotation vector
Figure BDA0003648108300000136
First translational vector
Figure BDA0003648108300000137
And a second translational vector
Figure BDA0003648108300000138
For known quantity, the rotation vector to be calibrated
Figure BDA0003648108300000139
And the translation vector to be calibrated ( B p C ) Is the quantity to be estimated (i.e., unknown quantity). Exemplarily, according to a plurality of sets of calibration plate images acquired by a camera in the moving process of the mobile robot, a plurality of sets of inter-frame pose transformation data of the camera and the mobile robot in the moving process can be obtained, a plurality of sets of known quantities can be obtained, and on the basis of a rotation transformation equation and a translation transformation equation, the plurality of sets of obtained known quantities are combined to solve an unknown quantity.
In the above embodiment, a rotation transformation equation and a translation transformation equation are established by using the translation vector and the rotation vector of the camera between frames, the translation vector and the rotation vector of the mobile robot between frames, and the rotation vector to be calibrated and the translation vector to be calibrated, and by solving the equations, the external reference calibration result of the camera can be quickly obtained, thereby improving the calibration efficiency.
It should be noted that, in the above embodiment, the inter-frame rotation vector is a rotation vector of the first frame rotating to the second frame, and the inter-frame translation vector is a position vector of the second frame pointing to the first frame, so that the amount of calculation introduced when the rotation transformation equation and the translation transformation equation are established can be minimized, which is helpful to reduce the calculation complexity and improve the calculation efficiency.
In other possible embodiments, the inter-frame rotation vector is a rotation vector of the first frame rotating to the second frame, the inter-frame translation vector may also be a position vector of the first frame pointing to the second frame, and the rotation transformation equation and the translation transformation equation may also be established by using the translation vector and the rotation vector of the camera between frames, the translation vector and the rotation vector of the mobile robot between frames, and the rotation vector to be calibrated and the translation vector to be calibrated, and the camera external reference calibration result is obtained by solving the equations.
In a possible implementation manner, establishing a rotation transformation equation based on the first rotation vector, the second rotation vector, and the rotation vector to be calibrated may specifically include the following steps: multiplying the first rotation vector by the rotation vector to be calibrated to obtain a first rotation mode of rotating the camera coordinate system of the first frame to the body coordinate system of the second frame; multiplying the rotation vector to be calibrated by the second rotation vector to obtain a second rotation mode of rotating the camera coordinate system of the first frame to the body coordinate system of the second frame; and establishing a rotation transformation equation based on the equivalence relation between the first rotation mode and the second rotation mode.
Alternatively, the equivalence of the first rotation to the second rotation describes the camera coordinate system (C) of the first frame in an ideal case (no noise) i ) Body coordinate system to second frame (B) i+1 ) Two rotational mapping relationships between are equivalent.
Illustratively, the rotational transformation equation may be represented by the following formula (1):
Figure BDA00036481083000001310
wherein the content of the first and second substances,
Figure BDA00036481083000001311
a second manner of rotation is shown in which,
Figure BDA00036481083000001312
representing a first rotation, q represents a rotation of four elements,
Figure BDA00036481083000001313
representing a four-element multiplication sign, the four-element multiplication formula can be represented by the following formula (2):
Figure BDA00036481083000001314
wherein q is 1 And q is 2 To represent the four elements of rotation, the derivation of equation (2) is as follows:
order:
Figure BDA00036481083000001315
wherein, a 4 Denotes q 1 Real part element of a 1 、a 2 And a 3 Denotes q 1 The imaginary part element of (a) is,
Figure BDA00036481083000001411
and
Figure BDA0003648108300000141
represents a unit vector; order:
Figure BDA0003648108300000142
wherein, b 4 Denotes q 2 Real part element of b 1 、b 2 And b 3 Denotes q 2 The imaginary part element of (a) is,
Figure BDA0003648108300000143
and
Figure BDA0003648108300000144
represents a unit vector;
Figure BDA0003648108300000145
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003648108300000146
to represent
Figure BDA0003648108300000147
In the above embodiment, the rotational transformation equation is established by using the equivalence relation between two rotation modes of the camera coordinate system of the first frame rotating to the body coordinate system of the second frame, and accordingly, the rotational transformation relation between the camera coordinate system and the body coordinate system can be obtained through calculation.
In one possible embodiment, the third rotation vector is the second rotation vector; establishing a translation transformation equation based on the first translation vector, the second translation vector, the third rotation vector, the rotation vector to be calibrated and the translation vector to be calibrated, which may specifically include the following steps: multiplying the rotation vector to be calibrated by the first translation vector, and adding the multiplied rotation vector to be calibrated and the translation vector to be calibrated to obtain a first translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame; multiplying the second rotation vector by the translation vector to be calibrated, and adding the multiplied second rotation vector and the second translation vector to obtain a second translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame; and establishing a translation transformation equation based on the equivalence relation between the first translation mode and the second translation mode.
Alternatively, the equivalence of the first translation mode to the second translation mode describes the body of the second frame under ideal conditions (no noise)Coordinate system (B) i+1 ) Camera coordinate system (C) pointing to the first frame i ) Are equivalent.
Illustratively, the translation transformation equation may be represented by the following formula (3):
Figure BDA0003648108300000148
wherein the content of the first and second substances,
Figure BDA0003648108300000149
a second manner of translation is shown, in which,
Figure BDA00036481083000001410
representing a first translation, R (q) represents the conversion of the four elements of rotation q into a corresponding rotation matrix R.
In the above embodiment, the translation transformation equation is established by using the equivalence relation between the two translation modes of translating the body coordinate system of the second frame to the camera coordinate system of the first frame, and accordingly, the translation transformation relation between the camera coordinate system and the body coordinate system can be obtained through calculation.
In a possible implementation, the rotational transformation equation further includes a rotational noise factor, and the translational transformation equation further includes a rotational noise factor and a translational noise factor; solving the rotation transformation equation and the translation transformation equation to obtain the external parameter calibration result may specifically include the following steps: obtaining a first residual error term caused by rotational noise according to a rotational transformation equation, and obtaining a calibration result of a to-be-calibrated rotational vector by minimizing the first residual error term; and obtaining a second residual error term caused by the rotation noise and the translation noise according to the translation transformation equation, and obtaining a calibration result of the translation vector to be calibrated by minimizing the second residual error term.
The rotation transformation equation (1) and the translation transformation equation (3) are transformation equations in an ideal case without noise, and noise may exist in an actual calibration process and noise may exist in a pose transformation vector calculation process of both the camera and the mobile robot. The rotational noise factor is used to characterize noise in the rotational transform and the translational noise factor is used to characterize noise in the translational transform. The rotation vector and the translation vector obtained by actual calculation can be understood as being determined by both the observed quantity and the noise (measurement error) in an ideal state.
Illustratively, the actually calculated rotation vector is represented by the following formula (4):
Figure BDA0003648108300000151
wherein the content of the first and second substances,
Figure BDA0003648108300000152
representing the rotation vector obtained by actual calculation, q representing the observed value of the rotation vector under the ideal state, and δ q -1 For the rotation noise expressed by four elements, the two sides of the formula (4) are simultaneously multiplied by delta q to obtain
Figure BDA0003648108300000153
For ease of calculation, δ q represents a small rotation.
The actually calculated translation vector is expressed by the following formula (5):
Figure BDA0003648108300000154
wherein the content of the first and second substances,
Figure BDA0003648108300000155
representing the translation vector obtained by actual calculation, p represents the observed value of the translation vector in an ideal state,
Figure BDA0003648108300000156
is translational noise.
Substituting equation (4) into equation (1) yields the following equation (6):
Figure BDA0003648108300000157
the formula (6) is rewritten into the following formula (7) using the formula (2):
Figure BDA0003648108300000158
in order to facilitate the calculation of the calculation,
Figure BDA0003648108300000159
expressed by the following formula (8):
Figure BDA00036481083000001510
wherein, I 4 An identity matrix is represented, and B (δ θ) represents conversion of δ q into an angle-dependent rotation matrix. Specifically, for a small rotation δ q, it can be expressed as:
Figure BDA00036481083000001511
wherein δ θ represents a rotation angle corresponding to δ q, u x 、u y And u z Showing the axis of rotation of the shaft,
Figure BDA00036481083000001512
and
Figure BDA00036481083000001513
representing a unit vector. Note that the subscripts on δ q and δ θ denote coordinate systems, e.g., herein
Figure BDA00036481083000001514
Representing the rotation generated by the rotational noise in the body coordinate system,
Figure BDA00036481083000001515
to represent
Figure BDA00036481083000001516
The corresponding angle of rotation is set to be,
Figure BDA00036481083000001517
representing the rotation resulting from rotational noise in the camera coordinate system,
Figure BDA00036481083000001518
to represent
Figure BDA00036481083000001519
The corresponding rotation angle.
Substituting equation (8) into equation (7) yields the following equation (9):
Figure BDA00036481083000001520
expanding and shifting the formula (9) to the left and the right to obtain the following formula (10):
Figure BDA00036481083000001521
order:
Figure BDA00036481083000001522
where eta i And a residual term caused by rotation noise is represented and is a residual term between the ith frame and the (i + 1) th frame, a loss function can be constructed by using the residual term, so that the external reference calibration problem is converted into a nonlinear optimization problem, and the final to-be-estimated quantity is obtained by minimizing the residual term.
Optionally, in order to more accurately estimate the rotation transformation relationship between the camera coordinate system and the body coordinate system
Figure BDA00036481083000001523
And integrating all the inter-frame residual error items for optimization to minimize the whole residual error item, thereby obtaining the optimal rotation transformation relation estimation value.
Translation transformation relation between camera coordinate system and body coordinate system B p C Is similar to the estimation process
Figure BDA00036481083000001524
The estimation process of (2) is specifically as follows:
shifting the formula (3) to obtain the following formula (11):
Figure BDA00036481083000001525
equation (11) may be converted into the following equation (12):
Figure BDA00036481083000001526
wherein, I 3 Representing an identity matrix. Substituting equations (4) and (5) into equation (12) yields the following equation (13):
Figure BDA0003648108300000161
in order to facilitate the calculation of the calculation,
Figure BDA0003648108300000162
expressed by the following formula (14):
Figure BDA0003648108300000166
wherein skew represents a skew symmetric matrix. Substituting equation (14) into equation (13) yields the following equation (15):
Figure BDA0003648108300000163
expanding and shifting the terms of the formula (15) to the left and the right to obtain the following formula (16):
Figure BDA0003648108300000164
order:
Figure BDA0003648108300000165
zeta here i And residual terms caused by rotation noise and translation noise are represented and are residual terms between the ith frame and the (i + 1) th frame, a loss function can be constructed by utilizing the residual terms, so that the external parameter calibration problem is converted into a nonlinear optimization problem, and the final quantity to be estimated is obtained by minimizing the residual terms.
Optionally, in order to more accurately estimate the translation transformation relationship between the camera coordinate system and the body coordinate system B p C And integrating all the inter-frame residual error items to optimize so that the integral residual error item reaches the minimum value, thereby obtaining the optimal translation transformation relation estimation value.
In the above embodiment, the noise influence in the actual calibration process is considered, the residual error term caused by the noise is extracted according to the rotation transformation equation and the translation transformation equation, and a more accurate external reference calibration result can be obtained by minimizing the residual error term.
In a possible implementation, after obtaining the external reference calibration result, the following steps may be further included: judging the obtained external reference calibration result; and under the condition that the external reference calibration result does not meet the external reference calibration success condition, re-calibrating the external reference of the camera until the obtained external reference calibration result meets the external reference calibration success condition or the times of calibrating the external reference of the camera reach a first preset time.
The external reference calibration success condition is a condition for judging whether the external reference calibration of the camera is successful or not, if the external reference calibration result meets the external reference calibration success condition, the external reference calibration of the camera is successful, and the successfully calibrated external reference of the camera can be applied to subsequent tasks (such as positioning and map construction) needing to fuse data collected by a camera and a odometer on the mobile robot.
Illustratively, after a camera is installed on the mobile robot, the translation amount and the rotation amount of the camera relative to a body coordinate system can be physically measured, the translation amount error is obtained by comparing the translation amount of the external reference calibration result with the translation amount obtained by the measurement, the rotation amount error is obtained by comparing the rotation amount of the external reference calibration result with the rotation amount obtained by the measurement, if at least one error of the translation amount error and the rotation amount error is greater than a corresponding error threshold value, the external reference calibration of the camera is judged to be unsuccessful, and if both the translation amount error and the rotation amount error are less than or equal to the corresponding error threshold value, the external reference calibration of the camera is judged to be successful.
The first preset times are the maximum of times for repeatedly carrying out external reference calibration of the camera, if the external reference calibration result does not meet the external reference calibration success condition, the external reference calibration of the camera at the current time is failed, and under the condition that the external reference calibration times of the camera do not reach the first preset times, the external reference calibration of the camera is carried out again until the external reference calibration of the camera is successful or the external reference calibration times of the camera reach the first preset times, and the external reference calibration of the camera is finished.
In the above embodiment, an external reference calibration result determination mechanism and an external reference calibration retry mechanism are added, after the external reference calibration result is obtained, the obtained external reference calibration result is automatically determined, and when the external reference calibration result indicates that the external reference calibration of the camera fails, the external reference calibration of the camera is automatically performed again without manual intervention, which is beneficial to improving the success rate of the external reference calibration of the camera.
In a possible implementation, before performing the camera external reference calibration, the following steps may be further included: determining internal parameters of the camera; the method for determining the internal parameters of the camera specifically comprises the following steps: acquiring calibration plate image data acquired by a camera in the process that the mobile robot moves according to the second track; performing camera internal reference calibration based on the calibration plate image data to obtain an internal reference calibration result; and determining the internal reference of the camera according to the obtained internal reference calibration result.
The second trajectory is a motion trajectory required by the mobile robot when the camera on the mobile robot is internally calibrated. Illustratively, the second trajectory is a random winding trajectory. Optionally, in the process of randomly winding the mobile robot on the ground, the calibration plate also rotates randomly in a small range, so that the calibration plate image data acquired by the camera can contain richer pose change information, and the accuracy of camera internal reference calibration is improved.
The camera internal reference calibration can be performed by using an existing camera internal reference calibration algorithm, for example, an internal reference calibration algorithm based on a pinhole camera can be selected, or an internal reference calibration algorithm suitable for a wide-angle camera can be adopted to perform the camera internal reference calibration.
In the above embodiment, before the camera external reference calibration is performed, the camera internal reference calibration process is added, and the camera internal reference required in the subsequent camera external reference calibration is determined by using the internal reference calibration result, which is helpful for improving the reliability of the camera internal reference.
In a possible implementation manner, determining the internal reference of the camera according to the obtained internal reference calibration result may specifically include the following steps: judging the obtained internal reference calibration result; determining the internal reference calibration result as the internal reference of the camera under the condition that the internal reference calibration result meets the internal reference calibration success condition; and under the condition that the internal reference calibration result does not meet the internal reference calibration success condition, re-calibrating the internal reference of the camera until the obtained internal reference calibration result meets the internal reference calibration success condition or the number of times of calibrating the internal reference of the camera reaches a second preset number.
The internal reference calibration success condition is a condition for judging whether the internal reference calibration of the camera is successful or not, if the internal reference calibration result meets the internal reference calibration success condition, the internal reference calibration of the camera is successful, and the successfully calibrated internal reference of the camera is applied to subsequent external reference calibration of the camera. The internal reference calibration result may be determined by using an existing internal reference determination method, for example, whether the internal reference calibration of the camera is successful may be automatically determined by determining whether a reprojection error of the internal reference calibration of the camera is greater than a threshold, if the reprojection error is greater than the threshold, the internal reference calibration of the camera is determined to be unsuccessful, and if the reprojection error is less than or equal to the threshold, the internal reference calibration of the camera is determined to be successful.
The second preset times is the maximum value of the times of repeatedly carrying out camera internal reference calibration, if the internal reference calibration result does not meet the internal reference calibration success condition, the camera internal reference calibration fails, and under the condition that the camera internal reference calibration times does not reach the second preset times, the camera internal reference calibration is carried out again until the camera internal reference calibration succeeds or the camera internal reference calibration times reaches the second preset times, and the camera internal reference calibration is finished.
In the embodiment, an internal reference calibration result determination mechanism and an internal reference calibration retry mechanism are added, after the internal reference calibration result is obtained, the obtained internal reference calibration result is automatically determined, and under the condition that the internal reference calibration result indicates that the camera internal reference calibration fails, the camera internal reference calibration is automatically carried out again without manual intervention, so that the success rate of the camera internal reference calibration is favorably improved.
In one possible implementation, the entire automated camera extrinsic calibration process is implemented based on a Behavior Tree (Behavior Tree). Referring to fig. 4, fig. 4 is a schematic diagram of an internal and external reference calibration behavior tree for a camera according to an embodiment of the present application, where a Root node (Root) of the behavior tree includes two sub-branches, a left branch is an automatic internal reference calibration branch, a right branch is an automatic external reference calibration branch, sequential triggering and execution of the left and right branches are implemented through a Reactive Sequence node (Reactive Sequence), and it is ensured that after the execution of the left internal reference calibration branch is successful, the right external reference calibration branch is triggered and executed.
Specifically, as shown in fig. 4, the left branch realizes an automatic camera internal parameter calibration process, and in the left branch, a node driving the mobile robot to randomly rotate and a node acquiring image data of the calibration board are bound together by a Parallel node (Parallel), so that the mobile robot acquires image information of the calibration board simultaneously during the movement process. And when the parallel nodes are successfully operated, the reaction sequence nodes trigger automatic judgment of camera internal reference calibration and internal reference calibration results.
The right branch realizes the automatic calibration process of the camera external parameters, the camera internal parameters which are successfully calibrated are loaded in the right branch, and then the nodes which drive the mobile robot to move along the S-shaped orbit and the nodes which acquire the image data of the calibration plate and the odometer data are bound together through the parallel nodes, so that the mobile robot acquires the image information of the calibration plate and the odometer information simultaneously in the movement process. And when the parallel nodes are successfully operated, the reaction sequence nodes trigger the automatic judgment of the camera external reference calibration and the external reference calibration result.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating a camera internal and external reference calibration method according to an embodiment of the present application, including the following steps S501 to S506.
S501, performing camera internal reference calibration;
s502, judging whether the camera internal reference calibration is successfully executed, if not, entering a step S503, and if so, entering a step S504;
s503, judging whether the execution times of the camera internal reference calibration reach a second preset time, if not, returning to the step S501, and if so, ending the calibration;
s504, loading the camera internal parameters successfully calibrated, and executing camera external parameter calibration;
s505, judging whether the camera external reference calibration is successfully executed, if not, entering the step S506, and if so, ending the calibration;
s506, judging whether the execution times of the camera external reference calibration reaches a first preset time, if not, returning to the step S504, and if so, ending the calibration.
For the detailed description of step S501 to step S506, reference may be made to the foregoing embodiments, which are not repeated herein.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating a camera internal and external reference calibration method according to an embodiment of the present application, including the following steps S601 to S608.
S601, controlling the mobile robot to move according to a second track, and simultaneously storing the image data of the calibration board acquired by the camera in the moving process of the mobile robot;
s602, after the mobile robot stops moving, loading the stored image data of the calibration plate, and calibrating the camera internal parameters;
s603, judging whether the camera internal reference calibration is successful, if not, ending the calibration, and if so, entering the step S604;
s604, storing the camera internal parameters successfully calibrated;
s605, controlling the mobile robot to move according to a second track, and simultaneously storing the image information of the calibration board acquired by the camera and the odometer information acquired by the odometer in the moving process of the mobile robot;
s606, after the mobile robot stops moving, loading the stored image information of the calibration board, the odometer information and the camera internal parameters, and calibrating the camera external parameters;
s607, judging whether the camera external reference calibration is successful, if not, ending the calibration, if so, entering the step S608;
and S608, storing the camera external parameters successfully calibrated.
For the detailed description of step S601 to step S608, reference may be made to the foregoing embodiments, which are not repeated herein.
In the above embodiment, by constructing the full-automatic internal and external parameter calibration behavior tree, the mobile robot can be triggered to move according to a specific track, data are collected simultaneously, automatic calibration of the internal and external parameters of the camera and automatic judgment of the internal and external parameter calibration result are performed, manual intervention is not required, the labor cost is reduced, the method is suitable for automatic calibration on a production line of the mobile robot, and powerful support is provided for mass production climbing.
In an example, a calibration board image taken by a top view camera of the mobile robot is shown in fig. 7, an internal reference calibration result is shown in fig. 8, an external reference calibration result is shown in fig. 9, and a distance between an optical center of an actual measurement camera and a body of the mobile robot is shown in fig. 10. In fig. 9, rc (ypr) represents a relative rotation angle between the camera coordinate system and the body coordinate system, including a Yaw angle (Yaw), a Pitch angle (Pitch), and a Roll angle (Roll), and trc represents a relative displacement between the camera coordinate system and the body coordinate system, including an X-axis directional displacement and a Y-axis directional displacement.
Where the rotation angle results in an outward rotation around ZYX, which is equivalent to an inward rotation around XYZ, and the negative sign means that the person is rotating counterclockwise just to the positive direction of the coordinate system, -1.537rad corresponds to the person rotating the camera coordinate system (Xrightmost, Yfrontmost, Zupmost) by about 88.06 degrees (Yaw) counterclockwise about the Z axis (up), i.e., it can be aligned with the body coordinate system (Xfrontmost, Yleftmost, Zupmost). Pitch and Roll are essentially zero because they are nearly parallel to the body coordinate system XY plane. In the displacement result, 0.081m represents the relative coordinate of the camera on the X axis of the body coordinate system, which is consistent with the actual measurement result shown in fig. 10.
It will be understood by those of skill in the art that in the above method of the present embodiment, the order of writing the steps does not imply a strict order of execution and does not impose any limitations on the implementation, as the order of execution of the steps should be determined by their function and possibly inherent logic.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
Referring to fig. 11, fig. 11 is a schematic structural diagram of a camera calibration device according to an embodiment of the present application, where the camera calibration device 1100 includes: an acquisition unit 1101, a first determination unit 1102, a second determination unit 1103, and an external parameter calibration unit 1104, wherein:
an obtaining unit 1101, configured to obtain, during a process in which the mobile robot moves according to the first trajectory, calibration plate image information collected by a camera fixed to the mobile robot and odometer information collected by an odometer fixed to the mobile robot;
a first determining unit 1102, configured to determine first pose transformation data of the camera in a motion process according to internal parameters of the camera and image information of the calibration board;
a second determining unit 1103, configured to determine, according to the odometer information, second posture transformation data of the mobile robot during the movement;
and an external reference calibration unit 1104, configured to perform external reference calibration on the camera based on the first position and orientation transformation data and the second position and orientation transformation data to obtain an external reference calibration result, where the external reference calibration result is used to represent a transformation relationship between a camera coordinate system corresponding to the camera and a body coordinate system corresponding to the mobile robot.
In a possible implementation manner, the first determining unit 1102 is specifically configured to: obtaining at least one group of calibration plate images according to the calibration plate image information, wherein each group of calibration plate images comprise a first frame of calibration plate image and a second frame of calibration plate image which are adjacent; obtaining a first pose of the camera relative to the calibration plate according to the internal reference of the camera and the coordinates of each calibration point in the first frame of calibration plate image in a world coordinate system and a pixel coordinate system; obtaining a second pose of the camera relative to the calibration plate according to the internal reference of the camera and the coordinates of each calibration point in the second frame of calibration plate image in the world coordinate system and the pixel coordinate system; determining a pose transformation vector of the camera between each two adjacent frames based on the first pose and the second pose, the first pose transformation data comprising the pose transformation vector of the camera between each two adjacent frames.
In a possible implementation, the second determining unit 1103 is specifically configured to: obtaining at least one group of odometry data according to the odometry information, wherein each group of odometry data comprises adjacent first frame odometry data and second frame odometry data; obtaining a third posture of the mobile robot relative to a world coordinate system according to the first frame of odometer data; obtaining a fourth pose of the mobile robot relative to a world coordinate system according to the second frame of odometer data; and determining a pose transformation vector of the mobile robot between every two adjacent frames based on the third pose and the fourth pose, wherein the second pose transformation data comprises the pose transformation vector of the mobile robot between every two adjacent frames.
In one possible implementation, the pose transformation vector of the camera between each two adjacent frames comprises a first translation vector and a first rotation vector, wherein the first translation vector represents a position vector of the camera pointing to the first frame from the second frame, and the first rotation vector represents a rotation vector of the camera rotating from the first frame to the second frame; the pose transformation vector of the mobile robot between each two adjacent frames comprises a second translation vector and a second rotation vector, wherein the second translation vector represents a position vector of the mobile robot pointing to the first frame from the second frame, and the second rotation vector represents a rotation vector of the mobile robot rotating from the first frame to the second frame; the external reference calibration unit 1104 is specifically configured to: establishing a rotation transformation equation based on the first rotation vector, the second rotation vector and the rotation vector to be calibrated, wherein the rotation vector to be calibrated is used for representing the rotation transformation relation between a camera coordinate system and a body coordinate system; establishing a translation transformation equation based on a first translation vector, a second translation vector, a third rotation vector, a rotation vector to be calibrated and a translation vector to be calibrated, wherein the third rotation vector is the first rotation vector or the second rotation vector, and the translation vector to be calibrated is used for representing a displacement transformation relation between a camera coordinate system and a body coordinate system; and solving the rotation transformation equation and the translation transformation equation to obtain an external reference calibration result, wherein the external reference calibration result comprises a calibration result of the rotation vector to be calibrated and the translation vector to be calibrated.
In a possible implementation manner, the external reference calibration unit 1104, when establishing the rotation transformation equation based on the first rotation vector, the second rotation vector and the rotation vector to be calibrated, is specifically configured to: multiplying the first rotation vector by the rotation vector to be calibrated to obtain a first rotation mode of rotating the camera coordinate system of the first frame to the body coordinate system of the second frame; multiplying the rotation vector to be calibrated by the second rotation vector to obtain a second rotation mode of rotating the camera coordinate system of the first frame to the body coordinate system of the second frame; and establishing a rotation transformation equation based on the equivalence relation between the first rotation mode and the second rotation mode.
In one possible embodiment, the third rotation vector is the second rotation vector; when the external reference calibration unit 1104 establishes the translation transformation equation based on the first translation vector, the second translation vector, the third rotation vector, the rotation vector to be calibrated, and the translation vector to be calibrated, it is specifically configured to: multiplying the rotation vector to be calibrated by the first translation vector, and then adding the multiplied rotation vector to be calibrated to the translation vector to be calibrated to obtain a first translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame; multiplying the second rotation vector by the translation vector to be calibrated, and adding the multiplied second rotation vector and the translation vector to be calibrated to obtain a second translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame; and establishing a translation transformation equation based on the equivalence relation between the first translation mode and the second translation mode.
In a possible implementation, the rotational transformation equation further includes a rotational noise factor, and the translational transformation equation further includes a rotational noise factor and a translational noise factor; when the external reference calibration unit 1104 solves the rotation transformation equation and the translation transformation equation to obtain an external reference calibration result, it is specifically configured to: obtaining a first residual error term caused by rotational noise according to a rotational transformation equation, and obtaining a calibration result of a to-be-calibrated rotational vector by minimizing the first residual error term; and obtaining a second residual error term caused by the rotation noise and the translation noise according to the translation transformation equation, and obtaining a calibration result of the translation vector to be calibrated by minimizing the second residual error term.
In a possible implementation manner, the apparatus 1100 further includes a first determination unit and a first retry unit, wherein the first determination unit is configured to determine the obtained external reference calibration result; the first retry unit is used for re-calibrating the external reference of the camera under the condition that the external reference calibration result is determined not to meet the external reference calibration success condition until the obtained external reference calibration result meets the external reference calibration success condition or the number of times of external reference calibration of the camera reaches a first preset number.
In a possible embodiment, the obtaining unit 1101 is further configured to obtain calibration board image data collected by the camera during the moving of the mobile robot according to the second trajectory; the device 1100 further comprises an internal reference calibration unit and a third determination unit, wherein the internal reference calibration unit is used for calibrating the internal reference of the camera based on the image data of the calibration plate to obtain an internal reference calibration result; the third determining unit is used for determining the internal reference of the camera according to the obtained internal reference calibration result.
In a possible embodiment, the apparatus 1100 further comprises a second determination unit and a second retry unit, wherein the second determination unit is used for determining the obtained internal reference calibration result; the third determining unit is specifically configured to determine the internal reference calibration result as the internal reference of the camera when it is determined that the internal reference calibration result meets the internal reference calibration success condition; the second retesting unit is used for re-calibrating the camera internal reference under the condition that the internal reference calibration result is determined not to meet the internal reference calibration success condition until the obtained internal reference calibration result meets the internal reference calibration success condition or the number of times of calibrating the camera internal reference reaches a second preset number of times.
For the specific definition of the camera calibration device, reference may be made to the above definition of the camera calibration method, which is not described herein again. The respective units in the above-described camera calibration device may be wholly or partially implemented by software, hardware, and a combination thereof. The units may be embedded in hardware or independent from a processor in the computer device, or may be stored in a memory in the computer device in software, so that the processor can call and execute operations corresponding to the units.
Referring to fig. 12, fig. 12 is a schematic diagram of a hardware structure of a camera calibration device according to an embodiment of the present application, where the schematic diagram includes:
a processor 1201, a memory 1202, and a transceiver 1203. The processor 1201, the memory 1202 and the transceiver 1203 are coupled by a bus 1204, the memory 1202 is used for storing instructions, and the processor 1201 is used for executing the instructions stored by the memory 1202 to implement the steps of the above-described method.
The processor 1201 is configured to execute the instructions stored in the memory 1202 to control the transceiver 1203 to receive and transmit signals, thereby implementing the steps of the above-described method. The memory 1202 may be integrated with the processor 1201, or may be provided separately from the processor 1201.
As an implementation manner, the function of the transceiver 1203 may be realized by a transceiver circuit or a dedicated chip for transceiving. The processor 1201 may be considered to be implemented by a dedicated processing chip, processing circuit, processor, or general purpose chip.
As another implementation manner, a general-purpose computer may be considered to implement the camera calibration apparatus provided in the embodiment of the present application. That is, program codes for realizing the functions of the processor 1201 and the transceiver 1203 are stored in the memory 1202, and the general-purpose processor realizes the functions of the processor 1201 and the transceiver 1203 by executing the codes in the memory 1202.
For the concept, explanation, detailed description and other steps related to the technical solution provided in the embodiments of the present application, please refer to the description of the method or the method steps executed by the apparatus in other embodiments, which is not described herein again.
The embodiment of the present application further provides a camera calibration system, including: the system comprises a mobile robot, a calibration plate, a bracket and a processor; a camera and a milemeter are fixedly arranged on the mobile robot; the calibration plate is arranged on the bracket and is positioned in the shooting range of the camera; a processor is communicatively coupled to the camera and the odometer for performing the method as in the above-described method embodiments.
An embodiment of the present application further provides an electronic device, including: a processor and a memory for storing computer program code comprising computer instructions, the electronic device performing the method as in the above-described method embodiments, in case the processor executes the computer instructions.
An embodiment of the present application further provides an electronic device, including: a processor, transmitting means, input means, output means and a memory for storing computer program code comprising computer instructions, the electronic device performing the method as in the above-described method embodiments, in case the processor executes the computer instructions.
The embodiments of the present application also provide a computer-readable storage medium, in which a computer program is stored, where the computer program includes program instructions, and in a case where the program instructions are executed by a processor, the processor is caused to execute the method in the above method embodiments.
The embodiments of the present application also provide a computer program product, which includes a computer program or instructions, and in the case that the computer program or instructions runs on a computer, the computer is caused to execute the method in the above method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.

Claims (14)

1. A camera calibration method is characterized by comprising the following steps:
in the process that the mobile robot moves according to the first track, acquiring calibration plate image information acquired by a camera fixed on the mobile robot and odometer information acquired by an odometer fixed on the mobile robot;
determining first pose transformation data of the camera in the motion process according to the internal reference of the camera and the image information of the calibration plate;
determining second position and posture transformation data of the mobile robot in the motion process according to the odometer information;
and performing camera external reference calibration based on the first position and posture transformation data and the second position and posture transformation data to obtain an external reference calibration result, wherein the external reference calibration result is used for representing a transformation relation between a camera coordinate system corresponding to the camera and a body coordinate system corresponding to the mobile robot.
2. The method of claim 1, wherein determining first pose transformation data of the camera during motion according to the internal parameters of the camera and the calibration board image information comprises:
obtaining at least one group of calibration plate images according to the calibration plate image information, wherein each group of calibration plate images comprise a first frame of calibration plate image and a second frame of calibration plate image which are adjacent;
obtaining a first pose of the camera relative to the calibration plate according to the internal parameters of the camera and the coordinates of each calibration point in the first frame of calibration plate image in a world coordinate system and a pixel coordinate system;
obtaining a second pose of the camera relative to the calibration plate according to the internal reference of the camera and the coordinates of each calibration point in the second frame of calibration plate image in the world coordinate system and the pixel coordinate system;
determining a pose transformation vector of the camera between each two adjacent frames based on the first pose and the second pose, the first pose transformation data comprising the pose transformation vector of the camera between each two adjacent frames.
3. The method of claim 2, wherein determining second pose transformation data for the mobile robot during motion based on the odometry information comprises:
obtaining at least one group of odometry data according to the odometry information, wherein each group of odometry data comprises adjacent first frame odometry data and second frame odometry data;
obtaining a third posture of the mobile robot relative to the world coordinate system according to the first frame of odometer data;
obtaining a fourth pose of the mobile robot relative to the world coordinate system according to the second frame of odometer data;
determining a pose transformation vector of the mobile robot between each two adjacent frames based on the third pose and the fourth pose, the second pose transformation data comprising the pose transformation vector of the mobile robot between each two adjacent frames.
4. The method according to claim 3, wherein the pose transformation vector of the camera between each two adjacent frames comprises a first translation vector and a first rotation vector, wherein the first translation vector represents a position vector of the camera pointing from the second frame to the first frame, and the first rotation vector represents a rotation vector of the camera rotating from the first frame to the second frame;
the pose transformation vector of the mobile robot between each two adjacent frames comprises a second translation vector and a second rotation vector, wherein the second translation vector represents a position vector of the mobile robot pointing to the first frame from the second frame, and the second rotation vector represents a rotation vector of the mobile robot rotating from the first frame to the second frame;
the camera external parameter calibration is performed based on the first position and orientation transformation data and the second position and orientation transformation data, and an external parameter calibration result is obtained, which includes:
establishing a rotation transformation equation based on the first rotation vector, the second rotation vector and a rotation vector to be calibrated, wherein the rotation vector to be calibrated is used for representing a rotation transformation relation between the camera coordinate system and the machine body coordinate system;
establishing a translation transformation equation based on the first translation vector, the second translation vector, a third rotation vector, the to-be-calibrated rotation vector and the to-be-calibrated translation vector, wherein the third rotation vector is the first rotation vector or the second rotation vector, and the to-be-calibrated translation vector is used for representing a displacement transformation relation between the camera coordinate system and the body coordinate system;
and solving the rotation transformation equation and the translation transformation equation to obtain an external reference calibration result, wherein the external reference calibration result comprises a calibration result of the rotation vector to be calibrated and the translation vector to be calibrated.
5. The method of claim 4, wherein establishing a rotational transformation equation based on the first rotation vector, the second rotation vector, and the rotation vector to be calibrated comprises:
multiplying the first rotation vector by the rotation vector to be calibrated to obtain a first rotation mode for rotating the camera coordinate system of the first frame to the body coordinate system of the second frame;
multiplying the rotation vector to be calibrated with the second rotation vector to obtain a second rotation mode of rotating the camera coordinate system of the first frame to the body coordinate system of the second frame;
and establishing a rotation transformation equation based on the equivalence relation between the first rotation mode and the second rotation mode.
6. The method of claim 4 or 5, wherein the third rotation vector is the second rotation vector; establishing a translation transformation equation based on the first translation vector, the second translation vector, the third rotation vector, the rotation vector to be calibrated and the translation vector to be calibrated, including:
multiplying the rotation vector to be calibrated by the first translation vector, and adding the multiplied rotation vector to be calibrated and the translation vector to be calibrated to obtain a first translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame;
multiplying the second rotation vector by the translation vector to be calibrated, and adding the multiplied second rotation vector to the second translation vector to obtain a second translation mode for translating the body coordinate system of the second frame to the camera coordinate system of the first frame;
and establishing a translation transformation equation based on the equivalence relation between the first translation mode and the second translation mode.
7. The method according to any one of claims 4 to 6, wherein the rotational transformation equation further comprises a rotational noise factor, and the translational transformation equation further comprises the rotational noise factor and a translational noise factor; the solving of the rotational transformation equation and the translational transformation equation to obtain the external reference calibration result comprises:
obtaining a first residual error term caused by rotational noise according to the rotational transformation equation, and obtaining a calibration result of the to-be-calibrated rotation vector by minimizing the first residual error term;
and obtaining a second residual error term caused by rotation noise and translation noise according to the translation transformation equation, and obtaining a calibration result of the translation vector to be calibrated by minimizing the second residual error term.
8. The method according to any one of claims 1 to 7, further comprising, after the obtaining of the external reference calibration result:
judging the obtained external reference calibration result;
and under the condition that the external reference calibration result does not meet the external reference calibration success condition, re-calibrating the external reference of the camera until the obtained external reference calibration result meets the external reference calibration success condition or the times of carrying out external reference calibration of the camera reach a first preset time.
9. The method according to any one of claims 1 to 8, wherein before the obtaining calibration board image information collected by a camera fixed on the mobile robot and odometry information collected by an odometry sensor fixed on the mobile robot during the movement of the mobile robot according to the external reference calibration trajectory, further comprising:
acquiring calibration plate image data acquired by the camera in the process that the mobile robot moves according to a second track;
performing camera internal reference calibration based on the calibration plate image data to obtain an internal reference calibration result;
and determining the internal reference of the camera according to the obtained internal reference calibration result.
10. The method according to claim 9, wherein determining the internal reference of the camera according to the obtained internal reference calibration result comprises:
judging the obtained internal reference calibration result;
determining the internal reference calibration result as the internal reference of the camera under the condition that the internal reference calibration result meets the internal reference calibration success condition;
and under the condition that the internal reference calibration result does not meet the internal reference calibration success condition, re-calibrating the internal reference of the camera until the obtained internal reference calibration result meets the internal reference calibration success condition or the number of times of calibrating the internal reference of the camera reaches a second preset number.
11. A camera calibration apparatus, characterized in that the apparatus comprises:
the mobile robot comprises an acquisition unit, a processing unit and a control unit, wherein the acquisition unit is used for acquiring calibration plate image information acquired by a camera fixed on the mobile robot and odometer information acquired by an odometer fixed on the mobile robot in the process that the mobile robot moves according to a first track;
the first determining unit is used for determining first position and posture transformation data of the camera in the motion process according to the internal reference of the camera and the image information of the calibration plate;
the second determining unit is used for determining second posture transformation data of the mobile robot in the motion process according to the odometer information;
and the external reference calibration unit is used for performing external reference calibration on the camera based on the first position and posture transformation data and the second position and posture transformation data to obtain an external reference calibration result, and the external reference calibration result is used for representing the transformation relation between the camera coordinate system corresponding to the camera and the body coordinate system corresponding to the mobile robot.
12. A camera calibration system, the system comprising: the system comprises a mobile robot, a calibration plate, a bracket and a processor; a camera and a milemeter are fixedly arranged on the mobile robot; the calibration plate is arranged on the bracket and is positioned in the shooting range of the camera; the processor is communicatively connected to the camera and the odometer for performing the method of any of claims 1 to 10.
13. An electronic device comprising at least one processor, and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the at least one processor implementing the method of any one of claims 1 to 10 by executing the instructions stored by the memory.
14. A computer-readable storage medium, in which at least one instruction or at least one program is stored, which is loaded and executed by a processor to implement the method according to any one of claims 1 to 10.
CN202210535889.8A 2022-05-17 2022-05-17 Camera calibration method, device, system, electronic equipment and storage medium Withdrawn CN114913242A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210535889.8A CN114913242A (en) 2022-05-17 2022-05-17 Camera calibration method, device, system, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210535889.8A CN114913242A (en) 2022-05-17 2022-05-17 Camera calibration method, device, system, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114913242A true CN114913242A (en) 2022-08-16

Family

ID=82769450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210535889.8A Withdrawn CN114913242A (en) 2022-05-17 2022-05-17 Camera calibration method, device, system, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114913242A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115847429A (en) * 2023-02-20 2023-03-28 库卡机器人(广东)有限公司 Parameter calibration method and device, mobile device and storage medium
CN117994356A (en) * 2024-04-02 2024-05-07 菲特(天津)检测技术有限公司 Camera internal reference verification method and device assisted by robot

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115847429A (en) * 2023-02-20 2023-03-28 库卡机器人(广东)有限公司 Parameter calibration method and device, mobile device and storage medium
CN115847429B (en) * 2023-02-20 2024-03-29 库卡机器人(广东)有限公司 Parameter calibration method, device, mobile device and storage medium
CN117994356A (en) * 2024-04-02 2024-05-07 菲特(天津)检测技术有限公司 Camera internal reference verification method and device assisted by robot

Similar Documents

Publication Publication Date Title
CN106767827B (en) Mobile robot point cloud map creation method based on laser data
KR20210084622A (en) Time synchronization processing methods, electronic devices and storage media
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
US20210183100A1 (en) Data processing method and apparatus
CN111795686A (en) Method for positioning and mapping mobile robot
CN111754579A (en) Method and device for determining external parameters of multi-view camera
CN114474056B (en) Monocular vision high-precision target positioning method for grabbing operation
CN111665512A (en) Range finding and mapping based on fusion of 3D lidar and inertial measurement unit
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN111307174A (en) Calibration method of sensor, moving object and storage medium
CN113052897A (en) Positioning initialization method and related device, equipment and storage medium
CN113587934A (en) Robot, indoor positioning method and device and readable storage medium
CN110930444B (en) Point cloud matching method, medium, terminal and device based on bilateral optimization
CN114913242A (en) Camera calibration method, device, system, electronic equipment and storage medium
CN114758011B (en) Zoom camera online calibration method fusing offline calibration results
CN112284381B (en) Visual inertia real-time initialization alignment method and system
CN111998870A (en) Calibration method and device of camera inertial navigation system
CN113256728B (en) IMU equipment parameter calibration method and device, storage medium and electronic device
CN113077515B (en) Tight coupling initialization method for underwater vision inertial navigation pressure positioning
CN113252066A (en) Method and device for calibrating parameters of odometer equipment, storage medium and electronic device
CN112154480A (en) Positioning method and device of movable platform, movable platform and storage medium
CN113763481B (en) Multi-camera visual three-dimensional map construction and self-calibration method in mobile scene
CN112729349B (en) Method and device for on-line calibration of odometer, electronic equipment and storage medium
CN115294280A (en) Three-dimensional reconstruction method, apparatus, device, storage medium, and program product
CN112880675A (en) Pose smoothing method and device for visual positioning, terminal and mobile robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220816

WW01 Invention patent application withdrawn after publication