CN114972544B - Method, device and equipment for self-calibration of external parameters of depth camera and storage medium - Google Patents

Method, device and equipment for self-calibration of external parameters of depth camera and storage medium Download PDF

Info

Publication number
CN114972544B
CN114972544B CN202210894348.4A CN202210894348A CN114972544B CN 114972544 B CN114972544 B CN 114972544B CN 202210894348 A CN202210894348 A CN 202210894348A CN 114972544 B CN114972544 B CN 114972544B
Authority
CN
China
Prior art keywords
depth camera
projector
light
calibration
stripe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210894348.4A
Other languages
Chinese (zh)
Other versions
CN114972544A (en
Inventor
黄煜
杨光
苏公喆
周佳骥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Star Ape Philosophy Technology Shanghai Co ltd
Xingyuanzhe Technology Shenzhen Co ltd
Original Assignee
Star Ape Philosophy Technology Shanghai Co ltd
Xingyuanzhe Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Star Ape Philosophy Technology Shanghai Co ltd, Xingyuanzhe Technology Shenzhen Co ltd filed Critical Star Ape Philosophy Technology Shanghai Co ltd
Priority to CN202210894348.4A priority Critical patent/CN114972544B/en
Publication of CN114972544A publication Critical patent/CN114972544A/en
Application granted granted Critical
Publication of CN114972544B publication Critical patent/CN114972544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a method, a device, equipment and a storage medium for external parameter self-calibration of a depth camera, wherein the depth camera comprises a light receiving sensor and a projector, and the method comprises the following steps: acquiring a plurality of stripe structure light images with different rotation angles, and calculating to obtain a coordinate value of each pixel and a corresponding absolute phase value on a light receiving sensor according to the stripe structure light images; acquiring calibration information generated by pre-calibration, and calculating to obtain a corresponding pixel coordinate value on the projector through an absolute phase value and the calibration information; and calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, and decomposing a rotation matrix and a translation matrix according to the essential matrix so as to determine the target external parameters of the depth camera. According to the method and the device, the target external parameter of the depth camera can be rapidly calculated, namely, the external parameter self-calibration of the depth camera is realized, and the depth camera can be calibrated offline without a calibration plate when the camera parameters are changed.

Description

Method, device and equipment for self-calibration of external parameters of depth camera and storage medium
Technical Field
The invention relates to a depth camera, in particular to a depth camera external parameter self-calibration method, device, equipment and storage medium.
Background
In image measurement process and machine vision application, in order to determine the correlation between the three-dimensional geometric position of a certain point on the surface of a space object and the corresponding point in an image, a geometric model of camera imaging must be established, and the parameters of the geometric model are the parameters of the camera. Under most conditions, the camera parameters must be obtained through experiments and calculation, and the process of solving the parameters is called camera calibration. In image measurement or machine vision application, calibration of camera parameters is a very critical link, and the accuracy of a calibration result and the stability of an algorithm directly influence the accuracy of a result generated by the operation of a camera.
Each 3D camera includes camera internal parameters and camera external parameters, which are key to obtaining high quality and dense depth measurement data. Although the 3D camera is rigidly mounted and calibrated off-line through a calibration board. However, due to changes in temperature, vibration, or accidental impacts to the system, camera parameters tend to change over time.
Therefore, the ability of the 3D camera to perform online self-calibration becomes a key factor for the application of the 3D camera in various fields.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a method, a device, equipment and a storage medium for self-calibration of external parameters of a depth camera.
According to the depth camera external parameter self-calibration method provided by the invention, the depth camera comprises a light receiving sensor and a projector, and the method comprises the following steps:
step S1: acquiring a plurality of stripe structure light images with different rotation angles, and calculating to obtain a coordinate value of each pixel on a light receiving sensor and a corresponding absolute phase value according to the stripe structure light images;
step S2: obtaining calibration information generated by pre-calibration, and calculating to obtain a corresponding pixel coordinate value on the projector through each absolute phase value on the light receiving sensor and the calibration information;
and step S3: and calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, and decomposing a rotation matrix and a translation matrix according to the essential matrix so as to determine the target external parameters of the depth camera.
Preferably, the step S1 includes the steps of:
step S101: controlling a projector in the depth camera to project stripe structure light to any plane object in the scene, and correspondingly changing the deflection angle of the plane object in the scene during each projection of the stripe structure light;
step S102: controlling a light receiving sensor in the depth camera to receive the structured light reflected by the planar object to generate a plurality of stripe structured light images with different rotation angles;
step S103: and calculating to obtain the coordinate value of each pixel and the corresponding absolute phase value on the light receiving sensor according to the structured light image.
Preferably, the step S2 includes the steps of:
step S201: acquiring calibration information generated by pre-calibration;
step S202: converting each pixel coordinate value on the light receiving sensor from a camera coordinate system to a pixel coordinate value under a projector coordinate system through the absolute phase value and the calibration information;
step S203: and determining each pixel coordinate value corresponding to each pixel coordinate value on the light receiving sensor on the projector according to the pixel coordinate value under the projector coordinate system.
Preferably, the step S3 includes the steps of:
step S301: according to camera parameters and distortion coefficients in the calibration information, the pixel coordinate values of the light receiving sensor and the pixel coordinate values of the projector are subjected to distortion removal to generate target pixel coordinate values;
step S302: calculating an essential matrix according to the coordinate values of the two target pixels matched with the light receiving sensor and the projector;
step S303: decomposing a first rotation matrix, a second rotation matrix, a first translation matrix and a second translation matrix according to the essential matrix;
step S304: and selecting a rotation matrix and a translation matrix according to preset screening conditions, and generating the external parameters of the target camera of the depth camera according to the selected rotation matrix and translation matrix.
Preferably, the step S101 includes the steps of:
step S1011: controlling a projector in the depth camera to project horizontal stripe structured light to any planar object in any scene and forming a plurality of stripe light spots on the planar object;
step S1012: controlling a plane object in the scene to deflect an angle, so that a plurality of stripe light spots formed on the plane object are subjected to angle change relative to the plane object;
step S1013: and controlling the plane object to deflect to another different angle, so that the plurality of fringe light spots are subjected to angle change relative to the plane object again.
Preferably, when the light receiving sensor includes a main camera and a sub-camera, the step S2 includes the steps of:
step S201A: obtaining calibration information generated by pre-calibration;
step S202A: converting each pixel coordinate value on the main camera from the main camera coordinate system to a pixel coordinate value under an auxiliary camera coordinate system through the absolute phase value and the calibration information;
step S203A: and determining the pixel coordinate value corresponding to each pixel coordinate value on the main camera and the auxiliary camera according to the pixel coordinate value under the auxiliary camera coordinate system.
Preferably, when a rotation matrix and a translation matrix are selected according to preset screening conditions, the following method is used for realizing:
and converting and screening a set of rotation matrix and translation matrix when Z is greater than 0 and X is less than 0 after the preset XYZ coordinates (0, 1) are converted by a set of rotation matrix and translation matrix.
The invention provides a depth camera external parameter self-calibration device, which comprises the following modules:
the image acquisition module is used for acquiring a plurality of stripe structure light images with different rotation angles and calculating to obtain a coordinate value of each pixel on the light receiving sensor and a corresponding absolute phase value according to the stripe structure light images;
the coordinate calculation module is used for acquiring calibration information generated by pre-calibration and calculating to obtain a corresponding pixel coordinate value on the projector through each absolute phase value on the light receiving sensor and the calibration information;
and the external parameter generation module is used for calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, decomposing a rotation matrix and a translation matrix according to the essential matrix and further determining the target external parameters of the depth camera.
The invention provides a depth camera external parameter self-calibration device, which comprises:
a processor;
a memory module having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the depth camera extrinsic self-calibration method via execution of the executable instructions.
According to the present invention, a computer readable storage medium is provided for storing a program which, when executed, performs the steps of the method for depth camera extrinsic self-calibration.
Compared with the prior art, the invention has the following beneficial effects:
according to the method, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained through calculation of a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained through calculation according to the absolute phase value, so that calculation of an essential matrix is achieved, and rapid calculation of target external parameters of the depth camera is achieved, namely external parameter self-calibration of the depth camera is achieved, and offline calibration of the depth camera can be achieved without a calibration board when camera parameters are changed.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating steps of a method for self-calibration of external parameters of a depth camera according to an embodiment of the present invention.
FIG. 2 is a flowchart illustrating a procedure of an absolute phase value corresponding to each pixel coordinate value according to an embodiment of the present invention.
FIG. 3 is a flowchart illustrating a step of controlling a projector to project different angles of structured light in accordance with an embodiment of the present invention.
Fig. 4 is a flowchart of determining pixel coordinate values corresponding to the light receiving sensor and the projector according to an embodiment of the present invention.
Fig. 5 is a flowchart illustrating a step of determining pixel coordinate values corresponding to the primary camera and the secondary camera according to an embodiment of the present invention.
Fig. 6 is a flowchart of steps of generating a target camera external reference of the depth camera according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of an article picking robot with external reference to a depth camera self-calibration method according to an embodiment of the present invention.
Fig. 8 is a schematic block diagram of a depth camera external reference self-calibration apparatus according to an embodiment of the present invention.
Fig. 9 is a schematic structural diagram of a depth reconstruction apparatus in an embodiment of the present invention.
Fig. 10 is a schematic structural diagram of a computer-readable storage medium in an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will aid those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any manner. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the concept of the invention. All falling within the scope of the invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical means of the present invention will be described in detail with reference to specific examples. These several specific embodiments may be combined with each other below, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The following describes the technical solutions of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of steps of a depth camera external parameter self-calibration method in an embodiment of the present invention, and as shown in fig. 1, the depth camera external parameter self-calibration method provided in the present invention includes a light receiving sensor and a projector, and includes the following steps:
step S1: acquiring a plurality of stripe structure light images with different rotation angles, and calculating to obtain a coordinate value of each pixel and a corresponding absolute phase value on a light receiving sensor according to the stripe structure light images;
in the embodiment of the invention, the absolute phase diagram corresponding to the fringe pattern is analyzed according to the fringe structure light image. And taking the pixel coordinates on the absolute phase image as the pixel coordinates on a light receiving sensor. And calculating the pixel coordinate on the projector according to the absolute phase value corresponding to each pixel coordinate on the absolute phase diagram.
Fig. 2 is a flowchart illustrating a step of an absolute phase value corresponding to each pixel coordinate value according to an embodiment of the present invention, and as shown in fig. 2, the step S1 includes the following steps:
step S101: controlling a projector in the depth camera to project stripe structure light to any plane object in the scene, and correspondingly changing the deflection angle of the plane object in the scene during each projected stripe structure light;
in the embodiment of the invention, the projector adopts a DLP optical machine, and the different deflection angles of the stripe structure light are realized by controlling the light emission of different pixels on the DLP optical machine.
Fig. 3 is a flowchart illustrating a step of controlling a projector to project structured light with stripes at different angles according to an embodiment of the present invention, where, as shown in fig. 3, the step S101 includes the following steps:
step S1011: controlling a projector in the depth camera to project horizontal stripe structured light to any planar object in any scene and forming a plurality of stripe light spots on the planar object;
step S1012: controlling a plane object in the scene to deflect an angle, so that a plurality of stripe light spots formed on the plane object generate angle change relative to the plane object;
step S1013: and controlling the plane object to deflect to another different angle, so that the plurality of fringe light spots are subjected to angle change relative to the plane object again.
In an embodiment of the present invention, the projector in the depth camera is controlled to project the horizontal stripe structured light pattern to the planar object of the scene, and the light receiving sensor receives the horizontal stripe structured light pattern and then calculates the absolute phase map in the horizontal direction. And then controlling a projector in the depth camera to project a vertical stripe structured light pattern to the planar object of the scene, and after receiving the vertical stripe structured light pattern, a light receiving sensor calculates an absolute phase image in the vertical direction. According to the absolute phase diagram in the horizontal and vertical directions, the pixel coordinate corresponding relation between the light receiving sensor and the projector can be found out.
In an embodiment of the present invention, after a projector in the depth camera is controlled to project a set of horizontal and vertical structured light stripes onto a planar object of the scene, a deflection angle of the planar object of the scene is changed, so that a light receiving sensor can receive horizontal and vertical structured light stripe images of the planar object at different deflection angles, thereby solving absolute phase diagrams of the planar object at different deflection angles, and finding out a pixel coordinate corresponding relationship between the light receiving sensor and the projector on the surface of the planar object at different deflection angles.
In a modification of the present invention, the step S101 includes the steps of:
step S1011A: controlling a projector in the depth camera to project horizontal stripe structured light to any scene;
step S1012A: controlling pixels at different positions in the projector to emit light, so that the angle of the stripe-structured light relative to the horizontal direction is changed, and the stripe-structured light at another angle is projected;
step S1013A: the change of the angle of the stripe-structured light relative to the horizontal direction is continuously controlled until the angle of the projected stripe-structured light relative to the horizontal direction is turned from 0 ° to 90 °.
In a modification of the present invention, the projector in the depth camera is controlled to project the horizontal stripe structured light to the scene first, then the projector in the depth camera is controlled to project the stripe structured light forming an angle of 30 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project the stripe structured light forming an angle of 45 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project the stripe structured light forming an angle of 60 ° with the horizontal direction to the scene first, and then the projector in the depth camera is controlled to project the stripe structured light forming an angle of 90 ° with the horizontal direction to the scene first.
In a modification of the present invention, the projector in the depth camera is controlled to project vertical stripe structured light to the scene first, then the projector in the depth camera is controlled to project stripe structured light having an angle of 60 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project stripe structured light having an angle of 45 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project stripe structured light having an angle of 30 ° with the horizontal direction to the scene first, and then the projector in the depth camera is controlled to project horizontal stripe structured light to the scene first.
Step S102: controlling a light receiving sensor in the depth camera to receive the structured light reflected by the planar object to generate a plurality of stripe structured light images with different rotation angles;
step S103: and calculating to obtain the coordinate value of each pixel and the corresponding absolute phase value on the light receiving sensor according to the structured light image.
Step S2: obtaining calibration information generated by pre-calibration, and calculating to obtain a corresponding pixel coordinate value on the projector through the absolute phase value and the calibration information;
fig. 4 is a flowchart of a step of determining pixel coordinate values corresponding to the light receiving sensor and the projector in the embodiment of the present invention, and as shown in fig. 4, the step S2 includes the following steps:
step S201: obtaining calibration information generated by pre-calibration;
step S202: converting each pixel coordinate value on the light receiving sensor from a camera coordinate system to a pixel coordinate value under a projector coordinate system through the absolute phase value and the calibration information;
step S203: and determining each pixel coordinate value corresponding to each pixel coordinate value on the light receiving sensor on the projector according to the pixel coordinate value under the projector coordinate system.
In the embodiment of the invention, the calibration information is obtained by performing off-line calibration on the depth camera and performing calibration generation through a plurality of calibration plate images.
The camera coordinate system is a three-dimensional rectangular coordinate system established by taking the optical center of a lens as an origin OA, and respectively taking the directions of a row and a column parallel to pixels as the directions of an XA axis and a YA axis according to the rule of a right-hand coordinate system.
The projector coordinate system is a three-dimensional rectangular coordinate system which takes a projector optical center as an original point OB, is parallel to the horizontal direction and the vertical direction of a projector chip and is respectively in the XB axis direction and the YB axis direction, and the optical axis is established in the Z axis direction.
Fig. 5 is a flowchart of a step of determining pixel coordinate values corresponding to a primary camera and a secondary camera in an embodiment of the present invention, and as shown in fig. 5, when the light receiving sensor includes the primary camera and the secondary camera, the step S2 includes the following steps:
step S201A: obtaining calibration information generated by pre-calibration;
step S202A: converting each pixel coordinate value on the main camera from the main camera coordinate system to a pixel coordinate value under the auxiliary camera coordinate system through the absolute phase value and the calibration information;
step S203A: and determining the pixel coordinate value corresponding to each pixel coordinate value on the main camera and the auxiliary camera according to the pixel coordinate value under the auxiliary camera coordinate system.
In the embodiment of the invention, the calibration information is the offline calibration of the depth camera, and the calibration is generated through a plurality of calibration plate images.
The main camera coordinate system is a three-dimensional rectangular coordinate system which is established by taking the optical center of the lens as an origin OC, and respectively taking the directions of the rows and the columns parallel to the pixels as XC axis and YC axis directions, and determining the ZC axis direction according to the rule of a right-hand coordinate system.
The secondary camera coordinate system is a three-dimensional rectangular coordinate system which takes a projector optical center as an origin OD, is parallel to the horizontal direction and the vertical direction of a projector chip and is respectively in the XD axis direction and the YD axis direction, and the optical axis is established in the ZD axis direction.
And step S3: and calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, and decomposing a rotation matrix and a translation matrix according to the essential matrix so as to determine the target external parameters of the depth camera.
Fig. 6 is a flowchart of a step of generating external parameters of a target camera of the depth camera according to an embodiment of the present invention, and as shown in fig. 6, the step S3 includes the following steps:
step S301: according to camera parameters and distortion coefficients in the calibration information, the pixel coordinate values of the light receiving sensor and the pixel coordinate values of the projector are subjected to distortion removal to generate target pixel coordinate values;
step S302: calculating according to the coordinate values of the two target pixels matched with the light receiving sensor and the projector;
step S303: decomposing a first rotation matrix, a second rotation matrix, a first translation matrix and a second translation matrix according to the essential matrix;
step S304: and selecting a rotation matrix and a translation matrix according to preset screening conditions, and generating the target camera external parameter of the depth camera according to the selected rotation matrix and translation matrix.
In the embodiment of the invention, when a rotation matrix and a translation matrix are selected according to preset screening conditions, the method is realized by the following steps:
and converting and screening a set of rotation matrix and translation matrix when Z is greater than 0 and X is less than 0 after the preset XYZ coordinates (0, 1) are converted by a set of rotation matrix and translation matrix.
In the embodiment of the present invention, calculating the essential matrix and decomposing the essential matrix are calculated by calling a corresponding OpenCV function. For example, a findEssestimAlMat () function is used to calculate the essential matrix, and a decomplexeEssestimAlMat () function is used to decompose the essential matrix.
In the embodiment of the invention, when the base line is not changed, a single picture is reconstructed from the calibration result, the difference between the reconstructed three-dimensional point coordinate and the OpenCV offline calibration result is large, and the error of the three-dimensional coordinate distance of the reconstructed point is about 10-30 mm. When the base line is unchanged, self-calibration is carried out on a plurality of stripe structure light images with different rotation angles, the obtained external parameter and the OpenCV offline calibration result are not greatly different, and the distance error of the final three-dimensional point reconstruction result is about 1-2 mm.
Fig. 7 is a schematic structural diagram of an article picking robot in an embodiment of the present invention, where as shown in fig. 7, the article picking robot provided in the present invention further includes:
the first unit and the second unit are used for storing or/and transporting materials;
the depth camera 300 is provided with a visual scanning area at least covering a first unit for storing or transporting the materials, and is used for visually scanning the materials, acquiring a depth image of the materials, and generating pose information and a storage position of the materials according to the depth image;
and the robot unit 100 is in communication connection with the depth camera 300 and is used for receiving the position and posture information and the storage position, judging the placement state of the target object according to the position and the storage position, and picking the target object according to the placement state.
In an embodiment of the present invention, the first unit may be configured as a storage unit 200;
the storage unit 200 is used for storing materials which are placed out of order, wherein the materials are the target objects, such as any articles like metal products and boxes;
and the robot unit 100 is in communication connection with the depth camera 300, and is configured to receive the position and posture information and the storage position, determine a placement state of the target object according to the position and the storage position, pick the target object according to the placement state, and transfer the target object to a second unit.
The second unit may be arranged to transport or store the sorted material, such as a support rack arranged to facilitate the orderly arrangement of the items,
the second unit may further include a transportation unit, so that the robot unit 100 can move the target object on the support frame to the transportation unit.
The depth camera 300 is disposed on a camera support.
The robot unit 100 includes a processor configured to calculate an absolute phase value corresponding to each pixel coordinate value on the light receiving sensor through a plurality of stripe structure light images with different rotation angles when the steps of the depth reconstruction method are executed by executing an executable instruction, and further calculate a pixel coordinate value corresponding to the projector according to the absolute phase value to realize calculation of an essential matrix, thereby realizing fast calculation of target external parameters of the depth camera, that is, realizing external parameter self-calibration of the depth camera, and enabling offline calibration of the depth camera without using a calibration board when camera parameters change.
Fig. 8 is a schematic block diagram of a depth camera external reference self-calibration apparatus in an embodiment of the present invention, and as shown in fig. 8, the depth camera external reference self-calibration apparatus 100 provided by the present invention includes the following modules:
the image acquisition module 101 is configured to acquire a plurality of stripe structure light images with different rotation angles, and calculate, according to the stripe structure light images, a coordinate value of each pixel on the light receiving sensor and a corresponding absolute phase value;
a coordinate calculation module 102, configured to obtain calibration information generated by pre-calibration, and calculate, through each absolute phase value on the light receiving sensor and the calibration information, a corresponding pixel coordinate value on the projector;
and the extrinsic parameter generating module 103 is configured to calculate an intrinsic matrix according to the two pixel coordinate values matched between the light receiving sensor and the projector, and decompose a rotation matrix and a translation matrix according to the intrinsic matrix, so as to determine the target extrinsic parameters of the depth camera.
The embodiment of the invention also provides depth reconstruction equipment which comprises a processor and a memory. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the depth reconstruction method steps via execution of executable instructions.
As described above, in this embodiment, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained through calculation of a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained through calculation according to the absolute phase value, so as to realize calculation of the essential matrix, thereby realizing fast calculation of the target external parameter of the depth camera, that is, realizing self-calibration of the external parameter of the depth camera, and enabling offline calibration of the depth camera without using a calibration board when the camera parameters are changed.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Accordingly, various aspects of the present invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 9 is a schematic structural diagram of a depth reconstruction apparatus in an embodiment of the present invention. An electronic device 600 according to such an embodiment of the invention is described below with reference to fig. 9. The electronic device 600 shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores a program code, which can be executed by the processing unit 610, such that the processing unit 610 performs the steps according to various exemplary embodiments of the present invention described in the above-mentioned deep reconstruction method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM) 6201 and/or a cache storage unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include programs/utilities 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 can be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, camera, depth camera, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in FIG. 9, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the depth reconstruction method when executed. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention described in the above-mentioned deep reconstruction method section of this specification, when the program product is run on the terminal device.
As shown above, when the program of the computer-readable storage medium of this embodiment is executed, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained through calculation of a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained through calculation according to the absolute phase value, so as to implement calculation of the essential matrix, thereby implementing fast calculation of the target external parameter of the depth camera, that is, implementing external parameter self-calibration of the depth camera, and enabling offline calibration of the depth camera without using a calibration board when the camera parameters are changed.
Fig. 10 is a schematic structural diagram of a computer-readable storage medium in an embodiment of the present invention. Referring to fig. 10, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external computing devices (e.g., through the internet using an internet service provider).
In the embodiment of the invention, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained by calculating a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained by calculating according to the absolute phase value so as to realize the calculation of the essential matrix, thereby realizing the rapid calculation of the target external parameter of the depth camera, namely realizing the self-calibration of the external parameter of the depth camera, and being capable of carrying out off-line calibration on the depth camera without a calibration board when the camera parameters are changed.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (9)

1. A depth camera external parameter self-calibration method is characterized in that a depth camera comprises a light receiving sensor and a projector, and the method comprises the following steps:
step S1: acquiring a plurality of stripe structure light images with different rotation angles, and calculating to obtain a coordinate value of each pixel on a light receiving sensor and a corresponding absolute phase value according to the stripe structure light images;
the step S1 includes the steps of:
step S101: controlling a projector in the depth camera to project stripe-structured light to any plane object in a scene, and correspondingly changing a deflection angle of the plane object in the scene in each projected stripe-structured light, specifically, controlling the projector in the depth camera to project horizontal stripe-structured light to any scene, controlling part of pixels at different positions in the projector to emit light, so that an angle of the stripe-structured light relative to a horizontal direction is changed, so as to project stripe-structured light at another angle, and continuously controlling the change of the angle of the stripe-structured light relative to the horizontal direction until the angle of the projected stripe-structured light relative to the horizontal direction is changed from 0 ° to 90 °, or controlling the projector in the depth camera to project vertical stripe-structured light to any scene, and controlling part of pixels at different positions in the projector to emit light, so that the angle of the stripe-structured light relative to the horizontal direction is changed, so as to project stripe-structured light at another angle, and continuously controlling the change of the angle of the stripe-structured light relative to the horizontal direction until the projected stripe-structured light relative to the horizontal direction is changed from 90 ° to 0 °;
step S102: controlling a light receiving sensor in the depth camera to receive the structured light reflected by the planar object to generate a plurality of stripe structured light images with different rotation angles;
step S103: calculating to obtain a coordinate value of each pixel and a corresponding absolute phase value on the light receiving sensor according to the structured light image;
step S2: obtaining calibration information generated by pre-calibration, and calculating to obtain a corresponding pixel coordinate value on the projector through each absolute phase value on the light receiving sensor and the calibration information;
and step S3: and calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, and decomposing a rotation matrix and a translation matrix according to the essential matrix so as to determine the target external parameters of the depth camera.
2. The depth camera external parameter self-calibration method according to claim 1, wherein the step S2 comprises the steps of:
step S201: acquiring calibration information generated by pre-calibration;
step S202: converting each pixel coordinate value on the light receiving sensor from a camera coordinate system to a pixel coordinate value under a projector coordinate system through the absolute phase value and the calibration information;
step S203: and determining each pixel coordinate value corresponding to each pixel coordinate value on the light receiving sensor on the projector according to the pixel coordinate value under the projector coordinate system.
3. The depth camera external reference self-calibration method according to claim 1, wherein the step S3 comprises the steps of:
step S301: according to camera parameters and distortion coefficients in the calibration information, carrying out distortion removal on the pixel coordinate values of the light receiving sensor and the pixel coordinate values of the projector to generate target pixel coordinate values;
step S302: calculating an essential matrix according to the coordinate values of the two target pixels matched with the light receiving sensor and the projector;
step S303: decomposing a first rotation matrix, a second rotation matrix, a first translation matrix and a second translation matrix according to the essential matrix;
step S304: and selecting a rotation matrix and a translation matrix according to preset screening conditions, and generating the external parameters of the target camera of the depth camera according to the selected rotation matrix and translation matrix.
4. The depth camera external parameter self-calibration method according to claim 1, wherein the step S101 comprises the steps of:
step S1011: controlling a projector in the depth camera to project horizontal stripe structured light to any plane object in any scene and forming a plurality of stripe light spots on the plane object;
step S1012: controlling a plane object in the scene to deflect an angle, so that a plurality of stripe light spots formed on the plane object generate angle change relative to the plane object;
step S1013: and controlling the plane object to deflect to another different angle, so that the plurality of fringe light spots are subjected to angle change relative to the plane object again.
5. The depth camera external reference self-calibration method according to claim 2, wherein when the light receiving sensor includes a primary camera and a secondary camera, the step S2 includes the steps of:
step S201A: acquiring calibration information generated by pre-calibration;
step S202A: converting each pixel coordinate value on the main camera from the main camera coordinate system to a pixel coordinate value under an auxiliary camera coordinate system through the absolute phase value and the calibration information;
step S203A: and determining the pixel coordinate value corresponding to each pixel coordinate value on the main camera and the auxiliary camera according to the pixel coordinate value under the auxiliary camera coordinate system.
6. The depth camera external parameter self-calibration method according to claim 3, wherein when a rotation matrix and a translation matrix are selected according to preset screening conditions, the method is implemented as follows:
after the preset XYZ coordinates (0, 1) are converted through a set of rotation matrix and translation matrix, when Z is greater than 0 and X is less than 0, the set of rotation matrix and translation matrix is converted and screened out.
7. The external parameter self-calibration device for the depth camera is characterized by comprising the following modules:
an image obtaining module, configured to control a projector in the depth camera to project stripe structure light to any planar object in a scene, where a deflection angle of the planar object in the scene should be changed in each time of the projected stripe structure light, control a light receiving sensor in the depth camera to receive the structure light reflected by the planar object to generate a plurality of stripe structure light images with different rotation angles, and further calculate, according to the structure light images, a coordinate value and a corresponding absolute phase value of each pixel on the light receiving sensor, specifically, control the projector in the depth camera to project lateral stripe structure light to any scene, control part of different position pixels in the projector to emit light, so that an angle of the stripe structure light with respect to a horizontal direction changes, so as to project stripe structure light with another angle, continuously control a change of an angle of the stripe structure light with respect to the horizontal direction until the angle of the projected stripe structure light with respect to the horizontal direction changes from 0 ° to 90 °, or control the projector in the depth camera to project stripe structure light to any scene, control part of the different position pixels to emit light, so that the angle of the stripe structure light with respect to the horizontal direction changes, so that the angle of the projected stripe structure light with respect to the horizontal direction changes from 0 ° to 90 ° of the projected stripe structure light, the projected stripe structure angle of the projected stripe structure light changes, and the projected stripe structure angle of the projected stripe structure light changes from 0 ° to 90 ° of the projected stripe structure angle of the projected stripe structure light;
the coordinate calculation module is used for acquiring calibration information generated by pre-calibration and calculating to obtain a corresponding pixel coordinate value on the projector through each absolute phase value on the light receiving sensor and the calibration information;
and the external parameter generating module is used for calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, decomposing a rotation matrix and a translation matrix according to the essential matrix and further determining the target external parameters of the depth camera.
8. A depth camera external parameter self-calibration device is characterized by comprising:
a processor;
a memory module having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the depth camera extrinsic self-calibration method of any one of claims 1 to 6 via execution of the executable instructions.
9. A computer readable storage medium storing a program which when executed performs the steps of the depth camera extrinsic self-calibration method of any one of claims 1 to 6.
CN202210894348.4A 2022-07-28 2022-07-28 Method, device and equipment for self-calibration of external parameters of depth camera and storage medium Active CN114972544B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210894348.4A CN114972544B (en) 2022-07-28 2022-07-28 Method, device and equipment for self-calibration of external parameters of depth camera and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210894348.4A CN114972544B (en) 2022-07-28 2022-07-28 Method, device and equipment for self-calibration of external parameters of depth camera and storage medium

Publications (2)

Publication Number Publication Date
CN114972544A CN114972544A (en) 2022-08-30
CN114972544B true CN114972544B (en) 2022-10-25

Family

ID=82969409

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210894348.4A Active CN114972544B (en) 2022-07-28 2022-07-28 Method, device and equipment for self-calibration of external parameters of depth camera and storage medium

Country Status (1)

Country Link
CN (1) CN114972544B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018000039A1 (en) * 2016-06-29 2018-01-04 Seeing Machines Limited Camera registration in a multi-camera system
CN111207693A (en) * 2020-01-10 2020-05-29 西安交通大学 Three-dimensional measurement method of turbine blade ceramic core based on binocular structured light
CN111242990A (en) * 2020-01-06 2020-06-05 西南电子技术研究所(中国电子科技集团公司第十研究所) 360-degree three-dimensional reconstruction optimization method based on continuous phase dense matching
CN113205593A (en) * 2021-05-17 2021-08-03 湖北工业大学 High-light-reflection surface structure light field three-dimensional reconstruction method based on point cloud self-adaptive restoration
CN113566709A (en) * 2021-08-26 2021-10-29 苏州小优智能科技有限公司 Calibration method and device of structured light measurement system and electronic equipment
CN114459384A (en) * 2022-02-28 2022-05-10 嘉兴市像景智能装备有限公司 Phase shift profilometry based on multi-angle sine stripe light field fusion

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107110637B (en) * 2014-12-22 2019-11-01 赛博光学公司 The calibration of three-dimension measuring system is updated
CN104729429B (en) * 2015-03-05 2017-06-30 深圳大学 A kind of three dimensional shape measurement system scaling method of telecentric imaging

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018000039A1 (en) * 2016-06-29 2018-01-04 Seeing Machines Limited Camera registration in a multi-camera system
CN111242990A (en) * 2020-01-06 2020-06-05 西南电子技术研究所(中国电子科技集团公司第十研究所) 360-degree three-dimensional reconstruction optimization method based on continuous phase dense matching
CN111207693A (en) * 2020-01-10 2020-05-29 西安交通大学 Three-dimensional measurement method of turbine blade ceramic core based on binocular structured light
CN113205593A (en) * 2021-05-17 2021-08-03 湖北工业大学 High-light-reflection surface structure light field three-dimensional reconstruction method based on point cloud self-adaptive restoration
CN113566709A (en) * 2021-08-26 2021-10-29 苏州小优智能科技有限公司 Calibration method and device of structured light measurement system and electronic equipment
CN114459384A (en) * 2022-02-28 2022-05-10 嘉兴市像景智能装备有限公司 Phase shift profilometry based on multi-angle sine stripe light field fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于在位测量的火箭燃料贮箱壁板铣削加工;刘晓 等;《机械制造》;20171130;全文 *
基于结构光的镜面/漫反射复合表面形貌测量;张宗华等;《红外与激光工程》;20200325(第03期);全文 *

Also Published As

Publication number Publication date
CN114972544A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN109829947B (en) Pose determination method, tray loading method, device, medium, and electronic apparatus
KR101549103B1 (en) Detection apparatus, Detection method and manipulator
JP5210203B2 (en) High-precision stereo camera calibration based on image differences
CN111127422A (en) Image annotation method, device, system and host
US20110164114A1 (en) Three-dimensional measurement apparatus and control method therefor
US20090002637A1 (en) Image Projection System and Image Geometric Correction Device
JP2017110991A (en) Measurement system, measurement method, robot control method, robot, robot system, and picking device
CN111652113B (en) Obstacle detection method, device, equipment and storage medium
US11126844B2 (en) Control apparatus, robot system, and method of detecting object
JPH0835818A (en) Image processing apparatus and method
CN115294208A (en) Temperature compensation system for depth camera
JPWO2018168757A1 (en) Image processing apparatus, system, image processing method, article manufacturing method, program
CN114972544B (en) Method, device and equipment for self-calibration of external parameters of depth camera and storage medium
CN114945091B (en) Temperature compensation method, device and equipment of depth camera and storage medium
CN112927340A (en) Three-dimensional reconstruction acceleration method, system and equipment independent of mechanical placement
US11856340B2 (en) Position specifying method and simulation method
CN117522992A (en) External parameter self-calibration system of depth camera
JP7024405B2 (en) Information processing equipment, programs and information processing methods
KR102586407B1 (en) Pixel compensating apparatus and display system having the same
CN114897997B (en) Camera calibration method, device, equipment and storage medium
CN115965697A (en) Projector calibration method, calibration system and device based on Samm's law
CN111260781B (en) Method and device for generating image information and electronic equipment
JP2023069373A (en) Marker detection apparatus and robot teaching system
CN117670765A (en) Article detection method, apparatus, device and storage medium
CN115272479A (en) Camera calibration system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 518102 Room 1201M1, Hengfang Science and Technology Building, No. 4008, Xinhu Road, Yongfeng Community, Xixiang Street, Baoan District, Shenzhen City, Guangdong Province

Patentee after: Star ape philosophy Technology (Shanghai) Co.,Ltd.

Patentee after: Xingyuanzhe Technology (Shenzhen) Co.,Ltd.

Address before: 518102 Room 1201M1, Hengfang Science and Technology Building, No. 4008, Xinhu Road, Yongfeng Community, Xixiang Street, Baoan District, Shenzhen City, Guangdong Province

Patentee before: Xingyuanzhe Technology (Shenzhen) Co.,Ltd.

Patentee before: Star ape philosophy Technology (Shanghai) Co.,Ltd.