Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a method, a device, equipment and a storage medium for self-calibration of external parameters of a depth camera.
According to the depth camera external parameter self-calibration method provided by the invention, the depth camera comprises a light receiving sensor and a projector, and the method comprises the following steps:
step S1: acquiring a plurality of stripe structure light images with different rotation angles, and calculating to obtain a coordinate value of each pixel on a light receiving sensor and a corresponding absolute phase value according to the stripe structure light images;
step S2: obtaining calibration information generated by pre-calibration, and calculating to obtain a corresponding pixel coordinate value on the projector through each absolute phase value on the light receiving sensor and the calibration information;
and step S3: and calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, and decomposing a rotation matrix and a translation matrix according to the essential matrix so as to determine the target external parameters of the depth camera.
Preferably, the step S1 includes the steps of:
step S101: controlling a projector in the depth camera to project stripe structure light to any plane object in the scene, and correspondingly changing the deflection angle of the plane object in the scene during each projection of the stripe structure light;
step S102: controlling a light receiving sensor in the depth camera to receive the structured light reflected by the planar object to generate a plurality of stripe structured light images with different rotation angles;
step S103: and calculating to obtain the coordinate value of each pixel and the corresponding absolute phase value on the light receiving sensor according to the structured light image.
Preferably, the step S2 includes the steps of:
step S201: acquiring calibration information generated by pre-calibration;
step S202: converting each pixel coordinate value on the light receiving sensor from a camera coordinate system to a pixel coordinate value under a projector coordinate system through the absolute phase value and the calibration information;
step S203: and determining each pixel coordinate value corresponding to each pixel coordinate value on the light receiving sensor on the projector according to the pixel coordinate value under the projector coordinate system.
Preferably, the step S3 includes the steps of:
step S301: according to camera parameters and distortion coefficients in the calibration information, the pixel coordinate values of the light receiving sensor and the pixel coordinate values of the projector are subjected to distortion removal to generate target pixel coordinate values;
step S302: calculating an essential matrix according to the coordinate values of the two target pixels matched with the light receiving sensor and the projector;
step S303: decomposing a first rotation matrix, a second rotation matrix, a first translation matrix and a second translation matrix according to the essential matrix;
step S304: and selecting a rotation matrix and a translation matrix according to preset screening conditions, and generating the external parameters of the target camera of the depth camera according to the selected rotation matrix and translation matrix.
Preferably, the step S101 includes the steps of:
step S1011: controlling a projector in the depth camera to project horizontal stripe structured light to any planar object in any scene and forming a plurality of stripe light spots on the planar object;
step S1012: controlling a plane object in the scene to deflect an angle, so that a plurality of stripe light spots formed on the plane object are subjected to angle change relative to the plane object;
step S1013: and controlling the plane object to deflect to another different angle, so that the plurality of fringe light spots are subjected to angle change relative to the plane object again.
Preferably, when the light receiving sensor includes a main camera and a sub-camera, the step S2 includes the steps of:
step S201A: obtaining calibration information generated by pre-calibration;
step S202A: converting each pixel coordinate value on the main camera from the main camera coordinate system to a pixel coordinate value under an auxiliary camera coordinate system through the absolute phase value and the calibration information;
step S203A: and determining the pixel coordinate value corresponding to each pixel coordinate value on the main camera and the auxiliary camera according to the pixel coordinate value under the auxiliary camera coordinate system.
Preferably, when a rotation matrix and a translation matrix are selected according to preset screening conditions, the following method is used for realizing:
and converting and screening a set of rotation matrix and translation matrix when Z is greater than 0 and X is less than 0 after the preset XYZ coordinates (0, 1) are converted by a set of rotation matrix and translation matrix.
The invention provides a depth camera external parameter self-calibration device, which comprises the following modules:
the image acquisition module is used for acquiring a plurality of stripe structure light images with different rotation angles and calculating to obtain a coordinate value of each pixel on the light receiving sensor and a corresponding absolute phase value according to the stripe structure light images;
the coordinate calculation module is used for acquiring calibration information generated by pre-calibration and calculating to obtain a corresponding pixel coordinate value on the projector through each absolute phase value on the light receiving sensor and the calibration information;
and the external parameter generation module is used for calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, decomposing a rotation matrix and a translation matrix according to the essential matrix and further determining the target external parameters of the depth camera.
The invention provides a depth camera external parameter self-calibration device, which comprises:
a processor;
a memory module having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the depth camera extrinsic self-calibration method via execution of the executable instructions.
According to the present invention, a computer readable storage medium is provided for storing a program which, when executed, performs the steps of the method for depth camera extrinsic self-calibration.
Compared with the prior art, the invention has the following beneficial effects:
according to the method, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained through calculation of a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained through calculation according to the absolute phase value, so that calculation of an essential matrix is achieved, and rapid calculation of target external parameters of the depth camera is achieved, namely external parameter self-calibration of the depth camera is achieved, and offline calibration of the depth camera can be achieved without a calibration board when camera parameters are changed.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will aid those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any manner. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the concept of the invention. All falling within the scope of the invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical means of the present invention will be described in detail with reference to specific examples. These several specific embodiments may be combined with each other below, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The following describes the technical solutions of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of steps of a depth camera external parameter self-calibration method in an embodiment of the present invention, and as shown in fig. 1, the depth camera external parameter self-calibration method provided in the present invention includes a light receiving sensor and a projector, and includes the following steps:
step S1: acquiring a plurality of stripe structure light images with different rotation angles, and calculating to obtain a coordinate value of each pixel and a corresponding absolute phase value on a light receiving sensor according to the stripe structure light images;
in the embodiment of the invention, the absolute phase diagram corresponding to the fringe pattern is analyzed according to the fringe structure light image. And taking the pixel coordinates on the absolute phase image as the pixel coordinates on a light receiving sensor. And calculating the pixel coordinate on the projector according to the absolute phase value corresponding to each pixel coordinate on the absolute phase diagram.
Fig. 2 is a flowchart illustrating a step of an absolute phase value corresponding to each pixel coordinate value according to an embodiment of the present invention, and as shown in fig. 2, the step S1 includes the following steps:
step S101: controlling a projector in the depth camera to project stripe structure light to any plane object in the scene, and correspondingly changing the deflection angle of the plane object in the scene during each projected stripe structure light;
in the embodiment of the invention, the projector adopts a DLP optical machine, and the different deflection angles of the stripe structure light are realized by controlling the light emission of different pixels on the DLP optical machine.
Fig. 3 is a flowchart illustrating a step of controlling a projector to project structured light with stripes at different angles according to an embodiment of the present invention, where, as shown in fig. 3, the step S101 includes the following steps:
step S1011: controlling a projector in the depth camera to project horizontal stripe structured light to any planar object in any scene and forming a plurality of stripe light spots on the planar object;
step S1012: controlling a plane object in the scene to deflect an angle, so that a plurality of stripe light spots formed on the plane object generate angle change relative to the plane object;
step S1013: and controlling the plane object to deflect to another different angle, so that the plurality of fringe light spots are subjected to angle change relative to the plane object again.
In an embodiment of the present invention, the projector in the depth camera is controlled to project the horizontal stripe structured light pattern to the planar object of the scene, and the light receiving sensor receives the horizontal stripe structured light pattern and then calculates the absolute phase map in the horizontal direction. And then controlling a projector in the depth camera to project a vertical stripe structured light pattern to the planar object of the scene, and after receiving the vertical stripe structured light pattern, a light receiving sensor calculates an absolute phase image in the vertical direction. According to the absolute phase diagram in the horizontal and vertical directions, the pixel coordinate corresponding relation between the light receiving sensor and the projector can be found out.
In an embodiment of the present invention, after a projector in the depth camera is controlled to project a set of horizontal and vertical structured light stripes onto a planar object of the scene, a deflection angle of the planar object of the scene is changed, so that a light receiving sensor can receive horizontal and vertical structured light stripe images of the planar object at different deflection angles, thereby solving absolute phase diagrams of the planar object at different deflection angles, and finding out a pixel coordinate corresponding relationship between the light receiving sensor and the projector on the surface of the planar object at different deflection angles.
In a modification of the present invention, the step S101 includes the steps of:
step S1011A: controlling a projector in the depth camera to project horizontal stripe structured light to any scene;
step S1012A: controlling pixels at different positions in the projector to emit light, so that the angle of the stripe-structured light relative to the horizontal direction is changed, and the stripe-structured light at another angle is projected;
step S1013A: the change of the angle of the stripe-structured light relative to the horizontal direction is continuously controlled until the angle of the projected stripe-structured light relative to the horizontal direction is turned from 0 ° to 90 °.
In a modification of the present invention, the projector in the depth camera is controlled to project the horizontal stripe structured light to the scene first, then the projector in the depth camera is controlled to project the stripe structured light forming an angle of 30 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project the stripe structured light forming an angle of 45 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project the stripe structured light forming an angle of 60 ° with the horizontal direction to the scene first, and then the projector in the depth camera is controlled to project the stripe structured light forming an angle of 90 ° with the horizontal direction to the scene first.
In a modification of the present invention, the projector in the depth camera is controlled to project vertical stripe structured light to the scene first, then the projector in the depth camera is controlled to project stripe structured light having an angle of 60 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project stripe structured light having an angle of 45 ° with the horizontal direction to the scene first, the projector in the depth camera is controlled to project stripe structured light having an angle of 30 ° with the horizontal direction to the scene first, and then the projector in the depth camera is controlled to project horizontal stripe structured light to the scene first.
Step S102: controlling a light receiving sensor in the depth camera to receive the structured light reflected by the planar object to generate a plurality of stripe structured light images with different rotation angles;
step S103: and calculating to obtain the coordinate value of each pixel and the corresponding absolute phase value on the light receiving sensor according to the structured light image.
Step S2: obtaining calibration information generated by pre-calibration, and calculating to obtain a corresponding pixel coordinate value on the projector through the absolute phase value and the calibration information;
fig. 4 is a flowchart of a step of determining pixel coordinate values corresponding to the light receiving sensor and the projector in the embodiment of the present invention, and as shown in fig. 4, the step S2 includes the following steps:
step S201: obtaining calibration information generated by pre-calibration;
step S202: converting each pixel coordinate value on the light receiving sensor from a camera coordinate system to a pixel coordinate value under a projector coordinate system through the absolute phase value and the calibration information;
step S203: and determining each pixel coordinate value corresponding to each pixel coordinate value on the light receiving sensor on the projector according to the pixel coordinate value under the projector coordinate system.
In the embodiment of the invention, the calibration information is obtained by performing off-line calibration on the depth camera and performing calibration generation through a plurality of calibration plate images.
The camera coordinate system is a three-dimensional rectangular coordinate system established by taking the optical center of a lens as an origin OA, and respectively taking the directions of a row and a column parallel to pixels as the directions of an XA axis and a YA axis according to the rule of a right-hand coordinate system.
The projector coordinate system is a three-dimensional rectangular coordinate system which takes a projector optical center as an original point OB, is parallel to the horizontal direction and the vertical direction of a projector chip and is respectively in the XB axis direction and the YB axis direction, and the optical axis is established in the Z axis direction.
Fig. 5 is a flowchart of a step of determining pixel coordinate values corresponding to a primary camera and a secondary camera in an embodiment of the present invention, and as shown in fig. 5, when the light receiving sensor includes the primary camera and the secondary camera, the step S2 includes the following steps:
step S201A: obtaining calibration information generated by pre-calibration;
step S202A: converting each pixel coordinate value on the main camera from the main camera coordinate system to a pixel coordinate value under the auxiliary camera coordinate system through the absolute phase value and the calibration information;
step S203A: and determining the pixel coordinate value corresponding to each pixel coordinate value on the main camera and the auxiliary camera according to the pixel coordinate value under the auxiliary camera coordinate system.
In the embodiment of the invention, the calibration information is the offline calibration of the depth camera, and the calibration is generated through a plurality of calibration plate images.
The main camera coordinate system is a three-dimensional rectangular coordinate system which is established by taking the optical center of the lens as an origin OC, and respectively taking the directions of the rows and the columns parallel to the pixels as XC axis and YC axis directions, and determining the ZC axis direction according to the rule of a right-hand coordinate system.
The secondary camera coordinate system is a three-dimensional rectangular coordinate system which takes a projector optical center as an origin OD, is parallel to the horizontal direction and the vertical direction of a projector chip and is respectively in the XD axis direction and the YD axis direction, and the optical axis is established in the ZD axis direction.
And step S3: and calculating an essential matrix according to the two pixel coordinate values matched with the light receiving sensor and the projector, and decomposing a rotation matrix and a translation matrix according to the essential matrix so as to determine the target external parameters of the depth camera.
Fig. 6 is a flowchart of a step of generating external parameters of a target camera of the depth camera according to an embodiment of the present invention, and as shown in fig. 6, the step S3 includes the following steps:
step S301: according to camera parameters and distortion coefficients in the calibration information, the pixel coordinate values of the light receiving sensor and the pixel coordinate values of the projector are subjected to distortion removal to generate target pixel coordinate values;
step S302: calculating according to the coordinate values of the two target pixels matched with the light receiving sensor and the projector;
step S303: decomposing a first rotation matrix, a second rotation matrix, a first translation matrix and a second translation matrix according to the essential matrix;
step S304: and selecting a rotation matrix and a translation matrix according to preset screening conditions, and generating the target camera external parameter of the depth camera according to the selected rotation matrix and translation matrix.
In the embodiment of the invention, when a rotation matrix and a translation matrix are selected according to preset screening conditions, the method is realized by the following steps:
and converting and screening a set of rotation matrix and translation matrix when Z is greater than 0 and X is less than 0 after the preset XYZ coordinates (0, 1) are converted by a set of rotation matrix and translation matrix.
In the embodiment of the present invention, calculating the essential matrix and decomposing the essential matrix are calculated by calling a corresponding OpenCV function. For example, a findEssestimAlMat () function is used to calculate the essential matrix, and a decomplexeEssestimAlMat () function is used to decompose the essential matrix.
In the embodiment of the invention, when the base line is not changed, a single picture is reconstructed from the calibration result, the difference between the reconstructed three-dimensional point coordinate and the OpenCV offline calibration result is large, and the error of the three-dimensional coordinate distance of the reconstructed point is about 10-30 mm. When the base line is unchanged, self-calibration is carried out on a plurality of stripe structure light images with different rotation angles, the obtained external parameter and the OpenCV offline calibration result are not greatly different, and the distance error of the final three-dimensional point reconstruction result is about 1-2 mm.
Fig. 7 is a schematic structural diagram of an article picking robot in an embodiment of the present invention, where as shown in fig. 7, the article picking robot provided in the present invention further includes:
the first unit and the second unit are used for storing or/and transporting materials;
the depth camera 300 is provided with a visual scanning area at least covering a first unit for storing or transporting the materials, and is used for visually scanning the materials, acquiring a depth image of the materials, and generating pose information and a storage position of the materials according to the depth image;
and the robot unit 100 is in communication connection with the depth camera 300 and is used for receiving the position and posture information and the storage position, judging the placement state of the target object according to the position and the storage position, and picking the target object according to the placement state.
In an embodiment of the present invention, the first unit may be configured as a storage unit 200;
the storage unit 200 is used for storing materials which are placed out of order, wherein the materials are the target objects, such as any articles like metal products and boxes;
and the robot unit 100 is in communication connection with the depth camera 300, and is configured to receive the position and posture information and the storage position, determine a placement state of the target object according to the position and the storage position, pick the target object according to the placement state, and transfer the target object to a second unit.
The second unit may be arranged to transport or store the sorted material, such as a support rack arranged to facilitate the orderly arrangement of the items,
the second unit may further include a transportation unit, so that the robot unit 100 can move the target object on the support frame to the transportation unit.
The depth camera 300 is disposed on a camera support.
The robot unit 100 includes a processor configured to calculate an absolute phase value corresponding to each pixel coordinate value on the light receiving sensor through a plurality of stripe structure light images with different rotation angles when the steps of the depth reconstruction method are executed by executing an executable instruction, and further calculate a pixel coordinate value corresponding to the projector according to the absolute phase value to realize calculation of an essential matrix, thereby realizing fast calculation of target external parameters of the depth camera, that is, realizing external parameter self-calibration of the depth camera, and enabling offline calibration of the depth camera without using a calibration board when camera parameters change.
Fig. 8 is a schematic block diagram of a depth camera external reference self-calibration apparatus in an embodiment of the present invention, and as shown in fig. 8, the depth camera external reference self-calibration apparatus 100 provided by the present invention includes the following modules:
the image acquisition module 101 is configured to acquire a plurality of stripe structure light images with different rotation angles, and calculate, according to the stripe structure light images, a coordinate value of each pixel on the light receiving sensor and a corresponding absolute phase value;
a coordinate calculation module 102, configured to obtain calibration information generated by pre-calibration, and calculate, through each absolute phase value on the light receiving sensor and the calibration information, a corresponding pixel coordinate value on the projector;
and the extrinsic parameter generating module 103 is configured to calculate an intrinsic matrix according to the two pixel coordinate values matched between the light receiving sensor and the projector, and decompose a rotation matrix and a translation matrix according to the intrinsic matrix, so as to determine the target extrinsic parameters of the depth camera.
The embodiment of the invention also provides depth reconstruction equipment which comprises a processor and a memory. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the depth reconstruction method steps via execution of executable instructions.
As described above, in this embodiment, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained through calculation of a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained through calculation according to the absolute phase value, so as to realize calculation of the essential matrix, thereby realizing fast calculation of the target external parameter of the depth camera, that is, realizing self-calibration of the external parameter of the depth camera, and enabling offline calibration of the depth camera without using a calibration board when the camera parameters are changed.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Accordingly, various aspects of the present invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 9 is a schematic structural diagram of a depth reconstruction apparatus in an embodiment of the present invention. An electronic device 600 according to such an embodiment of the invention is described below with reference to fig. 9. The electronic device 600 shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores a program code, which can be executed by the processing unit 610, such that the processing unit 610 performs the steps according to various exemplary embodiments of the present invention described in the above-mentioned deep reconstruction method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM) 6201 and/or a cache storage unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include programs/utilities 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 can be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, camera, depth camera, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in FIG. 9, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the depth reconstruction method when executed. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention described in the above-mentioned deep reconstruction method section of this specification, when the program product is run on the terminal device.
As shown above, when the program of the computer-readable storage medium of this embodiment is executed, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained through calculation of a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained through calculation according to the absolute phase value, so as to implement calculation of the essential matrix, thereby implementing fast calculation of the target external parameter of the depth camera, that is, implementing external parameter self-calibration of the depth camera, and enabling offline calibration of the depth camera without using a calibration board when the camera parameters are changed.
Fig. 10 is a schematic structural diagram of a computer-readable storage medium in an embodiment of the present invention. Referring to fig. 10, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external computing devices (e.g., through the internet using an internet service provider).
In the embodiment of the invention, the absolute phase value corresponding to each pixel coordinate value on the light receiving sensor is obtained by calculating a plurality of stripe structure light images with different rotation angles, and then the pixel coordinate value corresponding to the projector is obtained by calculating according to the absolute phase value so as to realize the calculation of the essential matrix, thereby realizing the rapid calculation of the target external parameter of the depth camera, namely realizing the self-calibration of the external parameter of the depth camera, and being capable of carrying out off-line calibration on the depth camera without a calibration board when the camera parameters are changed.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.