WO2020073261A1 - 相机标定的装置、***、方法及具有存储功能的装置 - Google Patents

相机标定的装置、***、方法及具有存储功能的装置 Download PDF

Info

Publication number
WO2020073261A1
WO2020073261A1 PCT/CN2018/109737 CN2018109737W WO2020073261A1 WO 2020073261 A1 WO2020073261 A1 WO 2020073261A1 CN 2018109737 W CN2018109737 W CN 2018109737W WO 2020073261 A1 WO2020073261 A1 WO 2020073261A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
coordinate system
coordinates
reference point
substrate
Prior art date
Application number
PCT/CN2018/109737
Other languages
English (en)
French (fr)
Inventor
阳光
Original Assignee
深圳配天智能技术研究院有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳配天智能技术研究院有限公司 filed Critical 深圳配天智能技术研究院有限公司
Priority to CN201880087316.5A priority Critical patent/CN111630851A/zh
Priority to PCT/CN2018/109737 priority patent/WO2020073261A1/zh
Publication of WO2020073261A1 publication Critical patent/WO2020073261A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof

Definitions

  • the present application relates to the field of vision technology, and in particular to a camera calibration device, system, method, and device with a storage function.
  • the binocular camera is an increasingly concerned device that can provide stereo vision. It is based on the principle of binocular parallax and uses imaging equipment to obtain two different images of the same object at different positions, and then calculates the binocular camera. The three-dimensional position of the object relative to the camera.
  • two lenses of a binocular camera have optical axes parallel to each other. The two lenses are arranged side by side in a direction perpendicular to the optical axis and have framing windows that are separated from each other, and have different viewing angles for the same imaged object.
  • the binocular stereo vision measurement method has the advantages of high efficiency, appropriate accuracy, simple system structure, low cost, etc. It is very suitable for online, non-contact product detection and quality control on the manufacturing site.
  • the binocular system requires a very stable fixed bracket, which not only increases the cost but also does not guarantee effective processing of the relative positional relationship changes that occur between the cameras.
  • the technical problem that this application mainly solves is to provide a camera calibration device, system, method and device with storage function, which can quickly obtain the positional relationship between cameras in each frame when the relative positional relationship between cameras changes.
  • the first technical solution adopted by the present application is to provide a camera calibration device, which includes a camera, a substrate within the shooting range of the camera, and a processor coupled to the camera, wherein the substrate is provided with Marking layer, the marking layer includes at least one reference point.
  • the second technical solution adopted in this application is to provide a camera calibration system.
  • the system includes a camera calibration device.
  • the device includes a camera, a substrate within a shooting range of the camera, and a camera coupled to the camera.
  • the substrate is provided with an identification layer, and the identification layer includes at least one reference point.
  • the third technical solution adopted by the present application is to provide a camera calibration method, which includes: the camera shoots a reference point within its shooting range to obtain the reference point in the image coordinate system The coordinates of the camera; the coordinates of the camera in the reference coordinate system are obtained through the coordinates of the reference point in the image coordinate system and the reference coordinate system to obtain the positional relationship between the cameras.
  • the fourth technical solution adopted by the present application is to provide a device with a storage function, the device stores program data, and the program data can be executed to implement the following method:
  • the reference point is photographed to obtain the coordinates of the reference point in the image coordinate system;
  • the coordinates of the camera in the reference coordinate system are obtained through the coordinates of the reference point in the image coordinate system and the reference coordinate system to obtain the positional relationship between the cameras.
  • the camera calibration device of the present application includes a camera, a substrate within the shooting range of the camera, and a processor coupled to the camera, and the substrate is provided with an identification layer, the identification layer At least one reference point is included. Capture the reference point image coordinate system coordinates through the camera to the reference point within its shooting range, and then obtain the camera's world coordinate system coordinates through the reference point image coordinate system coordinates and the world coordinate system coordinates, so that each frame can be quickly Obtain the positional relationship between cameras to further improve the accuracy of shooting.
  • FIG. 1 is a schematic structural diagram of an embodiment of a camera calibration device provided by this application.
  • FIG. 2 is a schematic structural plan view of an embodiment of a transparent glass substrate provided with a marking layer in FIG. 1;
  • FIG. 3 is a schematic flowchart of a camera calibration method provided by this application.
  • FIG. 4 is a schematic structural diagram of an embodiment of a camera calibration system provided by this application.
  • FIG. 5 is a schematic structural diagram of an embodiment of a device with a storage function provided by this application.
  • the calibration of the camera includes the calibration of internal parameters and external parameters.
  • the internal parameters mainly include the principal point coordinates, focal length, radial distortion coefficient and lateral distortion coefficient
  • the external parameters mainly include the rotation matrix and the translation matrix.
  • the positional relationship between cameras is easy to change, that is, the external parameters of the camera are changed, and the internal parameters remain unchanged. Therefore, in order to improve the accuracy of shooting, it is necessary to reacquire the positional relationship between the cameras.
  • this application sets the substrate provided with the marking layer within the shooting range of the camera, and obtains the coordinates of the camera in the world coordinate system through the reference point on the marking layer, thereby obtaining the inter-camera space Positional relationship.
  • the application takes the position relationship between two cameras as an example for description.
  • FIG. 1 is a schematic structural diagram of an embodiment of a camera calibration device provided by the present application.
  • the camera calibration device includes a substrate 103, a first camera 101 and a second camera on a side of the substrate 103
  • the substrate 103 is a transparent glass substrate
  • the substrate 103 in the present application is a planar substrate
  • a marking layer 1031 is provided on the substrate 103, the marking layer 1031 includes at least one reference point, and the marking layer 1031 will be transparent
  • the glass substrate 103 is divided into a plurality of rectangular cell grids, and the at least one reference point is located at a cross point of the plurality of rectangular cell grids.
  • the marking layer 1031 is disposed on the surface of the transparent glass substrate 103. In other embodiments, the marking layer 1031 may also be disposed inside the transparent glass substrate, or partially disposed on the surface of the substrate and partially disposed inside the substrate. Furthermore, the identification layer 1031 is a transflective reflective strip of segment display technology, that is, light incident on the reflective strip is partially reflected and partially refracted.
  • the transparent substrate is a glass substrate.
  • the material of the transparent substrate may also be sapphire, silicon carbide, or an organic transparent body, which is not limited herein.
  • FIG. 2 is a schematic structural plan view of an embodiment of a transparent glass substrate provided with a marking layer in FIG. 1.
  • a marking layer 1031 is plated on the surface of the transparent glass substrate 103, and the marking layer 1031 divides the glass substrate 103 into a plurality of rectangular unit grids.
  • the marking layer 1031 divides the glass substrate 103 into a plurality of rectangular unit grids is only a specific embodiment. In other specific embodiments, the marking layer 1031 may also be other specific patterns.
  • an identification layer 1031 on the surface of the transparent glass substrate 103 is provided to divide the glass substrate 103 into a plurality of rectangular cell grids ,
  • the physical size of each rectangular cell grid is the same
  • the reference points Q1, Q2 are selected at the intersections of multiple rectangular cell grids
  • the reference point Q1 is located within the shooting range of the first camera 101
  • the reference point Q2 is located at the
  • the second camera 102 is within the shooting range.
  • the plane of the reference coordinate system is the plane where the transparent glass substrate 103 is located, and the reference coordinate system is the world coordinate system.
  • the reference points Q1 and Q2 are in the world coordinate system.
  • the coordinates are known.
  • the reference point Q1 can be captured by the first camera 101 to obtain the coordinates of the reference point Q1 in the image coordinate system.
  • the reference point Q1 can be obtained through the coordinates of the reference point Q1 in the image coordinate system and the internal parameters of the first camera 101.
  • the optical center of the first camera 101 is selected as the origin of the camera coordinate system, and the coordinate energy in the world coordinate system and the camera coordinate system of the first camera 101 respectively according to the reference point Q1 Obtain the rotation and translation relationship between the world coordinate system and the camera coordinate system of the first camera 101 to obtain the coordinates of the first camera 101 in the world coordinate system; the reference point Q2 can be obtained by photographing the reference point Q2 by the second camera 102 For the coordinates in the image coordinate system, the coordinates of the reference point Q2 in the camera coordinate system of the second camera 102 can be obtained through the coordinates of the reference point Q2 in the image coordinate system and the internal parameters of the second camera 102.
  • the optical center is the origin of the camera coordinate system.
  • the world coordinate system and the second coordinate system can be obtained.
  • the optical center of the first camera 101 is used as the origin to establish its camera coordinate system.
  • the coordinates of the reference point Q1 in the camera coordinate system of the first camera 101 are set to (X1, Y1, Z1), and the coordinate point (X1, Y1 , Z1) is projected by light onto point q1 (x1, y1, f1) in the image coordinate system, where f1 represents the internal parameter focal length of the first camera 101, the image coordinate system plane is perpendicular to the optical axis of the first camera 101, and the image coordinates
  • the above proportional relationship can be expressed as the following matrix:
  • the first camera 101 photographs the reference point Q1 to obtain the coordinates (x1, y1, f1) of the reference point Q1 in the image coordinate system, and then the coordinates (x1, y1, f1) can obtain the reference point Q1 in the first
  • the coordinates (X1, Y1, Z1) in the camera coordinate system of a camera 101 and the coordinates of the reference point Q1 in the world coordinate system are known quantities, and the coordinates of the reference point Q1 in the world coordinate system are represented by (X1 ', Y1', Z1 '). If the rotation matrix between the world coordinate system and the camera coordinate system of the first camera 101 is R1, R1 is a 3 * 3 matrix, and the translation vector is t1, then the following relationship exists between the world coordinate system and the camera coordinate system of the first camera 101:
  • L W1 represents the rotation and translation relationship between the world coordinate system and the camera coordinate system of the first camera 101
  • the vector O (0, 0, 0)
  • O T represents the transpose of the vector (0, 0, 0). That is, the world coordinate system and the camera of the first camera 101 can be obtained through the coordinates (X1 ', Y1', Z1 ') in the world coordinate system of the reference point Q1 and their coordinates (X1, Y1, Z1) in the camera coordinate system Rotational translation relationship L W1 between coordinate systems.
  • the first camera 101 is located at the origin of its camera coordinate system, that is, the coordinates of the first camera 101 in its camera coordinate system are known, and then the camera coordinates of the first camera 101 are transformed by the rotation and translation relationship L W1 Can get its coordinates in the world coordinate system.
  • the optical center of the second camera 102 is used as the origin to establish its camera coordinate system.
  • the coordinates of the reference point Q2 in the camera coordinate system of the second camera 102 are set to (X2, Y2, Z2), and the coordinate point (X2, Y2 , Z2) is projected by light onto point q2 (x2, y2, f2) in the image coordinate system, where f2 represents the internal parameter focal length of the second camera 102, the image coordinate system plane is perpendicular to the optical axis of the second camera 102, and the image coordinates
  • the above proportional relationship can be expressed as the following matrix:
  • the reference point Q2 can be captured by the second camera 102 to obtain the coordinates (x2, y2, f2) of the reference point Q2 in the image coordinate system, and then the coordinates (x2, y2, f2) can obtain the reference point Q2 in the first The coordinates (X2, Y2, Z2) in the camera coordinate system of the two cameras 102.
  • the coordinates of the reference point Q2 in the world coordinate system are known quantities, and the coordinates of the reference point Q2 in the world coordinate system are represented by (X2 ', Y2', Z2 ').
  • L W2 represents the rotation and translation relationship between the world coordinate system and the camera coordinate system of the second camera 102. That is, the world coordinate system and the camera of the second camera 102 can be obtained through the coordinates (X2 ', Y2', Z2 ') in the world coordinate system of the reference point Q2 and their coordinates (X2, Y2, Z2) in the camera coordinate system Rotational translation relationship L W2 between coordinate systems.
  • the second camera 102 is located at the origin of its camera coordinate system, that is, the coordinates of the second camera 102 in its camera coordinate system are known, and then the coordinates of the second camera 102 can be transformed by the rotation and translation relationship L W2 Get its coordinates in the world coordinate system.
  • the coordinates of the first camera 101 and the second camera 102 in the same world coordinate system are obtained respectively, that is, the positional relationship between the first camera 101 and the second camera 102 can be obtained, so that the binocular can be quickly matched at each frame Recalibrate the external parameters of the camera.
  • the time interval between two adjacent calibrations of the binocular camera can be set.
  • the specific setting of the time interval depends on the actual situation and is not specifically limited here.
  • the camera calibration device includes two cameras. In other embodiments, it may also include three, four, or more cameras, which is not specifically limited herein.
  • two reference points Q1 and Q2 are selected, where the reference point Q1 is within the shooting range of the first camera 101 and the reference point Q2 is within the shooting range of the second camera 102.
  • only one reference point may be selected on the identification layer, and the one reference point is located in both the shooting range of the first camera and the second camera; or more than two selection points on the identification layer
  • at least one reference point is within the shooting range of the first camera, and at least one reference point is within the shooting range of the second camera.
  • the processor is disposed outside the first camera 101 and the second 102, and is coupled to the two cameras 101 and 102. In other embodiments, the processor may be directly installed on the first camera 101 or The inside of the second camera 102 is not specifically limited.
  • the camera calibration device of the present application includes a camera, a substrate within the shooting range of the camera, and a processor coupled to the camera, and an identification layer is provided on the substrate, and the identification layer includes at least one reference point. Capture the reference point image coordinate system coordinates through the camera to the reference point within its shooting range, and then obtain the camera's world coordinate system coordinates through the reference point image coordinate system coordinates and the world coordinate system coordinates, so that each frame can be quickly Obtain the positional relationship between cameras to further improve the accuracy of shooting.
  • FIG. 3 is a schematic flowchart of a camera calibration method provided by the present application. The specific steps of this method will be described in detail below.
  • Step 301 The camera shoots the reference point within its shooting range to obtain the coordinates of the reference point in the image coordinate system.
  • two cameras are respectively used to take a reference point located within the shooting range as an example for description. In other embodiments, a larger number of cameras may be used, which is not limited herein.
  • the reference coordinate system plane is the plane where the transparent glass substrate 103 is located, the reference coordinate system is the world coordinate system, and the logo layer pattern on the transparent glass substrate 103 is a plurality of rectangles Cell grid, select two cross points on the rectangular cell grid as two reference points, one of the reference points Q1 is within the shooting range of the first camera 101, and the other reference point Q2 is within the shooting range of the second camera 102
  • the coordinates of the reference points Q1 and Q2 in the world coordinate system are known, using (X1 ', Y1', Z1 ') and (X2' , Y2 ', Z2') respectively represent the coordinates of the reference points Q1 and Q2 in the world coordinate system
  • the first camera 101 is used to photograph the reference point Q1 to obtain the coordinates of the reference point Q1 in the image coordinate system (x1, y1, f1 ),
  • the reference point Q2 is photographed with the second camera
  • Step 302 Obtain the coordinates of the camera in the reference coordinate system through the coordinates of the reference point in the image coordinate system and the reference coordinate system to obtain the positional relationship between the cameras.
  • step 301 the coordinates of the two reference points Q1 and Q2 in the image coordinate system are obtained.
  • the processor calculates the reference point Q1 in the first according to the coordinates of the reference point Q1 in the image coordinate system and the internal parameters of the first camera 101
  • the coordinates in the camera coordinate system of the camera 101 according to the coordinates of the reference point Q1 in the world coordinate system and the coordinates in the camera coordinate system of the first camera 101, the rotation between the world coordinate system and the camera coordinate system of the first camera 101 is obtained Translation relationship, so that the coordinates of the first camera 101 in the world coordinate system can be obtained; similarly, the processor calculates the reference point Q2 in the second according to the coordinates of the reference point Q2 in the image coordinate system and the internal parameters of the second camera 102
  • the coordinates in the camera coordinate system of the camera 102 are obtained according to the coordinates of the reference point Q2 in the world coordinate system and the coordinates in the camera coordinate system of the second camera 102 to obtain the rotation between the world coordinate system and the camera coordinate system of the second
  • the coordinates (x1, y1, f1) of the reference point Q1 in the image coordinate system can be obtained in the camera coordinate system (X1, Y1, Z1) of the first camera 101.
  • the coordinates of the reference point Q1 in the world coordinate system are known quantities, and the coordinates of the reference point Q1 in the world coordinate system are represented by (X1 ', Y1', Z1 ').
  • the rotation matrix between the world coordinate system and the camera coordinate system of the first camera 101 is R1 and the translation vector is t1
  • the world coordinate system and the camera of the first camera 101 can be obtained through the coordinates (X1 ', Y1', Z1 ') in the world coordinate system of the reference point Q1 and their coordinates (X1, Y1, Z1) in the camera coordinate system Rotational translation relationship L W1 between coordinate systems.
  • the first camera 101 is located at the origin of its camera coordinate system, that is, the coordinates of the first camera 101 in its camera coordinate system are known, and then the camera coordinates of the first camera 101 are transformed by the rotation and translation relationship L W1 Can get its coordinates in the world coordinate system.
  • the processor uses the same steps as above to obtain the world coordinate system coordinates of the first camera 101 to calculate the world coordinate system coordinates of the second camera 102, so that the coordinates of the two cameras 101 and 102 in the same world coordinate system are The positional relationship between the two cameras 101 and 102 can be obtained, so that the external parameters of the binocular camera can be quickly recalibrated every frame.
  • the camera calibration device of the present application includes a camera, a substrate within the shooting range of the camera, and a processor coupled to the camera, and an identification layer is provided on the substrate, and the identification layer includes at least one reference point. Capture the reference point image coordinate system coordinates through the camera to the reference point within its shooting range, and then obtain the camera's world coordinate system coordinates through the reference point image coordinate system coordinates and the world coordinate system coordinates, so that each frame can be quickly Obtain the positional relationship between cameras to further improve the accuracy of shooting.
  • FIG. 4 is a schematic structural diagram of an embodiment of a camera calibration system provided by this application.
  • the camera calibration system 40 includes a camera calibration device 401, and the camera calibration device 401 includes a camera, a substrate within a shooting range of the camera, and a processor coupled to the camera, wherein the substrate is provided with an identification layer, and the identification layer includes at least one Reference point.
  • FIG. 5 is a schematic structural diagram of an embodiment of a device with a storage function provided by this application.
  • the device 50 with a storage function stores program data 501, which can be executed to implement the following method: the camera shoots a reference point within its shooting range to obtain the coordinates of the reference point in the image coordinate system; The coordinates of the camera in the reference coordinate system are obtained through the coordinates of the reference point in the image coordinate system and the reference coordinate system to obtain the positional relationship between the cameras.
  • the camera calibration device of the present application includes a camera, a substrate within the shooting range of the camera, and a processor coupled to the camera, and the substrate is provided with an identification layer, the identification layer At least one reference point is included. Capture the reference point image coordinate system coordinates through the camera to the reference point within its shooting range, and then obtain the camera's world coordinate system coordinates through the reference point image coordinate system coordinates and the world coordinate system coordinates, so that each frame can be quickly Obtain the positional relationship between cameras to further improve the accuracy of shooting.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种相机标定的装置、***、方法及具有存储功能的装置,该相机标定的装置包括相机、位于相机拍摄范围内的基板以及与相机耦接的处理器,其中,基板设有标识层,标识层至少包括一个参考点。通过相机对位于其拍摄范围内的参考点进行拍摄获取参考点的图像坐标系坐标,再通过参考点的图像坐标系坐标和世界坐标系坐标获取相机的世界坐标系坐标,从而能够在每帧快速获取相机间的位置关系,进而提升拍摄的精度。

Description

相机标定的装置、***、方法及具有存储功能的装置 【技术领域】
本申请涉及视觉技术领域,特别是涉及一种相机标定的装置、***、方法及具有存储功能的装置。
【背景技术】
双目相机是一种日益受关注的能够提供立体视觉的设备,它是基于双目视差原理并利用成像设备在不同位置获取同一被测物的两幅不同图像,再计算出双目相机所拍摄到的物体相对于相机的三维空间位置。一般而言,双目相机的两个镜头具有彼此平行的光轴,两个镜头在垂直于光轴的方向上并排布置且具有彼此分开的取景窗口,对于同一被成像物体具有不同的视角。双目立体视觉测量方法具有效率高、精度合适、***结构简单、成本低等优点,非常适合于制造现场的在线、非接触产品检测和质量控制。
但双目在实际应用时,若两目间发生相对位置变化,则相机间的外部参数如平移矩阵和旋转矩阵都会发生变化,而相机间的相对位置关系发生变化时不能马上重新标定,造成获取的匹配数据无法有效使用。在大部分情况下,双目***都需要非常稳的固定支架,这既增加了成本又不能保证对相机间发生的相对位置关系变化进行有效处理。
因此,有必要提出一种相机标定的装置、***、方法及具有存储功能的装置,以解决上述问题。
【发明内容】
本申请主要解决的技术问题是提供一种相机标定的装置、***、方法及具有存储功能的装置,能够实现在相机间的相对位置关系发生变化时,在每帧快速获取相机间的位置关系。
为解决上述技术问题,本申请采用的第一个技术方案是提供一种相机标定的装置,该装置包括相机、位于相机拍摄范围内的基板以及与相 机耦接的处理器,其中,基板设有标识层,标识层至少包括一个参考点。
为解决上述技术问题,本申请采用的第二个技术方案是提供一种相机标定的***,该***包括相机标定的装置,该装置包括相机、位于相机拍摄范围内的基板以及与相机耦接的处理器,其中,基板设有标识层,标识层至少包括一个参考点。
为解决上述技术问题,本申请采用的第三个技术方案是提供一种相机标定的方法,该方法包括:相机对位于其拍摄范围内的参考点进行拍摄,以获取参考点在图像坐标系中的坐标;通过参考点在图像坐标系和参考坐标系中的坐标得到相机在参考坐标系中的坐标,以得到相机间的位置关系。
为解决上述技术问题,本申请采用的第四个技术方案是提供一种具有存储功能的装置,该装置存储有程序数据,程序数据能够被执行以实现如下方法:相机对位于其拍摄范围内的参考点进行拍摄,以获取参考点在图像坐标系中的坐标;通过参考点在图像坐标系和参考坐标系中的坐标得到相机在参考坐标系中的坐标,以得到相机间的位置关系。
本申请的有益效果是:区别于现有技术的情况,本申请的相机标定装置包括相机、位于相机拍摄范围内的基板以及与相机耦接的处理器,且基板上设有标识层,标识层至少包括一个参考点。通过相机对位于其拍摄范围内的参考点进行拍摄获取参考点的图像坐标系坐标,再通过参考点的图像坐标系坐标和世界坐标系坐标获取相机的世界坐标系坐标,从而能够在每帧快速获取相机间的位置关系,进而提升拍摄的精度。
【附图说明】
图1是本申请提供的相机标定的装置一实施方式的结构示意图;
图2是图1中设有标识层的透明玻璃基板一实施方式的俯视结构示意图;
图3是本申请提供的相机标定方法的流程示意图;
图4是本申请提供的相机标定的***一实施方式的结构示意图;
图5是本申请提供的具有存储功能的装置一实施方式的结构示意 图。
【具体实施方式】
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,均属于本申请保护的范围。
相机的标定包括内部参数和外部参数的标定,其中,内部参数主要包括主点坐标、焦距、径向畸变系数和横向畸变系数,外部参数主要包括旋转矩阵和平移矩阵,而在实际使用的过程中,相机间的位置关系容易发生变化,即造成相机的外部参数发生变化,内部参数保持不变,所以为了提升拍摄的精度需要重新获取相机间的位置关系。为了在每帧快速获取相机间的位置关系,本申请将设有标识层的基板设置在相机的拍摄范围内,通过标识层上的参考点获取相机在世界坐标系中的坐标,从而获得相机间的位置关系。以下,本申请中以获取两个相机间的位置关系为例进行说明。
请参阅图1,图1是本申请提供的相机标定的装置一实施方式的结构示意图,如图1所示,相机标定的装置包括基板103,位于基板103一侧的第一相机101和第二相机102,以及与第一相机101和第二102耦接的处理器(图中未示出),基板103位于第一相机101和第二相机102的拍摄范围内,其中,第一相机101和第二相机102构成双目相机,基板103为透明玻璃基板,且本申请中的基板103为平面基板,基板103上设有标识层1031,标识层1031包括至少一个参考点,标识层1031将透明玻璃基板103划分成多个矩形单元网格,该至少一个参考点位于多个矩形单元网格的十字交叉点处。
进一步地,标识层1031设置于透明玻璃基板103的表面,在其他实施例中,标识层1031也可以设置在透明玻璃基板的内部,或者部分设置在该基板表面、部分设置在该基板内部。更进一步地,标识层1031 为段显示技术的半透半反的反光条,即入射到反光条的光线,部分发生反射部分发生折射。
本实施例中,透明基板为玻璃基板,在其他实施例中,透明基板的材质还可以为蓝宝石、碳化硅或有机透明体,在此不做限定。在一个可选的实施方式中,请参阅图2,图2是图1中设有标识层的透明玻璃基板一实施方式的俯视结构示意图。如图2所示,在透明玻璃基板103的表面镀有标识层1031,该标识层1031将玻璃基板103划分成多个矩形单元网格。标识层1031将玻璃基板103划分成多个矩形单元网格只是一个具体的实施方式,在其他具体实施方式中,标识层1031也可以是其他特定的图案。
在一具体的实施方式中,结合图1和图2,为了在每帧快速获取相机间的位置关系,设置位于透明玻璃基板103表面的标识层1031将玻璃基板103划分成多个矩形单元网格,每个矩形单元网格的物理尺寸大小相同,在多个矩形单元网格的交叉点处选取参考点Q1、Q2,且参考点Q1位于第一相机101的拍摄范围内,参考点Q2位于第二相机102的拍摄范围内。选取参考坐标系平面为透明玻璃基板103所在的平面,参考坐标系为世界坐标系,由于每个矩形单元网格的物理尺寸大小是已知的,即参考点Q1、Q2在世界坐标系中的坐标是已知的。通过第一相机101对参考点Q1进行拍摄可以获取参考点Q1在图像坐标系中的坐标,通过参考点Q1在图像坐标系中的坐标及第一相机101的内部参数能得到参考点Q1在第一相机101的相机坐标系中的坐标,选取第一相机101的光心为其相机坐标系的原点,根据参考点Q1分别在世界坐标系和在第一相机101的相机坐标系中的坐标能得到世界坐标系和第一相机101的相机坐标系间的旋转平移关系,从而得到第一相机101在世界坐标系中的坐标;通过第二相机102对参考点Q2进行拍摄可以获取参考点Q2在图像坐标系中的坐标,通过参考点Q2在图像坐标系中的坐标及第二相机102的内部参数能得到参考点Q2在第二相机102的相机坐标系中的坐标,选取第二相机102的光心为其相机坐标系的原点,根据参考点Q2分别在世界坐标系和在第二相机102的相机坐标系中的坐标 能得到世界坐标系和第二相机102的相机坐标系间的旋转平移关系,从而得到第二相机102在世界坐标中的坐标;根据第一相机101和第二相机102分别在世界坐标系中的坐标即可得到第一相机101和第二相机102间的位置关系。
具体地,以第一相机101的光心为原点建立其相机坐标系,参考点Q1在第一相机101的相机坐标系中的坐标设为(X1,Y1,Z1),坐标点(X1,Y1,Z1)被光线投影到图像坐标系中的点q1(x1,y1,f1),其中f1代表第一相机101的内部参数焦距,图像坐标系平面与第一相机101的光轴垂直,图像坐标系平面与第一相机101光心的距离为f1,按照三角比例关系能得到比例式:x1/f1=X1/Z1,y1/f1=Y1/Z1。上述比例关系可以用如下的矩阵表示为:
Figure PCTCN2018109737-appb-000001
即通过第一相机101对参考点Q1进行拍摄能获取到参考点Q1在图像坐标系中的坐标(x1,y1,f1),再通过坐标(x1,y1,f1)能得到参考点Q1在第一相机101的相机坐标系中的坐标(X1,Y1,Z1)。且参考点Q1在世界坐标系中的坐标为已知量,用(X1’,Y1’,Z1’)表示参考点Q1在世界坐标系中的坐标。设世界坐标系和第一相机101的相机坐标系间的旋转矩阵为R1,R1为3*3矩阵,平移向量为t1,则世界坐标系和第一相机101的相机坐标系间存在如下关系:
Figure PCTCN2018109737-appb-000002
其中的L W1表示世界坐标系和第一相机101的相机坐标系间的旋转平移关系,向量O=(0,0,0),O T表示对向量(0,0,0)的转置。即通过参考点Q1的世界坐标系中的坐标(X1’,Y1’,Z1’)和其在相机坐标系中的坐标(X1,Y1,Z1)能得到世界坐标系和第一相机101的相机坐标系间的旋转平移关系L W1。第一相机101位于其相机坐标系的原点,即 第一相机101在其相机坐标系中的坐标是已知的,则将第一相机101的相机坐标经旋转平移关系L W1的旋转平移变换后能得到其在世界坐标系中的坐标。
同理,以第二相机102的光心为原点建立其相机坐标系,参考点Q2在第二相机102的相机坐标系中的坐标设为(X2,Y2,Z2),坐标点(X2,Y2,Z2)被光线投影到图像坐标系中的点q2(x2,y2,f2),其中f2代表第二相机102的内部参数焦距,图像坐标系平面与第二相机102的光轴垂直,图像坐标系平面与第二相机102光心的距离为f2,按照三角比例关系能得到比例式:x2/f2=X2/Z2,y2/f2=Y2/Z2。上述比例关系可以用如下的矩阵表示为:
Figure PCTCN2018109737-appb-000003
即通过第二相机102对参考点Q2进行拍摄能获取到参考点Q2在图像坐标系中的坐标(x2,y2,f2),再通过坐标(x2,y2,f2)能得到参考点Q2在第二相机102的相机坐标系中的坐标(X2,Y2,Z2)。且参考点Q2在世界坐标系中的坐标为已知量,用(X2’,Y2’,Z2’)表示参考点Q2在世界坐标系中的坐标。设世界坐标系和第二相机102的相机坐标系间的旋转矩阵为R2,R2为3*3矩阵,平移向量为t2,则世界坐标系和第二相机102的相机坐标系间存在如下关系:
Figure PCTCN2018109737-appb-000004
其中的L W2表示世界坐标系和第二相机102的相机坐标系间的旋转平移关系。即通过参考点Q2的世界坐标系中的坐标(X2’,Y2’,Z2’)和其在相机坐标系中的坐标(X2,Y2,Z2)能得到世界坐标系和第二相机102的相机坐标系间的旋转平移关系L W2。第二相机102位于其相机坐标系的原点,即第二相机102在其相机坐标系中的坐标是已知的,则将第二相机102的坐标经旋转平移关系L W2的旋转平移变换后能得到其在 世界坐标系中的坐标。
上述分别得到了第一相机101和第二相机102在同一世界坐标系中的坐标,即能得到第一相机101和第二相机102之间的位置关系,从而能实现在每帧快速对双目相机的外部参数进行重新标定。
本实施例中,可以设置双目相机相邻两次标定的时间间隔,时间间隔的具体设置根据实际情况而定,在此不做具体限定。
本实施例中,相机标定的装置包括两个相机,在其他实施例中,还可以包括三个,四个或更多数量的相机,此处不做具体限定。
本实施例中选取了两个参考点Q1和Q2,其中参考点Q1在第一相机101的拍摄范围内,参考点Q2在第二相机102的拍摄范围内。在其他实施例中也可只在标识层上选取一个参考点,且该一个参考点既位于第一相机的拍摄范围内也位于第二相机的拍摄范围内;或者在标识层上选取两个以上的参考点,且其中至少一个参考点位于第一相机的拍摄范围内,以及至少一个参考点位于第二相机的拍摄范围内。
本实施例中,处理器设置在第一相机101和第二102的外部,并与两个相机101和102耦接,在其他实施例中,也可以直接将处理器安装在第一相机101或第二相机102的内部,具体不做限定。
由上述可知,本申请的相机标定装置包括相机、位于相机拍摄范围内的基板以及与相机耦接的处理器,且基板上设有标识层,标识层至少包括一个参考点。通过相机对位于其拍摄范围内的参考点进行拍摄获取参考点的图像坐标系坐标,再通过参考点的图像坐标系坐标和世界坐标系坐标获取相机的世界坐标系坐标,从而能够在每帧快速获取相机间的位置关系,进而提升拍摄的精度。
请参阅图3,图3是本申请提供的相机标定方法的流程示意图。以下,详细说明该方法的具体步骤。
步骤301:相机对位于其拍摄范围内的参考点进行拍摄,以获取参考点在图像坐标系中的坐标。
本实施方式中,以两个相机分别对位于其拍摄范围内的参考点进行拍摄为例子进行说明,其他实施例中也可以是更多数量的相机,在此不 做限定。
在一具体实施方式中,请参阅图1和图2,选取参考坐标系平面为透明玻璃基板103所在的平面,参考坐标系为世界坐标系,透明玻璃基板103上的标识层图案为多个矩形单元网格,选取矩形单元网格上的两个十字交叉点为两个参考点,其中一个参考点Q1位于第一相机101的拍摄范围内,另一参考点Q2位于第二相机102的拍摄范围内,由于每个矩形单元网格的物理尺寸是已知的,所以参考点Q1和Q2在世界坐标系中的坐标是已知的,用(X1’,Y1’,Z1’)和(X2’,Y2’,Z2’)分别表示参考点Q1和Q2在世界坐标系中的坐标,用第一相机101对参考点Q1进行拍摄获取参考点Q1在图像坐标系中的坐标(x1,y1,f1),同时用第二相机102对参考点Q2进行拍摄获取参考点Q2在图像坐标系中的坐标(x2,y2,f2),且两个相机101和102将获得的两个参考点Q1和Q2的图像坐标信息传输给处理器。
步骤302:通过参考点在图像坐标系和参考坐标系中的坐标得到相机在参考坐标系中的坐标,以得到相机间的位置关系。
在步骤301中获得了两个参考点Q1和Q2在图像坐标系中的坐标,处理器根据参考点Q1在图像坐标系中的坐标及第一相机101的内部参数计算得到参考点Q1在第一相机101的相机坐标系中的坐标,根据参考点Q1在世界坐标系中的坐标和在第一相机101的相机坐标系中的坐标得到世界坐标系和第一相机101的相机坐标系间的旋转平移关系,从而能得到第一相机101在世界坐标系中的坐标;同理,处理器根据参考点Q2在图像坐标系中的坐标及第二相机102的内部参数计算得到参考点Q2在第二相机102的相机坐标系中的坐标,根据参考点Q2在世界坐标系中的坐标和在第二相机102的相机坐标系中的坐标得到世界坐标系和第二相机102的相机坐标系间的旋转平移关系,从而能得到第二相机102在世界坐标系中的坐标;进而由第一相机101和第二相机102分别在世界坐标系中的坐标即可得到第一相机101和第二相机102之间的位置关系。
具体地,请继续参阅图1和图2,以第一相机101的光心为原点建 立其相机坐标系,参考点Q1在第一相机101的相机坐标系中的坐标设为(X1,Y1,Z1),坐标点(X1,Y1,Z1)被光线投影到图像坐标系中的点q1(x1,y1,f1),根据三角比例关系能得到比例式:x1/f1=X1/Z1,y1/f1=Y1/Z1。上述比例关系可以用如下的矩阵表示为:
Figure PCTCN2018109737-appb-000005
即通过参考点Q1在图像坐标系中的坐标(x1,y1,f1)能得到其在第一相机101的相机坐标系中的坐标(X1,Y1,Z1)。且参考点Q1在世界坐标系中的坐标为已知量,用(X1’,Y1’,Z1’)表示参考点Q1在世界坐标系中的坐标。设世界坐标系和第一相机101的相机坐标系间的旋转矩阵为R1,平移向量为t1,则世界坐标系和第一相机101的相机坐标系间存在如下关系:
Figure PCTCN2018109737-appb-000006
即通过参考点Q1的世界坐标系中的坐标(X1’,Y1’,Z1’)和其在相机坐标系中的坐标(X1,Y1,Z1)能得到世界坐标系和第一相机101的相机坐标系间的旋转平移关系L W1。第一相机101位于其相机坐标系的原点,即第一相机101在其相机坐标系中的坐标是已知的,则将第一相机101的相机坐标经旋转平移关系L W1的旋转平移变换后能得到其在世界坐标系中的坐标。
同理,处理器采用上述计算得到第一相机101的世界坐标系坐标的相同步骤计算得到第二相机102的世界坐标系坐标,从而由两相机101和102分别在同一世界坐标系中的坐标即可得到两相机101和102间的位置关系,从而能实现在每帧快速对双目相机的外部参数进行重新标定。
由上述可知,本申请的相机标定装置包括相机、位于相机拍摄范围内的基板以及与相机耦接的处理器,且基板上设有标识层,标识层至少 包括一个参考点。通过相机对位于其拍摄范围内的参考点进行拍摄获取参考点的图像坐标系坐标,再通过参考点的图像坐标系坐标和世界坐标系坐标获取相机的世界坐标系坐标,从而能够在每帧快速获取相机间的位置关系,进而提升拍摄的精度。
请参阅图4,图4是本申请提供的相机标定的***一实施方式的结构示意图。相机标定的***40包括相机标定的装置401,该相机标定的装置401包括相机、位于相机拍摄范围内的基板以及与相机耦接的处理器,其中,基板设有标识层,标识层至少包括一个参考点。
请参阅图5,图5是本申请提供的具有存储功能的装置一实施方式的结构示意图。该具有存储功能的装置50存储有程序数据501,该程序数据501能够被执行以实现如下方法:相机对位于其拍摄范围内的参考点进行拍摄,以获取参考点在图像坐标系中的坐标;通过参考点在图像坐标系和参考坐标系中的坐标得到相机在参考坐标系中的坐标,以得到相机间的位置关系。
本申请的有益效果是:区别于现有技术的情况,本申请的相机标定装置包括相机、位于相机拍摄范围内的基板以及与相机耦接的处理器,且基板上设有标识层,标识层至少包括一个参考点。通过相机对位于其拍摄范围内的参考点进行拍摄获取参考点的图像坐标系坐标,再通过参考点的图像坐标系坐标和世界坐标系坐标获取相机的世界坐标系坐标,从而能够在每帧快速获取相机间的位置关系,进而提升拍摄的精度。
以上所述仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所做的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (16)

  1. 一种相机标定的装置,其特征在于,所述装置包括相机、位于所述相机拍摄范围内的基板以及与所述相机耦接的处理器,其中,所述基板设有标识层,所述标识层至少包括一个参考点。
  2. 根据权利要求1所述的装置,其特征在于,所述基板为透明玻璃基板。
  3. 根据权利要求2所述的装置,其特征在于,所述标识层位于所述透明玻璃基板的表面或/和内部。
  4. 根据权利要求1所述的装置,其特征在于,所述相机的数量为两个,两个所述相机与所述处理器耦接。
  5. 根据权利要求4所述的装置,其特征在于,所述基板为平面基板,所述平面基板的所述标识层上设有两个所述参考点,两个所述参考点分别位于两个所述相机的拍摄范围内。
  6. 根据权利要求1所述的装置,其特征在于,所述基板上的所述标识层的图案为矩形单元网格,所述参考点为所述矩形单元网格的十字交叉点。
  7. 根据权利要求1所述的装置,其特征在于,所述标识层为半透半反的反光条。
  8. 一种相机标定的***,所述***包括相机标定的装置,其特征在于,所述装置包括相机、位于所述相机拍摄范围内的基板以及与所述相机耦接的处理器,其中,所述基板设有标识层,所述标识层至少包括一个参考点。
  9. 一种相机标定的方法,其特征在于,所述方法包括:
    相机对位于其拍摄范围内的参考点进行拍摄,以获取所述参考点在图像坐标系中的坐标;
    通过所述参考点在图像坐标系和参考坐标系中的坐标得到所述相机在参考坐标系中的坐标,以得到所述相机间的位置关系。
  10. 根据权利要求9所述的方法,其特征在于,所述基板为透明玻璃基板。
  11. 根据权利要求9所述的方法,其特征在于,所述基板为平面基板。
  12. 根据权利要求9所述的方法,其特征在于,所述标识层的图案为矩形单元网格,所述参考点为所述矩形单元网格的十字交叉点。
  13. 根据权利要求9所述的方法,其特征在于,所述标识层为半透半反的反 光条。
  14. 根据权利要求9所述的方法,其特征在于,所述相机的数量为两个,所述参考点的数量为两个,且两个所述参考点分别位于两个所述相机的拍摄范围内,所述相机对位于其拍摄范围内的参考点进行拍摄,以获取所述参考点在图像坐标系中的坐标的步骤包括:
    两个所述相机分别对位于其拍摄范围内的两个所述参考点进行拍摄,以分别获取两个所述参考点在图像坐标系中的坐标。
  15. 根据权利要求14所述的方法,其特征在于,所述通过所述参考点在图像坐标系和参考坐标系中的坐标得到所述相机在参考坐标系中的坐标,以得到所述相机间的位置关系的步骤包括:
    通过两个所述参考点在图像坐标系中坐标分别获取两个所述参考点在相机坐标系中的坐标;
    根据两个所述参考点分别在相机坐标系中的坐标及其分别在世界坐标系中的坐标,分别获取两个所述相机在世界坐标系中的坐标,从而得到两个所述相机之间的位置关系。
  16. 一种具有存储功能的装置,其特征在于,所述装置存储有程序数据,所述程序数据能够被执行以实现如下方法:
    相机对位于其拍摄范围内的参考点进行拍摄,以获取所述参考点在图像坐标系中的坐标;
    通过所述参考点在图像坐标系和参考坐标系中的坐标得到所述相机在参考坐标系中的坐标,以得到所述相机间的位置关系。
PCT/CN2018/109737 2018-10-10 2018-10-10 相机标定的装置、***、方法及具有存储功能的装置 WO2020073261A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880087316.5A CN111630851A (zh) 2018-10-10 2018-10-10 相机标定的装置、***、方法及具有存储功能的装置
PCT/CN2018/109737 WO2020073261A1 (zh) 2018-10-10 2018-10-10 相机标定的装置、***、方法及具有存储功能的装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/109737 WO2020073261A1 (zh) 2018-10-10 2018-10-10 相机标定的装置、***、方法及具有存储功能的装置

Publications (1)

Publication Number Publication Date
WO2020073261A1 true WO2020073261A1 (zh) 2020-04-16

Family

ID=70164291

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/109737 WO2020073261A1 (zh) 2018-10-10 2018-10-10 相机标定的装置、***、方法及具有存储功能的装置

Country Status (2)

Country Link
CN (1) CN111630851A (zh)
WO (1) WO2020073261A1 (zh)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105631853A (zh) * 2015-11-06 2016-06-01 湖北工业大学 车载双目相机标定及参数验证方法
CN106713897A (zh) * 2017-02-27 2017-05-24 驭势科技(北京)有限公司 双目相机以及用于双目相机的自校准方法
CN108416791A (zh) * 2018-03-01 2018-08-17 燕山大学 一种基于双目视觉的并联机构动平台位姿监测与跟踪方法

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7429999B2 (en) * 2004-05-24 2008-09-30 CENTRE DE RECHERCHE INDUSTRIELLE DU QUéBEC Camera calibrating apparatus and method
CN100470590C (zh) * 2007-02-05 2009-03-18 武汉大学 相机标定方法及所用标定装置
CN202057299U (zh) * 2010-12-14 2011-11-30 李军 可自动建立图像坐标系的图像标定模板
EP2808645B1 (en) * 2012-01-23 2019-02-20 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN102788552B (zh) * 2012-02-28 2016-04-06 王锦峰 一种线性坐标校正方法
CN202814413U (zh) * 2012-09-28 2013-03-20 北京航天计量测试技术研究所 一种用于相机参数标定的高精度标定板
CN103679693B (zh) * 2013-01-25 2017-06-16 杭州先临三维科技股份有限公司 一种多相机单视图标定装置及其标定方法
CN103632364A (zh) * 2013-11-06 2014-03-12 同济大学 一种多相机摄影测量***中相机空间位置关系标定装置
CN104375375A (zh) * 2014-11-17 2015-02-25 国家电网公司 采用棋盘格标定可见光及红外热像仪相机的方法及装置
CN104680535A (zh) * 2015-03-06 2015-06-03 南京大学 一种双目直视相机的标定靶标、标定***及标定方法
CN105469418B (zh) * 2016-01-04 2018-04-20 中车青岛四方机车车辆股份有限公司 基于摄影测量的大视场双目视觉标定装置及方法
CN107146254A (zh) * 2017-04-05 2017-09-08 西安电子科技大学 多相机***的相机外参数标定方法
CN206863818U (zh) * 2017-06-20 2018-01-09 成都通甲优博科技有限责任公司 一种标定模板
CN108171758B (zh) * 2018-01-16 2022-02-11 重庆邮电大学 基于最小光程原理和透明玻璃标定板的多相机标定方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105631853A (zh) * 2015-11-06 2016-06-01 湖北工业大学 车载双目相机标定及参数验证方法
CN106713897A (zh) * 2017-02-27 2017-05-24 驭势科技(北京)有限公司 双目相机以及用于双目相机的自校准方法
CN108416791A (zh) * 2018-03-01 2018-08-17 燕山大学 一种基于双目视觉的并联机构动平台位姿监测与跟踪方法

Also Published As

Publication number Publication date
CN111630851A (zh) 2020-09-04

Similar Documents

Publication Publication Date Title
CN106846415B (zh) 一种多路鱼眼相机双目标定装置及方法
WO2018153374A1 (zh) 相机标定
CN105678742B (zh) 一种水下相机标定方法
CN102509261B (zh) 一种鱼眼镜头的畸变校正方法
TWI555379B (zh) 一種全景魚眼相機影像校正、合成與景深重建方法與其系統
CN107025670A (zh) 一种远心相机标定方法
CN111192235B (zh) 一种基于单目视觉模型和透视变换的图像测量方法
JP2017108387A (ja) パノラマ魚眼カメラの画像較正、スティッチ、および深さ再構成方法、ならびにそのシステム
WO2019105261A1 (zh) 背景虚化处理方法、装置及设备
JP2004037270A (ja) キャリブレーション用データ測定装置、測定方法及び測定プログラム、並びにコンピュータ読取可能な記録媒体、画像データ処理装置
WO2019232793A1 (zh) 双摄像头标定方法、电子设备、计算机可读存储介质
JP2023030021A (ja) モバイルデバイス用プレノプティックカメラ
WO2023201578A1 (zh) 单目激光散斑投影***的外参数标定方法和装置
CN109191527A (zh) 一种基于最小化距离偏差的对位方法及装置
CN111896032B (zh) 一种单目散斑投射器位置的标定***及方法
CN110874854A (zh) 一种基于小基线条件下的大畸变广角相机双目摄影测量方法
JP2010276433A (ja) 撮像装置、画像処理装置及び距離計測装置
CN113920206A (zh) 透视移轴相机的标定方法
WO2023236508A1 (zh) 一种基于亿像素阵列式相机的图像拼接方法及***
CN115359127A (zh) 一种适用于多层介质环境下的偏振相机阵列标定方法
CN112950727B (zh) 基于仿生曲面复眼的大视场多目标同时测距方法
CN109084679B (zh) 一种基于空间光调制器的3d测量及获取装置
CN105203102B (zh) 基于s-波片的天空偏振模式探测方法与***
WO2020073261A1 (zh) 相机标定的装置、***、方法及具有存储功能的装置
WO2021093804A1 (zh) 全向立体视觉的摄像机配置***及摄像机配置方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18936700

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18936700

Country of ref document: EP

Kind code of ref document: A1