WO2021148050A1 - 一种三维空间相机及其拍照方法 - Google Patents

一种三维空间相机及其拍照方法 Download PDF

Info

Publication number
WO2021148050A1
WO2021148050A1 PCT/CN2021/075868 CN2021075868W WO2021148050A1 WO 2021148050 A1 WO2021148050 A1 WO 2021148050A1 CN 2021075868 W CN2021075868 W CN 2021075868W WO 2021148050 A1 WO2021148050 A1 WO 2021148050A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
dimensional
point
optical
Prior art date
Application number
PCT/CN2021/075868
Other languages
English (en)
French (fr)
Inventor
丁勇
丁大威
丁大路
江蓉芝
江小海
寄勤福
Original Assignee
上海爱德赞医疗科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海爱德赞医疗科技有限公司 filed Critical 上海爱德赞医疗科技有限公司
Priority to US17/794,264 priority Critical patent/US11997247B2/en
Publication of WO2021148050A1 publication Critical patent/WO2021148050A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the invention belongs to the technical field of three-dimensional space cameras, and in particular relates to a three-dimensional space camera and a photographing method thereof.
  • monocular structured light ranging can only be used for short-distance measurement, such as the face distance developed to improve the accuracy of face recognition Structured light ranging
  • binocular stereo vision can more accurately restore the three-dimensional information of the field of view through the parallax information of the two images provided by the left and right cameras.
  • the matching analysis of the corresponding points of the left and right images is required.
  • the calculation workload is large, and the binocular vision measurement is also easily affected by the mismatch of feature points, which is difficult to meet the real-time requirements.
  • the purpose of the present invention is to provide a three-dimensional space camera and a photographing method thereof.
  • a three-dimensional space camera includes an image-photographing unit and an image-photographing unit under the same optical system, and also includes a processing system for performing data processing on the image-photographing unit and the image-two photographing unit, the processing system including image-photographing
  • the processing system is connected to the control line with the image by signal connection and control line.
  • the unit and the image two are connected to the photographing unit.
  • the same optical system is a coaxial spherical surface system, and the distance between the principal optical planes of the image-photographing unit and the image-photographing unit is D, and D is not less than zero.
  • an image one conversion unit is further provided between the image one photography unit and the image two photography unit, the image one conversion unit is connected to the processing system through a signal connection and a control line, and the image one conversion unit is at least one ,
  • the distance between the main optical plane of the image-to-conversion unit and the image-to-photographing unit is E, and E is not less than zero.
  • the image-photographing unit is also provided with at least one image-parallel correlation imaging unit, the image-parallel correlation imaging unit is connected to the processing system through a signal connection and a control line, and the optical path of the image-parallel correlation imaging unit It is the same as or related to the optical path of the image-photographing unit, and the distance between the image-parallel correlation imaging unit and the optical principal plane of the image-photographing unit is G, and G is not less than zero.
  • At least one image-parallel related imaging unit conversion unit is also provided between the image-parallel related imaging unit and the image two photographing unit, and the image-parallel related imaging unit conversion unit is connected to the processing unit through signal and control lines.
  • the distance between the image-related imaging unit and the conversion unit is K, and K is not less than zero.
  • a photographing method of a three-dimensional space camera When the optical system and the optical subsystem are both coaxial spherical systems, the three-dimensional coordinates of the object point are characterized by: H0 object height, U0 object distance, and the angle between the ⁇ optical meridian, including The following steps:
  • Image 1 The photographing unit takes a picture of an object point in a three-dimensional space, and records a two-dimensional image with light sensitivity. On this two-dimensional image 1, there is an angle coordinate of the image height of the object point and the optical meridian plane;
  • Image 2 The photographing unit takes a picture of the image point 1 on the optical path of the image 1 photographing unit, and records a two-dimensional image 2 with light sensitivity.
  • This two-dimensional image 2 has the image height and optics of the image point 2 indirectly from the object point.
  • the image shooting control unit sets a set of optical system parameters. This set of parameters can clearly record the image point 1 and image point 2 of the object point in a certain depth of field on the image 1 and image 2, respectively, and the recording storage unit will record the image 1
  • the data of the image 1 and the image 2 on the photoreceptor of the photographing unit and the image 2 are recorded and stored.
  • the three-dimensional coordinate calculation unit uses the data recorded by the storage and recording unit and the pair of optical system parameters of the image 1 photographing unit and the image 2 photographing unit. Calculate the two-dimensional coordinates of the image points on image one and image two to obtain the three-dimensional object distance of the three-dimensional object point in space, the three-dimensional coordinate of the angle between the object height and the optical meridian, and store it;
  • step 3) to obtain and store the three-dimensional coordinate information of a three-dimensional object point in another depth of field;
  • step 2) at least one image-conversion unit is provided between the image-photographing unit and the image-two-photographing unit, and the image-conversion unit optically converts the image point to the image on the optical path of the image-conversion unit.
  • Point i, image 2 The photographing unit takes a picture of image point i, and records a two-dimensional image three with sensitivity. This two-dimensional image three has the image height of image point two indirectly from the object point and the angle coordinate of the optical meridian plane.
  • the image-photographing unit is also provided with at least one image-parallel correlation imaging unit, and the image-parallel correlation imaging unit also images the object points in the three-dimensional space to obtain image point one', image
  • the second photographing unit takes a picture of the image point 1', and records a two-dimensional image 4, which has the image height of the image point 2 indirectly from the object point and the angle coordinates of the optical meridian plane.
  • image to obtain image point one', image one parallel correlation conversion unit optically converts image point one' to image point i'on the optical path of image one parallel correlation transformation unit, image two photographing unit takes pictures of image point i', and photosensitive records
  • a two-dimensional image five, this two-dimensional image five has the coordinates of the angle between the image point two indirectly from the object point and the optical meridian plane.
  • the light or wave used to form image one and image two or image three or image four or image five is one of visible light, infrared light, ultraviolet light, X-ray electromagnetic waves, and ultrasonic waves.
  • the light emission of the object point is the light of the object point itself or the light generated by the light source irradiating the object point.
  • the three-dimensional coordinate calculation unit uses the data recorded by the storage and recording unit and the optical system parameters of the image one photographing unit and the image two photographing unit to determine the two-dimensional coordinates of the image points on the image one and the image two. Perform calculations to obtain the object distance of the three-dimensional object point in space, the three-dimensional coordinates of the angle between the object height and the optical meridian surface.
  • the specific calculation process is as follows:
  • Oi is the smallest or independent set of N optical system parameters describing the characteristics of the entire optical system
  • the included angle of the optical meridian of the object point ⁇ 0;
  • ⁇ 0 ⁇ (Oi, ⁇ 1, ⁇ 2,H1,H2)
  • the present invention has the advantage that: because the camera shoots one or more sets of two images at the same time by taking three-dimensional object points or all spatial object points constituting three-dimensional objects (taking double photos), and calculating these object points in the double photos To calculate the three-dimensional coordinates of the three-dimensional object points or all the three-dimensional object points that constitute the three-dimensional object (corresponding to the shooting, recording and storage of three-dimensional photos), the three-dimensional objects obtained by the double-photographing method
  • the three-dimensional coordinates can be used to display three-dimensional objects in three-dimensional space using a computer or other three-dimensional display technology.
  • FIG. 1 is a schematic diagram of the structure of the three-dimensional space stereo camera of the present invention.
  • Fig. 2 is a schematic diagram of the structure of a three-dimensional stereo camera with an image conversion unit behind the image-photographing unit.
  • Fig. 3 is a schematic structural diagram of a three-dimensional space stereo camera with an image-parallel correlation imaging unit.
  • Fig. 4 is a schematic structural diagram of a three-dimensional space stereo camera having a parallel related imaging unit and an image conversion unit at the same time.
  • FIG. 11 is a schematic diagram of the optical path configuration of Experimental Example 2.
  • FIG. 11 is a schematic diagram of the optical path configuration of Experimental Example 2.
  • FIG. 13 is a schematic diagram of the optical path configuration of Experimental Example 4.
  • FIG. 13 is a schematic diagram of the optical path configuration of Experimental Example 4.
  • the present invention discloses a three-dimensional space camera, which includes an image-photographing unit 2 and an image-two photographing unit 3 under the same optical system, and also includes performing data on the image-photographing unit 2 and the image-two photographing unit 3 under the same optical system.
  • Processed processing system 4 said processing system 4 includes a control unit for image shooting of the image one photography unit and the image two photography unit, a record storage unit for storing the captured images, and three-dimensional coordinates for calculating the record storage unit
  • the calculation unit, the same optical system is a coaxial spherical system.
  • the distance between the principal optical planes of the image one photographing unit 2 and the image two photographing unit 3 is D, and D is not less than zero.
  • an image-to-photographing unit 7 is also provided between the image-to-photographing unit 2 and the image-to-photographing unit 3. There is at least one image-to-photographing unit 7.
  • the distance between the optical principal planes is E, and E is greater than or equal to zero.
  • the image-photographing unit 2 is also provided with at least one image-parallel correlation imaging unit 9.
  • the optical path of the image-parallel correlation imaging unit 9 and the optical path of the image-photographing unit 2 Same or related.
  • Optical path correlation refers to expressing the imaging characteristics of two optical paths through some calculation parameters.
  • the distance between the image-parallel correlation imaging unit 9 and the optical principal plane of the image-photographing unit 2 is G, G Greater than or equal to zero.
  • At least one image-parallel correlation imaging unit conversion unit 12 is further provided between the image-parallel correlation imaging unit 9 and the image two photographing unit 3.
  • the distance between the main optical plane of the image-parallel correlation imaging unit 9 and the conversion unit 12 is K, and K is greater than or equal to zero.
  • the processing system 4 is respectively connected to the image-photographing unit 2, the image-two photographing unit 3, the image-conversion unit 7, the image-parallel correlation imaging unit 9, and the image-parallel correlation imaging unit conversion unit 12 through a signal connection and control line 11
  • the image-photographing unit 2, the image-two photographing unit, the image-conversion unit 7, the image-parallel related imaging unit 9 and the image-parallel related imaging unit conversion unit 12 are assembled using imaging optical components, such as convex lenses, concave lenses, Concave mirror, convex mirror, flat mirror, body-shaped diffuser, body-shaped luminous body, CCD photoreceptor chip or CMOS photoreceptor chip, etc.
  • the image-photographing unit can use a convex lens and a CCD photoreceptor.
  • the invention also discloses a photographing method of a three-dimensional space camera. Under the coaxial spherical surface system, the three-dimensional coordinates of an object point are characterized as: H0 object height, U0 object distance and the angle between the ⁇ optical meridian surface, and include the following steps:
  • Image 1 The photographing unit 2 takes a picture of the light-emitting object point 1 in the three-dimensional space, and records a two-dimensional image 1 with the light-emitting object point 1.
  • the image height and optical meridian surface of the image point 1 of the light-emitting object point 1 are on this two-dimensional image 1.
  • Angle coordinates
  • image two photographing unit 3 takes a picture of image point 5 on the optical path of image one photographing unit 2, and records a two-dimensional In the second image, the second two-dimensional image has the coordinates of the angle between the image height and the optical meridian plane of the image point 6 indirectly from the luminous object point 1.
  • the image-conversion unit 7 optically converts the image point 5 to the optical path of the image-conversion unit
  • the image point i(8) of the image 2 is the image point i(8).
  • the photographing unit 3 takes a picture of the image point i(8), and records a two-dimensional image 3 with light sensitivity. This two-dimensional image 3 has the image point 2 6 indirectly from the object point 1. The coordinate of the angle between the image height and the optical meridian.
  • the image-parallel correlation imaging unit 9 also images the light-emitting object point 1 in the three-dimensional space to obtain image point 1' 10.
  • Image 2 The photographing unit 3 takes a picture of image point 1'10, and records a two-dimensional image 4, which has the image height and optical meridian surface of image point 2 6 indirectly from the light-emitting object point 1. Angle coordinates.
  • the image-parallel correlation imaging unit 9 will also be used for luminous objects in the three-dimensional space.
  • Image point 1 is imaged, and image point 1'10 is obtained.
  • Image-parallel correlation conversion unit 12 optically converts image point 1'10 to image point i'19 on the optical path of image-parallel correlation conversion unit 12.
  • Image 2 photographing unit 3 pairs The image point i'19 is photographed, and a two-dimensional image 5 is recorded with light sensitivity.
  • This two-dimensional image 5 has the image height of the image point 2 indirectly from the object point 1 and the angle coordinates of the optical meridian plane.
  • the image shooting control unit sets a set of optical system parameters.
  • This set of parameters can clearly record the image point 1 and image point 2 of the object point in a certain depth of field on the image 1 and image 2, respectively, and the recording storage unit will record the image 1 Photographic unit and image two
  • the data of image one and image two on the photoreceptor are recorded and stored.
  • the three-dimensional coordinate calculation unit uses the data recorded by the recording and storage unit and the optical system parameters of the image one and two photo units to compare the image 1. Calculate the two-dimensional coordinates of the image point on image two to obtain the object distance of the three-dimensional object point in space, and the three-dimensional coordinates of the angle between the object height and the optical meridian plane and store it;
  • step 3) to obtain and store the three-dimensional coordinate information of a three-dimensional object point in another depth of field;
  • the light or wave used to form image 1 and image 2 is one of visible light, infrared light, ultraviolet light, X-ray, electromagnetic wave, and ultrasonic wave.
  • the light emission of the object point is the light of the object point itself or the light generated by the light source irradiating the object point.
  • FIG. 5 is a schematic diagram of an example of an optical path at a certain angle of an optical meridian, corresponding to the structure of the three-dimensional stereo camera system of FIG. 1.
  • the light path setting and photography process are described as follows:
  • the optical path system is a coaxial spherical system.
  • the photographic imaging lenses of the image 1 photographing unit and the image 2 photographing unit both use convex lenses, the convex lens 13 and the convex lens 14, and their focal lengths are f1 and f2, respectively.
  • the distance between the principal optical planes of the two convex lenses is D.
  • the image-photographing unit uses a semi-transparent and semi-reflective mirror 15 (as shown in Figure 5, the upper part of the mirror reflects light and the other side is transparent), which is placed at the focal point of the convex lens 13 at 45° from the optical axis Front.
  • the semi-transmissive and semi-reflective 45° reflector 15 reflects part of the light condensed by the convex lens 13, and images this part of the reflected light on the photoreceptor (such as CCD or CMOS) in the image-capturing unit to obtain a two-dimensional image. one.
  • the above two-dimensional image 1 has the coordinates of the angle between the image height of the image point 5 (or the image point group 1 of the object) from the light-emitting object point 1 and the optical meridian plane.
  • H0 Object height (the height of the object point on the optical spindle);
  • U0 Object distance (the axial distance between the object point and the principal optical point);
  • The included angle of the optical meridian (object point, the surface formed by the optical principal axis is the meridian surface of the optical system.
  • is the angle between the meridian surface where the object point is located and the main meridian surface Horn).
  • the object point and the two image points are in the same optical meridian, so the angle ⁇ 0 of the optical meridian of the object point is equal to the angle ⁇ 1, ⁇ 2 of the optical meridian of the two image points, that is:
  • H1, H2, ⁇ 1, ⁇ 2, 4 parameters are obtained through two light-sensitive shooting tests.
  • the light-sensitive shooting test the shutter is opened, CCD or CMOS exposure records the image, and the shutter is closed after a certain (very short) time after the shutter is opened.
  • the unknown H0, U0, ⁇ 0, V1, U2, V2, 6 parameters can be obtained by calculating and solving the above 6 formulas: For example, the calculation results of U0, H0, V1, and V2 are:
  • Formulas 7 and 8 show that by taking double photos as shown in Figure 5 and measuring two image heights, the three-dimensional coordinates of the object point can be calculated: object distance, object height, and the angle between the object and the optical meridian (the object image is on the same meridian) Inside).
  • the object distance is only related to the image height ratio (H2/H1); the object height is related to the image height (H1) and The image height ratio (H2/H1) is related to both.
  • Formulas 9, 10 show that by taking double photos as shown in Figure 5 and measuring two image heights, the image distance of the two photographing units can also be calculated. Under the fixed f1, f2, and D, three optical system parameters, the two image distances V1, V2 are also only related to the image height ratio (H2/H1). Generally, the image distance is not measured, and the correct image distance corresponds to a clear image.
  • a large and small object in the three-dimensional space can be regarded as a group of object points within the range of the object distance in the depth direction corresponding to the size of the object.
  • Multiple objects in three-dimensional space can be seen as a group of multiple object points within a certain object distance range. If multiple objects are within a certain depth of field of the optical system, you can take a set of double photos in this depth of field and measure the angle object points of the optical meridian surface of the image point group corresponding to the object point group on the photo to get The angle coordinate of the optical meridian of the object point group. Calculate the object distance and object height coordinates of the object point group by formulas 7 and 8. This corresponds to taking a three-dimensional photo within this depth of field. You can change the optical system parameter configuration, change the depth of field coverage, scan and take double photos, and get a three-dimensional photo of a three-dimensional object within a larger object distance/depth range.
  • Figure 9 shows that, at a distance of 60 meters, the image height ratio still decreases monotonously as the object distance increases.
  • Figure 10 shows that for a 1-meter high object, the two image heights are still measurable from 60 meters to 100 meters.
  • Figures 7, 8, 9, 10 illustrate that the optical system of the present invention can take a group of double photos in the depth of field of the system, and can measure and calculate the three-dimensional object coordinates within the depth of field.
  • the depth range of the system can be changed by changing the focal length (f1, f2) of the system and the distance (D) between the principal optical planes. Taking double photos with different depth of field (scanning and taking photos), you can measure the three-dimensional coordinates (optical meridian angle, object distance, object height) of three-dimensional objects in a larger object distance range, or take a larger object distance A three-dimensional photo of a group of three-dimensional objects in the range.
  • the three-dimensional coordinate measurement calculation principle of the above three-dimensional stereo camera can be further extended to the universally applicable non-paraxial approximation.
  • the three-dimensional coordinate calculation formula of an object point can be unified as:
  • Oi is the smallest or independent set of N optical system parameters that can describe the characteristics of the entire optical system. For example, the object side/object side focal length of each optical component, the object side/image side main plane distance, the aspheric geometric parameters of the components, the main plane distance or relative position between the components, etc.
  • the included angle of the optical meridian of the object point ⁇ 0;
  • ⁇ 0 ⁇ (Oi, ⁇ 1, ⁇ 2,H1,H2)
  • Formulas 11, 12, and 13 can be applied to the calculation of three-dimensional coordinates of object points or object point groups in the following implementation examples.
  • Fig. 11 is a schematic diagram of an example of an optical path at a certain angle between an optical meridian, corresponding to the structure of the three-dimensional stereo camera system of Fig. 2.
  • the light path setting and photography process are described as follows:
  • the optical path system is a coaxial spherical system.
  • the photographic imaging lenses of the image 1 and image 2 photographing units both use convex lenses, the convex lens 13 and the convex lens 14, and their focal lengths are f1 and f2, respectively.
  • the distance between the principal optical planes of the convex lenses 13 and 14 is D.
  • the image-conversion unit also uses convex lens 16 (convex lens Fi), and its focal length is fi.
  • the distance between the convex lens 16 and the principal optical plane of the convex lens 14 is E.
  • the image-photographing unit uses a semi-transparent and semi-reflective mirror 15 (as shown in Figure 11, the upper part of the mirror 15 is partially reflected, and the other side is transparent), which is placed on the convex lens 13 at an angle of 45° to the optical axis. Focus on the front.
  • the semi-transmissive and semi-reflective 45° reflector 15 reflects part of the light condensed by the convex lens 13, and images this part of the reflected light on the photoreceptor (such as CCD or CMOS) in the image-capturing unit to obtain a two-dimensional image. one.
  • the above two-dimensional image 1 has the angle coordinates of the image point 1 from the object point (light) (or the image point group 1 of the object) and the optical meridian plane.
  • the light from the above image point H1 (emitting) on the optical path passes through the convex lens Fi (16), and forms the image on the optical path of the image-conversion unit (the image distance of the lens Fi shown in Figure 11 is Vi, and the image height is Hi).
  • the image point shown in the figure is a virtual image.
  • the three-dimensional coordinates of the object point or object point group within the depth of field of the optical system are calculated by formulas 11, 12, and 13.
  • the range of depth of field of the system can be changed by changing the focal length (f1, f2, fi) of the system and the distance between the principal optical planes (D, E). Taking double photos with different depth of field (scanning and taking photos), you can measure the three-dimensional coordinates (optical meridian angle, object distance, object height) of three-dimensional objects in a larger object distance range, or take a larger object distance A three-dimensional photo of a group of three-dimensional objects in the range.
  • Fig. 12 is a schematic diagram of an example of an optical path at a certain angle of the optical meridian, corresponding to the structure of the three-dimensional space stereo camera system of Fig. 2.
  • the light path setting and photography process are described as follows:
  • the optical path system is a coaxial spherical system.
  • the image-photographing unit uses convex lens 13 (convex lens F1), and its focal length is f1.
  • the image two photographing unit uses a concave mirror 20, and its focal length is f2.
  • the concave mirror 20 is a concave mirror whose convex surface is fully transparent and the concave surface is partially reflective.
  • the distance between the principal optical plane of the convex lens 13 and the concave mirror 20 is D.
  • the image-to-conversion unit uses a flat mirror Fi (17) that reflects light on the left as shown in FIG. 12.
  • the image-photographing unit uses a semi-transparent and semi-reflective reflector 15 (as shown in Figure 12, the upper part of the reflector reflects light, and the other side is transparent), placed at the focal point of the convex lens 13 at 45° from the optical axis Front.
  • the semi-transparent 45° reflector 15 condenses the light through the convex lens 13 and passes through the concave mirror 20, reflecting a part of the light, and imaging this part of the reflected light on the photoreceptor (such as CCD or CMOS) in the image-capturing unit. )
  • the photoreceptor such as CCD or CMOS
  • the above two-dimensional image 1 has the coordinates of the angle between the image point 1 from the object point (light) (or the image point group 1 of the object) and the optical meridian plane.
  • the light from the above image point Hi on the optical path passes through the concave mirror 20 and is imaged on the photoreceptor in the second image capturing unit to obtain the second two-dimensional image.
  • the three-dimensional coordinates of the object point or object point group within the depth of field of the optical system are calculated by formulas 11, 12, and 13.
  • the depth range of the system can be changed by changing the focal length (f1, f2) of the system and the distance between the principal optical planes (D, E). Taking double photos with different depth of field (scanning and taking photos), you can measure the three-dimensional coordinates (optical meridian angle, object distance, object height) of three-dimensional objects in a larger object distance range, or take a larger object distance A three-dimensional photo of a group of three-dimensional objects in the range.
  • FIG. 13 is a schematic diagram of an example of an optical path at a certain angle of the optical meridian, corresponding to the structure of the three-dimensional stereo camera system of FIG. 2.
  • the light path setting and photography process are described as follows:
  • the optical path system is a coaxial spherical system.
  • the image-photographing unit uses convex lens 13 (convex lens F1), and its focal length is f1.
  • the image two photographing unit uses a concave mirror 20, and its focal length is f2.
  • the concave mirror 20 is a concave mirror whose convex surface is fully transparent and the concave surface is partially reflective.
  • the distance between the principal optical plane of the convex lens 13 and the concave mirror 20 is D.
  • the image-conversion unit uses the body-shaped diffuser or body-shaped luminous body Fi (18) as shown in FIG. 13.
  • the image-photographing unit uses a semi-transparent and semi-reflective mirror (as shown in Figure 13, the upper part of the mirror reflects light and the other side is transparent), placed in front of the focal point of the convex lens F1 at 45° from the optical axis .
  • This transflective 45° reflector condenses the light through the convex lens F1 and passes through the concave mirror 20, reflecting a part of the light, and imaging this part of the reflected light on the photoreceptor (such as CCD or CMOS) in the image-capturing unit. Above, get two-dimensional image one.
  • the above two-dimensional image 1 has the coordinates of the angle between the image point 1 from the object point (light) (or the image point group 1 of the object) and the optical meridian plane.
  • the light from the above image point H1 on the optical path (emission), passing through the body-shaped astigmatism body astigmatism, can be seen as an in-situ image on the same spot of the image point H1.
  • This image point Hi is a real image.
  • the light from the above image point Hi on the optical path passes through the concave mirror 20 and is imaged on the photoreceptor in the second image capturing unit to obtain the second two-dimensional image.
  • the three-dimensional coordinates of the object point or object point group within the depth of field of the optical system are calculated by formulas 11, 12, and 13.
  • the depth range of the system can be changed by changing the focal length (f1, f2) of the system and the distance between the principal optical planes (D, E). Taking double photos with different depth of field (scanning and taking photos), you can measure the three-dimensional coordinates (optical meridian angle, object distance, object height) of three-dimensional objects in a larger object distance range, or take a larger object distance A three-dimensional photo of a group of three-dimensional objects in the range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请属于三维空间相机技术领域,具体公开了一种三维空间相机及其拍照方法,包括同一光学***下的图像一摄影单元和图像二摄影单元,还包括对图像一摄影单元和图像二摄影单元进行数据处理的处理***,所述处理***包括对图像一摄影单元和图像二摄影单元进行图像拍摄的控制单元,对控制单元进行处理的记录存储单元,以及对记录存储单元进行计算的三维坐标计算单元,通过对三维空间物点或构成三维物体的全部空间物点同时拍摄一组或多组两幅图像(拍双照),通过计算这些物点在双照中的高度比例关系,来计算出三维空间物点或构成三维物体的全部空间物点正对于相机的三维坐标。

Description

一种三维空间相机及其拍照方法 技术领域
本发明属于三维空间相机技术领域,尤其涉及一种三维空间相机及其拍照方法。
背景技术
人类发明照相机已经有180年了,但这些相机只能记录二维世界,没有第三维度的距离深度信息,为了得到距离深度信息,我们还需要一些辅助技术,如:超声波测距、激光雷达测距、红外测距、单目结构光测距以及双目测距等;超声、激光、红外等装置通过测量发射源发射与返回之间的时间差来计算被测物体与传感器之间的距离,称之为主动法,主动法测距比较方便、迅速,计算较为简单,因此在实时控制上得到了广泛的应用,但是发射和接收设备价格昂贵,成本较高,而且反射、噪音、交叉等环境问题难于避免,实现大像素比较困难,要实现普遍应用性,还是有困难的;单目结构光测距只能用于短距的测量,如为提高人脸识别准确度而开发的人脸距离的结构光测距;双目立体视觉通过左右摄像机提供的两个图像的视差信息能够比较准确地恢复视场的三维信息,但是,双目视觉的得到空间距离信息,需要左右图像对应点的匹配分析,计算工作量大,双目视觉测量还容易受到特征点误匹配的影响,难以满足实时性要求。
发明内容
本发明的目的是提供一种三维空间相机及其拍照方法。
为达到上述目的,本发明采用的技术方案是:
一种三维空间相机,包括同一光学***下的图像一摄影单元和图像二摄影单元,还包括对图像一摄影单元和图像二摄影单元进行数据处理的处理***, 所述处理***包括对图像一摄影单元和图像二摄影单元进行图像拍摄的控制单元,对控制单元进行处理的记录存储单元,以及对记录存储单元进行计算的三维坐标计算单元,所述处理***通过信号连接与控制线与图像一摄影单元、图像二摄影单元连接。
同一光学***为共轴球面***,所述图像一摄影单元与图像二摄影单元的光学主平面之间的距离为D,D不小于零。
进一步的,所述图像一摄影单元和图像二摄影单元之间还设有图像一转换单元,所述图像一转换单元通过信号连接与控制线与处理***连接,所述图像一转换单元至少为一个,图像一转换单元与图像一摄影单元的光学主平面之间的距离为E,E不小于零。
进一步的,所述图像一摄影单元还设有至少一个图像一并行相关成像单元,所述图像一并行相关成像单元通过信号连接与控制线与处理***连接,所述图像一并行相关成像单元的光路与图像一摄影单元的光路相同或相关,所述图像一并行相关成像单元与图像一摄影单元的光学主平面之间的距离为G,G不小于零。
进一步的,所述图像一并行相关成像单元与图像二摄影单元之间还设有至少一个图像一并行相关成像单元转换单元,所述图像一并行相关成像单元转换单元通过信号连接与控制线与处理***连接,所述图像一相关成像单元与转换单元之间的距离为K,K不小于零。
一种三维空间相机的拍照方法,当所述光学***与光学子***都为共轴球面***时,物点的三维坐标表征为:H0物高、U0物距和α光学子午面夹角,包括以下步骤:
1)图像一摄影单元对三维空间的物点进行拍照,感光记录一个二维的图像一, 在这个二维图像一上有物点的像点一的像高与光学子午面夹角坐标;
2)图像二摄影单元对图像一摄影单元光路上的像点一进行拍照,感光记录一个二维的图像二,这个二维的图像二上有间接来自物点的像点二的像高与光学子午面夹角坐标;
3)图像拍摄控制单元设置一组光学***参数,这组参数可以对某一个景深内的物点的像点一和像点二分别清晰记录在图像一与图像二上,记录存储单元将图像一摄影单元与图像二摄影单元感光器上的图像一与图像二的数据进行记录并存储,三维坐标计算单元利用存储与记录单元记录的数据以及图像一摄影单元和图像二摄影单元的光学***参数对图像一与图像二上像点的二维坐标进行计算,得到空间三维物点的物距,物高与光学子午面夹角的三维坐标并存储;
4)重复步骤3),得到另一个景深内的三维空间物点的三维坐标信息并存储;
5)对需要三维坐标测量的物点,通过划分一个到多个景深,重复步骤3)和步骤4),得到所需的整个深度范围内物点的一组成对的图像一与图像二,以及整个深度范围内的空间三维空间物点的物距,物高与光学子午面夹角的三维坐标信息,并存储,再通过显示器或投影仪显示出物体的三维立体照片。
进一步的,所述的步骤2)中:图像一摄影单元与图像二摄影单元之间设有至少一个图像一转换单元,图像一转换单元将像点一光学转换到图像一转换单元光路上的像点i,图像二摄影单元对像点i进行拍照,感光记录一个二维的图像三,这个二维图像三上有间接来自物点的像点二的像高与光学子午面夹角坐标。
进一步的,所述的步骤2)中:图像一摄影单元还设有至少一个图像一并行相关成像单元,图像一并行相关成像单元也对三维空间的物点进行成像,得到像点一’,图像二摄影单元对像点一’进行拍照,感光记录一个二维的图像四,这 个二维图像四上有间接来自物点的像点二的像高与光学子午面夹角坐标。
进一步的,所述的步骤2)中:图像一并行相关成像单元与图像二摄影单元之间还设有至少一个图像一并行相关转换单元,图像一并行相关成像单元也对三维空间的物点进行成像,得到像点一’,图像一并行相关转换单元将像点一’光学转换到图像一并行相关转换单元光路上的像点i’,图像二摄影单元对像点i’进行拍照,感光记录一个二维的图像五,这个二维图像五上有间接来自物点的像点二的像高与光学子午面夹角坐标。
进一步的,用来形成图像一与图像二或图像三或图像四或图像五的光或波为可见光,红外光,紫外光,X‐光电磁波,超声波中的一种。
进一步的,所述物点的发光,是物点本身的光或光源照射物点产生的光。
进一步的,所述的步骤3)中:三维坐标计算单元利用存储与记录单元记录的数据以及图像一摄影单元和图像二摄影单元的光学***参数对图像一与图像二上像点的二维坐标进行计算,得到空间三维物点的物距,物高与光学子午面夹角的三维坐标,具体计算过程如下:
光学***的全部光学***参数集合,Oi(i=1,…,N);
Oi为描述整个光学***特性的最少的或独立的,N个光学***参数的集合;
像点一与像点二的光学子午面夹角,α1,α2;
像点一与像点二的像高,H1,H2;
也即:
物点的光学子午面夹角,α0;
α0=Φ(Oi,α1,α2,H1,H2)
i=1,2,…N     (公式11)
物点的物距,U0;
U0=Ω(Oi,α1,α2,H1,H2)
i=1,2,…N     (公式12)
物点的物高,H0;
H0=Ψ(Oi,α1,α2,H1,H2)
i=1,2,…N      (公式13)。
本发明具有的优点是:由于该相机通过对三维空间物点或构成三维物体的全部空间物点同时拍摄一组或多组两幅图像(拍双照),通过计算这些物点在双照中的高度比例关系,来计算出三维空间物点或构成三维物体的全部空间物点正对于相机的三维坐标(对应于立体照片的拍摄,记录与存储),利用拍双照方法得到的这些空间物体的三维坐标,可以用计算机或其它立体显示技术显示出三维立体空间物体。
附图说明
图1是本发明三维空间立体相机的结构示意图。
图2是图像一摄影单元后面有一个图像转换单元的三维空间立体相机的结构示意图。
图3是有图像一并行相关成像单元的三维空间立体相机的结构示意图。
图4是同时有并行相关成像单元与图像转换单元的三维空间立体相机的结构示意图。
图5是实验例1的光路配置示意图。
图6是实验例1中物点的像高比与物距的关系图(当设置图5的光学***参数:f1=35mm,f2=60mm,D=145mm)。
图7是实验例1中物距U0与像距V1的关系图(当设置图5的光学***参数:f1=35mm,f2=60mm,D=145mm时,物距大于50米时)。
图8是实验例1中物距U0与像距V2的关系(当设置图5的光学***参数:f1=35mm,f2=60mm,D=145mm时,物距大于50米时)。
图9是实验例1中物距U0与像高比H2/H1的关系图(当设置图5的光学***参数:f1=35mm,f2=60mm,D=145mm时,物距大于50米时)。
图10是实验例1中物距U0与像高H2、H1的关系图(当设置图5的光学***参数:f1=35mm,f2=60mm,D=145mm时,物距大于50米时)。
图11是实验例2的光路配置示意图。
图12是实验例3的光路配置示意图。
图13是实验例4的光路配置示意图。
1、发光物点;2、图像一摄影单元;3、图像二摄影单元;4、处理***;5、像点一(含感光器);6、像点二(含感光器);7、图像一转换单元;8、像点i;9、图像一并行相关成像单元;10、像点一’;11、信号连接与控制线;12、图像一并行相关成像单元图像转换单元;13、凸透镜F1;14、凸透镜F2;15、45°反光镜;16、凸透镜Fi;17、反光镜Fi;18、体状发光体Fi;19、像点i’;20、凹面镜。
具体实施方式
如图1所示,本发明公开了一种三维空间相机,包括同一光学***下的图像一摄影单元2和图像二摄影单元3,还包括对图像一摄影单元2和图像二摄影单元3进行数据处理的处理***4,所述处理***4包括对图像一摄影单元和图像二摄影单元进行图像拍摄的控制单元,对拍摄完图像进行存储的记录存储单元,以及对记录存储单元进行计算的三维坐标计算单元,同一光学***为共轴球面***。所述图像一摄影单元2与图像二摄影单元3的光学主平面之间的距离为D,D不小于零。
如图2所示,所述图像一摄影单元2和图像二摄影单元3之间还设有图像一转换单元7,图像一转换单元7至少为一个,图像一转换单元7与图像一摄影单元2之间的光学主平面距离为E,E大于或等于零。
如图3所示,在图1的基础上,图像一摄影单元2还设有至少一个图像一并行相关成像单元9,所述图像一并行相关成像单元9的光路与图像一摄影单元2的光路相同或相关,光路相关指的是通过一些计算参数把两个光路的成像特性表示出来,所述图像一并行相关成像单元9与图像一摄影单元2的光学主平面之间的距离为G,G大于或等于零。
进一步的,如图4所示,在图3的基础上,所述图像一并行相关成像单元9与图像二摄影单元3之间还设有至少一个图像一并行相关成像单元转换单元12,所述图像一并行相关成像单元9与转换单元12的光学主平面之间的距离为K,K大于或等于零。
所述处理***4通过信号连接与控制线11与图像一摄影单元2、图像二摄影单元3、图像一转换单元7、图像一并行相关成像单元9和图像一并行相关成像单元转换单元12分别连接,所述图像一摄影单元2、图像二摄影单元、图像一转换单元7、图像一并行相关成像单元9和图像一并行相关成像单元转换单元12的组装使用成像光学元器件,如凸透镜、凹透镜、凹面镜、凸面镜、平面反光镜、体状散光体,体状发光体,CCD感光器芯片或CMOS感光器芯片等。例如图像一摄影单元可采用一个凸透镜和一个CCD感光器。本发明还公开了一种三维空间相机的拍照方法,在共轴球面***下,物点的三维坐标表征为:H0物高、U0物距和α光学子午面夹角,包括以下步骤:
1)图像一摄影单元2对三维空间的发光物点1进行拍照,感光记录一个二维的图像一,在这个二维图像一上有发光物点1的像点一的像高与光学子午面夹角 坐标;
2)如图1所示,当只存在图像一摄影单元2和图像二摄影单元3时,图像二摄影单元3对图像一摄影单元2光路上的像点一5进行拍照,感光记录一个二维的图像二,这个二维图像二上有间接来自发光物点1的像点二6的像高与光学子午面夹角坐标。
如图2所示,当图像一摄影单元2与图像二摄影单元3之间设有至少一个图像一转换单元7时,图像一转换单元7将像点一5光学转换到图像一转换单元光路上的像点i(8),图像二摄影单元3对像点i(8)进行拍照,感光记录一个二维的图像三,这个二维图像三上有间接来自物点1的像点二6的像高与光学子午面夹角坐标。
如图3所示,当图像一摄影单元2还设有至少一个图像一并行相关成像单元9时,图像一并行相关成像单元9也对三维空间的发光物点1进行成像,得到像点一’10,图像二摄影单元3对像点一’10进行拍照,感光记录一个二维的图像四,这个二维图像四上有间接来自发光物点1的像点二6的像高与光学子午面夹角坐标。
如图4所示,当图像一并行相关成像单元9与图像二摄影单元3之间还设有至少一个图像一并行相关转换单元12时,图像一并行相关成像单元9也对三维空间的发光物点1进行成像,得到像点一’10,图像一并行相关转换单元12将像点一’10光学转换到图像一并行相关转换单元12光路上的像点i’19,图像二摄影单元3对像点i’19进行拍照,感光记录一个二维的图像五,这个二维图像五上有间接来自物点1的像点二的像高与光学子午面夹角坐标。
3)图像拍摄控制单元设置一组光学***参数,这组参数可以对某一个景深内的物点的像点一和像点二分别清晰记录在图像一与图像二上,记录存储单元将图 像一摄影单元与图像二摄影单元感光器上的图像一与图像二的数据进行记录并存储,三维坐标计算单元利用记录存储单元记录的数据以及图像一摄影单元和图像二摄影单元的光学***参数对图像一与图像二上像点的二维坐标进行计算,得到空间三维物点的物距,物高与光学子午面夹角的三维坐标并存储;
4)重复步骤3),得到另一个景深内的三维空间物点的三维坐标信息并存储;
5)对需要三维坐标测量的物点,通过划分一个到多个景深,重复步骤3)和步骤4),得到所需的整个深度范围内物点的一组成对的图像一与图像二,以及整个深度范围内的空间三维空间物点的物距,物高与光学子午面夹角的三维坐标信息,并存储,再通过显示器显示出物体的三维立体照片。
进一步的,用来形成图像一与图像二的光或波为可见光,红外光,紫外光,X‐光,电磁波,超声波中的一种。
进一步的,所述物点的发光,是物点本身的光或光源照射物点产生的光。
具体的三维立体相机的光路配置,见以下实验例。
实验例1
图5为对应于图1的三维空间立体相机***结构的,在某一光学子午面夹角时的一个光路举例示意图。光路设置与摄影过程描述如下:
1、光路***为共轴球面***。
2、图像一摄影单元与图像二摄影单元的摄影成像镜头都使用凸透镜,凸透镜13和凸透镜14,它们焦距分别为f1与f2。
3、两个凸透镜的光学主平面间距为D。
4、图像一摄影单元使用一只半透半反的反光镜15(如图5所示,反光镜的朝上面部分反光,另一面透明),与光轴成45°摆放在凸透镜13的焦点前面。这个半透半反45°反光镜15把经过凸透镜13聚光的光,反射一部分光,把这部分 反射光成像在图像一摄影单元内的感光器(如CCD或CMOS)上,得到二维图像一。
5、以上二维图像一上有来自发光物点1的像点一5(或物体的像点群一)的像高与光学子午面夹角坐标。
6、以上透过半透半反45°反光镜15的光继续前行,成像在图像一摄影单元的光路上(在图5所示的凸透镜13的像距为V1,像高为H1处的像点)。
7、来自光路上的以上像点H1(发)光,经过凸透镜14,成像在图像二摄影单元内的感光器上,得到二维图像二。
8、在以上二维图像二上有间接来自物点的像点二6(或间接来自物体的像点群二)的像高与光学子午面夹角坐标。
在共轴球面***情景下,物点的三维坐标可以描述为:
H0:物高(物点在光学主轴上的高度);
U0:物距(物点离光学主点的轴向距离);
α:光学子午面夹角(物点,与光学主轴构成的面为光学***的子午面。当定义某一个子午面为主子午面时,α为物点所在的子午面与主子午面的夹角)。
物点的成像过程,与用以上两幅相关的二维图像计算物点的三维坐标的原理陈述如下:
物点与两个像点在同一个光学子午面内,所以物点的光学子午面夹角α0与两个像点的光学子午面夹角α1,α2相等,也即:
α1=α2=α0  (公式1)
在近轴近似条件下:
Figure PCTCN2021075868-appb-000001
Figure PCTCN2021075868-appb-000002
Figure PCTCN2021075868-appb-000003
Figure PCTCN2021075868-appb-000004
V1+U2=D  (公式6)
以上6个公式里,总共有13个参数。其中f1,f2,与D,3个参数为已知的光学***参数设置。H1,H2,α1,α2,4个参数为通过两次感光拍摄测试得到,感光拍摄测试:快门开启,CCD或CMOS曝光记录影像,快门开启一定(非常短)时间后,快门关闭。未知的H0,U0,α0,V1,U2,V2,6个参数,可以通过计算求解以上6个公式来得到:如U0,H0,V1,与V2的计算结果为:
Figure PCTCN2021075868-appb-000005
其中:
a=DDH2/E1-2Df1H2/H1+f1f1H2/H1-Df2H2/H1-Df2+f1f2H2/H1+f1f2  (公式7a)
b=-2DDf1H2/H1+2Df1f1H2/H1+2Df1f2H2/H1-f1f1f2H2/H1+2Df1f2-f1f1f2  (公式7b)
Figure PCTCN2021075868-appb-000006
Figure PCTCN2021075868-appb-000007
Figure PCTCN2021075868-appb-000008
Figure PCTCN2021075868-appb-000009
公式7,8说明,通过如图5这样拍双照,测量两个像高,可以计算得到物点的三维坐标:物距,物高,与光学子午面夹角(物像在同一个子午面内)。在固定的焦距f1,f2,与两个凸透镜的光学主平面间距D,这3个光学***参数下,物距只与像高比(H2/H1)有关;物高与像高(H1)和像高比(H2/H1)两者都有关。
公式9,10说明,通过如图5这样拍双照,测量两个像高,也可以计算得到两个摄影单元的像距。在固定的f1,f2,与D,3个光学***参数下,两个像距V1,V2也只与像高比(H2/H1)有关。一般情况下,像距是不测量的,正确的像距对应于清晰的成像。
对某个物点测三维坐标时,图像一摄影单元先对物点对焦(改变焦距f1或像距V1),拍摄图像一。在固定以上图像一摄影单元光学***参数(焦距f1或像距V1)情况下,图像二摄影单元对图像一摄影单元光路上的图像一对焦(改变焦距f2,或像距V2,或光学主平面间距D),拍摄图像二。测量两个图像上的像高H1,H2,及光学子午面夹角α1=α1=α0,用公式7,8计算得到物点的物距U0与物高H0。从而得到物点的三维坐标:α0,U0,H0。
当设置图5的光学***参数:f1=35mm,f2=60mm,D=145mm时,物点的像高比与物距的关系如图6所示。测得的小的像高比对应于远的物点的物距。
三维空间的一个有大小的物体可以看成物体的大小对应的深度方向的物距距离范围内的一个物点群构成。三维空间的多个物体可以看成在一定的物距范围内的多个物点群组成的。若多个物体在光学***的某个景深范围内,可以通过在这个景深范围内拍一组双照,在照片上测量与物点群对应的像点群的光学 子午面夹角物点,得到物点群的光学子午面夹角坐标。通过公式7,8算出物点群的物距与物高坐标。这个对应于拍得这个景深范围内的立体照。可以通过改变光学***参数配置,改变景深覆盖范围,扫描拍双照,得到更大物距/深度范围内三维物体的立体照。
以下用图示来进一步形像地说明以上拍双照确定三维空间物体群的三维坐标。当设置图5的光学***参数:f1=35mm,f2=60mm,D=145mm时,并且在物距大于50米时,图像一像1与图像二像2的像距与物距的关系,分别如图7,8所示;像高比与物距的关系,如图9所示;物高H0=1000mm时,物距U0与像高H2,H1的关系,如图10所示。
图7,8说明,当多个物体有60米以上物距时,像距V1,V2的变化都非常小(小于+,‐10um),可以认为它们都可以在同一组双照里对焦成像。也就是说在图5的光学***中,用f1=35mm,f2=60mm,D=145mm光学***参数配置时,***的景深在60米以远(或景深范围为60米到无穷远)。图9说明,在60米以远,像高比仍是随着物距的增加单调减小。图10说明,对于1米高的物体,在60米到100米,两个像高仍是可测量的。图7,8,9,10说明,本发明专利的光学***,可以在***的景深里,拍一组双照,可以测量计算得到景深范围内的三维物体坐标。测量方法为测量双照上的像点群的像高H1,H2,及光学子午面夹角α1=α1=α0,用公式7,8计算得到对应物点群的物距U0与物高H0。从而得到***景深范围内的物点群的三维坐标:α0,U0,H0。
通过改变***的焦距(f1,f2)与光学主平面间距(D)可以改变***的景深范围。用不同的景深范围拍双照(扫描拍照),可以在更大的物距范围内测量三维空间物体群的三维坐标(光学子午面夹角,物距,物高),或拍得更大物距范围内的三维空间物体群的立体照。
以上三维空间立体相机的三维坐标测量计算原理可以进一步推广到普遍适用的非近轴近似,非共轴的其它有的光学***中,一个物点的三维坐标计算公式可以统一为:
它们都是以下三类参数的函数:
光学***的全部光学***参数集合,Oi(i=1,…,N);
Oi为可以描述整个光学***特性的最少的或独立的,N个光学***参数的集合。如各光学部件的物方/物方焦距,物方/像方主平面间距,部件的非球面几何参数,部件间的主平面间距或相对位置,等等。
像点一与像点二的光学子午面夹角,α1,α2;
像点一与像点二的像高,H1,H2。
也即:
物点的光学子午面夹角,α0;
α0=Φ(Oi,α1,α2,H1,H2)
i=1,2,…N     (公式11)
物点的物距,U0;
U0=Ω(Oi,α1,α2,H1,H2)
i=1,2,…N     (公式12)
物点的物高,H0;
H0=Ψ(Oi,α1,α2,H1,H2)
i=1,2,…N     (公式13)
公式11,12,13可以应用于以下几个实施例子的物点或物点群的三维坐标的计算。
实验例2
具体的,见图11,图11为对应于图2的三维空间立体相机***结构的,在某一光学子午面夹角时的一个光路举例示意图。光路设置与摄影过程描述如下:
1、光路***为共轴球面***。
2、图像一与图像二两个摄影单元的摄影成像镜头都使用凸透镜,凸透镜13和凸透镜14,它们焦距分别为f1与f2。
3、凸透镜13与14的光学主平面间距为D。
4、图像一摄影单元后面有一个图像一转换单元。
5、图像一转换单元也使用凸透镜16(凸透镜Fi),它的焦距为fi。
6、凸透镜16与凸透镜14的光学主平面间距为E。
7、图像一摄影单元使用一只半透半反的反光镜15(如图11所示,反光镜15的朝上面部分反射,另一面透明),与光轴成45°摆放在凸透镜13的焦点前面。这个半透半反45°反光镜15把经过凸透镜13聚光的光,反射一部分光,把这部分反射光成像在图像一摄影单元内的感光器(如CCD或CMOS)上,得到二维图像一。
8、以上二维图像一上有来自物点(光)的像点一(或物体的像点群一)的像高与光学子午面夹角坐标。
9、以上透过半透半反45°反光镜15的光继续前行,成像在图像一摄影单元的光路上(在图11所示的透镜F1的像距为V1,像高为H1处的像点。
10、来自光路上的以上像点H1(发)光,经过凸透镜Fi(16),成像在图像一转换单元的光路上(在图11所示的透镜Fi的像距为Vi,像高为Hi处的像点。图示的像点为虚像。
11、来自光路上的以上像点Hi(发)光,经过凸透镜F2,成像在图像二摄影单元内的感光器上,得到二维图像二。
12、在以上二维图像二上有间接来自物点的像点二(或间接来自物体的像点群二)的像高与光学子午面夹角坐标。
光学***景深范围内物点或物点群的三维坐标通过公式11,12,13计算得到。通过改变***的焦距(f1,f2,fi)与光学主平面间距(D,E)可以改变***的景深范围。用不同的景深范围拍双照(扫描拍照),可以在更大的物距范围内测量三维空间物体群的三维坐标(光学子午面夹角,物距,物高),或拍得更大物距范围内的三维空间物体群的立体照。
实验例3
具体的见图12,图12为对应于图2的三维空间立体相机***结构的,在某一光学子午面夹角时的一个光路举例示意图。光路设置与摄影过程描述如下:
1、光路***为共轴球面***。
2、图像一摄影单元使用凸透镜13(凸透镜F1),它的焦距为f1。图像二摄影单元使用凹面镜20,它的焦距为f2。
3、凹面镜20为凸面全透,凹面部分反射的凹面镜。
4、凸透镜13与凹面镜20的光学主平面间距为D。
5、图像一摄影单元后面有一个图像一转换单元。
6、图像一转换单元使用如图12所示的左面反光的平面反光镜Fi(17)。
7、凹面镜20与反光镜Fi(17)的光学主平面间距为E。
8、图像一摄影单元使用一只半透半反的反光镜15(如图12所示,反光镜的朝上面部分反光,另一面透明),与光轴成45°摆放在凸透镜13的焦点前面。这个半透半反45°反光镜15把经过凸透镜13聚光的,透过凹面镜20的光,反射一部分光,把这部分反射光成像在图像一摄影单元内的感光器(如CCD或CMOS)上,得到二维图像一。
9、以上二维图像一上有来自物点(光)的像点一(或物体的像点群一)的像高与光学子午面夹角坐标。
10、以上透过半透半反45°反光镜的光继续前行,成像在图像一摄影单元的光路上(标注在图12所示的半透半反45°反光镜右面的像点H1处)。
11、来自光路上的以上像点H1(发)光,经过反光镜Fi(17),成像在图12所示的反光镜Fi(17)的右边Hi处。这个像点Hi为虚像。
12、来自光路上的以上像点Hi(发)光,经过凹面镜20,成像在图像二摄影单元内的感光器上,得到二维图像二。
13、在以上二维图像二上有间接来自物点的像点二(或间接来自物体的像点群二)的像高与光学子午面夹角坐标。
光学***景深范围内物点或物点群的三维坐标通过公式11,12,13计算得到。通过改变***的焦距(f1,f2)与光学主平面间距(D,E)可以改变***的景深范围。用不同的景深范围拍双照(扫描拍照),可以在更大的物距范围内测量三维空间物体群的三维坐标(光学子午面夹角,物距,物高),或拍得更大物距范围内的三维空间物体群的立体照。
实验例4
图13为对应于图2的三维空间立体相机***结构的,在某一光学子午面夹角时的一个光路举例示意图。光路设置与摄影过程描述如下:
1、光路***为共轴球面***。
2、图像一摄影单元使用凸透镜13(凸透镜F1),它的焦距为f1。图像二摄影单元使用凹面镜20,它的焦距为f2。
3、凹面镜20为凸面全透,凹面部分反射的凹面镜。
4、凸透镜13与凹面镜20的光学主平面间距为D。
5、图像一摄影单元后面有一个图像一转换单元。
6、图像一转换单元使用如图13所示的体状散光体或体状发光体Fi(18)。
7、凹面镜20与体状散光体Fi(18)的光学主平面间距为如图13所示的E。
8、图像一摄影单元使用一只半透半反的反光镜(如图13所示,反光镜的朝上面部分反光,另一面透明),与光轴成45°摆放在凸透镜F1的焦点前面。这个半透半反45°反光镜把经过凸透镜F1聚光的,透过凹面镜20的光,反射一部分光,把这部分反射光成像在图像一摄影单元内的感光器(如CCD或CMOS)上,得到二维图像一。
9、以上二维图像一上有来自物点(光)的像点一(或物体的像点群一)的像高与光学子午面夹角坐标。
10、以上透过半透半反45°反光镜的光继续前行,成像在图像一摄影单元的光路上(标注在图12所示的半透半反45°反光镜右面的像点H1处)。
11、来自光路上的以上像点H1(发)光,经过体状散光体散光,可以看为就地成像在像点H1的同一个地点。这个像点Hi为实像。
12、来自光路上的以上像点Hi(发)光,经过凹面镜20,成像在图像二摄影单元内的感光器上,得到二维图像二。
13、在以上二维图像二上有间接来自物点的像点二(或间接来自物体的像点群二)的像高与光学子午面夹角坐标。
光学***景深范围内物点或物点群的三维坐标通过公式11,12,13计算得到。通过改变***的焦距(f1,f2)与光学主平面间距(D,E)可以改变***的景深范围。用不同的景深范围拍双照(扫描拍照),可以在更大的物距范围内测量三维空间物体群的三维坐标(光学子午面夹角,物距,物高),或拍得更大物距范围内的三维空间物体群的立体照。

Claims (12)

  1. 一种三维空间相机,其特征在于:包括同一光学***下的图像一摄影单元和图像二摄影单元,还包括对图像一摄影单元和图像二摄影单元进行数据处理的处理***,所述处理***包括对图像一摄影单元和图像二摄影单元进行图像拍摄的控制单元,对控制单元进行处理的记录存储单元,以及对记录存储单元进行计算的三维坐标计算单元,所述处理***通过信号连接与控制线与图像一摄影单元、图像二摄影单元连接。
  2. 如权利要求1所述的三维空间相机,其特征在于:同一光学***为共轴球面***,所述图像一摄影单元与图像二摄影单元的光学主平面之间的距离为D,D不小于零。
  3. 如权利要求2所述的三维空间相机,其特征在于:所述图像一摄影单元和图像二摄影单元之间还设有图像一转换单元,所述图像一转换单元通过信号连接与控制线与处理***连接,所述图像一转换单元至少为一个,图像一转换单元与图像一摄影单元的光学主平面之间的距离为E,E不小于零。
  4. 如权利要求2所述的三维空间相机,其特征在于:所述图像一摄影单元还设有至少一个图像一并行相关成像单元,所述图像一并行相关成像单元通过信号连接与控制线与处理***连接,所述图像一并行相关成像单元的光路与图像一摄影单元的光路相同或相关,所述图像一并行相关成像单元与图像一摄影单元的光学主平面之间的距离为G,G不小于零。
  5. 如权利要求4所述的三维空间相机,其特征在于:所述图像一并行相关成像单元与图像二摄影单元之间还设有至少一个图像一并行相关成像单元转换单元,所述图像一并行相关成像单元转换单元通过信号连接与控制线与处理***连接,所述图像一相关成像单元与转换单元之间的距离为K,K不小于零。
  6. 一种如权利要求1所述的三维空间相机的拍照方法,当所述光学***与光学子***都为共轴球面***时,物点的三维坐标表征为:H0物高、U0物距和a光学子午面夹角,其特征在于,包括以下步骤:
    1)图像一摄影单元对三维空间的物点进行拍照,感光记录一个二维的图像一,在这个二维图像一上有物点的像点一的像高与光学子午面夹角坐标;
    2)图像二摄影单元对图像一摄影单元光路上的像点一进行拍照,感光记录一个二维的图像二,这个二维的图像二上有间接来自物点的像点二的像高与光学子午面夹角坐标;
    3)图像拍摄控制单元设置一组光学***参数,这组参数可以对某一个景深内的物点的像点一和像点二分别清晰记录在图像一与图像二上,记录存储单元将图像一摄影单元与图像二摄影单元感光器上的图像一与图像二的数据进行记录并存储,三维坐标计算单元利用存储与记录单元记录的数据以及图像一摄影单元和图像二摄影单元的光学***参数对图像一与图像二 上像点的二维坐标进行计算,得到空间三维物点的物距,物高与光学子午面夹角的三维坐标并存储;
    4)重复步骤3),得到另一个景深内的三维空间物点的三维坐标信息并存储;
    5)对需要三维坐标测量的物点,通过划分一个到多个景深,重复步骤3)和步骤4),得到所需的整个深度范围内物点的一组成对的图像一与图像二,以及整个深度范围内的空间三维空间物点的物距,物高与光学子午面夹角的三维坐标信息,并存储,再通过显示器或投影仪显示出物体的三维立体照片。
  7. 如权利要求6所述的三维空间相机的拍照方法,其特征在于:所述的步骤2)中:图像一摄影单元与图像二摄影单元之间设有至少一个图像一转换单元,图像一转换单元将像点一光学转换到图像一转换单元光路上的像点i,图像二摄影单元对像点i进行拍照,感光记录一个二维的图像三,这个二维图像三上有间接来自物点的像点二的像高与光学子午面夹角坐标。
  8. 如权利要求6所述的三维空间相机的拍照方法,其特征在于:所述的步骤2)中:图像一摄影单元还设有至少一个图像一并行相关成像单元,图像一并行相关成像单元也对三维空间的物点进行成像,得到像点一’,图像二摄影单元对像点一’进行拍照,感光记录一个二维的图像四,这个二维图像四上有间接来自物点的像点二的像高与光学子午面夹角坐标。
  9. 如权利要求8所述的三维空间相机的拍照方法,其特征在于:所述的步骤2)中:图像一并行相关成像单元与图像二摄影单元之间还设有至少一个图像一并行相关转换单元,图像一并行相关成像单元也对三维空间的物点进行成像,得到像点一’,图像一并行相关转换单元将像点一’光学转换到图像一并行相关转换单元光路上的像点i’,图像二摄影单元对像点i’进行拍照,感光记录一个二维的图像五,这个二维图像五上有间接来自物点的像点二的像高与光学子午面夹角坐标。
  10. 如权利要求7至9任一所述的三维空间相机的拍照方法,其特征在于:用来形成图像一与图像二或图像三或图像四或图像五的光或波为可见光,红外光,紫外光,X-光电磁波,超声波中的一种。
  11. 如权利要求7至9任一所述的三维空间相机的拍照方法,其特征在于:所述物点的发光,是物点本身的光或光源照射物点产生的光。
  12. 如权利要求7至9任一所述的所述的三维空间相机的拍照方法,其特征在于:所述的步骤3)中:三维坐标计算单元利用存储与记录单元记录的数据以及图像一摄影单元和图像二摄影单元的光学***参数对图像一与图像二上像点的二维坐标进行计算,得到空间三维物点 的物距,物高与光学子午面夹角的三维坐标,具体计算过程如下:
    光学***的全部光学***参数集合,Oi(i=1,…,N);
    Oi为描述整个光学***特性的最少的或独立的,N个光学***参数的集合;
    像点一与像点二的光学子午面夹角,a1,a2;
    像点一与像点二的像高,H1,H2;
    也即:
    物点的光学子午面夹角,a0;
    a0=F(Oi,a1,a2,H1,H2)
    i=1,2,…N  (公式11)
    物点的物距,U0;
    U0=W(Oi,a1,a2,H1,H2)
    i=1,2,…N  (公式12)
    物点的物高,H0;
    H0=Y(Oi,a1,a2,H1,H2)
    i=1,2,…N  (公式13)。
PCT/CN2021/075868 2020-01-22 2021-02-07 一种三维空间相机及其拍照方法 WO2021148050A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/794,264 US11997247B2 (en) 2020-01-22 2021-02-07 Three-dimensional space camera and photographing method therefor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010075630.0 2020-01-22
CN202010075630.0A CN111277811B (zh) 2020-01-22 2020-01-22 一种三维空间相机及其拍照方法

Publications (1)

Publication Number Publication Date
WO2021148050A1 true WO2021148050A1 (zh) 2021-07-29

Family

ID=71001209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/075868 WO2021148050A1 (zh) 2020-01-22 2021-02-07 一种三维空间相机及其拍照方法

Country Status (3)

Country Link
US (1) US11997247B2 (zh)
CN (1) CN111277811B (zh)
WO (1) WO2021148050A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111277811B (zh) * 2020-01-22 2021-11-09 上海爱德赞医疗科技有限公司 一种三维空间相机及其拍照方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107801017A (zh) * 2017-10-30 2018-03-13 北京都是科技有限公司 一种多目3d摄像机及其3d图像建模方法
CN107864372A (zh) * 2017-09-22 2018-03-30 捷开通讯(深圳)有限公司 立体拍照方法、装置及终端
CN109068035A (zh) * 2018-07-13 2018-12-21 中科光电(北京)科学技术有限公司 一种智能微相机阵列内窥成像***
WO2019007180A1 (zh) * 2017-07-06 2019-01-10 杭州思看科技有限公司 同时具备摄影测量和三维扫描功能的手持式大尺度三维测量扫描仪***
CN111277811A (zh) * 2020-01-22 2020-06-12 上海爱德赞医疗科技有限公司 一种三维空间相机及其拍照方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE48834E1 (en) * 2011-12-15 2021-11-30 Synthes Gmbh Method and a device for computer assisted surgery
US10848731B2 (en) * 2012-02-24 2020-11-24 Matterport, Inc. Capturing and aligning panoramic image and depth data
CN102999939B (zh) * 2012-09-21 2016-02-17 魏益群 坐标获取装置、实时三维重建***和方法、立体交互设备
HK1221372A2 (zh) * 2016-03-29 2017-05-26 萬維數碼有限公司 種獲得空間音頻定向向量的方法、裝置及設備
WO2017205841A1 (en) * 2016-05-27 2017-11-30 Craig Peterson Combining vr or ar with autostereoscopic usage in the same display device
HK1217870A2 (zh) * 2016-06-10 2017-01-20 Marvel Digital Ltd 種用於擴展光源的透鏡及其設計方法
CN107749053A (zh) * 2017-10-24 2018-03-02 郑州布恩科技有限公司 一种用于视觉假体的双目图像采集与预处理装置及方法
CN107991838B (zh) * 2017-11-06 2020-10-23 万维科研有限公司 自适应三维立体成像***

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019007180A1 (zh) * 2017-07-06 2019-01-10 杭州思看科技有限公司 同时具备摄影测量和三维扫描功能的手持式大尺度三维测量扫描仪***
CN107864372A (zh) * 2017-09-22 2018-03-30 捷开通讯(深圳)有限公司 立体拍照方法、装置及终端
CN107801017A (zh) * 2017-10-30 2018-03-13 北京都是科技有限公司 一种多目3d摄像机及其3d图像建模方法
CN109068035A (zh) * 2018-07-13 2018-12-21 中科光电(北京)科学技术有限公司 一种智能微相机阵列内窥成像***
CN111277811A (zh) * 2020-01-22 2020-06-12 上海爱德赞医疗科技有限公司 一种三维空间相机及其拍照方法

Also Published As

Publication number Publication date
CN111277811A (zh) 2020-06-12
CN111277811B (zh) 2021-11-09
US20230084212A1 (en) 2023-03-16
US11997247B2 (en) 2024-05-28

Similar Documents

Publication Publication Date Title
CN103471715B (zh) 一种共光路组合式光场光谱成像方法及装置
CN102494609B (zh) 一种基于激光探针阵列的三维摄影方法及装置
US20190114796A1 (en) Distance estimation method based on handheld light field camera
CN108171758B (zh) 基于最小光程原理和透明玻璃标定板的多相机标定方法
US10715711B2 (en) Adaptive three-dimensional imaging system and methods and uses thereof
WO2006075528A1 (ja) 3次元オブジェクト計測装置
WO2020207172A1 (zh) 基于三维光场技术的光学无人机监测方法及***
She et al. Adjustment and calibration of dome port camera systems for underwater vision
CN112085793B (zh) 一种基于组合透镜组的三维成像扫描***及点云配准方法
WO2021148050A1 (zh) 一种三维空间相机及其拍照方法
JP2002152779A (ja) 3次元画像検出装置
CN115151945A (zh) 将三维相机的坐标系转成二维相机的入射点
JP2006329897A (ja) 透明板に映る2重像を用いた距離計測方法
WO2023065721A1 (en) Methods, devices and systems for transparent object three-dimensional reconstruction
CN114111626B (zh) 一种基于同轴投影的光场相机三维测量装置及***
JP5147055B2 (ja) 距離計測装置及び距離計測方法
US20070002161A1 (en) Focus detecting apparatus and image pickup apparatus
WO2021093804A1 (zh) 全向立体视觉的摄像机配置***及摄像机配置方法
JP7314659B2 (ja) 測距装置および撮影装置
RU2790049C1 (ru) Способ анизотропной регистрации светового поля и устройство для его реализации
CN109470148A (zh) 旋转柱面镜高分辨力立体视觉***与测量方法
CN221302522U (zh) 一种双偏振接收模组、深度相机及电子设备
RU2760845C1 (ru) Способ обнаружения и определения характеристик целей на основе регистрации и обработки хода лучей от объектов в наблюдаемом пространстве и устройство для его реализации
Darwish et al. Plenoptic camera calibration based on sub-aperture images
CN109470146B (zh) 高分辨力立体视觉***与测量方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21743891

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21743891

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/01/2023)