WO2018001252A1 - 一种投射单元及包括该单元的拍摄装置、处理器、成像设备 - Google Patents

一种投射单元及包括该单元的拍摄装置、处理器、成像设备 Download PDF

Info

Publication number
WO2018001252A1
WO2018001252A1 PCT/CN2017/090394 CN2017090394W WO2018001252A1 WO 2018001252 A1 WO2018001252 A1 WO 2018001252A1 CN 2017090394 W CN2017090394 W CN 2017090394W WO 2018001252 A1 WO2018001252 A1 WO 2018001252A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
matching
target object
imaging
Prior art date
Application number
PCT/CN2017/090394
Other languages
English (en)
French (fr)
Inventor
陈森淼
Original Assignee
鲁班嫡系机器人
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 鲁班嫡系机器人 filed Critical 鲁班嫡系机器人
Priority to US16/313,387 priority Critical patent/US20230199324A1/en
Priority to EP17819249.8A priority patent/EP3481062A4/en
Publication of WO2018001252A1 publication Critical patent/WO2018001252A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/281Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for attenuating light intensity, e.g. comprising rotatable polarising elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/82Camera processing pipelines; Components thereof for controlling camera response irrespective of the scene brightness, e.g. gamma correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination

Definitions

  • the present invention relates to the field of imaging technologies, and in particular, to a projection unit and a camera, a processor, and an imaging device including the same.
  • the prior art generally generates a discrete spot by projecting a discrete spot to the target area (ie, by emitting a discrete beam to the target area, the discrete spot being unique in a certain area, so as to capture the target covering the discrete spot by shooting
  • the image of the target object in the region is then matched with the unique feature of the discrete spot in a certain spatial range.
  • the problem is that the discrete light is not continuous in space due to the matching method of the discrete spot. Sexuality causes the final matching result of the backend processor to not achieve very high (subpixel) accuracy.
  • the patent number is US13/907426, which requires continuous projection of multiple speckle-shifted phase-shifted stripe scene images. Although it can improve imaging accuracy, it is necessary to capture multiple photos at the same time. On the one hand, the speed is slow, it is difficult to achieve better real-time performance, and on the other hand, the point cloud imaging accuracy for moving objects is not enough.
  • the present invention provides a projection unit and an imaging device, a processor, and an imaging device including the same. Projecting an image with a certain gradation law and uniqueness within a certain spatial range by the projection unit, and combining the relevant characteristics of the image improves the accuracy of generating a point cloud in the back-end processor, and has the same Good real-time.
  • a first aspect of the present invention provides a projection unit for projecting a target area including a target object An image that has a certain gradual law and is unique within a certain spatial extent.
  • a second aspect of the present invention provides a photographing apparatus, comprising: at least one projection unit, and at least one image pickup unit.
  • the projection unit is configured to project an image to a target area including the target object, the image has a certain gradation law and is unique within a certain spatial range;
  • the image capturing unit is configured to acquire at least one image including at least a target object after the image is projected by the projection unit.
  • a third aspect of the invention provides an image forming apparatus comprising a three-dimensional imaging photographing apparatus as described above.
  • a fourth aspect of the present invention provides a processor of an image forming apparatus, the processor comprising: at least one first matching unit, the first matching unit comprising at least a first and a second computing unit;
  • the first calculating unit is configured to match the first and second images
  • the second calculating unit is configured to use a certain gradation rule of the image for the matched first and second images, thereby achieving higher precision matching of the first and second images.
  • a fifth aspect of the invention provides a processor of an image forming apparatus, the processor comprising: a storage unit, at least one second matching unit, the second matching unit comprising at least a fourth and fifth calculating unit;
  • the storage unit is configured to acquire and store an image of a plurality of target areas in advance as a reference image
  • the fourth calculating unit is configured to match the acquired image including at least the target object with a plurality of reference images stored in advance;
  • the fifth calculating unit uses the gradation rule of the image containing the target object and the matched reference image to achieve a more high-precision matching of the image containing the target object and the matched reference image.
  • the embodiment of the present invention adopts a projection unit and a photographing device, a processor and an imaging device including the same.
  • the following technical effects have been achieved:
  • the present invention projects a target object that projects such an image through a camera unit by projecting a certain image with a certain gradual change and a certain degree of space in the target area by the projection unit, thereby avoiding multiple shots repeatedly.
  • the image combined with the relevant characteristics of this image, improves the accuracy of subsequent generation of point clouds in the backend processor, and has better real-time performance.
  • the imaging device used in the present invention projects an image having a certain gradation law and a uniqueness within a certain spatial range by the projection unit, and the camera unit captures at least the target object after the projection image is captured.
  • the image in combination with the relevant characteristics of this image, improves the accuracy of subsequent generation of point clouds in the backend processor, while having better real-time performance.
  • the reflective surface Since the target surface is in the process of shooting, the reflective surface will have a greater impact on the scanning result. Once the surface of the object has a reflective phenomenon, the 3D point cloud image of the area cannot be obtained. Therefore, the polarizing units are respectively disposed on the projection unit and the imaging unit, and by the cooperation of the two polarization units, the brightness of the reflective surface can be greatly reduced, so that the reflective surface of the target object can also acquire an image.
  • the camera unit of the photographing device further includes at least one adjusting unit or the back end processor of the three-dimensional imaging device further includes at least one adjusting unit, the latitude of the image captured by the image capturing unit or the image capturing unit to the brightness of the light can be increased. .
  • the projection unit includes a laser emitting module and a diffractive sheet located in front of the laser emitting module; or a MEMS module, the relative projector can obtain a scanning range of a larger depth of field and a lower cost.
  • FIG. 1 is a structural block diagram of a photographing apparatus according to an embodiment of the present invention
  • FIG. 2 is a structural block diagram of another imaging apparatus according to an embodiment of the present invention.
  • FIG. 3 is a structural block diagram of another imaging apparatus according to an embodiment of the present invention.
  • FIG. 4 is a structural block diagram of another imaging apparatus according to an embodiment of the present invention.
  • FIG. 5 is a structural block diagram of an image forming apparatus according to an embodiment of the present invention.
  • FIG. 6 is a structural block diagram of another imaging device according to an embodiment of the present invention.
  • FIG. 7 is a structural block diagram of another imaging device according to an embodiment of the present invention.
  • FIG. 8 is a schematic flowchart of a photographing method according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram showing the difference between the projected image and the prior art discrete spot according to the embodiment of the present invention, wherein the upper part is a discrete spot, and the lower part is a projected image provided by the embodiment of the present invention;
  • FIG. 10 is a schematic diagram showing the continuous light sinusoidal shape of the overall light intensity of the projected image with a certain gradation law according to an embodiment of the present invention.
  • FIG. 11 is a partial flowchart of an imaging method according to an embodiment of the present invention.
  • FIG. 12 is a partial flowchart of another imaging method according to an embodiment of the present invention.
  • Figure 13 is a partial flow chart of another imaging method provided by an embodiment of the present invention.
  • Figure 14 is a partial flow chart of another imaging method provided by an embodiment of the present invention.
  • FIG. 15 is a structural block diagram of another imaging device according to an embodiment of the present invention.
  • FIG. 16 is a structural block diagram of another imaging device according to an embodiment of the present invention.
  • An embodiment of the present invention provides a projection unit, a camera, a processor, and an imaging device, including the unit, and projecting, by the projection unit, an image that has a certain gradation law and is unique within a certain spatial range. Combining the relevant characteristics of this image improves the accuracy of subsequent generation of point clouds in the back-end processor, and has better real-time performance.
  • the prior art can form a discrete spot by projecting a discrete spot to the target area (ie, by emitting a discrete beam (eg, a laser beam, as shown in the upper diagram of FIG. 8) to the target area, thereby forming a discrete spot in the target area, thus covering the discrete spot by shooting.
  • a discrete beam eg, a laser beam, as shown in the upper diagram of FIG. 8
  • the image of the target object in the target area, and then the image is matched by the discrete spot.
  • the problem is that by this method of matching the discrete spot, the lack of continuity in the space due to the intensity of the projected pattern leads to the final
  • the matching result does not reach high precision (such as sub-pixel precision), which will result in the final matching result of the back-end processor not reaching high (sub-pixel) precision.
  • this patent is a continuous phase projection of multiple speckle-shifted phase-shifted stripe scene images. Although it can improve the imaging accuracy, it is difficult to achieve better because it needs to capture multiple photos at the same time. The real-time nature, on the other hand, is not accurate enough for point cloud imaging of moving objects.
  • the present invention provides a photographing apparatus for photographing a target object located in a target area.
  • FIG. 1 is a schematic structural diagram of a photographing apparatus according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram showing the difference between the projected image and the prior art discrete spot according to the embodiment of the present invention, wherein the upper part is a discrete spot, and the lower part is a projected image provided by the embodiment of the present invention.
  • FIG. 10 is a schematic diagram showing the continuous light sinusoidal shape of the overall light intensity of the projected image with a certain gradation law according to an embodiment of the present invention.
  • the apparatus 100 includes at least one projection unit 101 and at least one imaging unit 102.
  • the projection unit 101 is configured to project an image to a target area including the target object, the image has a certain gradation rule and is unique within a certain spatial range.
  • the projected image is an image.
  • the image needs to include a target object. If an image cannot include the entire target object due to the limitation of the shooting range, it may be necessary to take multiple images and then stitch in a subsequent processor to form an image.
  • the splicing technology of the back-end processor belongs to the prior art and will not be described here.
  • the intensity of the projected image that is unique within a certain range of space is continuously sinusoidal.
  • the sine wave does not necessarily fully comply with the sine wave standard, and may also be close to a sine wave.
  • a sine wave that is not completely regular, or a sine wave that changes linearly also called a triangular wave.
  • the advantage of using a linearly varying sine wave is that the difference in pixel values between adjacent pixels is small near the peak and valley positions of the sine wave, which may cause subsequent processor matching due to adjacent.
  • the difference in the pixel value between the pixels is small, so that the subsequent higher-precision matching cannot be performed well, and the image with a linearly varying sine wave has a linear change in light intensity, so it does not appear in the peaks and troughs. In the vicinity of the position, a difference in the difference in pixel values between adjacent pixels occurs, which is advantageous for improving the accuracy of subsequent matching.
  • the projection unit of the present invention projects an image having a certain gradation law and a uniqueness in a certain spatial range to the target area by the projection unit, and images the target object that projects such an image through the imaging unit, thereby avoiding repeated shooting.
  • the image combined with the relevant characteristics of the image, improves the accuracy of subsequent generation of point clouds in the backend processor, and has better real-time performance.
  • a target object is placed in the target area to ensure that the edge of the target object is located within the target area. This ensures that the image can cover the target object when the image is projected onto the target area.
  • the image capturing unit 102 is configured to acquire at least one image including at least a target object after the image is projected by the projection unit.
  • the projection unit projects the image to the target area, the target area and the target object located in the target area are overlaid on the image. Therefore, the image including at least the target object obtained by the imaging unit is an image of the target object covering the image.
  • the acquiring the image including at least the target object by the imaging unit means that the obtained image may include only the target object, and may also include a target area other than the target object.
  • an image having a certain gradation law and uniqueness within a certain spatial range is projected to the target area by the projection unit, and the image capturing unit captures an image of at least the target object after the projection image is captured. Combining the relevant features of this image improves subsequent follow-up processors The accuracy of the point cloud is generated, and it has good real-time performance.
  • the gradation may include repeated gradations of light and dark (as shown in the lower diagram of FIG. 9), and the light and dark phases may be in various directions, such as horizontal, vertical, oblique, etc.
  • the gradient arrangement may include repeated gradations of light and dark (as shown in the lower diagram of FIG. 9), and the light and dark phases may be in various directions, such as horizontal, vertical, oblique, etc.
  • the gradient arrangement may include repeated gradations of light and dark (as shown in the lower diagram of FIG. 9), and the light and dark phases may be in various directions, such as horizontal, vertical, oblique, etc.
  • the gradient arrangement may include repeated gradations of light and dark (as shown in the lower diagram of FIG. 9), and the light and dark phases may be in various directions, such as horizontal, vertical, oblique, etc.
  • the certain spatial extent means that a fixed-size frame can be preset (the size of the frame can be arbitrarily set as needed), so that no matter how the frame moves on the image, the images in each frame are always only.
  • a fixed-size frame can be preset (the size of the frame can be arbitrarily set as needed), so that no matter how the frame moves on the image, the images in each frame are always only.
  • it may include a plurality of irregular points, a plurality of points of different sizes, a circle shape, a triangle shape; or a plurality of irregular patterns, or a two-dimensional code, etc., as long as it is unique in a certain spatial range It belongs to the scope of protection of the present invention.
  • the projection unit may employ 1 projection unit, 2 projection units, 3 projection units, or more as needed.
  • the projection unit is preferably a projector, and an image to be projected is designed in advance as needed, and then projected onto the target area and the target object by the projector.
  • the projection unit may further include a laser emitting module and a diffractive sheet located in front of the laser emitting module; or a micro laser projection (MEMS) module.
  • MEMS micro laser projection
  • the laser emitting module and the diffraction sheet located in front of the laser emitting module operate on the principle of producing a diffraction sheet having a specific three-dimensional structure, and the diffraction phenomenon of the laser is used to form a predetermined diffraction pattern after passing the laser through the diffraction sheet, which has a large depth of field and a low cost. advantage.
  • the MEMS module mainly comprises a laser emitting device and a mirror chip.
  • the working principle is to generate a light beam by using a laser emitting device, and the light beam is scanned by a mirror chip to generate a picture, and the module has the advantages of large depth of field. ,
  • the imaging unit may adopt one imaging unit, two imaging units, three imaging units or more according to the needs of the method used by the subsequent processor to process the image. However, it is within the scope of the present invention to use several camera units.
  • the specific embodiment is described by taking one imaging unit and two imaging units as an example.
  • the imaging method of the one imaging unit and the two imaging units may further be implemented in the second embodiment. Detailed description.
  • the structure of the one camera unit is shown in FIG. 1 and will not be described here.
  • two imaging units are further described as an example.
  • FIG. 2 is a schematic structural diagram of another imaging device according to an embodiment of the present invention.
  • the imaging unit 102 of the image capturing apparatus 100 includes: a first imaging unit 1021 and a second imaging unit 1022.
  • the first imaging unit 102 and the second imaging unit 103 are respectively configured to acquire the first and second two-dimensional images.
  • FIG. 2 only shows one of the preferred positions of the positions.
  • Each of the image capturing units may be any one or a combination of a black and white camera, a color camera, an infrared camera, and the like.
  • the imaging unit 102 can increase the number of imaging units, for example, the first and second imaging units are black and white cameras, and then A color camera may be added; or the number of the camera units may be increased, for example, a plurality of camera units are disposed around the target object, and the imaging principle is the same as that of the two camera units, and details are not described herein again.
  • FIG. 3 is a schematic structural diagram of a photographing apparatus for three-dimensional imaging of an object according to an embodiment of the present invention.
  • the apparatus may further include at least first and second polarization units 103, 104.
  • the first and second polarization units 103, 104 are respectively disposed on the imaging unit 102 and the projection unit 101.
  • the reflective surface will have a greater impact on the scanning results.
  • the surface of the object Once the surface of the object has a reflective phenomenon, it will affect the 3D point cloud image of the area to some extent. Therefore, the polarizing units are respectively disposed on the projection unit and the imaging unit, and the common cooperation of the two polarization units can greatly reduce the brightness of the reflective surface, so that the reflective surface of the target object can be better obtained. image.
  • the polarizing unit may specifically include: a polarizing plate, a polarizer, and the like.
  • the polarizing unit is preferably a linear polarizing plate, which will be described in detail below.
  • the first polarization unit on the projection unit and the second polarization unit of the image pickup unit are disposed at 90 degrees to each other, such that the polarization direction of the incident light and the polarization of the reflected light in the light projected by the projection unit can be eliminated.
  • the light in the same direction can therefore reduce the reflection to a certain extent.
  • the number of the polarization units varies according to the number of imaging units and projection units.
  • the device may include three polarization units.
  • the number of polarization units can also be increased to match the newly added camera unit.
  • FIG. 4 is a schematic structural diagram of another imaging apparatus according to an embodiment of the present invention.
  • the camera unit further includes at least one first adjusting unit 1021 for increasing the latitude of the camera unit to the brightness of the light.
  • the first adjusting unit may be disposed in the image capturing unit such that the captured image itself has a wider range of brightness latitude.
  • the first adjusting unit 1021 includes, but is not limited to, a dual sensitivity (DUAL-ISO) or a high dynamic range image (HDR) unit, and the above unit can enable the camera unit to better handle the occurrence of dark objects together with bright objects. .
  • DUAL-ISO and HDR are prior art and will not be described here.
  • an adjustment unit may be provided in the image forming apparatus, which will be further described in the second embodiment.
  • the above-mentioned imaging device can be applied to a three-dimensional imaging device, but it should be noted that the imaging device is not limited to a certain application in a three-dimensional imaging device, and any imaging device to which the imaging device can be applied is within the scope of protection of the present invention. .
  • FIG. 10 is a schematic diagram of a continuous near sinusoidal wave of the overall light intensity of a projected image according to an embodiment of the present invention.
  • the present invention also provides an image forming apparatus (not shown), the apparatus comprising at least the photographing apparatus as described in the first embodiment.
  • the apparatus comprising at least the photographing apparatus as described in the first embodiment.
  • the following is a description of the three-dimensional imaging device as an example. It should be noted that the imaging device is not limited to a certain application in a three-dimensional imaging device, and any imaging device to which the imaging device can be applied is within the scope of the present invention.
  • the three-dimensional imaging apparatus includes a camera at a front end and an image processor at the rear end, and the image capturing device at the front end transmits the captured image to a back-end processor, and the back-end processor generates image data according to the front-end camera.
  • Three-dimensional imaging is achieved by a corresponding method.
  • the method adopted by the processor differs according to the number of image capturing units included in the image capturing unit, and even in the same number of image capturing units, three-dimensional imaging can be realized in different ways in the processor.
  • an imaging method using one imaging unit is different from an imaging method using two or more imaging units, and the following imaging devices are respectively described in detail by taking one imaging unit and two imaging units as examples.
  • the imaging unit mentioned in the above embodiment includes two imaging units as an example to explain the structure and working process of the three-dimensional imaging device.
  • the three-dimensional imaging apparatus includes an image capturing device at a front end and an image processor at the rear end, and the image capturing device at the front end transmits the captured image to the back end processor, and the matching unit of the back end processor transmits the image according to the front end image capturing device
  • the first and second image data are obtained, and the first and second images are matched, and according to the matched image, the calculation unit uses a triangulation algorithm to draw A point cloud diagram of a three-dimensional object.
  • the two imaging units need to be calibrated in advance, that is, by arranging a calibration plate in the target area, the processor includes a calibration calculation unit, and the calibration plate in the target area is captured by the camera unit of the front end.
  • the image is transmitted to the calibration calculation unit of the back-end processor, thereby performing global calibration of the parameters of the camera unit.
  • the calibration belongs to the prior art and will not be described herein.
  • FIG. 6 is a structural block diagram of another imaging device according to an embodiment of the present invention
  • FIG. 9 is a schematic diagram of a continuous near sinusoidal wave of an overall light intensity of a projected image according to an embodiment of the present invention.
  • the image forming apparatus includes, in addition to the photographing apparatus, a first matching unit located in the processor, and the first matching unit includes a first calculating unit and a second calculating unit.
  • the first calculating unit is configured to match the first and second images.
  • a block-match method can be used.
  • the specific matching process is as follows:
  • the image projecting device projects the preset pattern on the scanned object, and the first and second camera units of the image capturing device respectively take two pictures with the field of view difference, respectively
  • the first image and the second image are sent to the backend processor.
  • the first camera unit and the second camera unit are generally the same in hardware and software settings except for the position information, so the pixel brightness values of the first image and the second image where the pictures overlap are highly close.
  • a fixed-size image frame is usually set around the matched pixel points to perform image block matching in the image frame.
  • N is a search range of two image parallaxes
  • the comparison method is to calculate the absolute value of the luminance difference between the corresponding pixel points of the two image blocks, and then sum this absolute value to obtain a matching score. Therefore, N matching scores can be obtained, and the minimum of the N matching scores can be obtained, and the pixel points in the second image corresponding to the minimum value correspond to the matched pixel points in the first image. And the accuracy of this correspondence is usually plus or minus 0.5 pixels.
  • the second calculating unit is configured to use a certain gradation rule of the first and second images on the matched first and second images, thereby achieving higher precision of the first and second images. match.
  • the brightness of the first image and the second image are continuously changed as a whole, and this feature can be utilized to improve the accuracy of the matching relationship.
  • One possible method is as follows: The method described in the above paragraph finds the matched point A in the first image and the pixel point B matched in the second image, and then adds the brightness of each pixel in the image frame corresponding to the point A to calculate the image frame.
  • the overall brightness value AL corresponding to the overall brightness value BL2 of the B point image frame and the overall brightness value BL1 of the image frame adjacent to the left and right, and BL3, then the three values BL1, BL2, BL3 usually have Nearly linear monotonic change relationship (the continuous change shown in Figure 10 is close to the sinusoidal change relationship), and the AL value will fall between BL1 and BL2 or between BL2 and BL3.
  • AL, BL1, BL2 or AL, BL2, BL3 can be used for linear interpolation to calculate the sub-pixel matching relationship.
  • the imaging device further includes a third calculation unit (not illustrated) for acquiring the three-dimensional point cloud image of the target object by the correlation calculation method.
  • FIG. 15 is a structural block diagram of another imaging device according to an embodiment of the present invention.
  • the processor further includes a first flat field correction unit 400 (Flat Field Correction), and the first flat field correction unit transmits the first from the front end camera unit 102.
  • the second image is subjected to the flat field correction, so that the brightness of the picture taken by the camera unit 102 is closer to the real situation, and then the image after the flat field correction is re-transmitted to the first matching unit 200, thereby helping to improve the subsequent image matching. Accuracy.
  • FIG. 7 is a structural block diagram of another imaging device according to an embodiment of the present invention.
  • the imaging device includes one imaging unit, and the imaging device includes:
  • a storage unit configured to acquire and store an image of the plurality of target areas in advance as a reference image.
  • an image of the plurality of target areas is acquired as a reference image in advance, and one of the feasible methods may be as follows: placing a flat plate (this is the target area) in parallel in the image capturing In front of the unit, the distance is the minimum working distance of the three-dimensional scanning device, Then, the projection unit projects an image as described in Embodiment 1 on the tablet, takes a reference picture and records the distance corresponding to the reference image. Next, move the tablet equidistantly away from the camera (each distance is s), take several pictures (K sheets) and record the corresponding distance. The farthest picture is the 3D scanning system. The farthest working distance.
  • the device further comprises a second matching unit located in the processor, the second matching unit comprising a fourth calculating unit and a fifth calculating unit.
  • the fourth calculating unit is configured to match the acquired image including at least the target object with a plurality of reference images stored in advance.
  • the block-match method does block matching of pixels.
  • the specific method is as follows: when calculating the correspondence between the target image and the reference picture pixel, a fixed size image frame is generally set around the matched pixel point to perform image block matching.
  • An n*n image block in the target picture is compared with the direction of the polar line of the shooting unit and the projection unit (horizontal direction of the picture) with N identically sized image blocks in one of the reference pictures (N is based on A search range value calculated by the translation distance s, the larger the s, the larger the N is).
  • the comparison method is to first normalize the brightness values of the two image blocks, then calculate the absolute value of the brightness difference of the corresponding pixel points of the two image blocks, and then sum the absolute values to obtain a match. fraction. From this, N matching scores can be obtained, and all the reference pictures are similarly matched, then there are a total of K*N matching scores, and the minimum value is found among the K*N scores, then the target pixel points are known. Which pixel of the reference image (assumed to be the Mth sheet) corresponds to which the accuracy of the correspondence is plus or minus 0.5 pixels.
  • the fifth calculating unit uses the gradation rule of the image containing the target object and the matched reference image to achieve a more high-precision matching of the image containing the target object and the matched reference image.
  • the brightness of the target picture and the reference picture are continuously changed as a whole, and this feature can be utilized to improve the accuracy of the matching relationship.
  • One possible method is as follows: First, use the above paragraph The method described finds the matched point A in the first image And the pixel point B matching the Mth reference image, and then centering on the two pixels, respectively setting a slightly larger image frame on the target picture and the Mth reference picture, for example ( n+2)*(n+2), and then normalize the brightness values of the two image frames respectively, then the pixel brightness values of the two image frames are highly close.
  • the brightness of each pixel in an n*n image frame corresponding to the point A on the target image is added to calculate the overall brightness value AL of the image frame, and the corresponding corresponding point B of the Mth reference image is calculated.
  • the monotonous change relationship (the continuous change shown in Figure 8 is close to the sine wave change relationship), and the AL value will fall between BL1 and BL2 or between BL2 and BL3.
  • AL, BL1, BL2 or AL, BL2, BL3 can be used for linear interpolation to calculate the sub-pixel matching relationship.
  • the imaging device further includes a sixth calculation unit (not illustrated) for acquiring the three-dimensional point cloud image of the target object by the correlation calculation method.
  • FIG. 5 is a structural block diagram of a three-dimensional imaging device according to an embodiment of the present invention.
  • the device 1 further includes at least one second adjustment unit 103 for increasing the latitude of the image captured by the camera unit to the brightness of the light.
  • the above method can be realized by taking a plurality of photographs having different exposure values at the time of taking a photograph or by imaging a plurality of exposure values in the same photograph.
  • the second adjustment unit may be disposed in the imaging unit such that the captured image itself has a greater range of brightness latitude. It can also be disposed in the three-dimensional imaging device, such as in a storage unit, in particular, by calling a technical program of the adjustment unit stored in advance by the storage unit, and then processing the image after the matching is completed, thereby It is better to deal with the situation where dark objects appear together with bright objects; or the program of the DUAL-ISO or HDR technology is pre-written into the software program or burned into the hardware.
  • the adjustment unit may include, but is not limited to, a dual sensitivity (DUAL-ISO) or a high dynamic range image (HDR) unit, and the above unit may enable the camera unit to better handle the occurrence of dark objects together with bright objects.
  • DUAL-ISO dual sensitivity
  • HDR high dynamic range image
  • the first adjusting unit and the second adjusting unit may be optionally disposed in the same imaging device, that is, the first adjusting unit is disposed in the camera unit, and the second adjustment is further set in the device. Units, which increase latitude over a wider range.
  • FIG. 16 is a structural block diagram of another imaging device according to an embodiment of the present invention.
  • the processor further includes a second flat field correction unit 500 (Flat Field Correction), and the second flat field correction unit 500 transmits the image from the front end camera unit 102.
  • the image is subjected to the flat field correction so that the brightness of the picture taken by the camera unit 102 is closer to the real situation, and then the image subjected to the flat field correction is retransmitted to the second matching unit 300, thereby helping to improve the accuracy of subsequent image matching.
  • FIG. 8 is a schematic flowchart diagram of a photographing method according to an embodiment of the present invention.
  • the embodiment of the present invention further provides a three-dimensional image capturing method corresponding to the three-dimensional image capturing device according to the first embodiment, the method comprising:
  • S201 Projecting an image to a target area including the target object, the image has a certain gradation law and is unique within a certain spatial range.
  • the method further includes: reducing, by the polarization unit, a lightness of a light reflecting surface on the target area captured by the image capturing unit.
  • the method further includes increasing the latitude of the camera unit to the brightness of the light.
  • the photographing method can be applied to the three-dimensional imaging method, but it should be noted that the photographing method is not limited to a certain application in the three-dimensional imaging method, and any imaging method to which the photographing method can be applied is within the scope of protection of the present invention. .
  • the fourth embodiment is an imaging method including the photographing method as described in the third embodiment.
  • the present invention also provides an imaging method (not shown), the method comprising at least the photographing method as described in the third embodiment.
  • the three-dimensional imaging method will be further described as an example. It should be noted that the imaging method is not limited to a certain application in the three-dimensional imaging method, and any imaging method to which the imaging method can be applied is within the scope of the protection of the present invention.
  • the image capturing device located at the front end transmits the captured image to the back-end processor, and the back-end processor performs three-dimensional imaging by a corresponding method according to the image data sent by the front-end camera.
  • the method adopted by the processor varies according to the number of imaging units, and even if the same number of imaging units, three-dimensional imaging can be implemented in different ways in the processor.
  • FIG. 11 is a partial flowchart of an imaging method according to an embodiment of the present invention.
  • the two camera units obtain two images, need to match the two images, and then draw the relevant three-dimensional point cloud image for the matched results.
  • the method and steps are described in detail below.
  • the image projecting device projects the preset pattern on the scanned object, and the first and second camera units of the image capturing device respectively take two pictures with the field of view difference, respectively
  • the first image and the second image are sent to the backend processor.
  • the step of matching the first and second images by the backend processor is included, and the method for image matching includes the following steps:
  • the gradation rule of the first and second images is utilized to achieve higher precision matching of the first and second images.
  • the imaging method further includes: S305, acquiring the three-dimensional point cloud image of the target object by using the matched computing result (not illustrated).
  • a three-dimensional point cloud image of the target object can be obtained by a method of triangulation.
  • the method of triangulation belongs to the prior art, and details are not described herein again.
  • the method further includes: S306 performing first, second, and second images transmitted from the front-end imaging unit through the first flat field correction unit for flat field correction, and then transmitting to the first Matching units are matched.
  • Performing flat field correction on the image before matching can make the brightness of the picture taken by the camera unit closer to the real situation, and improve the accuracy of picture matching.
  • the imaging method using one imaging unit includes the following methods:
  • S401 Acquire and store an image of a plurality of target regions as a reference image in advance.
  • the step of the backend processor matching the stored plurality of reference images and the image including at least the target object includes the following steps:
  • the three-dimensional imaging method further includes, in S406, acquiring the three-dimensional point cloud image of the target object by using the matched image result (not illustrated).
  • the method further comprises: increasing a latitude of the image captured by the camera unit to the brightness of the light.
  • the method further includes S406: performing an off-field correction on the image containing the target object transmitted from the front-end photographing unit through the second flat field correcting unit, and then transmitting the image to the second match. The unit is matched.
  • Performing flat field correction on the image before matching can make the brightness of the picture taken by the camera unit closer to the real situation, and improve the accuracy of picture matching.
  • a shooting method and a three-dimensional imaging method, device and device thereof provided by embodiments of the present invention
  • the detailed description of the present invention is intended to be a Those skilled in the art, in light of the spirit of the present invention, are susceptible to variations and substitutions within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

本发明提供一种投射单元,所述投射单元用于向包含目标物体的目标区域投射图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性。本发明还提供相应拍摄装置、成像设备、处理器。本发明采用的技术方案通过投射单元向所述目标区域投射一张成一定渐变规律且在一定空间范围内具有唯一性的图像,通过摄像单元对投射这样图像的目标物体进行拍摄,避免重复拍摄多张图像,结合此图像的相关特性提高了后续在后端处理器生成点云的精度,同时具有较好的实时性。

Description

一种投射单元及包括该单元的拍摄装置、处理器、成像设备 技术领域
本发明涉及成像技术领域,具体涉及一种投射单元及包括该单元的拍摄装置、处理器、成像设备。
背景技术
在进行物体三维成像的过程中,首先需要通过摄像单元拍摄物体的二维图像,然后通过相关的算法,形成物体的三维图像。二维图像获取的方法,直接影响了后面的运算以及最终进行三维成像的精度。
目前现有技术通常通过向目标区域投射离散斑(即通过向目标区域发射离散光束,从而在目标区域形成离散光斑,所述离散光斑在一定区域内具有唯一性,这样通过拍摄覆盖离散光斑的目标区域内的目标物体图像,然后利用所述离散光斑在一定空间范围内具有唯一性的特征,对图像进行匹配。但问题在于,通过这种离散光斑的匹配方法,由於离散光在空间中缺乏连续性,就会导致后端处理器的最终的匹配结果达不到很高(亚像素)精度。
又或者如微软公司申请的专利,专利号为:US13/907426,该专利是需要连续投射多张带斑点的相移条纹场景图像,虽然可以提高成像精度,但因为需要同时抓拍多张照片,因此造成一方面速度慢,难以实现较好的实时性,另一方面针对运动的物体的点云成像精度不够。
发明内容
本发明为解决上述问题,提供一种投射单元及包括该单元的拍摄装置、处理器、成像设备。通过投射单元向所述目标区域投射一张成一定渐变规律且在一定空间范围内具有唯一性的图像,结合此图像的相关特性提高了后续在后端处理器生成点云的精度,同时具有较好的实时性。
本发明第一方面提供一种投射单元,用于向包含目标物体的目标区域投射 图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性。
本发明第二方面提供一种拍摄装置,所述装置包括:至少一个投射单元、至少一个摄像单元。
所述投射单元,用于向包含目标物体的目标区域投射图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性;
所述摄像单元,用于获取通过所述投射单元投射图像后的至少包含目标物体的至少一幅图像。
本发明第三方面提供一种成像设备,所述设备包括如上所述的三维成像的拍摄装置。
本发明第四方面提供一种成像设备的处理器,所述处理器包括:至少一第一匹配单元,所述第一匹配单元包括至少第一、第二计算单元;
所述第一计算单元,用于匹配所述第一、第二图像;
所述第二计算单元,用于对所述匹配后的第一、第二图像,利用所述图像的一定渐变规律,从而实现所述第一、第二图像更高精度的匹配。
发明第五方面提供一种成像设备的处理器,所述处理器包括:存储单元、至少一第二匹配单元,所述第二匹配单元包括至少第四、第五计算单元;
所述存储单元,用于预先获取并存储多个目标区域的图像作为参考图像;
所述第四计算单元,用于将所述获取的至少包含目标物体的图像与预先存储的多个参考图像进行匹配;
所述第五计算单元,利用所述包含目标物体的图像和匹配的参考图像的渐变规律,从而实现所述包含目标物体的图像与所述匹配的参考图像更高精度的匹配。
由上可见,本发明实施例采用一种投射单元及包括该单元的拍摄装置、处理器、成像设备。取得了以下技术效果:
1、由于本发明通过投射单元向所述目标区域投射一张成一定渐变规律且在一定空间范围内具有唯一性的图像,通过摄像单元对投射这样图像的目标物体进行拍摄,避免重复拍摄多张图像,结合此图像的相关特性提高了后续在后端处理器生成点云的精度,同时具有较好的实时性。
2、由于本发明采用的拍摄装置,通过投射单元向所述目标区域投射一张成一定渐变规律且在一定空间范围内具有唯一性的图像,所述摄像单元拍摄投射图像后的至少包含目标物体的图像,因此结合此图像的相关特性提高了后续在后端处理器生成点云的精度,同时具有较好的实时性。
3、由于目标物体在进行拍摄过程中,反光面会对扫描结果造成较大的影响,一旦物体表面有反光现象,就无法得出该区域的三维点云图。因此将偏振单元,分别设置在投射单元和摄像单元上,通过两个偏振单元的共同配合,能很大限度的降低反光面的亮度,从而让目标物体的反光面也能获取到图像。
4、由于所述拍摄装置的摄像单元还包括至少一调整单元或者三维成像设备的后端处理器还包括至少一调整单元,因此可以增加摄像单元或者摄像单元所拍摄的图像对光亮度的宽容度。
5、由于采用线性变化的正弦波状的图像,光强成线性变化,因此不会出现在波峰和波谷位置附近出现相邻的像素之间像素值的差值变化小的情况。
6、由于投射单元包括激光发射模块和位于激光发射模块前方的衍射片;或MEMS模块,相对投影仪可以获得更大景深的扫描范围和更低的成本。
7、由于处理器中加入平场校正单元,从而使得摄像单元所拍摄图片的亮度更接近真实情况,提升了图片匹配的准确度。
附图说明
为了更清楚地说明本发明实施例技术方案,下面将对实施例和现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1是本发明实施例提供的一种拍摄装置的结构框图;
图2是本发明实施例提供的另一种拍摄装置的结构框图;
图3是本发明实施例提供的另一种拍摄装置的结构框图;
图4是本发明实施例提供的另一种拍摄装置的结构框图;
图5是本发明实施例提供的一种成像设备的结构框图;
图6是本发明实施例提供的另一种成像设备的结构框图;
图7是本发明实施例提供的另一种成像设备的结构框图;
图8是本发明实施例提供的一种拍摄方法的流程示意图;
图9是本发明实施例提供的投射图像和现有技术离散光斑的区别示意图,其中上方是离散光斑,下方是本发明实施例提供的投射图像;
图10是本发明实施例提供的投射的成一定渐变规律的图像的整体光强度的连续接近正弦波状的示意图;
图11本发明实施例提供的一种成像方法的部分流程图;
图12本发明实施例提供的另一种成像方法的部分流程图;
图13本发明实施例提供的另一种成像方法的部分流程图;
图14本发明实施例提供的另一种成像方法的部分流程图;
图15是本发明实施例提供的另一种成像设备的结构框图;
图16是本发明实施例提供的另一种成像设备的结构框图。
具体实施方式
本发明实施例提供一种投射单元及包括该单元的拍摄装置、处理器、成像设备,通过投射单元向所述目标区域投射一张成一定渐变规律且在一定空间范围内具有唯一性的图像,结合此图像的相关特性提高了后续在后端处理器生成点云的精度,同时具有较好的实时性。
为了使本领域的人员更好地理解本发明方案,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分的实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都应当属于本发明保护的范围。
实施例一、
在进行物体三维成像的过程中,首先需要通过摄像单元拍摄物体的二维图像,然后通过相关的算法,形成物体的三维图像。二维图像的拍摄,会影响后面所采用的三维成像的运算以及最终进行三维成像的精度。
目前现有技术可以通过向目标区域投射离散光斑(即通过向目标区域发射离散光束(比如:激光束,如图8上方图所示),从而在目标区域形成离散光斑,这样通过拍摄覆盖离散光斑的目标区域内的目标物体图像,然后通过所述离散光斑对图像进行匹配。但问题在于,通过这种离散光斑的匹配方法,由於投影图案光强在空间中缺乏连续性,就会导致最终的匹配结果达不到很高的精度(比如:亚像素精度)。就会导致后端处理器的最终的匹配结果达不到很高(亚像素)精度。又或者如微软公司申请的专利,专利号为:US13/907426,该专利是需要连续投射多张带斑点的相移条纹场景图像,虽然可以提高成像精度,但因为需要同时抓拍多张照片,因此造成一方面速度慢,难以实现较好的实时性,另一方面针对运动的物体的点云成像精度不够。
为解决上述问题,本发明提出一种拍摄装置,用于拍摄位于目标区域内的目标物体。
图1为本发明实施例提供的一种拍摄装置的结构示意图。图9是本发明实施例提供的投射图像和现有技术离散光斑的区别示意图,其中上方是离散光斑,下方是本发明实施例提供的投射图像。图10是本发明实施例提供的投射的成一定渐变规律的图像的整体光强度的连续接近正弦波状的示意图。
如图1所示,所述装置100包括:至少一个投射单元101、至少一个摄像单元102。
所述投射单元101,用于向包含目标物体的目标区域投射图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性。
所述投射的图像为一张图像。所述一张图像需包括目标物体,如果因为拍摄范围的限制,当一张图像不能包括整个目标物体时,有可能需要拍摄多个图像,然后在后续处理器中进行拼接从而形成一张图像,后端处理器的拼接技术属于现有技术,在此不再赘述。
由于所述图像成一定渐变规律且在一定空间范围内具有唯一性,因此所投射的在一定范围空间内具有唯一性的图像的光强整体上成连续的正弦波状。
所述正弦波并不一定完全符合正弦波标准,也可以为接近正弦波。如图10所示,比如:不完全规则的正弦波,或者成线性变化的正弦波(也称作三角波)。采用成线性变化的正弦波的好处在于,由于在正弦波的波峰和波谷位置附近,相邻的像素之间像素值的差值变化小,从而可能会造成后续处理器匹配的过程中由于相邻的像素之间像素值的差值变化小,从而不能很好完成后续的更高精度的匹配,而采用成线性变化的正弦波的图像,光强成线性变化,因此不会出现在波峰和波谷位置附近出现相邻的像素之间像素值的差值变化小的情况,因此有利于提高后续匹配的精度。
本发明的投射单元,通过投射单元向所述目标区域投射一张成一定渐变规律且在一定空间范围内具有唯一性的图像,通过摄像单元对投射这样图像的目标物体进行拍摄,避免重复拍摄多张图像,结合此图像的相关特性提高了后续在后端处理器生成点云的精度,同时具有较好的实时性。
将目标物体放置在目标区域,以保证所述目标物体的边缘都位于所述目标区域内。这样可以保证当向目标区域投射图像的时候,所述图像可以覆盖所述目标物体。
所述摄像单元102,用于获取通过所述投射单元投射图像后的至少包含目标物体的至少一幅图像。
当投射单元向目标区域投射完图像后,所述目标区域及位于所述目标区域内的目标物体上覆盖了所述图像。因此通过摄像单元获得的至少包含目标物体的图像是覆盖了图像的目标物体的图像。
所述通过摄像单元获取至少包含目标物体的图像,是指所获得的图像即可以只包含目标物体,也可以同时还包含目标物体之外的目标区域。
采用本发明的拍摄装置,通过投射单元向所述目标区域投射一张成一定渐变规律且在一定空间范围内具有唯一性的图像,所述摄像单元拍摄投射图像后的至少包含目标物体的图像。结合此图像的相关特性提高了后续在后端处理器 生成点云的精度,同时具有较好的实时性。
需要说明的是,进一步,所述渐变可以包括重复的明暗相间的渐变(如图9下方图所示),所述明暗相间可以成各种方向,比如成水平、竖直、倾斜等等明暗相间的渐变排布方式。
所述在一定空间范围是指,可以预先设定一个固定大小的框(所述框的大小可以根据需要任意设定),保证无论框在图像上如何移动,始终每个框内的图像都是唯一的。具体可以包括多个不规则的点、多个大小不一的点、圆圈状、三角形;或者多个不规则的图形、或者二维码等等,只要是满足在一定空间范围内具有唯一性都属于本发明的保护范围。
在一些实施例中,所述投射单元可以根据需要采用1个投射单元、2个投射单元、3个投射单元或者更多。
所述投射单元优选投影仪,预先根据需要设计需要投影的图像,然后将所述图像通过投影仪投射到目标区域及目标物体上。
除投影仪外,所述投射单元还可以包括激光发射模块和位于激光发射模块前方的衍射片;或微激光投影(MEMS)模块。
激光发射模块和位于激光发射模块前方的衍射片,其工作原理为制作有特定立体结构的衍射片,利用激光的衍射现象使激光通过衍射片后形成预设的衍射图形,具有景深大成本低的优点。
MEMS模块,结构主要包括一激光发射装置和一反射镜芯片,工作原理为利用激光发射装置生成光束,该光束通过反射镜芯片扫描生成图片,采用该模块具有景深大的优点。,
所述摄像单元可以根据后续处理器处理图像所采用的方法的需要,采用1个摄像单元、2个摄像单元、3个摄像单元或者更多。但不论采用几个摄像单元都属于本发明保护范围内。
为理解本发明的技术方案,本具体实施例分别以1个摄像单元和2个摄像单元为例进行说明,所述1个摄像单元和2个摄像单元的成像方法在实施例二中会有进一步详细的描述。
所述1个摄像单元的结构参见附图1,在此不再描述。
本具体实施例再以2个摄像单元为例进行详细的说明。
如图1、2所示,其中图2为本发明实施例提供的另一种拍摄装置的结构示意图。
为进一步理解,下面以两个摄像单元为例,进行详细说明。所述图像拍摄装置100的摄像单元102包括:第一摄像单元1021、第二摄像单元1022
所述第一摄像单元102和第二摄像单元103,分别用于获取所述第一、第二二维图像。
找出第一摄像单元102和第二摄像单元103的对应关系,将获取的第一、第二二维图像发送到后端处理器,通过所述对应关系去计算深度数据,绘制出目标物体的三维点云图,所述成像方法和原理在实施例二中会有进一步的详细描述。
需要说明的是,所述第一、第二摄像装置与投射单元三者之间的位置可以根据需要任意设置,所述附图2,只列明了其中一种较优的位置排列的方式。
所述各个摄像单元可以是黑白相机、彩色相机、红外相机等等任意一个或者组合。
所述摄像单元102除了包括第一、第二摄像单元1021、1022,除此之外还可以增加摄像单元的个数,比如:第一、第二摄像单元为黑白相机,则在此之外还可以增加一个彩色相机;或者单纯增加摄像单元的个数,比如环绕所述目标物体设置多个摄像单元,其成像原理与所述2个摄像单元的原理相同,在此不再赘述。
如图3所示,为本发明实施例提供的一种物体三维成像的拍摄装置的结构示意图。
在一些优选实施例中,所述装置还可以包括:至少第一、第二偏振单元103、104。
所述第一、第二偏振单元103、104分别设置在所述摄像单元102和投射单元101上。
因为目标物体在进行拍摄过程中,反光面会对扫描结果造成较大的影响,一旦物体表面有反光现象,就一定程度上影响该区域的三维点云图。因此将偏振单元,分别设置在投射单元和摄像单元上,通过两个偏振单元的共同配合,能很大限度的降低反光面的亮度,从而让目标物体的反光面也能被更好的获取到图像。
所述偏振单元具体可以包括:偏振片、偏光片等等。
在本具体实施例中,所述偏振单元优选线偏振片,下面进行详细说明。
在一些优选实施例中,优选投射单元上的第一偏振单元与摄像单元的第二偏振单元彼此成90度设置,这样设置可以消除投射单元投射的光中入射光的偏振方向与反射光的偏振方向相同情况的光,因此可以在一定程度上减弱反光。
需要说明的是,所述偏振单元的数量跟随摄像单元和投射单元的数量而变化,当摄像单元包括如上实施例所述的两个摄像单元的时候,所述装置则可以包括三个偏振单元。
如果所述摄像单元除了第一、第二摄像单元还包括其它摄像单元,同样可以增加偏振单元的个数,与新增加的摄像单元相匹配。
如图4所示,所述图4为本发明实施例提供的另一种拍摄装置的结构示意图。
在一些实施例中,所述摄像单元还包括至少一第一调整单元1021,用于增加摄像单元对光亮度的宽容度。
所述第一调整单元可以设置在所述摄像单元内,使得拍摄到的图像本身具有更大范围的光亮度宽容度。所述第一调整单元1021包括但不限于:双感光度(DUAL-ISO)或者高动态范围图像(HDR)单元,上述单元可以让摄像单元能较好的处理暗物体与亮物体一起出现的情况。所述DUAL-ISO和HDR,属于现有技术,在此不再赘述。除此之外,调整单元可以设置在所述成像设备中,在实施例二中会有进一步的描述。
上述拍摄装置,可以应用在三维成像设备中,但需要说明的是,所述拍摄装置并不限于一定应用在三维成像设备中,任何可以应用该拍摄装置的成像设备都属于本发明保护的范围内。
实施例二、
图10是本发明实施例提供的投射的图像的整体光强度的连续接近正弦波示意图。
本发明还提供一种成像设备(图未示意出),所述设备至少包括如实施例一所述的拍摄装置。下面以三维成像设备为例进行进一步的说明,需要说明的是,所述拍摄装置并不限于一定应用在三维成像设备中,任何可以应用该拍摄装置的成像设备都属于本发明保护的范围内。
所述三维成像设备包括位于前端的拍摄装置和位于后端的图像处理器,位于前端的图像拍摄装置将拍摄的图像发送到后端处理器,后端处理器根据前端拍摄装置发送来的图像数据,通过相应的方法从而实现三维成像。
所述处理器采用的方法,根据摄像单元所包含的摄像单元的数量的不同而不同,即使相同数量的摄像单元,也可以在处理器中通过不同的方法实现三维成像。通常采用1个摄像单元的成像方法与采用2个及2个以上的摄像单元的成像方法不同,下面分别以1个摄像单元、2个摄像单元为例分别详细说明相关的成像设备。
本具体实施例,以上面实施例中提到过的摄像单元包括2个摄像单元为例,大概说明三维成像设备的结构及工作过程。对成像设备结构的描述,因为设备里的其它部件不是本发明的发明点,所以只用简单语言描述部分的结构。
所述三维成像设备包括位于前端的图像拍摄装置和位于后端的图像处理器,位于前端的图像拍摄装置将拍摄的图像发送到后端处理器,后端处理器的匹配单元根据前端图像拍摄装置发送来的第一、第二图像数据,将所述第一、第二图像进行匹配,根据匹配后的图像,计算单元利用三角测量的算法,绘制 三维物体的点云图。下面对该三维成像设备的其它各个部分的工作过程进行进一步详细的描述:
在上述拍摄装置拍摄图像之前,需要预先对两个摄像单元进行标定,即通过在目标区域中布置标定板,所述处理器中包括标定计算单元,通过前端的摄像单元拍摄的目标区域中标定板的图像发射到后端处理器的标定计算单元,从而进行摄像单元内、外参数的全局标定。所述标定属于现有技术在此不再赘述。
图6是本发明实施例提供的另一种成像设备的结构框图;图9是本发明实施例提供的投射的图像的整体光强度的连续接近正弦波示意图。
如图6所示,所述成像设备除拍摄装置外还包括:位于处理器中的第一匹配单元,所述第一匹配单元包括第一计算单元和第二计算单元。
所述第一计算单元,用于对所述第一、第二图像进行匹配。
具体可以通过块匹配(Block-match)的方法。具体匹配过程如下:
在上述图像拍摄装置拍摄图像的过程中,图像投射装备投射出我们预设定的图案在扫描物体上,图像拍摄装置的第一、第二摄像单元分别拍摄两张具有视场差的图片,分别为第一图像和第二图像,并发送给后端处理器。第一摄像单元和第二摄像单元除了位置信息不一样,在硬件和软件设定上通常是一样的,所以第一图像和第二图像在图片重叠的地方像素亮度值高度接近。在计算两幅图片像素点之间对应关系的时候,通常会以被匹配的像素点为中心设定一个固定大小的图像框,做图像框中的图像块匹配。第一图像中的一个n*n的图像块,会沿两个拍摄单元的极线方向,跟第二图像中的N个同样大小的图像块做对比(N是两幅图片视差的一个搜索范围)。对比的方法是计算两个图像块相对应像素点的亮度差的绝对值,再对这个绝对值求和,得出一个匹配分数。由此可以得出N个匹配分数,可以求得这N个匹配分数中的最小值,则此最小值所对应的第二图像中的像素点与第一图像中的被匹配的像素点相对应,并且此对应关系的精度通常为正负0.5个像素。
所述第二计算单元,用于对所述匹配后的第一、第二图像,利用所述第一、第二图像的一定渐变规律,从而实现所述第一、第二图像更高精度的匹配。
由于所投射的图像在整体上亮度连续变化,所以第一图像和第二图像在总体上亮度也是连续变化的,可以利用这个特性提高匹配关系的精度,其中一个可行的方法如下所述:首先用上段所述的方法找到第一图像中的被匹配点A和第二图像中与其相匹配的像素点B,然后将A点所对应的图像框中的每一个像素的亮度相加计算出图像框的整体亮度值AL,相对应的计算出B点图像框的整体亮度值BL2及其左右相邻的图像框的整体亮度值BL1,和BL3,则BL1,BL2,BL3这三个值通常会有接近线性的单调变化关系(如图10所示的连续变化接近正弦波变化关系),而AL值会落在BL1到BL2之间或者BL2到BL3之间。最后可以利用AL,BL1,BL2或者AL,BL2,BL3做线性插值计算出亚像素的匹配关系。
除此之外,所述成像设备还包括第三计算单元(图未示意出),用于将所述匹配后的图像结果通过相关计算方法获取目标物体的三维点云图。
图15是本发明实施例提供的另一种成像设备的结构框图。
如图15所示,在一些优选实施例中,所述处理器还包括一第一平场校正单元400(Flat Field Correction),所述第一平场校正单元对从前端摄像单元102发送的第一、第二图像进行平场校正,使得摄像单元102所拍摄图片的亮度更接近真实情况,然后将进行平场校正后的图像再发送给第一匹配单元200,从而可以帮助提升后续图像匹配的准确度。
图7是本发明实施例提供的另一种成像设备的结构框图。
如图7所示,所述成像设备包括1个摄像单元,所述成像设备除拍摄装置外还包括:
存储单元,用于预先获取并存储多个目标区域的图像作为参考图像。
在没有将目标物体放置在所述目标区域之前,预先获取多个目标区域的图像作为参考图像,其中一种可行方法可以如下所述:将一张平板(此即为目标区域)平行放置在摄像单元面前,距离为此三维扫描设备的最小工作距离,然 后让投射单元投射出一张如实施例一所述的图像在平板上,拍摄一张参考图片并记录下此参考图像所对应的距离。接下来将平板沿远离相机的方向等距平移(每次移动的距离为s),拍摄若干张图片(K张)并记录相对应的距离,最远拍摄的一张照片就是此三维扫描***的最远工作距离。
除此之外,所述设备还包括位于处理器中的第二匹配单元,所述第二匹配单元包括第四计算单元和第五计算单元。
所述第四计算单元,用于将所述获取的至少包含目标物体的图像与预先存储的多个参考图像进行匹配。
拍摄装置获取包含目标物体的图像后通过处理器的通讯端口将图像发送给后端处理器,后端处理器的第三计算单元首先将会用拍摄的图像跟预先存储的一系列参考图像通过块匹配(Block-match)的方法做像素的块匹配。具体方法如下:在计算目标图片与参考图片像素点之间对应关系的时候,通常会以被匹配的像素点为中心设定一个固定大小的图像框,做图像块的匹配。目标图片中的一个n*n的图像块,会沿拍摄单元与投射单元的极线方向(图片的水平方向)跟其中一张参考图片中的N个同样大小的图像块做对比(N是根据平移距离s计算出来的一个搜索范围值,s越大则N也越大)。对比的方法是,先分别对这两个图像块做亮度值的归一化,然后计算两个图像块相对应像素点的亮度差的绝对值,再对这个绝对值求和,得出一个匹配分数。由此可以得出N个匹配分数,对所有的参考图片都做类似的匹配,则总共有K*N个匹配分数,在这K*N个分数中找出最小值,则可知道目标像素点对应哪张参考图像(假设为第M张)中的哪个像素点,此对应关系的精度为正负0.5个像素。
所述第五计算单元,利用所述包含目标物体的图像和匹配的参考图像的渐变规律,从而实现所述包含目标物体的图像与所述匹配的参考图像更高精度的匹配。
由于所投射的图像在整体上亮度连续变化,所以目标图片和参考图片在总体上亮度也是连续变化的,可以利用这个特性提高匹配关系的精度,其中一个可行的方法如下所述:首先用上段所述的方法找到第一图像中的被匹配点A 和第M张参考图像中与其相匹配的像素点B,然后以这两个像素点为中心,在目标图片和第M张参考图片上面分别设定一个稍微大一点的图像框,比如说是(n+2)*(n+2),然后对这两个图像框分别做亮度值的归一化,则此时这两个图像框的像素亮度值高度接近。然后目标图像上A点所对应的一个n*n的图像框中的每一个像素的亮度相加计算出图像框的整体亮度值AL,相对应的计算出第M张参考图像上面B点所对应的n*n大小的图像框的整体亮度值BL2及其左右相邻的n*n的图像框的整体亮度值BL1,和BL3,则BL1,BL2,BL3这三个值通常会有接近线性的单调变化关系(如图8所示的连续变化接近正弦波变化关系),而AL值会落在BL1到BL2之间或者BL2到BL3之间。最后可以利用AL,BL1,BL2或者AL,BL2,BL3做线性插值计算出亚像素的匹配关系。
除此之外,所述成像设备还包括第六计算单元(图未示意出),用于将所述匹配后的图像结果通过相关计算方法获取目标物体的三维点云图。
如图5所示,所述图5为本发明实施例提供的一种三维成像设备的结构框图。
在一些实施例中,所述设备1还包括至少一第二调整单元103,用于增加摄像单元拍摄的图像对光亮度的宽容度。
上述方法可以通过在拍摄照片时拍摄多张曝光值不一样的照片或者在同一张照片中采用多个曝光值成像的方法来实现。
所述第二调整单元可以设置在所述摄像单元内,使得拍摄到的图像本身具有更大范围的光亮度宽容度。也可以设置在所述三维成像设备中,比如设置在存储单元中,具体可以通过处理器调用所述存储单元预先存储的调整单元的技术程序,然后对所述完成匹配后的图像进行处理,从而可以较好的处理暗物体与亮物体一起出现的情况;或者将所述DUAL-ISO或者HDR技术的程序预先写入软件程序中,或者烧入硬件中。
所述调整单元可以包括但不限于:双感光度(DUAL-ISO)或者高动态范围图像(HDR)单元,上述单元可以让摄像单元能较好的处理暗物体与亮物体一起出现的情况。所述DUAL-ISO和HDR,属于现有技术,在此不再赘述。
所述第一调整单元、第二调整单元可以任选其一,或者共同设置在同一个成像设备中,即在摄像单元中设置第一调整单元的同时,在所述设备中还设置第二调整单元,这样可以更大范围提高宽容度。
图16是本发明实施例提供的另一种成像设备的结构框图。
如图16所示,在一些优选实施例中,所述处理器还包括一第二平场校正单元500(Flat Field Correction),所述第二平场校正单元500对从前端摄像单元102发送的图像进行平场校正,使得摄像单元102所拍摄图片的亮度更接近真实情况,然后将进行平场校正后的图像再发送给第二匹配单元300,从而可以帮助提升后续图像匹配的准确度。
所述拍摄装置请参见具体实施例一中的具体描述,在此不再赘述。
实施例三、
如图8所示,图8是本发明实施例提供的一种拍摄方法的流程示意图。
为了解决上述问题,本发明实施例还提出一种对应所述实施例一所述的三维图像拍摄装置的三维图像拍摄方法,所述方法包括:
S201,向包含目标物体的目标区域投射一张图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性。
S202,获取通过所述投射单元投射图像后的至少包含目标物体的图像。
在一些实施例中,所述方法还包括:通过偏振单元使所述摄像单元摄到的所述目标区域上的反光面的光亮度降低。
所述方法中涉及到的装置或者部件及相关详细描述参见具体实施例一,在此不再赘述。
在一些实施例中,所述方法还包括:增加摄像单元对光亮度的宽容度。
所述拍摄方法可以应用在三维成像方法中,但需要说明的是,所述拍摄方法并不限于一定应用在三维成像方法中,任何可以应用该拍摄方法的成像方法都属于本发明保护的范围内。
施例四、
所述实施例四为一种包括如实施例三所述的拍摄方法的成像方法。
本发明还提供一种成像方法(图未示意出),所述方法至少包括如实施例三所述的拍摄方法。下面以三维成像方法为例进行进一步的说明,需要说明的是,所述拍摄方法并不限于一定应用在三维成像方法中,任何可以应用该拍摄方法的成像方法都属于本发明保护的范围内。
位于前端的图像拍摄装置将拍摄的图像发送到后端处理器,后端处理器根据前端拍摄装置发送来的图像数据,通过相应的方法从而实现三维成像。
所述处理器采用的方法,根据摄像单元所数量的不同而不同,即使相同数量的摄像单元,也可以在处理器中通过不同的方法实现三维成像。
图11为本发明实施例提供的一种成像方法的部分流程图。
如图11所示,下面以2个摄像单元为例,进行说明。两个摄像单元获得两幅图像,需要对两幅图像进行匹配,然后对匹配后的结果绘制相关的三维点云图。下面对该方法和步骤进行详细的描述。
S301,向包含目标物体的目标区域投射一张图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性。
S302,获取通过所述投射单元投射图像后的至少包含目标物体的图像的至少第一、第二图像。
在上述图像拍摄装置拍摄图像的过程中,图像投射装备投射出我们预设定的图案在扫描物体上,图像拍摄装置的第一、第二摄像单元分别拍摄两张具有视场差的图片,分别为第一图像和第二图像,并发送给后端处理器。
在上述步骤之后,包括所述后端处理器对所述第一、第二图像进行匹配的步骤,所述图像匹配的方法包括如下的步骤:
S303,对所述第一、第二图像进行匹配。
S304,利用所述第一、第二图像的渐变规律,从而实现所述第一、第二图像更高精度的匹配。
上述匹配方法的相关步骤的详细描述参见具体实施例二中,第一计算单元和第二计算单元中详细的描述,在此不再重复赘述。
最后,通过上述匹配后,所述成像方法还包括:S305,将所述匹配后的图像结果通过相关计算方法获取目标物体的三维点云图(图未示意出)。
具体可以通过三角测量的方法获得所述目标物体的三维点云图。所述三角测量的方法属于现有技术,在此不再赘述。
如图13所示,在一些优选实施例中,所述方法还包括S306将从前端拍摄单元发送的第一、第二图像通过第一平场校正单元分别进行平场校正,然后发送给第一匹配单元进行匹配。
在匹配前对图像进行平场校正,可以使所述摄像单元所拍摄图片的亮度更接近真实情况,提升了图片匹配的准确度。
如图12所示,下面以1个摄像单元为例进行说明。所述采用1个摄像单元的成像方法包括如下方法:
S401,预先获取并存储多个目标区域的图像作为参考图像。
S402,将目标物体放置在目标区域内,向包含目标物体的目标区域投射一张图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性;
S403,获取通过所述投射单元投射图像后的至少包含目标物体的图像。
在上述步骤之后,包括所述后端处理器对所述存储的多个参考图像和至少包含目标物体的图像进行匹配的步骤,所述图像匹配的方法包括如下的步骤:
S404,将所述获取的至少包含目标物体的图像与预先存储的多个参考图像进行匹配。
S405,利用所述包含目标物体的图像和匹配的参考图像的渐变规律,从而实现所述包含目标物体的图像与所述匹配的参考图像更高精度的匹配。
最后通过上述匹配方法后,所涉三维成像方法还包括,S406,将所述匹配后的图像结果通过相关计算方法获取目标物体的三维点云图(图未示意出)。
在一些实施例中,所述方法还包括:增加摄像单元拍摄的图像对光亮度的宽容度。
如图14所示,在一些优选实施例中,所述方法还包括S406将从前端拍摄单元发送的包含目标物体的图像通过第二平场校正单元分别进行平场校正,然后发送给第二匹配单元进行匹配。
在匹配前对图像进行平场校正,可以使所述摄像单元所拍摄图片的亮度更接近真实情况,提升了图片匹配的准确度。
所述图像拍摄方法及其它相关介绍参见具体实施例一、二、三,在此不再赘述。所述后端处理器进行三角测量运算从而获得三维点云图的方法属于现有技术,在此不再赘述。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述动作顺序的限制,因为依据本发明,某些步骤可以采用其它顺序或者同时进行。
其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详细描述的部分,可以参见其它实施例的相关描述。
需要说明的是,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
以上对本发明实施例所提供的一种拍摄方法及其三维成像方法、装置和设 备进行了详细介绍,但以上实施例的说明只是用于帮助理解本发明的方法及其核心思想,不应理解为对本发明的限制。本技术领域的技术人员,依据本发明的思想,在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。

Claims (16)

  1. 一种投射单元,用于向包含目标物体的目标区域投射图像,其特征在于,所述图像成一定渐变规律且在一定空间范围内具有唯一性。
  2. 根据权利要求1所述的投射单元,其特征在于,所述图像成一定渐变规律包括图像的光强整体上成连续的正弦波、不规则的正弦波、或线性变化的正弦波。
  3. 根据权利要求1所述的投射单元,其特征在于,所述投射单元包括投影仪、激光发射模块和位于激光发射模块前方的衍射片、或MEMS模块。
  4. 一种拍摄装置,其特征在于,所述装置包括:至少一投射单元、至少一个摄像单元;
    所述投射单元,用于向包含目标物体的目标区域投射图像,所述图像成一定渐变规律且在一定空间范围内具有唯一性;
    所述摄像单元,用于获取通过所述投射单元投射图像后的至少包含目标物体的至少一幅图像。
  5. 根据权利要求5所述的装置,其特征在于,所述装置还包括:至少第一偏振单元、第二偏振单元;
    所述第一偏振单元设置在所述投射单元上;
    所述第二偏振单元设置在所述摄像单元上;
    所述第一偏振单元、第二偏振单元相配合使得所述摄像单元拍摄到的所述目标物体的反光面的光亮度降低。
  6. 根据权利要求5或6所述的装置,其特征在于,所述摄像单元包括至少一第一调整单元,用于增加所述摄像单元对光亮度的宽容度。
  7. 根据权利要求5或6所述的拍摄装置,其特征在于,所述图像成一定渐变规律包括图像的光强整体上成连续的正弦波、不规则正弦波、线性变化的正弦波。
  8. 根据权利要求5或6所述的拍摄装置,其特征在于,所述投射单元包括投影仪、激光发射模块和位于激光发射模块前方的衍射片、或MEMS模块。
  9. 一种成像设备,其特征在于,所述设备包括如权利要求4-8任意一项所 述的拍摄装置。
  10. 根据权利要求9所述的成像设备,其特征在于,当所述摄像单元包括至少第一、第二2个摄像单元,所述成像设备还包括:至少一第一匹配单元,所述第一匹配单元包括至少第一、第二计算单元;
    所述第一计算单元,用于匹配所述第一、第二图像;
    所述第二计算单元,用于对所述匹配后的第一、第二图像,利用所述图像的一定渐变规律,从而实现所述第一、第二图像更高精度的匹配。
  11. 根据权利要求10所述的成像设备,其特征在于,所述设备还包括:至少一第一平场校正单元,用于对所述第一、第二2个摄像单元发送的第一和第二图像进行平场校正,将平场校正后的第一和第二图像发送给第一匹配单元。
  12. 根据权利要求9所述的成像设备,其特征在于,当所述摄像单元包括1个摄像单元,所述成像设备还包括:存储单元、至少一第二匹配单元,所述第二匹配单元包括至少第四、第五计算单元;
    所述存储单元,用于预先获取并存储多个目标区域的图像作为参考图像;
    所述第四计算单元,用于将所述获取的至少包含目标物体的图像与预先存储的多个参考图像进行匹配;
    所述第五计算单元,利用所述包含目标物体的图像和匹配的参考图像的渐变规律,从而实现所述包含目标物体的图像与所述匹配的参考图像更高精度的匹配。
  13. 根据权利要求12所述的成像设备,其特征在于,所述设备还包括:至少一第二平场校正单元,用于对所述1个摄像单元发送的图像进行平场校正,将平场校正后的图像发送给第二匹配单元。
  14. 根据权利要求9-13任意一项所述的装置,其特征在于,所述设备还包括:至少一第二调整单元,用于增加所述摄像单元拍摄的图像对光亮度的宽容度。
  15. 一种成像设备的处理器,其特征在于,所述处理器包括:至少一第一匹配单元,所述第一匹配单元包括至少第一、第二计算单元;
    所述第一计算单元,用于匹配所述第一、第二图像;
    所述第二计算单元,用于对所述匹配后的第一、第二图像,利用所述图像的一定渐变规律,从而实现所述第一、第二图像更高精度的匹配。
  16. 一种成像设备的处理器,其特征在于,根据权利要求4所述的装置,其特征在于,所述处理器包括:存储单元、至少一第二匹配单元,所述第二匹配单元包括至少第四、第五计算单元;
    所述存储单元,用于预先获取并存储多个目标区域的图像作为参考图像;
    所述第四计算单元,用于将所述获取的至少包含目标物体的图像与预先存储的多个参考图像进行匹配;
    所述第五计算单元,利用所述包含目标物体的图像和匹配的参考图像的渐变规律,从而实现所述包含目标物体的图像与所述匹配的参考图像更高精度的匹配。
PCT/CN2017/090394 2016-06-29 2017-06-27 一种投射单元及包括该单元的拍摄装置、处理器、成像设备 WO2018001252A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/313,387 US20230199324A1 (en) 2016-06-29 2017-06-27 Projection unit and photographing apparatus comprising same projection unit, processor, and imaging device
EP17819249.8A EP3481062A4 (en) 2016-06-29 2017-06-27 PROJECTION UNIT AND PHOTOGRAPHY APPARATUS COMPRISING SAID PROJECTION UNIT, PROCESSOR, AND IMAGING DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610509224.4A CN106878697A (zh) 2016-06-29 2016-06-29 一种拍摄方法及其成像方法、装置和设备
CN201610509224.4 2016-06-29

Publications (1)

Publication Number Publication Date
WO2018001252A1 true WO2018001252A1 (zh) 2018-01-04

Family

ID=59239415

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/090394 WO2018001252A1 (zh) 2016-06-29 2017-06-27 一种投射单元及包括该单元的拍摄装置、处理器、成像设备

Country Status (4)

Country Link
US (1) US20230199324A1 (zh)
EP (1) EP3481062A4 (zh)
CN (3) CN106878697A (zh)
WO (1) WO2018001252A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3803268A4 (en) * 2018-06-07 2022-03-09 LaDiMo Oy MODELING THE TOPOGRAPHY OF A THREE-DIMENSIONAL SURFACE
CN114697623A (zh) * 2020-12-29 2022-07-01 成都极米科技股份有限公司 投影面选取和投影图像校正方法、装置、投影仪及介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878697A (zh) * 2016-06-29 2017-06-20 鲁班嫡系机器人 一种拍摄方法及其成像方法、装置和设备
CN109434844B (zh) * 2018-09-17 2022-06-28 鲁班嫡系机器人(深圳)有限公司 食材处理机器人控制方法、装置、***、存储介质及设备
CN112752088B (zh) * 2020-07-28 2023-03-28 腾讯科技(深圳)有限公司 深度图像生成方法及装置、参考图像生成方法、电子设备
CN112954153B (zh) * 2021-01-26 2022-09-02 维沃移动通信有限公司 相机装置、电子设备、景深检测方法及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101539595A (zh) * 2009-04-30 2009-09-23 清华大学 一种机器视觉照明***
CN101608908A (zh) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 数字散斑投影和相位测量轮廓术相结合的三维数字成像方法
CN201429412Y (zh) * 2009-06-26 2010-03-24 徐州泰诺仕视觉科技有限公司 内窥镜深度测量装置
US20100265316A1 (en) * 2009-04-16 2010-10-21 Primesense Ltd. Three-dimensional mapping and imaging
CN102316355A (zh) * 2011-09-15 2012-01-11 丁少华 3d机器视觉信号的生成方法及3d机器视觉传感器
CN105701809A (zh) * 2016-01-11 2016-06-22 宁波江丰生物信息技术有限公司 一种基于线阵相机扫描的平场校正方法
CN106878697A (zh) * 2016-06-29 2017-06-20 鲁班嫡系机器人 一种拍摄方法及其成像方法、装置和设备

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6603103B1 (en) * 1998-07-08 2003-08-05 Ppt Vision, Inc. Circuit for machine-vision system
US6956963B2 (en) * 1998-07-08 2005-10-18 Ismeca Europe Semiconductor Sa Imaging for a machine-vision system
CN100480625C (zh) * 2005-11-18 2009-04-22 北京航空航天大学 基于自适应正弦条纹投射的立体视觉检测***
US8090194B2 (en) * 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
US8384997B2 (en) * 2008-01-21 2013-02-26 Primesense Ltd Optical pattern projection
US10182223B2 (en) * 2010-09-03 2019-01-15 California Institute Of Technology Three-dimensional imaging system
US8334985B2 (en) * 2010-10-08 2012-12-18 Omron Corporation Shape measuring apparatus and shape measuring method
JP2013124938A (ja) * 2011-12-15 2013-06-24 Ckd Corp 三次元計測装置
US9349174B2 (en) * 2013-05-31 2016-05-24 Microsoft Technology Licensing, Llc Absolute phase measurement with secondary pattern-embedded fringe
WO2015184308A1 (en) * 2014-05-29 2015-12-03 Northwestern University Motion contrast depth scanning
JP6566768B2 (ja) * 2015-07-30 2019-08-28 キヤノン株式会社 情報処理装置、情報処理方法、プログラム
JP2019059004A (ja) * 2017-09-28 2019-04-18 セイコーエプソン株式会社 ロボットシステム
JP6784280B2 (ja) * 2018-06-27 2020-11-11 セイコーエプソン株式会社 プロジェクターおよびプロジェクターの制御方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100265316A1 (en) * 2009-04-16 2010-10-21 Primesense Ltd. Three-dimensional mapping and imaging
CN101539595A (zh) * 2009-04-30 2009-09-23 清华大学 一种机器视觉照明***
CN201429412Y (zh) * 2009-06-26 2010-03-24 徐州泰诺仕视觉科技有限公司 内窥镜深度测量装置
CN101608908A (zh) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 数字散斑投影和相位测量轮廓术相结合的三维数字成像方法
CN102316355A (zh) * 2011-09-15 2012-01-11 丁少华 3d机器视觉信号的生成方法及3d机器视觉传感器
CN105701809A (zh) * 2016-01-11 2016-06-22 宁波江丰生物信息技术有限公司 一种基于线阵相机扫描的平场校正方法
CN106878697A (zh) * 2016-06-29 2017-06-20 鲁班嫡系机器人 一种拍摄方法及其成像方法、装置和设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3481062A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3803268A4 (en) * 2018-06-07 2022-03-09 LaDiMo Oy MODELING THE TOPOGRAPHY OF A THREE-DIMENSIONAL SURFACE
US11561088B2 (en) 2018-06-07 2023-01-24 Pibond Oy Modeling the topography of a three-dimensional surface
CN114697623A (zh) * 2020-12-29 2022-07-01 成都极米科技股份有限公司 投影面选取和投影图像校正方法、装置、投影仪及介质
CN114697623B (zh) * 2020-12-29 2023-08-15 极米科技股份有限公司 投影面选取和投影图像校正方法、装置、投影仪及介质

Also Published As

Publication number Publication date
CN207766424U (zh) 2018-08-24
CN107241592B (zh) 2020-08-11
CN106878697A (zh) 2017-06-20
US20230199324A1 (en) 2023-06-22
EP3481062A4 (en) 2020-01-08
CN107241592A (zh) 2017-10-10
EP3481062A1 (en) 2019-05-08

Similar Documents

Publication Publication Date Title
WO2018001252A1 (zh) 一种投射单元及包括该单元的拍摄装置、处理器、成像设备
US10309770B2 (en) Three-dimensional sensor system and three-dimensional data acquisition method
KR102073205B1 (ko) 복수 개의 다른 파장의 레이저를 포함하는 3차원 스캔 방법 및 스캐너
CN110689581B (zh) 结构光模组标定方法、电子设备、计算机可读存储介质
TWI555379B (zh) 一種全景魚眼相機影像校正、合成與景深重建方法與其系統
US9774837B2 (en) System for performing distortion correction and calibration using pattern projection, and method using the same
US9807372B2 (en) Focused image generation single depth information from multiple images from multiple sensors
JP2019510234A (ja) 奥行き情報取得方法および装置、ならびに画像取得デバイス
CN110009672A (zh) 提升ToF深度图像处理方法、3D图像成像方法及电子设备
US20160117820A1 (en) Image registration method
JP2010113720A (ja) 距離情報を光学像と組み合わせる方法及び装置
JP2012088114A (ja) 光学情報処理装置、光学情報処理方法、光学情報処理システム、光学情報処理プログラム
CN107808398B (zh) 摄像头参数算出装置以及算出方法、程序、记录介质
JP2009524849A (ja) シーン画像および奥行き形状を取り込んで補償画像を生成するためのシステム、方法、および媒体
WO2016155110A1 (zh) 图像透视畸变校正的方法及***
JP2016100698A (ja) 校正装置、校正方法、プログラム
WO2019232793A1 (zh) 双摄像头标定方法、电子设备、计算机可读存储介质
JP7378219B2 (ja) 撮像装置、画像処理装置、制御方法、及びプログラム
Wilm et al. Accurate and simple calibration of DLP projector systems
US11348271B2 (en) Image processing device and three-dimensional measuring system
JP2015046019A (ja) 画像処理装置、撮像装置、撮像システム、画像処理方法、プログラム、および、記憶媒体
JP7489253B2 (ja) デプスマップ生成装置及びそのプログラム、並びに、デプスマップ生成システム
WO2020146965A1 (zh) 图像重新聚焦的控制方法及***
TWI638566B (zh) 一種拍攝方法及其成像方法、裝置和設備
WO2021093804A1 (zh) 全向立体视觉的摄像机配置***及摄像机配置方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17819249

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017819249

Country of ref document: EP

Effective date: 20190129