CN112712583A - Three-dimensional scanner, three-dimensional scanning system and three-dimensional scanning method - Google Patents

Three-dimensional scanner, three-dimensional scanning system and three-dimensional scanning method Download PDF

Info

Publication number
CN112712583A
CN112712583A CN201911018772.7A CN201911018772A CN112712583A CN 112712583 A CN112712583 A CN 112712583A CN 201911018772 A CN201911018772 A CN 201911018772A CN 112712583 A CN112712583 A CN 112712583A
Authority
CN
China
Prior art keywords
light
stripe
camera
preset
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911018772.7A
Other languages
Chinese (zh)
Other versions
CN112712583B (en
Inventor
赵晓波
马超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining 3D Technology Co Ltd
Original Assignee
Shining 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201911018772.7A priority Critical patent/CN112712583B/en
Application filed by Shining 3D Technology Co Ltd filed Critical Shining 3D Technology Co Ltd
Priority to US17/771,470 priority patent/US12007224B2/en
Priority to AU2020371142A priority patent/AU2020371142B2/en
Priority to PCT/CN2020/123684 priority patent/WO2021078300A1/en
Priority to KR1020227017511A priority patent/KR20220084402A/en
Priority to EP20878731.7A priority patent/EP4050302A4/en
Priority to CA3158933A priority patent/CA3158933A1/en
Priority to JP2022524057A priority patent/JP7298025B2/en
Publication of CN112712583A publication Critical patent/CN112712583A/en
Application granted granted Critical
Publication of CN112712583B publication Critical patent/CN112712583B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a three-dimensional scanner, a three-dimensional scanning system and a three-dimensional scanning method. The three-dimensional scanner includes: the image projection device is used for projecting preset fringe patterns corresponding to each preset period to the target object in each preset period; and the image acquisition device is used for acquiring light modulated by the target object under the condition that the target object is projected with a preset stripe pattern so as to acquire a plurality of stripe images, wherein the acquired stripe images are used as the code pattern to determine each stripe sequence and used as the reconstruction pattern to carry out three-dimensional reconstruction on the target object. By the method and the device, the technical problems that in the related technology, the cost of hardware needed by a three-dimensional reconstruction method is high, and popularization and use of a three-dimensional scanning device are not facilitated are solved.

Description

Three-dimensional scanner, three-dimensional scanning system and three-dimensional scanning method
Technical Field
The application relates to the field of three-dimensional scanning, in particular to a three-dimensional scanner, a three-dimensional scanning system and a three-dimensional scanning method.
Background
In the field of three-dimensional scanning of the inside of the oral cavity, the existing three-dimensional scanner generally performs three-dimensional reconstruction processing by using the following method: firstly, sinusoidal stripes based on time coding are subjected to phase de-matching, and then three-dimensional reconstruction and splicing fusion are carried out to obtain the three-dimensional appearance of an object; secondly, acquiring the three-dimensional appearance of the object based on an algorithm of time-coded stripe center line extraction, three-dimensional reconstruction and splicing fusion; and thirdly, acquiring the three-dimensional appearance of the object based on a microscopic confocal three-dimensional imaging principle.
However, the above methods have various disadvantages, which are not suitable for popularizing the three-dimensional scanning device in the oral cavity, and the specific disadvantages are as follows:
firstly, the three-dimensional reconstruction method based on time coding is difficult to realize small-volume handheld scanning, so that the three-dimensional reconstruction method cannot be applied to the field of three-dimensional scanning in the oral cavity;
secondly, when the microscopic confocal three-dimensional imaging principle is used for three-dimensional reconstruction, the cost of required hardware is high, and the popularization and the use of three-dimensional scanning equipment are also not facilitated.
Aiming at the technical problems that the three-dimensional reconstruction method in the related technology requires high hardware cost and is not beneficial to the popularization and the use of a three-dimensional scanning device, an effective solution is not provided at present.
Disclosure of Invention
The application provides a three-dimensional scanner, a three-dimensional scanning system and a three-dimensional scanning method, which are used for solving the technical problems that in the related technology, the cost of hardware required by a three-dimensional reconstruction method is high, and popularization and use of a three-dimensional scanning device are not facilitated.
According to one aspect of the present application, a three-dimensional scanner is provided. The three-dimensional scanner includes: the image projection device is used for projecting preset stripe patterns corresponding to each preset period to the target object in each preset period respectively, wherein the stripes of each preset stripe pattern are arranged according to the preset color coding stripes, each preset stripe pattern comprises stripes of at least one color of the preset color coding stripes, the preset stripe patterns comprise stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes; the image acquisition device is used for acquiring light modulated by the target object under the condition that the target object is projected with a preset stripe pattern so as to acquire a plurality of stripe images, wherein the acquired stripe images are used as an encoding graph to determine each stripe sequence and used as a reconstruction graph to carry out three-dimensional reconstruction on the target object.
Optionally, the image projection apparatus further includes: and the image projection device projects preset fringe patterns corresponding to the preset periods to the target object respectively in each preset period through the DLP projection part.
Optionally, the image projection apparatus further includes: the light emitting part is used for respectively emitting initial light corresponding to each preset period in each preset period, wherein each initial light is composed of light of at least one stripe color, and the stripe color is the color of a stripe in the preset color coding stripes; and the light transmission part is arranged on the transmission path of the initial light, and each initial light generates a corresponding preset color stripe to be projected onto a target object after being transmitted by a pattern of the preset color coding stripe arranged on the light transmission part.
Optionally, the light emitting unit further includes a plurality of light source units, and a light wave band emitted by each of the light source units is different from each other, wherein the light emitting unit emits the initial light through the plurality of light source units.
Optionally, the light emitting portion further includes a light converging unit disposed on a transmission path of the light emitted from the plurality of light source units, wherein the light emitted from the plurality of light source units is converged by the light converging unit and then projected to the light transmitting portion through the same transmission path.
Optionally, the light source unit includes at least one of: LED light source, laser emitter.
Optionally, the light transmission part further includes a diffraction grating, wherein the light transmission part generates a preset fringe pattern through the diffraction grating to project the preset fringe pattern onto the target object.
Optionally, the three-dimensional scanner further includes a timing control unit, where the timing control unit is connected to the image projection device and the image acquisition device, and is configured to control the image projection device to emit the preset fringe pattern corresponding to each preset period in each preset period, and control the image acquisition device to acquire the light modulated by the target object in a plurality of preset periods, so as to acquire the fringe image corresponding to each preset fringe pattern.
Optionally, the three-dimensional scanner further includes a timing control unit, where the timing control unit is connected to the light source units and the image acquisition device, and is configured to control the light source units to emit light rays in different preset periods respectively, so as to generate initial light rays corresponding to the preset periods in each preset period respectively; and controlling the image acquisition device to respectively acquire the light modulated by the target object in a plurality of preset periods so as to acquire the fringe image corresponding to each initial light.
Optionally, the three-dimensional scanner further includes an illuminating element, and the illuminating element is configured to illuminate a target object, wherein the image capturing device is further configured to capture a texture map of the target object when the target object is projected with illuminating light by the illuminating element.
Optionally, the image acquisition device further includes a plurality of cameras, where at least one of the plurality of cameras includes a black-and-white camera, where the image acquisition device acquires light modulated by the target object through the plurality of cameras to obtain a plurality of fringe images, where the fringe image obtained by at least one of the black-and-white cameras is used as a reconstruction map to perform three-dimensional reconstruction on the target object; and at least a plurality of black and white camera stripe images are used as code patterns to determine each stripe sequence, and/or at least one color camera stripe image is used as a code pattern to determine each stripe sequence.
Optionally, the image capturing device further includes a light beam processing device, where the light beam processing device includes a light inlet portion and at least two light outlet portions, where each camera is respectively disposed corresponding to a different light outlet portion, and the image capturing device collects light modulated by the target object through the light beam processing device.
Optionally, the light beam processing apparatus further includes at least one first light beam splitting unit, where the first light beam splitting unit is configured to perform light splitting processing on the light beams projected from the light inlet portion, so that the light beams are respectively projected from the at least two light outlet portions to cameras correspondingly disposed on the light outlet portions.
Optionally, the light beam processing apparatus further includes at least one second light beam splitting unit, where the second light beam splitting unit is configured to split light to be obtained by a specified camera, so that the specified camera obtains light of a specified wavelength band, where the specified wavelength band at least includes: at least one initial light ray comprises a light ray band.
Optionally, the light beam processing device includes a right-angle two-channel dichroic prism, and the right-angle two-channel dichroic prism includes a third light emitting portion and a fourth light emitting portion, where the light beam processing device implements, through the right-angle two-channel dichroic prism, light splitting processing on light rays projected from the light inlet portion, so that the light rays are projected from the third light emitting portion and the fourth light emitting portion to cameras respectively corresponding to the light emitting portions; the image acquisition device comprises a third camera and a fourth camera, the third camera is arranged corresponding to the third light-emitting part, the fourth camera is arranged corresponding to the fourth light-emitting part, the third camera generates a third stripe image based on the acquired light, the fourth camera generates a fourth stripe image based on the acquired light, and the third stripe image and the fourth stripe image respectively comprise stripes with at least two colors and the stripes with at least two colors can be identified; the light beam processing device realizes separation processing of light rays acquired by the appointed camera through the right-angle two-channel color separation prism, so that the appointed camera acquires the light rays containing the appointed waveband, wherein the acquiring of the light rays containing the appointed waveband by the appointed camera comprises: the third camera acquires light of a third specified waveband, and the fourth camera acquires light of a fourth specified waveband.
Optionally, the light beam processing device includes a three-channel dichroic prism, and the three-channel dichroic prism includes a fifth light-emitting portion, a sixth light-emitting portion, and a seventh light-emitting portion, where the light beam processing device implements, through the three-channel dichroic prism, light splitting processing on light rays projected from the light-entering portion, so that the light rays are projected from the fifth light-emitting portion, the sixth light-emitting portion, and the seventh light-emitting portion to cameras respectively corresponding to the light-emitting portions; the image acquisition device comprises a fifth camera arranged corresponding to the fifth light-emitting part, a sixth camera arranged corresponding to the sixth light-emitting part and a seventh camera arranged corresponding to the seventh light-emitting part, the fifth camera generates a fifth stripe image based on the acquired light, the sixth camera generates a sixth stripe image based on the acquired light, the seventh camera generates a seventh stripe image based on the acquired light, and the fifth stripe image, the sixth stripe image and the seventh stripe image respectively comprise stripes with at least two colors, and the stripes with at least two colors can be identified; the light beam processing device separates light rays acquired by the appointed camera through the three-channel color separation prism, so that the appointed camera acquires the light rays containing the appointed waveband, wherein the acquiring of the light rays containing the appointed waveband by the appointed camera at least comprises the following steps: the fifth camera acquires light of a fifth specified waveband, the sixth camera acquires light of a sixth specified waveband, and the fifth specified waveband is different from the sixth specified waveband.
Optionally, the light beam processing device includes a semi-reflective and semi-transparent prism, and the semi-reflective and semi-transparent prism includes a first light emitting portion and a second light emitting portion, where the light beam processing device implements light splitting processing on light rays projected from the light inlet portion through the semi-reflective and semi-transparent prism, so that the light rays are respectively projected from the first light emitting portion and the second light emitting portion to cameras respectively corresponding to the light emitting portions; the image acquisition device comprises a first camera and a second camera, the first camera is arranged corresponding to the first light-emitting portion, the second camera is arranged corresponding to the second light-emitting portion, the first camera generates a first stripe image based on acquired light, the second camera generates a second stripe image based on acquired light, and the first stripe image and the second stripe image respectively comprise stripes with at least two colors, and the stripes with at least two colors can be identified.
Optionally, the light beam processing apparatus further includes an optical filter, where the light beam processing apparatus separates light rays obtained by a designated camera through the optical filter, so that the designated camera obtains light rays including a designated wavelength band, and at least one of the plurality of cameras is a designated camera.
Optionally, the three-dimensional scanner further includes an illuminator, wherein the image acquisition device is further configured to acquire the illumination light reflected by the target object to acquire the texture data of the target object when the target object is illuminated by the illuminator.
Optionally, the image acquisition device can identify and determine red light, green light and blue light.
According to one aspect of the present application, a three-dimensional scanning system is provided. The three-dimensional scanning system includes: the three-dimensional scanner is used for projecting preset stripe patterns corresponding to each preset period to a target object in each preset period and acquiring light modulated by the target object under the condition that the target object is projected with the preset stripe patterns to obtain a plurality of stripe images, wherein the stripes of each preset stripe pattern are arranged according to preset color coding stripes, each preset stripe pattern comprises stripes of at least one color of the preset color coding stripes, the preset stripe patterns comprise stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes; and the image processor is connected with the three-dimensional scanner and used for acquiring a plurality of stripe images acquired by the three-dimensional scanner, determining each stripe sequence according to the stripe images as an encoding image and performing three-dimensional reconstruction on the target object as a reconstruction image.
Optionally, in a case that the three-dimensional scanner acquires light modulated by the target object through a plurality of cameras to obtain a plurality of fringe images, and at least one of the plurality of cameras includes a black-and-white camera, the image processor is further configured to: taking a stripe image obtained by at least one black-and-white camera as a reconstruction image to carry out three-dimensional reconstruction on the target object; the stripe images obtained by at least a plurality of black and white cameras are used as coding patterns to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as coding patterns to determine each stripe sequence.
According to one aspect of the present application, a three-dimensional scanning method is provided. The three-dimensional scanning method comprises the following steps: emitting initial light rays corresponding to each preset period in each preset period, wherein each initial light ray consists of light rays with at least one color in preset color coding stripes, and after each initial light ray is transmitted by a pattern of the preset color coding stripes arranged on the light ray transmission part, the corresponding preset color stripes are generated and projected onto a target object; respectively collecting light modulated by the target object in the preset periods, and acquiring a plurality of stripe images based on the light, wherein the acquired stripe images are used as encoding images to determine each stripe sequence and as reconstruction images to perform three-dimensional reconstruction on the target object; determining a sequence of stripes in the plurality of stripe images based on the coding pattern; and performing three-dimensional reconstruction on the reconstruction map based on the sequence to acquire three-dimensional data of the target object.
Optionally, the three-dimensional scanning method further includes: projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light; and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
According to one aspect of the present application, a three-dimensional scanning method is provided. The three-dimensional scanning method comprises the following steps: acquiring a first image and a second image, wherein the first image and the second image are fringe images acquired based on the same light transmission part; determining a coded sequence of stripes based on the first image; and performing stripe matching on the stripes of the second image based on the coding sequence, and realizing three-dimensional reconstruction to obtain three-dimensional data of the target object.
Optionally, the three-dimensional scanning method further includes: and acquiring texture data, and acquiring color three-dimensional data of the target object based on the three-dimensional data and the texture data.
In summary, the stripe extraction algorithm based on spatial coding of the present application realizes the projection requirement of canceling dynamic projection, and only needs a small amount of two-dimensional images to realize the technical effect of three-dimensional reconstruction of the target object, and solves the technical problems of high hardware cost required by the three-dimensional reconstruction method in the related art and being not beneficial to popularization and use of the three-dimensional scanning device.
In addition, the three-dimensional scanner also improves the three-dimensional identification accuracy by using the color as the space coding information.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
FIG. 1 is a first schematic diagram of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 2 is a second schematic diagram of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 3 is a schematic view of a positional relationship between an illuminating member and a reflector according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a beam path in a beam processing apparatus according to an embodiment of the present application;
FIG. 5 is a third schematic view of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 6 is a fourth schematic view of an alternative three-dimensional scanner provided in accordance with embodiments of the present application;
FIG. 7 is a schematic diagram of an alternative three-dimensional scanning system provided in accordance with an embodiment of the present application;
fig. 8 is a first flowchart of an alternative three-dimensional scanning method provided in an embodiment of the present application.
Wherein the figures include the following reference numerals:
10. an image projection device; 20. an image acquisition device; 30. an illuminating member; 40. a reflective mirror; 11. a DLP projection section; 12. a light emitting section; 13. a light transmitting part; 14. a first imaging lens; 121. a light source unit; 21. a camera; 22. a light beam processing device; 22a, a right-angle two-channel dichroic prism; 22b, a three-channel dichroic prism; 22c, a semi-reflecting and semi-transmitting prism; 22d, a filter; 23. and a second imaging lens.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used.
Furthermore, the terms "first," "second," and the like in the various embodiments, independently, do not have to be used to limit the same terms to the same events in the various embodiments. It should be understood that the data so used may be arbitrarily changed where appropriate to facilitate describing the embodiments of the present application in real time herein.
Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present application, a three-dimensional scanner is provided.
Fig. 1 is a schematic diagram of a three-dimensional scanner according to an embodiment of the present application. As shown in fig. 1, the three-dimensional scanner includes: an image projection apparatus 10 and an image acquisition apparatus 20.
The image projection device 10 is configured to project preset stripe patterns corresponding to each preset period to the target object at each preset period, where each preset stripe pattern is arranged according to the preset color coding stripes, each preset stripe pattern includes stripes of at least one color of the preset color coding stripes, and a plurality of preset stripe patterns include stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes.
It should be noted that: projecting the preset stripe pattern corresponding to the preset period to the target object in each preset period may be: the image projection device 10 projects a preset stripe pattern periodically, and the image projection device 10 projects a plurality of preset stripe patterns in each preset period, wherein the preset stripe patterns are projected in a time-sharing manner. For example, the image projection apparatus 10 projects a first preset stripe pattern in a first period, projects a second preset stripe pattern in a second period, the image capture apparatus 20 captures the first preset stripe pattern in the first period, and captures the second preset stripe pattern in the second period, and the image capture apparatus 20 repeats this process until the target object scanning is completed.
In an alternative example, as shown in fig. 2, the image projection apparatus 10 further includes: a DLP projection unit 11, wherein the image projection device 10 projects a plurality of preset fringe patterns corresponding to each preset period to the target object through the DLP projection unit 11 at each preset period.
That is, the image projection apparatus 10 can realize its function by the DLP projection section 11.
Specifically, the DLP projection unit 11 projects a plurality of preset stripe patterns corresponding to each preset period to the target object in each preset period, wherein each preset stripe pattern is arranged according to the preset color coding stripe, each preset stripe pattern includes at least one color stripe of the preset color coding stripe, and the plurality of preset stripe patterns include at least two colors stripes of the preset color coding stripe, and the stripes of the preset stripe patterns are arranged consistently with the same color stripes of the preset color coding stripe.
In an alternative example, the image projection apparatus 10 further includes: a light emitting unit 12, configured to emit a plurality of initial lights corresponding to each preset period in each preset period, where each of the initial lights is composed of lights of at least one stripe color, and the stripe color is a color of a stripe in the preset color coding stripes; and the light transmission part 13 is arranged on the transmission path of the initial light, wherein each initial light generates a corresponding preset color stripe after being transmitted by a pattern of the preset color coding stripe arranged on the light transmission part 13, namely the preset stripe pattern is projected onto a target object, and the stripes in the preset stripe pattern are arranged in the same way as the stripes with the same color in the preset color coding stripe.
It should be noted that, preset color coding stripes are preset for each preset color stripe arrangement standard, in the present application, a preset stripe pattern meeting the preset color stripe arrangement standard can be directly projected through the DLP projection portion 11, or the light transmission portion 13 can also be used as a carrier of the preset color stripe arrangement standard, that is, the light transmission portion 13 determines the preset color stripe arrangement standard, and the initial light passes through the light projection portion to generate the preset stripe pattern arranged according to the preset color stripe arrangement standard.
That is, the image projection apparatus 10 can realize its function by the light emitting part 12 and the light transmitting part 13.
Specifically, the three-dimensional scanner may form different preset stripe patterns to project onto the target object in a transmission projection manner, and the generated stripes of each preset stripe pattern are arranged and set according to preset color coding stripes arranged on the light transmission portion 13, each preset stripe pattern includes stripes of at least one color of the preset color coding stripes, and the plurality of preset stripe patterns include stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes.
Optionally, the light emitting unit 12 further includes a plurality of light source units 121, and the light wave bands emitted by each of the light source units 121 are different, wherein the light emitting unit 12 emits the initial light through the plurality of light source units 121, and the initial light may be light of a single wave band emitted by only a single light source unit 121, or light of a plurality of wave bands emitted by the plurality of light source units 121 simultaneously.
For example, the following steps are carried out: as shown in fig. 1, the light emitting unit 12 includes three light source units 121, and the wavelength band of the light emitted by each light source unit 121 is different, for example: the first light source unit 121 emits light of 605-700 wavelength bands, i.e., red light; the second light source unit 121 emits light of 435-480 bands, i.e., blue light; the third light source unit 121 emits light of 500-560 wavelength bands, i.e., green light.
In the period a of the preset period, the first light source unit 121 emits light rays in the 605-700 wavelength band; in the B period of the preset period, the second light source unit 121 emits light in 435-480 wave band; in the period C of the preset period, the first light source unit 121 emits light with the wavelength band of 605-.
Or, in the a period of the preset period, the first light source unit 121 emits light rays in 605-700 wavelength bands; in the B period of the preset period, the second light source unit 121 emits light in the 450-480 band; in the period C of the preset period, the third light source unit 121 emits light of the wavelength of 500-.
It should be noted that: the settings of the first light source unit 121, the second light source unit 121, and the third light source unit 121 are schematic examples, and the wavelength band of the light emitted by the light source unit 121 is not particularly limited. In addition to the above illustrative examples, the wavelength band of the light emitted by the light source unit 121 may be arbitrarily selected, and the present application is not particularly limited thereto.
It should also be noted that: the above-mentioned setting of the light source units 121 operated in the preset periods a, B, and C is an illustrative example, and the light source units 121 capable of emitting light in each preset period are not specifically limited. In addition to the above-mentioned exemplary embodiment, the light source unit 121 that can be activated in each preset period may be arbitrarily selected, and the present application is not particularly limited thereto.
Alternatively, the light source unit 121 may include at least one of: LED light source, laser emitter.
That is, the light source unit 121 may implement its function by a laser emitter, and may also implement its function by an LED light source. The laser has the advantages of directional light emission, extremely high brightness, extremely pure color and good coherence.
Specifically, the light emitting portion 12 further includes a plurality of LED light sources, and light bands emitted by each of the LED light sources are different from each other, wherein the light emitting portion 12 emits the initial light through the plurality of LED light sources.
Specifically, the light emitting unit 12 further includes a plurality of laser emitters, and each of the laser emitters emits light in a different wavelength band, wherein the light emitting unit 12 emits the initial light through the plurality of laser emitters.
Optionally, the light emitting portion 12 further includes a light converging unit disposed on a transmission path of the light emitted from the plurality of light source units 121, wherein the light emitted from the plurality of light source units 121 is converged by the light converging unit and then projected to the light transmitting portion 13 through the same transmission path.
That is, the initial light is a combination of light projected to the light transmission part 13 through the same transmission path after being polymerized by the light polymerization unit, wherein the light polymerization unit can realize its function through the semi-reflective and semi-transparent prism 22 c.
For example, the following steps are carried out: as shown in fig. 1, the light emitting unit 12 includes three light source units 121, and the wavelength band of the light emitted by each light source unit 121 is different, and a first half-reflecting and half-transmitting prism 22c is disposed on the light path of the first light source unit 121 and the second light source unit 121, where the first half-reflecting and half-transmitting prism 22c is used for performing a polymerization process on the light emitted by the first light source unit 121 and the second light source unit 121 to project the light onto the second half-reflecting and half-transmitting prism 22 c; the third light source unit 121 is disposed on a side of the second half-reflecting and half-transmitting prism 22c away from the polymerized light, wherein the light emitted by the third light source unit 121 and the polymerized light are polymerized by the second half-reflecting and half-transmitting prism 22c to generate a light combination projected to the light transmission part 13 through the same transmission path.
Optionally, the light transmitting portion 13 further includes a grating, and specifically, the light transmitting portion 13 generates a preset stripe pattern through the grating so as to project the preset stripe pattern onto the target object.
Specifically, different regions are arranged on the grating, and the different regions correspond to different wavebands, that is, the different regions can transmit light of different wavebands, and the different regions on the grating determine preset color coding stripes, which can also be understood as that each region on the grating is consistent with the arrangement of each stripe in the preset color coding stripes, and the wavebands corresponding to the regions correspond to the stripe colors corresponding to the stripes with the consistent arrangement. For example, the grating includes a first region for light of a first wavelength band to transmit and a second region for light of a second wavelength band to transmit, the light of the first wavelength band forms stripes of the first wavelength band after passing through the grating, and the arrangement of the stripes is consistent with the arrangement of the first region, and the light of the second wavelength band forms stripes of the second wavelength band after passing through the grating, and the arrangement of the stripes is consistent with the arrangement of the second region.
That is, the light emitting part 12 emits different initial lights at different periods of the preset cycle; at this time, when some initial light is projected onto the grating, the light of each color is transmitted through the corresponding region, and a predetermined stripe pattern is formed.
It should be noted that: in the case where the light emitting part 12 emits the initial light by the plurality of laser emitters, the light emitting part 12 may further include a phase modulation unit, wherein the phase modulation unit is disposed on a transmission path of the initial light, so that the initial light is projected to the light transmitting part 13 after the diffraction spots are eliminated by the phase modulation unit.
Specifically, the phase modulation unit may include: the phase modulation element is arranged on a transmission path of initial light and rotates around a preset axis, wherein the transmission path of the initial light is parallel to the preset axis of the phase modulation element; the light beam coupling element is arranged on a transmission path of the initial light ray and used for collimating and adjusting the initial light ray and reducing the divergence angle of the initial light ray.
Wherein the phase modulation element may be in any form as follows: a transparent optical material sheet, micro-optical element or random phase plate; the phase modulation unit also comprises a driving motor, and the phase modulation element is driven by the driving motor to rotate around the rotating shaft at a certain speed;
the beam coupling element may be composed of a collimating system and a converging lens, or an optical system having equivalent function.
The phase modulation element may be located before the beam coupling element or located after the beam coupling element.
It should be noted that: in the case where the light emitting unit 12 emits the initial light through the plurality of light source units 121, the light emitting unit 12 may further include a solid medium element disposed on a transmission path of the initial light, and the initial light is reflected and mixed by the solid medium element for a plurality of times and then projected to the light transmitting unit 13 in a form of uniform light field intensity.
In particular, the solid media element may be in any of the following forms: a slender hexahedral prism, a cylindrical prism and a pyramidal prism; meanwhile, the solid medium element can be a hollow rod for reflecting light rays for multiple times in a space surrounded by a solid interface, and also can be a solid rod for reflecting light rays for multiple times in a solid transparent medium, wherein the input end face and the output end face of the solid rod are plated with antireflection films, the inner surface of the hollow rod is plated with an antireflection film, and in addition, the emergent end face and the incident end face of the solid medium element are arranged in parallel.
Optionally, the three-dimensional scanner further includes a timing control unit, where the timing control unit is connected to the image projection device 10 and the image acquisition device 20, and is configured to control the image projection device 10 to emit the preset fringe patterns corresponding to each preset period in each preset period, and control the image acquisition device 20 to acquire the light modulated by the target object in a plurality of preset periods, so as to acquire the fringe image corresponding to each preset fringe pattern.
That is, the three-dimensional scanner controls the image projection device 10 to emit the preset fringe pattern corresponding to each preset period in each preset period through the timing control unit, and controls the image capture device 20 to capture the light modulated by the target object in a plurality of preset periods, respectively, so as to obtain the fringe image corresponding to each preset fringe pattern.
That is, the three-dimensional scanner allows the image projection apparatus 10 and the image pickup apparatus 20 to perform the same process by the timing control unit.
Optionally, the three-dimensional scanner further includes a timing control unit, where the timing control unit is connected to the light source units 121 and the image acquisition device 20, and is configured to control the light source units 121 to emit light rays in different preset periods respectively, so as to generate initial light rays corresponding to the preset periods in each preset period respectively; and controlling the image capturing device 20 to capture the light modulated by the target object in a plurality of preset periods respectively, so as to obtain a fringe image corresponding to each of the initial light.
That is, the three-dimensional scanner controls, through the timing control unit, the plurality of light source units 121 to emit light beams in different preset periods, respectively, so as to generate preset fringe patterns projected onto the target object corresponding to the preset periods, and controls the image capturing device 20 to capture the light beams modulated by the target object in the preset periods, respectively, so as to obtain a fringe image corresponding to each of the initial light beams.
That is, the three-dimensional scanner allows the plurality of light source units 121 and the image pickup device 20 to be aligned in the sequence control unit.
It should be noted that: the two kinds of timing control portions may be selectable examples of the present application, that is, the three-dimensional scanning apparatus includes: a first timing control unit or a second timing control unit, where the first timing control unit is connected to the image projection apparatus 10 and the image acquisition apparatus 20, and is configured to control the image projection apparatus 10 to emit a preset fringe pattern corresponding to each preset period in each preset period, and control the image acquisition apparatus 20 to acquire light modulated by the target object in a plurality of preset periods, respectively, so as to obtain a fringe image corresponding to each preset fringe pattern; the second timing control unit is connected to the light source units 121 and the image capturing device 20, and configured to control the light source units 121 to emit light rays in different preset periods, so as to generate initial light rays corresponding to the preset periods in each preset period; and controlling the image capturing device 20 to capture the light modulated by the target object in a plurality of preset periods respectively, so as to obtain a fringe image corresponding to each of the initial light.
Optionally, the three-dimensional scanner further includes an illuminator 30, and the three-dimensional scanner further includes the illuminator 30, wherein the image acquisition device 20 is further configured to acquire the illumination light reflected by the target object to acquire the texture data of the target object when the target object is illuminated by the illuminator 30.
Further, in the case that the three-dimensional scanner further includes the illuminating member 30, the image capturing device 20 may identify and determine red light, blue light, and green light, so that the image capturing device 20 captures a texture image of the target object in the case that the target object is projected with the illuminating light by the illuminating member 30, and generates a three-dimensional model having a color (or a color substantially) identical to that of the target object through the texture image and the three-dimensional data, that is, a true color scan is performed.
For example, the following steps are carried out: the illuminating member 30 may be an LED lamp emitting white light, and if the image projection apparatus 10 includes the DLP projection unit 11, the image projection apparatus 10 may be an integrated apparatus of the illuminating member 30 and the DLP projection unit 11 projecting illuminating light.
Further, in the case that the three-dimensional scanner further includes an illuminating member 30, the timing control unit is further connected to the illuminating member 30, and is configured to control the illuminating member 30 to project the illuminating light onto the target object, and control the image capturing device 20 to capture the texture map of the target object in the case that the illuminating member 30 projects the illuminating light onto the target object.
Further, under the condition that the three-dimensional scanner further comprises an illuminating piece 30 and the timing control part is further connected with the illuminating piece 30, the timing control part is used for controlling the image projecting device 10 and the illuminating piece 30 to alternately project the preset stripe pattern and the illuminating light onto the target object, and the timing control part is used for controlling the image collecting device 20 to synchronously collect the preset stripe pattern relative to the image projecting device 10 and controlling the image collecting device 20 to synchronously collect the texture image relative to the illuminating piece 30; or, the timing control unit is configured to control the plurality of light source units 121 and the illuminating element 30 to alternately project the preset stripe pattern and the illuminating light onto the target object, and the timing control unit is configured to control the image capturing device 20 to synchronously capture the preset stripe pattern with respect to the image projecting device 10, and is configured to control the image capturing device 20 to synchronously capture the texture image with respect to the illuminating element 30.
Optionally, the three-dimensional scanner further includes a mirror 40, and the mirror 40 is configured to change a transmission path of light.
For example, the following steps are carried out: the reflective mirror 40 is disposed on a transmission path of the preset stripe pattern, specifically, the preset stripe pattern is reflected to the target object by the reflective mirror 40, and then is modulated by the target object and then is reflected to the image acquisition device 20; in this case, the installation constraints of the image projection apparatus 10 and the image capture apparatus 20 can be reduced, and the size of the space required for the image projection apparatus 10 and the image capture apparatus 20 can be reduced.
For example, the following steps are carried out: the reflective mirror 40 is disposed on a transmission path of the light emitted from the light source units 121, and specifically, the reflective mirror 40 is used to change the transmission path of the light emitted from the light source units 121, so as to reduce installation constraints of the light source units 121 and reduce the size of the space required by the light source units 121.
Alternatively, in the case where the three-dimensional scanner further includes an illuminating member 30 and a reflecting mirror 40, and the reflecting mirror 40 is disposed on a transmission path of a preset stripe pattern, as shown in fig. 3, the illuminating member 30 may be disposed on an outer circumference of the reflecting mirror 40; the illumination device 30 may be disposed at another position of the scanner and configured to cooperate with the reflective mirror 40 to reflect the illumination light to the target object through the reflective mirror 40, for example, the illumination device 30 is disposed at a side of the first imaging lens 14 close to the light source unit 121, so that the illumination light and the light projected by the light source unit 121 can both pass through the first imaging lens 14 and be reflected to the target object through the reflective mirror 40.
For example, the following steps are carried out: the three-dimensional scanner includes a holding portion and an entrance portion provided at a front end of the holding portion, the image projection device 10 and the image capture device 20 are both mounted to the holding portion, the reflective mirror 40 is mounted to the entrance portion, and the illuminating member 30 may be mounted to the entrance portion or the holding portion.
And the image acquisition device 20 is used for acquiring light modulated by the target object to acquire a plurality of stripe images under the condition that the target object is projected with a preset stripe pattern, wherein the acquired stripe images are used as an encoding diagram to determine each stripe sequence and are used as a reconstruction diagram to carry out three-dimensional reconstruction on the target object, and three-dimensional data of the target object is generated.
That is, in the case that the target object is projected with a preset stripe pattern, the projected preset stripe pattern is mapped on the target object, and the preset stripe pattern is deformed (i.e., modulated) based on the shape of the target object, at this time, the image acquisition device 20 acquires the deformed preset stripe pattern, and further acquires a stripe image, where the stripe image is used to determine each stripe sequence and perform three-dimensional reconstruction on the target object.
In an optional example, the image capturing apparatus 20 further includes a plurality of cameras 21, where the plurality of cameras 21 includes at least one black and white camera 21, where the image capturing apparatus 20 captures light modulated by the target object through the plurality of cameras 21 to obtain a plurality of fringe images, where the fringe image obtained by at least one black and white camera 21 is used as a reconstruction map to perform three-dimensional reconstruction on the target object; and the stripe images obtained by at least a plurality of black and white cameras 21 are used as coding patterns to determine each stripe sequence, and/or the stripe image obtained by at least one color camera 21 is used as a coding pattern to determine each stripe sequence.
That is, the image capturing device 20 captures the light modulated by the target object through the plurality of cameras 21 to obtain a plurality of stripe images, and the plurality of cameras 21 at least include one black-and-white camera 21, wherein the stripe image obtained by at least one black-and-white camera 21 is used as a reconstruction map to reconstruct the target object in three dimensions.
It should be noted that: the imaging resolution of the black-and-white camera 21 is higher than that of the color camera 21, so that the plurality of cameras 21 at least include one black-and-white camera 21, and the stripe image generated by the black-and-white camera 21 is used for three-dimensional reconstruction, thereby improving the accuracy of three-dimensional reconstruction of the target object.
Specifically, the three-dimensional reconstruction of the target object by using the fringe image obtained by at least one black-and-white camera 21 as a reconstruction map includes: a fringe image obtained by a black and white camera 21 is used as a reconstruction image to carry out three-dimensional reconstruction on the target object; the stripe images obtained by the plurality of black and white cameras 21 are used as reconstruction images to perform three-dimensional reconstruction on the target object; the stripe images obtained by one black and white camera 21 and at least one color camera 21 are used as reconstruction images to carry out three-dimensional reconstruction on the target object; the streak images obtained by the plurality of black and white cameras 21 and the at least one color camera 21 are used as reconstruction images to reconstruct the target object in three dimensions.
Specifically, the determining each stripe sequence by using the stripe images obtained by at least a plurality of black-and-white cameras 21 as the code pattern, and/or the determining each stripe sequence by using the stripe image obtained by at least one color camera 21 as the code pattern includes: the stripe images obtained by the black and white cameras 21 are used as encoding patterns to determine each stripe sequence; the fringe image obtained by the at least one color camera 21 is used as a code pattern to determine each fringe sequence; the fringe images obtained by the at least one color camera 21 and the at least one black and white camera 21 serve as coding patterns to determine the respective fringe sequences.
That is, as stripe information included in at least one stripe image of the code pattern, a code sequence capable of determining each stripe is required; that is, the code pattern is composed of a stripe image that can determine the code sequence of each stripe.
Optionally, the camera 21 may be a CDD camera or a CMOS camera. Specifically, the camera form is not specifically limited in the present application, and technicians may make corresponding replacements according to technical requirements.
It should be noted that: the CCD camera has the characteristics of small volume, light weight, no influence of a magnetic field and vibration and impact resistance, so that the volume of the three-dimensional scanner can be correspondingly reduced under the condition that the three-dimensional scanner adopts the 2CCD camera to acquire a fringe image, the three-dimensional scanner is convenient to use by hands, and the three-dimensional scanner is applied to an environment to be scanned (such as an oral cavity) with a small space.
For example, the following steps are carried out: projecting a preset stripe image A designed in advance to a target object in a period a of a preset period, projecting a preset stripe image B designed in advance to the target object in a period B of the preset period, and controlling an image acquisition device 20 to quickly acquire an image of the target object with the preset stripe image, wherein cameras 21 included in the image acquisition device 20 respectively acquire different stripe images, for example, a camera 211 is a color camera 21 for acquiring a color stripe image in the case that the target object is projected with the preset stripe pattern A; the camera 212 is a black and white camera 21 for acquiring a black and white stripe image in a case where the target object is projected with a preset stripe pattern B.
At the moment, the color stripe image and the black and white stripe image are transmitted to a computer terminal, and the computer takes the color stripe image as coding information and takes the black and white stripe image as a reconstruction image so as to acquire the three-dimensional morphology of the target object.
In an optional example, the image capturing device 20 further includes a light beam processing device 22, where the light beam processing device 22 includes an optical light inlet portion and at least two optical light outlet portions, and each camera 21 is respectively disposed corresponding to a different optical light outlet portion, and the image capturing device 20 passes through the light beam processing device 22 to capture the light modulated by the target object.
The image capturing device 20 further includes a second imaging lens 23, the second imaging lens 23 is disposed corresponding to the light entering portion of the light beam processing device 22, wherein the light collected by the image capturing device 20 is emitted to the light entering portion of the light beam processing device 22 through the second imaging lens 23 to reach different light exiting portions of the light beam processing device 22.
That is, the image capturing apparatus 20 is configured to set the beam processing apparatus 22 such that the plurality of cameras 21 can respectively perform imaging based on the coaxial light incident from the same second imaging lens 23, that is, such that the fringe patterns respectively acquired by the plurality of cameras 21 have uniform fields of view and angles. Specifically, the light beam processing device 22 has a second imaging lens 23 disposed at the light inlet thereof, the light beam processing device 22 includes a plurality of light emitting portions, the light emitting portions are respectively disposed in one-to-one correspondence with the cameras 21, and the light beam processing device 22 performs direction adjustment and/or wavelength band separation on the light rays incident therein, so that each camera 21 can respectively image based on the light rays in the same incident direction and can image based on the light rays in a designated wavelength band.
For example, the following steps are carried out: as shown in fig. 4, the light of the target object is incident through the light entrance portion of the light beam processing device 22; the light beam processing device 22 performs a light splitting process on the image light of the target object, so that the image light is emitted from at least two light emitting parts respectively to be projected to the plurality of cameras 21; in this case, the streak images collected by the plurality of cameras 21 are all streak images acquired from the same viewing angle.
Optionally, the light beam processing device 22 further includes at least one first light beam splitting unit, where the first light beam splitting unit is configured to perform light splitting processing on the light beams projected from the light inlet portion, so that the light beams are respectively projected from the at least two light outlet portions to the camera 21 correspondingly disposed on the light outlet portion.
That is, the light beam processing device 22 separates the received light into light projected in a plurality of directions by the first beam splitting unit. For example, the following steps are carried out: after being processed by the first light beam separation unit, one red and blue light beam forms two red and blue light beams, and the two red and blue light beams are respectively emitted towards different directions.
Optionally, the light beam processing apparatus 22 further includes at least one second light beam splitting unit, where the second light beam splitting unit is configured to split light to be acquired by the designated camera 21, so that the designated camera 21 acquires light of a designated wavelength band, where the designated wavelength band at least includes: at least one initial light ray comprises a light ray band.
That is, the light beam processing device 22 can separate the received light into partial wavelength bands of light by the second beam splitting unit. For example, the following steps are carried out: and the red light beam and the blue light beam are processed by the second light beam separation unit to form a blue light beam.
It should be noted that: the first beam splitting unit and the second beam splitting unit in the present application may be integrated into one physical unit, or each unit may exist separately and physically.
For example, the following steps are carried out: the first beam splitting unit may be a half-reflecting and half-transmitting prism 22 c; the second beam splitting unit may be a filter 22 d; the first beam splitting unit and the second beam splitting unit may be integrated in a right-angle two-channel dichroic prism 22 a; the first beam splitting unit and the second beam splitting unit may be integrated in the three-channel dichroic prism 22 b.
For example, the following steps are carried out: in a period a of the preset period, the image projection apparatus 10 projects a preset fringe image a designed in advance to the target object, where the preset fringe image a is formed by combining a blue fringe and a green fringe, and when the camera 211 in the image capture apparatus 20 captures light modulated by the target object, a second light beam separation unit corresponding to the camera 211 performs separation processing on the light captured by the camera 211, so that the camera 211 can obtain the green light and the blue light. Preferably, the camera 211 can only acquire green light and blue light.
Preferably, the plurality of cameras 21 included in the image capturing device 20 correspond to the plurality of preset stripe patterns one to one, that is, each camera 21 can identify a certain light color consistent with a stripe color included in the corresponding preset stripe pattern.
Optionally, the number of stripe colors of the reconstructed image is less than the number of stripe colors in the preset color coding stripes, so that the distance between adjacent stripes is not too small, and the problem that the stripes cannot be accurately matched due to the too small distance in the stripe matching process is solved. Preferably, the reconstructed image consists of only one color stripe. Preferably, the reconstructed image is acquired by a black and white camera 21. Preferably, the reconstructed image is a black-and-white stripe image generated by blue light only, and the blue light has higher interference resistance and higher stability than light of other colors.
It should be noted that: the three-dimensional scanner may further include: the system comprises a heat dissipation system, a heating antifogging system, a software algorithm system and the like, wherein the heat dissipation system is used for preventing the inside of the three-dimensional scanner device from being overheated to cause the damage of the scanner; the heating antifogging system is used for preventing each optical instrument in the three-dimensional scanner from generating a fog surface phenomenon, so that the fog surface phenomenon can not be generated under the condition that an accurate stripe image cannot be acquired; the software algorithm system is configured to perform three-dimensional reconstruction on the target object according to the at least one fringe image acquired by the image acquisition device 20.
In summary, the three-dimensional scanner provided in the embodiment of the present application, based on the stripe extraction algorithm of the spatial coding, realizes the projection requirement of canceling the dynamic projection, and only needs a small number of two-dimensional images to realize the technical effect of three-dimensional reconstruction of the target object, and solves the technical problems in the related art that the hardware cost required by the three-dimensional reconstruction method is high, and the popularization and use of the three-dimensional scanner are not facilitated.
In addition, the three-dimensional scanner also improves the three-dimensional identification accuracy by using the color as the space coding information.
In order to make the technical solutions of the present application more clearly understood by those skilled in the art, the following description will be given with reference to specific embodiments:
the first embodiment is as follows:
taking fig. 1 as an example, the light beam processing device 22 includes a right-angle two-channel dichroic prism 22a, and the right-angle two-channel dichroic prism 22a includes a third light emitting portion and a fourth light emitting portion, wherein the light beam processing device 22 implements a light splitting process on the light projected from the light entering portion through the right-angle two-channel dichroic prism 22a, so that the light is projected from the third light emitting portion and the fourth light emitting portion to the cameras 21 corresponding to the respective light emitting portions.
Correspondingly, the image capturing device 20 includes a third camera 21 disposed corresponding to the third light-emitting portion, and a fourth camera 21 disposed corresponding to the fourth light-emitting portion, where the third camera 21 generates a third fringe image based on the collected light, the fourth camera 21 generates a fourth fringe image based on the collected light, and the third fringe image and the fourth fringe image each include fringes of at least two colors and recognizable fringes of at least two colors.
It should be noted that: the stripes of the third stripe image and the fourth stripe image, which both comprise at least two colors, are used for realizing the distinguishing processing of the two stripes in color, but not for limiting the color.
In addition, the light beam processing device 22 implements, through the right-angle two-channel dichroic prism 22a, separation processing on light rays obtained by the designated camera 21, so that the designated camera 21 obtains light rays containing a designated wavelength band, where the obtaining of light rays containing a designated wavelength band by the designated camera 21 includes: the third camera 21 acquires light of a third specified wavelength band, and the fourth camera 21 acquires light of a fourth specified wavelength band.
The following examples illustrate:
preferably, the third camera 21 is a black and white camera 21, and the fourth camera 21 is a color camera 21.
The light emitting part 12 emits red light to the light transmitting part 13 at a first period, and the red light is projected by a preset pattern arranged on the light transmitting part 13 to generate a first preset stripe pattern; the first preset stripe pattern is projected onto the target object in the form of red coding stripe, and the light is modulated by the target object and then transmitted to the image processing device, in this embodiment, the right-angle two-channel dichroic prism 22a is a red, green and blue dichroic prism, so that the red light is emitted from the third light emitting portion, and the green light and the blue light are emitted from the fourth light emitting portion; at this time, the red coding stripe is emitted from the third light-emitting part through the right-angle two-channel dichroic prism 22a and collected by the black-and-white camera 21, and the black-and-white camera 21 generates a third stripe image containing the red stripe;
the light emitting part 12 emits green light and blue light to the light transmitting part 13 at a second period, and the green light and the blue light are transmitted through a preset image arranged on the light transmitting part 13 to generate a second preset stripe pattern; the second preset stripe pattern is projected to the target object in a green-blue coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the green-blue encoded stripe is emitted from the fourth light-emitting portion through the right-angle two-channel dichroic prism 22a and collected by the color camera 21, and the color camera 21 generates a fourth stripe image including the green stripe and the blue stripe.
The illuminating part 30 projects illuminating light onto the target object in the eighth time period, the illuminating light is transmitted to the image processing device after being emitted by the target object, the blue light and the green light in the illuminating light are collected by the color camera 21 to generate a fourth texture map, the red light is collected by the black-and-white camera 21 to generate a third texture map, and the third texture map and the fourth texture map are synthesized into the texture map of the target object. It can be seen that, in order to obtain the texture map of the target object, red light, green light and blue light all need to be collected and recognized by the color camera 21, or red light, green light and blue light all need to be collected and recognized by the color camera 21 and the black-and-white camera 21, that is, part of color light is collected and recognized by the color camera 21, and part of color light is collected and recognized by the black-and-white camera 21.
Further, since the third fringe image and the fourth fringe image both correspond to the same light transmission portion 13, the fringes in the third fringe image and the fourth fringe image correspond to each other, and specifically, the fringes in the third fringe image and the fourth fringe image after being combined based on the same coordinate system correspond to the preset color-coded fringes on the light transmission portion 13.
Specifically, the third fringe image is used as a reconstruction image, the fourth fringe image is used as an encoding image, the fourth fringe image is collected by the color camera 21, and both the green fringe and the blue fringe in the fourth fringe image can be identified, so that the encoding sequence of each fringe in the fourth fringe image can be determined; based on the corresponding relation of the stripes of the third stripe image and the fourth stripe image, all the stripes of the third stripe image can be identified and matched through the coding sequence of the fourth stripe image, and three-dimensional reconstruction is achieved.
Preferably, the black and white camera 21 only obtains monochromatic light, so that the third fringe image can be identified and determined, and the third fringe image can be combined with the fourth fringe image to determine the coding sequence of each fringe, that is, both the third fringe image and the fourth fringe image are used as coding patterns.
In addition, the filter 22d may be provided in this embodiment, or the filter 22d may not be provided, and the filter 22d may be provided in cooperation with the two-channel dichroic prism 22 a.
It is worth emphasizing that: in this embodiment, the light beam processing device 22 implements a light splitting process on the light projected from the light inlet portion through the right-angle two-channel dichroic prism 22a, so that the light is projected from the third light outlet portion and the fourth light outlet portion to the cameras 21 respectively disposed corresponding to the light outlet portions; that is, the light beam processing device 22 realizes the function corresponding to the first light beam splitting unit by the right-angle two-channel dichroic prism 22 a.
For the same reason, it is also worth emphasizing: in this embodiment, the light beam processing device 22 further performs separation processing on the light rays obtained by the designated camera 21 through the right-angle two-channel dichroic prism 22a, so that the designated camera 21 obtains the light rays containing the designated wavelength band; that is, the light beam processing device 22 realizes the function corresponding to the second light beam splitting unit by the right-angle two-channel dichroic prism 22 a.
For example, the right-angle two-channel dichroic prism 22a allows red, green and blue light rays to be emitted from the third light emitting portion and the blue light ray to be emitted from the fourth light emitting portion, when a light beam containing red, green and blue light rays passes through the right-angle two-channel dichroic prism 22a, the red, green and blue light rays are separated from the blue light rays, the red, green and blue light rays are emitted through the third light emitting portion, and the blue light ray is emitted through the third light emitting portion.
Example two:
taking fig. 5 as an example, the light beam processing device 22 includes a three-channel dichroic prism 22b, and the three-channel dichroic prism 22b includes a fifth light emitting portion, a sixth light emitting portion, and a seventh light emitting portion, wherein the light beam processing device 22 implements a light splitting process on the light rays projected from the light entering portion through the three-channel dichroic prism 22b, so that the light rays are projected to the cameras 21 respectively disposed corresponding to the light emitting portions from the fifth light emitting portion, the sixth light emitting portion, and the seventh light emitting portion.
Correspondingly, the image capturing device 20 includes a fifth camera 21 disposed corresponding to the fifth light-emitting portion, a sixth camera 21 disposed corresponding to the sixth light-emitting portion, and a seventh camera 21 disposed corresponding to the seventh light-emitting portion, the fifth camera 21 generates a fifth stripe image based on the collected light, the sixth camera 21 generates a sixth stripe image based on the collected light, the seventh camera 21 generates a seventh stripe image based on the collected light, and the fifth stripe image, the sixth stripe image, and the seventh stripe image each include stripes of at least two colors and the stripes of at least two colors are identifiable.
It should be noted that: the stripes including at least two colors in the fifth stripe image, the sixth stripe image and the seventh stripe image are used for realizing the distinguishing processing of the two stripes in color, but not for limiting the color.
At this time, the light beam processing device 22 implements separation processing on the light rays obtained by the designated camera 21 through the three-channel dichroic prism 22b, so that the designated camera 21 can obtain the light rays containing the designated wavelength band, where the obtaining of the light rays containing the designated wavelength band by the designated camera 21 at least includes: the fifth camera 21 acquires light of a fifth specified wavelength band, and the sixth camera 21 acquires light of a sixth specified wavelength band, where the fifth specified wavelength band is different from the sixth specified wavelength band.
Preferably, at least one of the fifth camera 21, the sixth camera 21 and the seventh camera 21 is a black-and-white camera 21, specifically, the fifth camera 21 is a black-and-white camera 21, and the sixth camera 21 and the seventh camera 21 are color cameras 21; or, the fifth camera 21 and the sixth camera 21 are black and white cameras 21, and the seventh camera 21 is a color camera 21; alternatively, the fifth camera 21, the sixth camera 21, and the seventh camera 21 are all black-and-white cameras 21.
The following examples illustrate:
preferably, the fifth camera 21, the sixth camera 21 and the seventh camera 21 are all black and white cameras 21.
The light emitting part 12 emits red light to the light transmitting part 13 in a third time period, and the red light is projected by the preset color coding stripes arranged on the light transmitting part 13 to generate a third preset stripe pattern; the third preset stripe pattern is projected onto the target object in the form of red coding stripe, and the light is modulated by the target object and then transmitted to the image processing device, in this embodiment, the light beam processing device is a three-channel dichroic prism 22b for separating three colors of red, green and blue, so that the red light is emitted from the fifth light emitting part, the green light is emitted from the sixth light emitting part, and the blue light is emitted from the seventh light emitting part; at this time, the red encoding stripe is decomposed by the three-channel dichroic prism 22b, and is collected by the fifth camera 21 through the fifth light-emitting part, and the fifth camera 21 generates a fifth stripe image containing the red stripe;
the light emitting part 12 emits blue light to the light transmitting part 13 at a fourth period, and the blue light is projected by a preset image arranged on the light transmitting part 13 to generate a fourth preset stripe pattern; the fourth preset stripe pattern is projected to the target object in a blue coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the blue-coded stripe three-channel dichroic prism 22b is decomposed and collected by the sixth camera 21 through the sixth light-emitting portion, and the sixth camera 21 generates a sixth stripe image including blue stripes.
The light emitting part 12 emits green light to the light transmitting part 13 in a fifth time period, and the green light is projected by a preset image arranged on the light transmitting part 13 to generate a fifth preset stripe pattern; the fifth preset stripe pattern is projected to the target object in a green coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the green-coded stripes are decomposed by the three-channel dichroic prism 22b, and are collected by the seventh camera 21 through the seventh light-emitting portion, and the seventh camera 21 generates a seventh stripe image including the green stripes.
The illuminating part 30 projects illuminating light onto the target object in a ninth time period, the illuminating light is transmitted to the image processing device after being emitted by the target object, red light in the illuminating light is collected by the fifth camera 21 to generate a fifth texture map, blue light is collected by the sixth camera 21 to generate a sixth texture map, green light is collected by the seventh camera 21 to generate a seventh texture map, and the fifth texture map, the sixth texture map and the seventh texture map are synthesized into a texture map of the target object. It can be seen that, in order to obtain the texture map of the target object, red light, green light and blue light all need to be collected and identified by the color camera 21, or red light, green light and blue light all need to be collected and identified by the color camera 21 and the black-and-white camera 21, that is, a part of color light is collected and identified by the color camera 21, a part of color light is collected and identified by the black-and-white camera 21, or red light, green light and blue light all need to be collected and identified by the black-and-white camera 21, that is, each color light is separately collected and identified by one black-and-white camera 21.
Further, since the fifth stripe image, the sixth stripe image and the seventh stripe image all correspond to the same light transmission portion 13, each stripe in the fifth stripe image, the sixth stripe image and the seventh stripe image corresponds to each other, and specifically, the fifth stripe image, the sixth stripe image and the seventh stripe image after being combined correspond to a preset pattern on the light transmission portion 13.
Specifically, any stripe image combination determined by the fifth stripe image, the sixth stripe image and the seventh stripe image may be used as a reconstructed image, and any stripe image combination determined by the fifth stripe image, the sixth stripe image and the seventh stripe image may be used as an encoded image. Preferably, the fifth stripe image, the sixth stripe image and the seventh stripe image are used together as a coding pattern to determine coding sequences of the respective stripes; and the fifth fringe image, the sixth fringe image and the seventh fringe image are used as reconstruction images together to realize three-dimensional reconstruction.
In addition, the filter 22d may be provided in this embodiment, or the filter 22d may not be provided, and the filter 22d may be provided in cooperation with the three-channel dichroic prism 22 b.
It is worth emphasizing that: in this embodiment, the light beam processing device 22 implements, through the three-channel dichroic prism 22b, a light splitting process on the light projected from the light inlet portion, so that the light is projected from the fifth light outlet portion, the sixth light outlet portion, and the seventh light outlet portion to the cameras 21 respectively disposed corresponding to the light outlet portions; that is, the light beam processing device 22 realizes the function corresponding to the first light beam splitting unit through the three-channel dichroic prism 22 b.
Similarly, in this embodiment, the light beam processing device 22 further performs separation processing on the light beam obtained by the designated camera 21 through the three-channel dichroic prism 22b, so that the designated camera 21 obtains the light beam containing the designated wavelength band; that is, the light beam processing device 22 realizes the function corresponding to the second light beam splitting unit through the three-channel dichroic prism 22 b.
Example three:
taking fig. 6 as an example, the light beam processing device 22 includes a semi-reflective and semi-transparent prism 22c, and the semi-reflective and semi-transparent prism 22c includes a first light emitting portion and a second light emitting portion, wherein the light beam processing device 22 implements a light splitting process on the light projected from the light inlet portion through the semi-reflective and semi-transparent prism 22c, so that the light is projected from the first light emitting portion and the second light emitting portion to the cameras 21 corresponding to the respective light emitting portions;
correspondingly, the image capturing device 20 includes a first camera 21 disposed corresponding to the first light emitting portion, and a second camera 21 disposed corresponding to the second light emitting portion, where the first camera 21 generates a first stripe image based on the collected light, and the second camera 21 generates a second stripe image based on the collected light, where the first stripe image and the second stripe image each include stripes of at least two colors and the stripes of at least two colors are identifiable.
It should be noted that: the stripes of the first stripe image and the second stripe image, which both comprise at least two colors, are used for realizing the distinguishing processing of the two stripes in color, but not the limitation of the color.
In addition, in an embodiment, the light beam processing apparatus 22 further includes a filter 22d, wherein the light beam processing apparatus 22 performs separation processing on the light rays to be acquired by the designated camera 21 through the filter 22d, so that the designated camera 21 acquires the light rays containing the designated wavelength band, and at least one of the plurality of cameras 21 is the designated camera 21.
In an alternative example, the optical filter 22d is disposed between the first light emitting portion and the first camera 21 so that the first camera 21 acquires light of the first specified wavelength band, and/or disposed between the second light emitting portion and the second camera 21 so that the second camera 21 acquires light of the second specified wavelength band.
The following examples illustrate:
preferably, the first camera 21 is a black-and-white camera 21, the second camera 21 is a color camera 21, and the black-and-white camera 21 is disposed corresponding to the filter 22 d.
The light emitting part 12 emits red light to the light transmitting part 13 in a sixth time period, and the red light is projected through a preset pattern (i.e. preset coding stripes) arranged on the light transmitting part 13 to generate a sixth preset stripe pattern; the sixth preset stripe pattern is projected to the target object in a red coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the red encoding stripe is decomposed by the half-reflecting and half-transmitting prism 22c into two red light beams, wherein at least one light beam is collected by the black-and-white camera 21 to generate a first stripe image.
In addition, the light is filtered by a red filter 22d before being collected by the black-and-white camera 21. That is, the filter color of the filter 22d provided in front of the camera 21 corresponds to the color of the light beam collected by the camera 21.
The light emitting part 12 emits red light and blue light to the light transmitting part 13 in a seventh period, and the red light and the blue light are projected through a preset image arranged on the light transmitting part 13 to generate a seventh preset stripe pattern; the seventh preset stripe pattern is projected to the target object in a red-blue coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the red-blue encoding stripe is decomposed by the half-reflecting and half-transmitting prism 22c into two red-blue light beams, wherein at least one light beam is collected by the color camera 21 to generate a second stripe image.
The illuminator 30 projects illumination light onto the target object in the tenth period, and transmits the illumination light to the image processing device after being emitted by the target object, and the red light, the blue light and the green light in the illumination light are collected by the second camera 21 to generate a texture map. In the present embodiment, if the optical filter 22d is disposed in front of the color camera 21, in order to obtain the texture map of the target object, red light, green light, and blue light need to be collected and recognized by the color camera 21 and the black-and-white camera 21, that is, part of the color light is collected and recognized by the color camera 21, and part of the color light is collected and recognized by the black-and-white camera 21.
Further, since the first stripe image and the second stripe image both correspond to the same light transmitting portion 13, the first stripe image and the second stripe image correspond to each stripe, and specifically, the first stripe image and the second stripe image are combined to correspond to a predetermined pattern on the light transmitting portion 13.
Specifically, the first stripe image is used as a reconstruction image, the second stripe image is used as a coding image, wherein the second stripe image is collected by the color camera 21, and both red stripes and blue stripes in the second stripe image can be identified and determined, so that the coding sequence of each stripe in the second stripe image can be determined; based on the corresponding relation of the stripes of the first stripe image and the second stripe image, each stripe of the first stripe image can be identified and matched through the coding sequence of the second stripe image, and three-dimensional reconstruction is achieved.
It should be noted that: the filter 22d is disposed in front of the black-and-white camera 21 as an optional example, and whether the filter 22d is disposed in front of the camera 21 is not specifically limited in the present application, and it is only necessary to ensure that at least two colors of stripes in the stripe image acquired by each camera 21 can be identified and determined.
Specifically, the optical filter 22d is not arranged in front of the black-and-white camera 21, and the first stripe image acquired by the black-and-white camera 21 includes red stripes; or, a blue color filter 22d is disposed in front of the color camera 21, the second stripe image acquired by the color camera 21 includes blue stripes, and because the red light emitted by the light emitting unit 12 in the sixth time period, the red light and the blue light emitted in the seventh time period, in order to ensure that at least two color stripes in the stripe image acquired by each camera 21 can be identified, the red color filter 22d cannot be disposed in front of the color camera 21, so that only red stripes are avoided in the stripe images acquired by the black-and-white camera 21 and the color camera 21; alternatively, a dichroic filter 22d is provided in front of the color camera 21, and the second streak image acquired by the color camera 21 includes red streaks and blue streaks.
It should be noted that the preset fringe pattern and the illumination light are projected at a very small time interval in each period, so as to ensure that the three-dimensional scanner remains stationary or substantially stationary in the period, and the preset fringe pattern and the illumination light are (substantially) projected on the same region of the target object.
It is worth emphasizing that: in this embodiment, the light beam processing device 22 transmits and reflects light through the half-reflecting and half-transmitting prism 22c to realize light splitting processing on the light projected from the light inlet portion, so that the light is projected from the first light outlet portion and the second light outlet portion to the cameras 21 respectively arranged corresponding to the light outlet portions; that is, the light beam processing device 22 realizes the function corresponding to the first light beam splitting unit through the half-reflecting and half-transmitting prism 22 c.
At the same time, it is also worth emphasizing: in this embodiment, the light beam processing device 22 performs separation processing on the light rays to be acquired by the designated camera 21 through the optical filter 22d, so that the designated camera 21 acquires the light rays containing the designated wavelength band; that is, the light beam processing device 22 realizes the function corresponding to the second light beam splitting unit through the filter 22 d.
It should be noted that: the first embodiment, the second embodiment and the third embodiment are all listed in the present application, so that a person skilled in the art can more clearly understand an exemplary illustration of the technical solution of the present application, and the present application is not limited specifically herein. Other specific devices may also be used as a practical solution to the present application, if they can implement the function limitation description of the beam processing device 22 in the present application.
Further, it should be noted that: for example, in the second embodiment and the third embodiment, after the light beam processing device 22 realizes the function corresponding to the second light beam splitting unit through the right-angle two-channel dichroic prism 22a or the three-channel dichroic prism 22b, the light beam processing device 22 may continue to realize the function corresponding to the second light beam splitting unit again through the optical filter 22 d.
In summary, compared with the prior art, the invention has the following beneficial effects:
1. the stripe extraction algorithm based on the spatial coding realizes the technical purpose that the target object can be subjected to three-dimensional reconstruction only by a small amount of two-dimensional images, and achieves the technical effects of reducing the frame rate of the camera 21 and the operation cost of the algorithm;
2. the color is used as the information of the space coding, so that the coded information is easy to identify, and the technical effect of improving the identification accuracy is achieved;
3. based on the technical principle of the three-dimensional scanner, the three-dimensional scanner can perform pattern projection processing in a simple transmission projection mode; furthermore, under the condition that the three-dimensional scanner performs pattern projection processing in a transmission projection mode, the hardware cost is greatly reduced;
4. in the case where the three-dimensional scanner performs the pattern projection processing using the laser as the light source, the luminance and the depth of field of the projection device (i.e., the combination of the light emitting unit 12 and the light transmitting unit 13) can be improved, and the technical effect of realizing low cost, high luminance, and high depth of field can be achieved.
That is, the three-dimensional scanner provided by the application has the advantages of low hardware cost, low real-time frame rate requirement, high brightness and large depth of field of an optical system, and miniaturization of equipment; and the three-dimensional scanner can directly perform dynamic real-time three-dimensional scanning with colored textures on materials with the characteristics of light reflection, light transmission, light diffusion and the like, such as teeth, gum and the like in the mouth.
According to an embodiment of the application, a three-dimensional scanning system is also provided.
FIG. 7 is a schematic diagram of a three-dimensional scanning system according to an embodiment of the present application. As shown in fig. 7, the three-dimensional scanning system includes: a three-dimensional scanner 71 and an image processor 73.
The three-dimensional scanner 71 is configured to project a preset stripe pattern corresponding to each preset period to a target object in each preset period, and collect light modulated by the target object to obtain a plurality of stripe images under the condition that the preset stripe pattern is projected to the target object, where the stripe of each preset stripe pattern is arranged according to a preset color coding stripe, each preset stripe pattern includes stripes of at least one color of the preset color coding stripes, the plurality of preset stripe patterns includes stripes of at least two colors of the preset color coding stripes, and the stripes of the preset stripe patterns are arranged in accordance with the stripes of the same color in the preset color coding stripes;
an image processor 73, connected to the three-dimensional scanner 71, configured to acquire a plurality of stripe images acquired by the three-dimensional scanner 71, determine each stripe sequence according to the stripe images as an encoding map, and perform three-dimensional reconstruction on the target object as a reconstruction map;
it should be noted that: the three-dimensional scanner 71 is any one of the three-dimensional scanners provided in the above embodiments.
It should also be noted that: the three-dimensional scanning system is based on the stripe extraction algorithm of the space coding, realizes the technical effects that the three-dimensional scanner 71 can perform pattern projection processing in a simple transmission projection mode, and can realize the three-dimensional reconstruction of a target object only by a small amount of two-dimensional images, and solves the technical problems that the three-dimensional reconstruction method in the related technology needs higher hardware cost and is not beneficial to the popularization and the use of a three-dimensional scanning device.
In addition, the three-dimensional scanning system also improves the three-dimensional identification accuracy by using the color as the space coding information.
In an optional example, in a case where the three-dimensional scanner 71 acquires light modulated by the target object through a plurality of cameras 21 to obtain a plurality of stripe images, and at least one black-and-white camera 21 is included in the plurality of cameras 21, the image processor 73 is further configured to: taking the stripe image obtained by at least one black-and-white camera 21 as a reconstruction image to perform three-dimensional reconstruction on the target object; the strip images obtained by at least one black and white camera 21 are used as code patterns to determine the respective strip sequences, and/or the strip images obtained by at least one color camera 21 are used as code patterns to determine the respective strip sequences.
According to the embodiment of the application, a three-dimensional scanning method is further provided.
It should be noted that: the three-dimensional scanning method provided by the embodiment of the application is applied to the three-dimensional scanner provided by the embodiment of the application. The following describes a three-dimensional scanning method provided in an embodiment of the present application.
Fig. 8 is a flowchart of a three-dimensional scanning method according to an embodiment of the present application. As shown in fig. 8, the three-dimensional scanning method includes:
step S801, respectively emitting initial light corresponding to each preset period in each preset period, where each initial light is composed of light of at least one color in a preset color coding stripe, and after each initial light is transmitted through a pattern of the preset color coding stripe arranged on the light transmission portion 13, each initial light generates a corresponding preset color stripe and projects the corresponding preset color stripe onto a target object;
step S803, respectively collecting light modulated by the target object in the preset periods, and obtaining a plurality of fringe images based on the light, where the obtained fringe images are used as a code pattern to determine fringe sequences, and are used as a reconstruction pattern to perform three-dimensional reconstruction on the target object;
step S805, determining a sequence of each stripe in the plurality of stripe images based on the coding pattern;
step S807, three-dimensional reconstruction is performed on the reconstructed map based on the sequence, and three-dimensional data of the target object is acquired.
In summary, the three-dimensional scanning method provided in the embodiment of the present application is based on the stripe extraction algorithm of the spatial coding, so that the three-dimensional scanner can perform the pattern projection processing in a simple transmission projection manner, and the three-dimensional reconstruction of the target object can be achieved only with a small number of two-dimensional images, thereby solving the technical problems that the three-dimensional reconstruction method in the related art requires high hardware cost and is not beneficial to the popularization and use of the three-dimensional scanning device.
In addition, the three-dimensional scanning method also achieves the technical effect of improving the three-dimensional identification accuracy by taking the color as the information of the space coding.
In an optional example, the three-dimensional scanning method further comprises: projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light; and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
Alternatively, the texture data may be acquired by a single camera 21, or may be synthesized by data acquired by a plurality of cameras 21.
Preferably, in step S803, light modulated by the target object is collected, and at least two stripe images are acquired based on the light, wherein at least one stripe image is acquired by the black-and-white camera 21, and the stripe image acquired by the black-and-white camera 21 is used as a reconstructed image.
Specifically, in step S805, a sequence of each stripe in the plurality of stripe images is determined based on the code map, and a code sequence is determined based on the arrangement information and the color information of each stripe in the code map, for example, if four stripes arranged in red, green, and red are coded and decoded by red (1,0) and green (0,1), the code sequence is (1,0) (0,1) (0,1) (1,0), and if five stripes arranged in red, blue, green, and red are coded and decoded by red (1,0,0), green (0,1,0), and blue (0,0,1), the code sequence is (1,0,0), (0,0,1), (0,1, 0);
specifically, in step S807, stripe matching is performed on each stripe of the reconstructed image based on the coding sequence, for binocular reconstruction, two image acquisition devices 20 are provided in combination with the present embodiment, stripe matching is performed on the reconstructed images of the two image acquisition devices 20, point cloud reconstruction is performed after matching, three-dimensional data of the target object is obtained, for monocular reconstruction, one image acquisition device 20 is provided in combination with the present embodiment, stripe matching is performed on the reconstructed image of the image acquisition device 20 and a preset color coding stripe provided on the light transmission portion 13, point cloud reconstruction is performed after matching, and three-dimensional data of the target object is obtained.
The following is explained by a specific method:
in an alternative example, the light emitting part 12 and the light transmitting part 13 project red and blue color coding stripes to a target object in a first period, the red and blue color coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the red and blue color coding stripes is separated into at least one beam of light of the red and blue color coding stripes by the semi-reflective and semi-transparent prism 22c, wherein the beam of light of the red and blue color coding stripes is collected by the color camera 21, and the color camera 21 generates a corresponding image of the red and blue color coding stripes; and the light emitting part 12 and the light transmitting part 13 project blue coding stripes to the target object in the second time period, the blue coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the blue coding stripes is separated into at least one beam of light of the blue coding stripes by the semi-reflective and semi-transparent prism 22c, wherein one beam of light of the blue coding stripes is collected by the black-and-white camera 21 through the blue filter 22d, and the black-and-white camera 21 generates corresponding blue stripe images.
In addition, the illuminating part 30 irradiates white light to the target object in a third time period, the white light is collected by the color camera 21 after being reflected by the target object, the color camera 21 generates a texture map, a coding sequence of each stripe is determined based on the red and blue color coding stripe images, stripe matching is performed on each stripe of the blue stripe images based on the coding sequence, three-dimensional reconstruction is achieved, three-dimensional data of the target object is obtained, and true color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture map.
In an alternative example, the light emitting part 12 and the light transmitting part 13 project red and green color coding stripes to a target object in a first period, the red and green color coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the red and green color coding stripes is decomposed into a beam of light of the red and green color coding stripes through the right-angle two-channel color separation prism 22a, wherein the beam of light of the red and green color coding stripes is collected by the color camera 21, and the color camera 21 generates a corresponding image of the red and green color coding stripes; and the light emitting part 12 and the light transmitting part 13 project blue coding stripes to the target object in the second time period, the blue coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the blue coding stripes is decomposed into a beam of light of the blue coding stripes by the right-angle two-channel dichroic prism 22a, wherein the beam of light of the blue coding stripes is collected by the black and white camera 21, and the black and white camera 21 generates a corresponding blue stripe image.
In addition, the lighting device 30 irradiates white light to the target object in a third time period, the white light is collected by the color camera 21 and the black-and-white camera 21 after being reflected by the target object, the color camera 21 generates texture maps based on red light and green light, the black-and-white camera 21 generates texture maps based on blue light, a coding sequence of each stripe is determined based on a red-green color coding stripe image, each stripe of the blue stripe image is subjected to stripe matching based on the coding sequence, three-dimensional reconstruction is achieved, three-dimensional data of the target object is obtained, the texture maps based on the white light are synthesized based on the texture maps of the color camera 21 and the texture maps of the black-and-white camera 21, and true color three-dimensional data of the target.
In an alternative example, the light emitting unit 12 and the light transmitting unit 13 project red coding stripes to a target object in a first time period, the red coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the red coding stripes is decomposed into a bundle of light of the red coding stripes by the three-channel dichroic prism 22b, wherein the bundle of light of the red coding stripes is collected by the first black and white camera 21, and the first black and white camera 21 generates a corresponding image of the red coding stripes; the light emitting part 12 and the light transmitting part 13 project green coding stripes to a target object in a second time period, the green coding stripes are modulated by the target object and then transmitted to the image processing device, light of the green coding stripes is decomposed into a beam of light of the green coding stripes through the three-channel color separation prism 22b, wherein the beam of light of the green coding stripes is collected by the second black-and-white camera 21, and the second black-and-white camera 21 generates corresponding green coding stripe images; and the light emitting part 12 and the light transmitting part 13 project blue coding stripes to the target object in a third time period, the blue coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the blue coding stripes is decomposed into a beam of light of the blue coding stripes by the three-channel color separation prism 22b, wherein the beam of light of the blue coding stripes is collected by the third black and white camera 21, and the third black and white camera 21 generates a corresponding blue coding stripe image.
In addition, the lighting device 30 irradiates white light to the target object at a fourth time period, the white light is collected by the three black and white cameras 21 after being reflected by the target object, the first black and white camera 21 generates a texture map based on red light, the second black and white camera 21 generates a texture map based on green light, the third black and white camera 21 generates a texture map based on blue light, a coding sequence of each stripe is determined based on a combination of the red stripe image, the green stripe image and the blue stripe image, stripe matching is performed on each stripe of the red stripe image, the green stripe image and the blue stripe image based on the coding sequence, three-dimensional reconstruction is achieved to obtain three-dimensional data of the target object, the texture map based on the white light is synthesized based on the texture maps of the three black and white cameras 21, and true color three-dimensional data of the target object is.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
An embodiment of the present invention provides a storage medium on which a program is stored, the program implementing the three-dimensional scanning method when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the three-dimensional scanning method is executed when the program runs.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
In addition, in the above embodiments of the present invention, the description of each embodiment has a respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related description of other embodiments.

Claims (18)

1. A three-dimensional scanner, comprising:
the image projection device (10) is used for projecting preset stripe patterns corresponding to each preset period to the target object in each preset period respectively, wherein the stripes of each preset stripe pattern are arranged according to the preset color coding stripes, each preset stripe pattern comprises stripes of at least one color of the preset color coding stripes, the preset stripe patterns comprise stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes;
and the image acquisition device (20) is used for acquiring light modulated by the target object to acquire a plurality of stripe images under the condition that the target object is projected with a preset stripe pattern, wherein the acquired stripe images are used as an encoding graph to determine each stripe sequence and used as a reconstruction graph to carry out three-dimensional reconstruction on the target object.
2. The three-dimensional scanner according to claim 1, wherein the image projection device (10) further comprises:
and the image projection device (10) projects preset fringe patterns corresponding to each preset period to the target object through the DLP projection part (11) in each preset period.
3. The three-dimensional scanner according to claim 1, wherein the image projection device (10) further comprises:
a light emitting part (12) for emitting initial light corresponding to each preset period in each preset period, wherein each initial light is composed of light of at least one stripe color, and the stripe color is the color of the stripes in the preset color coding stripes;
and the light transmission part (13) is arranged on the transmission path of the initial light, wherein each initial light generates a corresponding preset color stripe to be projected onto a target object after being transmitted by a pattern of the preset color coding stripe arranged on the light transmission part (13).
4. The three-dimensional scanner according to claim 3, wherein the light emitting portion (12) further comprises a plurality of light source units (121), each of the light source units (121) emitting light of a different wavelength band, wherein the light emitting portion (12) emits the initial light through the plurality of light source units (121).
5. The three-dimensional scanner according to claim 1, further comprising a timing control unit, connected to the image projection device (10) and the image capturing device (20), for controlling the image projection device (10) to emit the preset fringe pattern corresponding to each preset period respectively in each preset period, and controlling the image capturing device (20) to capture the light modulated by the target object respectively in a plurality of preset periods, so as to obtain the fringe image corresponding to each preset fringe pattern.
6. The three-dimensional scanner according to claim 4, further comprising a timing control unit, connected to the plurality of light source units (121) and the image capturing device (20), for controlling the plurality of light source units (121) to emit light rays in different preset periods respectively, so as to generate initial light rays corresponding to the preset periods respectively in each preset period; and controlling the image acquisition device (20) to acquire the light modulated by the target object in a plurality of preset periods respectively so as to acquire a fringe image corresponding to each initial light.
7. The three-dimensional scanner according to claim 1, further comprising an illuminator (30), the illuminator (30) being configured to illuminate a target object, wherein the image acquisition device (20) is further configured to acquire a texture map of the target object in a case where the target object is projected with illumination light by the illuminator (30).
8. The three-dimensional scanner according to claim 1, wherein the image capturing device (20) further comprises a plurality of cameras (21), and the plurality of cameras (21) at least comprises a black and white camera (21), wherein the image capturing device (20) captures the light modulated by the target object through the plurality of cameras (21) to obtain a plurality of stripe images, and wherein the stripe images obtained by at least one black and white camera (21) are used as a reconstruction map to perform three-dimensional reconstruction on the target object; and the stripe images obtained by at least a plurality of black and white cameras (21) are used as coding patterns to determine each stripe sequence, and/or the stripe images obtained by at least one color camera (21) are used as coding patterns to determine each stripe sequence.
9. The three-dimensional scanner according to claim 8, wherein the image capturing device (20) further comprises a light beam processing device (22), the light beam processing device (22) comprising an light entrance portion and at least two light exit portions, wherein each camera (21) is respectively arranged corresponding to a different light exit portion, and the image capturing device (20) is configured to capture the light modulated by the target object through the light beam processing device (22).
10. The three-dimensional scanner according to claim 9, wherein the light beam processing device (22) further comprises at least one first light beam splitting unit, and the first light beam splitting unit is configured to split the light projected from the light inlet portion, so that the light is projected from the at least two light outlet portions to the camera (21) corresponding to the light outlet portion.
11. The three-dimensional scanner according to claim 10, wherein the light beam processing device (22) further comprises at least one second light beam splitting unit, and the second light beam splitting unit is configured to split the light beam to be acquired by the designated camera (21) so that the designated camera (21) acquires the light beam with the designated wavelength band, wherein the designated wavelength band at least comprises: at least one initial light ray comprises a light ray band.
12. The three-dimensional scanner according to claim 9, wherein the three-dimensional scanner is configured to include any one of:
the light beam processing device (22) comprises a right-angle two-channel dichroic prism (22a), and the right-angle two-channel dichroic prism (22a) comprises a third light-emitting part and a fourth light-emitting part, wherein the light beam processing device (22) realizes light splitting processing on light rays projected from the light-entering part through the right-angle two-channel dichroic prism (22a), so that the light rays are projected to cameras (21) which are correspondingly arranged on the light-emitting parts respectively from the third light-emitting part and the fourth light-emitting part; the image acquisition device (20) comprises a third camera (21) arranged corresponding to the third light-emitting part and a fourth camera (21) arranged corresponding to the fourth light-emitting part, the third camera (21) generates a third fringe image based on the acquired light, the fourth camera (21) generates a fourth fringe image based on the acquired light, and the third fringe image and the fourth fringe image respectively comprise fringes with at least two colors, and the fringes with at least two colors can be identified; the light beam processing device (22) separates the light rays obtained by the specified camera (21) through the right-angle two-channel color separation prism (22a), so that the specified camera (21) obtains the light rays containing the specified wavelength band, wherein the step of obtaining the light rays containing the specified wavelength band by the specified camera (21) comprises the following steps: the third camera (21) acquires light rays of a third specified waveband, and the fourth camera (21) acquires light rays of a fourth specified waveband;
the light beam processing device (22) comprises a three-channel dichroic prism (22b), the three-channel dichroic prism (22b) comprises a fifth light-emitting part, a sixth light-emitting part and a seventh light-emitting part, and the light beam processing device (22) realizes light splitting processing on light rays projected from the light-entering part through the three-channel dichroic prism (22b) so that the light rays are projected to cameras (21) which are arranged corresponding to the light-emitting parts respectively from the fifth light-emitting part, the sixth light-emitting part and the seventh light-emitting part; the image acquisition device (20) comprises a fifth camera (21) arranged corresponding to the fifth light-emitting part, a sixth camera (21) arranged corresponding to the sixth light-emitting part, and a seventh camera (21) arranged corresponding to the seventh light-emitting part, wherein the fifth camera (21) generates a fifth stripe image based on the acquired light, the sixth camera (21) generates a sixth stripe image based on the acquired light, the seventh camera (21) generates a seventh stripe image based on the acquired light, and the fifth stripe image, the sixth stripe image and the seventh stripe image respectively comprise stripes with at least two colors and the stripes with at least two colors are identifiable; the light beam processing device (22) separates the acquired light by the designated camera (21) through the three-channel dichroic prism (22b), so that the designated camera (21) acquires the light containing the designated wavelength band, wherein the acquiring of the light containing the designated wavelength band by the designated camera (21) at least comprises: the fifth camera (21) acquires light of a fifth specified waveband, the sixth camera (21) acquires light of a sixth specified waveband, and the fifth specified waveband is different from the sixth specified waveband;
the light beam processing device (22) comprises a semi-reflective and semi-transparent prism (22c), and the semi-reflective and semi-transparent prism (22c) comprises a first light-emitting part and a second light-emitting part, wherein the light beam processing device (22) realizes the light splitting processing of the light projected from the light-inlet part through the semi-reflective and semi-transparent prism (22c), so that the light is projected to the cameras (21) which are correspondingly arranged on the light-emitting parts from the first light-emitting part and the second light-emitting part respectively; the image acquisition device (20) comprises a first camera (21) arranged corresponding to the first light-emitting part and a second camera (21) arranged corresponding to the second light-emitting part, the first camera (21) generates a first stripe image based on acquired light, the second camera (21) generates a second stripe image based on acquired light, and the first stripe image and the second stripe image respectively comprise stripes of at least two colors, and the stripes of the at least two colors can be identified.
13. The three-dimensional scanner according to claim 12, wherein said beam processing means (22) further comprises a filter (22d),
the light beam processing device (22) performs separation processing on the light rays acquired by the appointed camera (21) through the optical filter (22d) so that the appointed camera (21) acquires the light rays containing the appointed wave band, and at least one camera (21) of the cameras (21) is the appointed camera (21).
14. The three-dimensional scanner according to any of claims 1-13, wherein the image acquisition device (20) is configured to identify red, green and blue light.
15. A three-dimensional scanning system, comprising:
the three-dimensional scanner is used for projecting preset stripe patterns corresponding to each preset period to a target object in each preset period and acquiring light modulated by the target object under the condition that the target object is projected with the preset stripe patterns to obtain a plurality of stripe images, wherein the stripes of each preset stripe pattern are arranged according to preset color coding stripes, each preset stripe pattern comprises stripes of at least one color of the preset color coding stripes, the preset stripe patterns comprise stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes;
the image processor is connected with the three-dimensional scanner and used for acquiring a plurality of stripe images acquired by the three-dimensional scanner, determining each stripe sequence according to the stripe images as an encoding image and performing three-dimensional reconstruction on the target object as a reconstruction image;
wherein the three-dimensional scanner is the three-dimensional scanner of any one of claims 1-19.
16. The three-dimensional scanning system of claim 15, wherein in the case where the three-dimensional scanner acquires light modulated by the target object through a plurality of cameras to obtain a plurality of fringe images, and at least one of the plurality of cameras comprises a black and white camera, the image processor is further configured to:
taking a stripe image obtained by at least one black-and-white camera as a reconstruction image to carry out three-dimensional reconstruction on the target object;
the stripe images obtained by at least a plurality of black and white cameras are used as coding patterns to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as coding patterns to determine each stripe sequence.
17. A three-dimensional scanning method applied to the three-dimensional scanner according to any one of claims 3 to 19, the three-dimensional scanning method comprising:
emitting initial light rays corresponding to each preset period in each preset period, wherein each initial light ray consists of light rays with at least one color in preset color coding stripes, and after each initial light ray is transmitted by a pattern of the preset color coding stripes arranged on the light ray transmission part, the corresponding preset color stripes are generated and projected onto a target object;
respectively collecting light modulated by the target object in the preset periods, and acquiring a plurality of stripe images based on the light, wherein the acquired stripe images are used as encoding images to determine each stripe sequence and as reconstruction images to perform three-dimensional reconstruction on the target object;
determining a sequence of stripes in the plurality of stripe images based on the coding pattern;
and performing three-dimensional reconstruction on the reconstruction map based on the sequence to acquire three-dimensional data of the target object.
18. The three-dimensional scanning method of claim 17, further comprising:
projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light;
and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
CN201911018772.7A 2019-10-24 2019-10-24 Three-dimensional scanner, three-dimensional scanning system, and three-dimensional scanning method Active CN112712583B (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
CN201911018772.7A CN112712583B (en) 2019-10-24 2019-10-24 Three-dimensional scanner, three-dimensional scanning system, and three-dimensional scanning method
AU2020371142A AU2020371142B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
PCT/CN2020/123684 WO2021078300A1 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
KR1020227017511A KR20220084402A (en) 2019-10-24 2020-10-26 3D Scanners and 3D Scanning Methods
US17/771,470 US12007224B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
EP20878731.7A EP4050302A4 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
CA3158933A CA3158933A1 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
JP2022524057A JP7298025B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911018772.7A CN112712583B (en) 2019-10-24 2019-10-24 Three-dimensional scanner, three-dimensional scanning system, and three-dimensional scanning method

Publications (2)

Publication Number Publication Date
CN112712583A true CN112712583A (en) 2021-04-27
CN112712583B CN112712583B (en) 2024-06-11

Family

ID=75540322

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911018772.7A Active CN112712583B (en) 2019-10-24 2019-10-24 Three-dimensional scanner, three-dimensional scanning system, and three-dimensional scanning method

Country Status (1)

Country Link
CN (1) CN112712583B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114521982A (en) * 2022-02-21 2022-05-24 资阳联耀医疗器械有限责任公司 Intraoral scanner, intraoral scanning implementation method and storage medium
CN115401527A (en) * 2022-10-08 2022-11-29 瑞安市博业激光应用技术有限公司 Visual teaching system
CN116408575B (en) * 2021-12-31 2024-06-04 广东美的白色家电技术创新中心有限公司 Method, device and system for locally scanning and eliminating workpiece reflection interference

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100311005A1 (en) * 2009-06-03 2010-12-09 Carestream Health, Inc. Apparatus for dental surface shape and shade imaging
US20110058031A1 (en) * 2009-09-04 2011-03-10 Mitutoyo Corporation Image processing measuring apparatus and image processing measurement method
CN102980526A (en) * 2012-08-23 2013-03-20 杭州先临三维科技股份有限公司 Three-dimensional scanister using black and white camera to obtain color image and scan method thereof
US20140022356A1 (en) * 2010-12-21 2014-01-23 3Shape A/S Optical system in 3d focus scanner
CN104677308A (en) * 2015-01-30 2015-06-03 宋展 Three-dimensional scanning method for high-frequency two-value strip
CN206132003U (en) * 2016-10-19 2017-04-26 杭州思看科技有限公司 Spatial digitizer who contains a plurality of different wavelength laser
CN109489583A (en) * 2018-11-19 2019-03-19 先临三维科技股份有限公司 Projection arrangement, acquisition device and the 3 D scanning system with it
CN109584352A (en) * 2018-08-21 2019-04-05 先临三维科技股份有限公司 Image acquisition, processing method, device and the three-dimensional scanning device of 3-D scanning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100311005A1 (en) * 2009-06-03 2010-12-09 Carestream Health, Inc. Apparatus for dental surface shape and shade imaging
US20110058031A1 (en) * 2009-09-04 2011-03-10 Mitutoyo Corporation Image processing measuring apparatus and image processing measurement method
US20140022356A1 (en) * 2010-12-21 2014-01-23 3Shape A/S Optical system in 3d focus scanner
CN102980526A (en) * 2012-08-23 2013-03-20 杭州先临三维科技股份有限公司 Three-dimensional scanister using black and white camera to obtain color image and scan method thereof
CN104677308A (en) * 2015-01-30 2015-06-03 宋展 Three-dimensional scanning method for high-frequency two-value strip
CN206132003U (en) * 2016-10-19 2017-04-26 杭州思看科技有限公司 Spatial digitizer who contains a plurality of different wavelength laser
CN109584352A (en) * 2018-08-21 2019-04-05 先临三维科技股份有限公司 Image acquisition, processing method, device and the three-dimensional scanning device of 3-D scanning
CN109489583A (en) * 2018-11-19 2019-03-19 先临三维科技股份有限公司 Projection arrangement, acquisition device and the 3 D scanning system with it

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116408575B (en) * 2021-12-31 2024-06-04 广东美的白色家电技术创新中心有限公司 Method, device and system for locally scanning and eliminating workpiece reflection interference
CN114521982A (en) * 2022-02-21 2022-05-24 资阳联耀医疗器械有限责任公司 Intraoral scanner, intraoral scanning implementation method and storage medium
CN115401527A (en) * 2022-10-08 2022-11-29 瑞安市博业激光应用技术有限公司 Visual teaching system
CN115401527B (en) * 2022-10-08 2024-04-23 瑞安市博业激光应用技术有限公司 Visual teaching system

Also Published As

Publication number Publication date
CN112712583B (en) 2024-06-11

Similar Documents

Publication Publication Date Title
CN109489583B (en) Projection device, acquisition device and three-dimensional scanning system with same
CN112710253B (en) Three-dimensional scanner and three-dimensional scanning method
US11528463B2 (en) Method and apparatus for colour imaging a three-dimensional structure
CN112985307B (en) Three-dimensional scanner, system and three-dimensional reconstruction method
CN112712583B (en) Three-dimensional scanner, three-dimensional scanning system, and three-dimensional scanning method
KR101691156B1 (en) Optical system having integrated illumination and imaging systems and 3D image acquisition apparatus including the optical system
JP2001523827A (en) Three-dimensional imaging by triangulation using dual-wavelength light
WO2021078300A1 (en) Three-dimensional scanner and three-dimensional scanning method
JP3818028B2 (en) 3D image capturing apparatus and 3D image capturing method
US20230320825A1 (en) Method and intraoral scanner for detecting the topography of the surface of a translucent object, in particular a dental object
KR100902176B1 (en) 3d scanner using the polygon mirror
CN117804342A (en) Three-dimensional scanner and three-dimensional scanning method
CN118317029A (en) Three-dimensional scanning device and three-dimensional scanning system
RU2543688C2 (en) Camera and optical system for obtaining 3d images (versions)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant