CN110619617B - Three-dimensional imaging method, device, equipment and computer readable storage medium - Google Patents

Three-dimensional imaging method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN110619617B
CN110619617B CN201910927761.4A CN201910927761A CN110619617B CN 110619617 B CN110619617 B CN 110619617B CN 201910927761 A CN201910927761 A CN 201910927761A CN 110619617 B CN110619617 B CN 110619617B
Authority
CN
China
Prior art keywords
dimensional
image data
area array
frequency
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910927761.4A
Other languages
Chinese (zh)
Other versions
CN110619617A (en
Inventor
孙海江
王宇庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN201910927761.4A priority Critical patent/CN110619617B/en
Publication of CN110619617A publication Critical patent/CN110619617A/en
Application granted granted Critical
Publication of CN110619617B publication Critical patent/CN110619617B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the invention discloses a three-dimensional imaging method, a three-dimensional imaging device and a three-dimensional imaging system. The three-dimensional imaging system comprises an image acquisition device, a synchronous driving circuit and an information processor. The image acquisition device comprises a visible light sensor and an area array three-dimensional sensor, wherein the visible light sensor is used for acquiring two-dimensional image data of a measured point, and the area array three-dimensional sensor is used for acquiring depth image data of the measured point and outputting the depth image data in an area array form. The synchronous driving circuit is used for controlling the synchronism of the two-dimensional image data and the depth image data acquisition so as to ensure the consistency of the time domain information acquisition; the information processor is used for fusing the two-dimensional image data and the depth image data in real time and generating point cloud data which are output in an area array form. The technical scheme provided by the application realizes the output of the large-resolution and high-frame frequency point cloud data in the form of an area array, and meets the requirements of high-resolution and high-precision three-dimensional imaging in the technical field of depth vision.

Description

Three-dimensional imaging method, device, equipment and computer readable storage medium
Technical Field
The embodiment of the invention relates to the technical field of optical imaging, in particular to a three-dimensional imaging method, a three-dimensional imaging device and a three-dimensional imaging system.
Background
The depth vision technology is an important development direction of future machine vision industry and photoelectric industry, compared with traditional machine vision two-dimensional information processing, a depth image comprises three-dimensional depth information and two-dimensional gray scale information of a scene, the depth image is consistent with a vision imaging mechanism of human eyes, the future machine vision industry develops from common two-dimensional image vision to depth vision, and the technology is not a revolutionary revolution in almost all vision and photoelectric industry related fields such as intelligent robots, unmanned driving, AR/VR, security monitoring and the like.
The heart of depth vision is a high-performance three-dimensional depth imaging technique. In the related art, a three-dimensional imaging technology can be realized by laser scanning, structured light, binocular vision, and a Time of flight (TOF) technology. The scanning three-dimensional imaging, structured light and binocular vision methods cannot meet the high-performance depth imaging requirement due to the defects of complex mechanism, high cost, poor stability and the like. Although the TOF technology can acquire depth information of a scene in a non-scanning manner, due to the limitation of a sensor process and an optical design theory, more point cloud data cannot be acquired in real time, the TOF imaging resolution is too low, the imaging quality is poor, the working distance cannot meet the requirements of the high-end field, and only three-dimensional information can be used as a distance measurement method and cannot be applied as a visual imaging technology.
Disclosure of Invention
The embodiment of the disclosure provides a three-dimensional imaging method, a three-dimensional imaging device and a three-dimensional imaging system, which realize the output of large-resolution and high-frame-frequency point cloud data in an area array form and meet the requirements of high-resolution and high-precision three-dimensional imaging in the technical field of depth vision.
In order to solve the above technical problems, embodiments of the present invention provide the following technical solutions:
in one aspect, an embodiment of the present invention provides a three-dimensional imaging system, including an image acquisition device, a synchronous driving circuit, and an information processor;
the image acquisition device comprises a visible light sensor for acquiring two-dimensional image data of a measured point and an area array three-dimensional sensor for acquiring depth image data of the measured point and outputting the depth image data in an area array form;
the synchronous driving circuit is used for controlling the acquisition synchronism of the two-dimensional image data and the depth image data so as to ensure the consistency of time domain information acquisition;
the information processor is used for fusing the two-dimensional image data and the depth image data in real time and generating point cloud data output in an area array form.
Optionally, the image acquisition device further includes a single-point laser sensor, and the information processor further includes a data self-calibration module and a light source control module;
the light source control module is used for modulating the working frequency of a laser lighting source of the single-point laser sensor according to the working frequency of the area array three-dimensional sensor so as to synchronously trigger the area array three-dimensional sensor and the single-point laser sensor to acquire data;
the data self-calibration module is used for calibrating data of the three-dimensional imaging system and self-calibrating point cloud data according to the distance information of the measured point acquired by the single-point laser sensor.
Optionally, the device further comprises an energy converging optical device;
the energy converging optical device is used for carrying out remote energy converging of the laser illumination light source in the working area of the single-point laser sensor so as to increase the illumination distance of the laser illumination light source.
Optionally, the energy converging optical device is an aspheric discontinuous arc-shaped reflecting light cup.
Optionally, the aspheric discontinuous arc-shaped reflective light cup is further provided with a reflective film covering the surface of the light cup.
Optionally, the information processor further includes an operating frequency calculation module;
the working frequency calculation module is used for calculating the working frequency of the area array three-dimensional sensor according to a first formula, wherein the first formula is as follows:
Figure BDA0002219378390000021
in the formula, N0For actually measuring the distance, n is the total frequency number of the area array three-dimensional sensor, kiIs a multiple of the wavelength, diDistances obtained for different modulation frequencies, c is the speed of light, fiIs the ith operating frequency.
Optionally, the information processor further includes an operating frequency calculation module;
the working frequency calculation module is used for calculating the working frequency of the area array three-dimensional sensor according to a second formula, wherein the second formula is as follows:
Figure BDA0002219378390000031
in the formula, N0' is the weighted real measuring distance, n is the total number of the frequencies of the area array three-dimensional sensor, m is the total number of the frequencies, u is the first position of the frequency, v is the second position of the frequency, kvAnd kuIs a multiple of the wavelength, c is the speed of light, fuFor the u-th operating frequency, fvFor the v-th jobThe frequency of the radio frequency is set to be,
Figure BDA0002219378390000032
is fuCorresponding variance, μuIs fuA corresponding mean value; mu.svIs fvThe corresponding average value of the average value,
Figure BDA0002219378390000033
is fvThe corresponding variance.
Another aspect of the embodiments of the present invention provides a three-dimensional imaging method, including:
simultaneously acquiring two-dimensional visible light image data and depth image data of a measured point at a first moment;
and fusing the two-dimensional visible light image data and the depth image data in real time to generate point cloud data output in an area array form.
Optionally, after the two-dimensional visible light image data and the depth image data are fused in real time, the method further includes:
self-calibration is carried out on the fused data according to the laser distance data information of the measured point so as to serve as point cloud data of the measured point to be output;
the laser distance data information is the laser distance data of the measured point acquired at the first moment.
An embodiment of the present invention further provides a three-dimensional imaging apparatus, including:
the data acquisition module is used for simultaneously acquiring two-dimensional visible light image data of a measured point at a first moment and depth image data output in an area array form;
and the data fusion module is used for fusing the two-dimensional visible light image data and the depth image data in real time to generate point cloud data output in an area array form.
The technical scheme provided by the application has the advantages that the multiple image sensors are comprehensively utilized to carry out dense and high-density image data sampling, the synchronous driving circuit is arranged to ensure the sampling synchronism of the high-density visible light data and the dense depth image data, so that the time domain consistency of the collected image data is ensured, finally, the image data with the time domain consistency is fused in real time, the detail information of the depth image is effectively increased, the quality and the resolution of three-dimensional imaging are improved, the high-resolution and high-frame frequency point cloud data output in an area array form is realized, and the requirements of the depth vision technical field on high-resolution and high-precision three-dimensional imaging are met.
In addition, the embodiment of the invention also provides a corresponding implementation method and a virtual device for the three-dimensional imaging system, so that the method is more feasible, and the device, the equipment and the computer-readable storage medium have corresponding advantages.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the related arts, the drawings used in the description of the embodiments or the related arts will be briefly described below, it is obvious that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a block diagram of an embodiment of a three-dimensional imaging system according to an embodiment of the present invention;
fig. 2 is a block diagram of another embodiment of a three-dimensional imaging system according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a three-dimensional imaging method according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of another three-dimensional imaging method provided by the embodiment of the invention;
fig. 5 is a structural diagram of an embodiment of a three-dimensional imaging device according to an embodiment of the present invention;
fig. 6 is a structural diagram of another specific implementation of the three-dimensional imaging device according to the embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the disclosure, the invention will be described in further detail with reference to the accompanying drawings and specific embodiments. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may include other steps or elements not expressly listed.
Having described the technical solutions of the embodiments of the present invention, various non-limiting embodiments of the present application are described in detail below.
Referring to fig. 1, fig. 1 is a schematic structural frame diagram of a three-dimensional imaging system according to an embodiment of the present invention, where the embodiment of the present invention includes the following:
the three-dimensional imaging system can comprise an image acquisition device 1, a synchronous driving circuit 2 and an information processor 3, wherein the image acquisition device 1 is respectively connected with the synchronous driving circuit 2 and the information processor 3.
The image acquisition device 1 includes a visible light sensor 11 and an array three-dimensional sensor 12. The visible light sensor 11 is used for collecting two-dimensional gray scale image data, the area array three-dimensional sensor 12 is used for collecting depth image data, and the depth image data is output in an area array form. The number, types and hardware parameters of the visible light sensors 11 and the three-dimensional array sensor 12 included in the image acquisition device 1 can be determined according to the actual application scene, which is not limited in this application.
In this application, the synchronous driving circuit 2 is used to control the synchronization of the two-dimensional image data and the depth image data, that is, the synchronous driving circuit 2 is used to trigger each image sensor, such as the visible light sensor 11 and the planar three-dimensional sensor 12, in the image acquisition device 1 and simultaneously perform data acquisition on the measured point, so as to ensure that the time domain information of the data acquired by each image sensor has consistency. A person skilled in the art can determine the composition structure of the synchronous driving circuit 2 and the included circuit components according to the actual application scenario, which is not limited in this application, as long as the image sensors in the controllable image acquisition device 1 can be controlled to perform data sampling simultaneously. In addition, the synchronous driving circuit 2 can also make the noise not lower than the preset noise value through noise reduction means such as filtering, etc., improve the integration capability of the circuit to make the integration level higher than the reference threshold value of the preset integration level, and the voltage adaptation range of the synchronous driving circuit 2 is set to be wider. The synchronous driving circuit 2 with low noise, high integration and wide voltage adaptation range can realize various complex control functions of the front-end image acquisition device 1, so that each image sensor can stably output data according to rated indexes.
In this embodiment, the information processor 3 is configured to perform real-time fusion on the two-dimensional image data and the depth image data, and generate point cloud data output in an area array form. Any image processing algorithm capable of fusing two-dimensional data and three-dimensional data may be used in the related art, which is not limited in this application. The depth image obtained by TOF output is fused with the visible light image, so that the detail information of the depth image is further increased, and the point cloud data generated after data fusion can improve the resolution ratio by 8 times compared with the point cloud data of the three-dimensional area array image sensor 12 theoretically. Therefore, the three-dimensional imaging quality and resolution can be effectively improved.
In the technical scheme provided by the embodiment of the invention, a plurality of image sensors are comprehensively utilized for sampling dense and high-density image data, and a synchronous driving circuit is arranged to ensure the sampling synchronism of high-density visible light data and dense depth image data, so that the time domain consistency of the acquired image data is ensured, and finally the image data with the time domain consistency is fused in real time, thereby effectively increasing the detail information of the depth image, improving the quality and resolution of three-dimensional imaging, realizing the output of high-resolution and high-frame frequency point cloud data output in an area array form, and meeting the requirements of high-resolution and high-precision three-dimensional imaging in the technical field of depth vision.
In another embodiment, in order to further improve the imaging accuracy and resolution of the three-dimensional imaging system, the image capturing device 1 may further include a single-point laser sensor 13, and the single-point laser sensor 13 is used for capturing distance data information of the measured point. The single-point laser sensor 13 may be any single-point scanning laser displacement sensor, which does not affect the implementation of the present application. The single-point laser sensor 13, the visible light sensor 11 and the three-dimensional array sensor 12 synchronously acquire data under the control of the synchronous driving circuit 2. Correspondingly, the information processor 3 may further include a light source control module, and the light source control module may be configured to modulate a working frequency of a laser illumination light source of the single-point laser sensor according to the working frequency of the area array three-dimensional sensor, so as to synchronously trigger the area array three-dimensional sensor and the single-point laser sensor to perform data acquisition. By utilizing the relation between the laser distance data of the measured point acquired by the single-point laser sensor 13 and the depth information in the point cloud data generated after the real-time fusion with the information processor 3, the self-calibration of the point cloud data can be realized, and the calibrated point cloud data is output as the three-dimensional imaging data of the measured point, so that the precision and the resolution of three-dimensional imaging are improved.
In the embodiment, in order to solve the problem of complicated manual calibration operation and improve the adaptability and the automation degree of the three-dimensional imaging system, the application can also realize data calibration or correction of the system by adopting a single-point laser sensor to collect distance data and actual distance data of a tested point, and the automatic calibration is simply implemented, so that the complex operation of manual calibration is avoided, and the automation degree of the whole system is improved. That is to say, the information processor 3 may further include a data self-calibration module, which is configured to perform data calibration and point cloud data self-calibration of the three-dimensional imaging system according to the distance information of the measured point acquired by the single-point laser sensor.
Therefore, the embodiment of the invention can realize self-corrected full-automatic point cloud data output, further improve the precision and the resolution of the three-dimensional imaging data, and also improve the automation degree of the three-dimensional imaging system.
In another embodiment, in order to realize the remote energy convergence of the laser illumination and improve the detection performance of the single-point laser sensor 13, an energy converging optical device may be further provided, and the energy converging optical device is configured to perform the remote energy convergence of the laser illumination light source in the working area of the single-point laser sensor 13, so as to increase the illumination distance of the laser illumination light source. That is, the position and configuration parameters of the energy concentrating optics in the overall system are such as to cover the entire working area of the single-point laser sensor 13. Optionally, a better energy convergence effect is achieved, the energy convergence optical device may adopt an aspheric discontinuous arc-shaped reflective light cup, and of course, a person skilled in the art may adopt other devices capable of achieving energy convergence according to a specific application scenario, which does not affect the implementation of the present application. In addition, in order to further improve the energy convergence effect, a reflecting film covering the surface of the light cup can be arranged on the spherical discontinuous arc-shaped reflecting light cup.
In some other embodiments, as shown in fig. 2, in order to improve the measurement accuracy of the system, the information processor 3 may further modulate the working frequency of the three-dimensional area array sensor, and accordingly, the information processor may further include a working frequency calculation module, and the three-dimensional area array sensor sets a plurality of working frequencies, such as f1,f2,...,fnThe distance information obtained at each operating frequency may be set to d1,d2,...,dnThe operating frequency f can be calculated by the following formulaiDistance value of time:
Figure BDA0002219378390000081
in the formula (I), the compound is shown in the specification,
Figure BDA0002219378390000082
in order to be the true distance,dito measure distance, kiIs a multiple of wavelength, c is the speed of light, fiIs the ith operating frequency.
The distance information closest to the actual value can be calculated by the following formula, that is, the working frequency calculation module can calculate the working frequency of the area array three-dimensional sensor according to the following formula:
Figure BDA0002219378390000083
in the formula, N0For actually measuring the distance, n is the total frequency number of the area array three-dimensional sensor, kiIs a multiple of the wavelength, diDistances obtained for different modulation frequencies, c is the speed of light, fiIs the ith operating frequency.
In order to further improve the calculation accuracy of the working frequency, the frequency can be weighted, the frequency can be calculated by adopting a weighted average method according to Gaussian distribution by taking the influence of factors such as errors into consideration, and the actual working frequency of the area array three-dimensional sensor is f1,f2,...,fmAnd m is less than or equal to n, the working frequency calculation module can also calculate the working frequency of the area array three-dimensional sensor according to the following formula:
Figure BDA0002219378390000084
in the formula, N0' is the weighted real measuring distance, n is the total number of the frequencies of the area array three-dimensional sensor, m is the total number of the frequencies, u is the first position of the frequency, v is the second position of the frequency, kvAnd kuIs a multiple of the wavelength, c is the speed of light, fuFor the u-th operating frequency, fvFor the v-th operating frequency, the frequency of the first frequency,
Figure BDA0002219378390000085
is fuCorresponding variance, μuIs fuA corresponding mean value; mu.svIs fvThe corresponding average value of the average value,
Figure BDA0002219378390000086
is fvThe corresponding variance.
In summary, the depth information detected by the sensors is comprehensively utilized, and the three-dimensional depth information with high resolution and high precision is obtained through data fusion processing. The three-dimensional imaging system utilizes the information fusion technology of a plurality of sensors in a compounding way, and is a highly integrated three-dimensional imaging device. Research on visual physiology and visual psychology shows that human eyes perceive three-dimensional depth information of a scene, and a traditional three-dimensional measuring device cannot perceive and acquire the information in the form of an area array. To the shortcoming that traditional time of flight measuring device detection distance is close, intelligent degree is low, this application combines high accuracy synchronization circuit drive control, light source design, has realized high accuracy, intelligent high resolution time of flight measurement to further promote the wide application of this technique in fields such as security protection control, industrial robot, industrial detection. The method breaks through the technical barrier in the field of flight time measurement at present, develops a flight time point cloud imaging and intelligent signal analysis and processing system integrating three-dimensional imaging, data processing and data analysis, surpasses the current mature products at home and abroad in the aspects of three indexes such as detection distance, detection precision and resolution, and realizes high-resolution area array three-dimensional imaging.
In addition, the present application also provides a three-dimensional imaging method for a three-dimensional imaging system, please refer to fig. 3, where fig. 3 is a schematic flow diagram of a three-dimensional imaging method according to an embodiment of the present invention, and the embodiment of the present invention may include the following contents:
s301: and simultaneously acquiring two-dimensional visible light image data and depth image data of the measured point at the first moment.
The two-dimensional visible light image data and the depth image data are data acquired at the same time by using different types of sensors, and the depth image data is three-dimensional point cloud data output in an area array form.
S302: and fusing the two-dimensional visible light image data and the depth image data in real time to generate point cloud data output in an area array form.
In another implementation manner, referring to fig. 4, based on the foregoing embodiment, the method may further include:
s303: and self-calibrating the fused data according to the laser distance data information of the measured point to be output as point cloud data of the measured point.
The specific implementation process of each step of the three-dimensional imaging method according to the embodiment of the present invention may refer to the related description of the system embodiment, and is not described herein again.
Therefore, the embodiment of the invention realizes the output of the large-resolution and high-frame-frequency point cloud data in the area array form, and meets the requirements of high-resolution and high-precision three-dimensional imaging in the technical field of depth vision.
The embodiment of the invention also provides a corresponding implementation device for the three-dimensional imaging method, so that the method has higher practicability. In the following, the three-dimensional imaging apparatus provided by the embodiment of the present invention is introduced, and the three-dimensional imaging apparatus described below and the three-dimensional imaging method described above may be referred to in correspondence with each other.
Referring to fig. 5, fig. 5 is a structural diagram of a three-dimensional imaging apparatus according to an embodiment of the present invention, where the apparatus may include:
the data acquiring module 501 is configured to acquire two-dimensional visible light image data of a measured point at a first time and depth image data output in an area array form at the same time.
And the data fusion module 502 is configured to fuse the two-dimensional visible light image data and the depth image data in real time to generate point cloud data output in an area array form.
Optionally, in some embodiments of this embodiment, please refer to fig. 6, the apparatus may further include a point cloud data self-calibration module 503, for example, configured to perform self-calibration on the fused data according to laser distance data information of the measured point, so as to output the fused data as point cloud data of the measured point; the laser distance data information is the laser distance data of the measured point collected at the first moment.
The functions of the functional modules of the three-dimensional imaging device according to the embodiments of the present invention may be specifically implemented according to the method in the above method embodiments, and the specific implementation process may refer to the related description of the above method embodiments, which is not described herein again.
Therefore, the embodiment of the invention realizes the output of the large-resolution and high-frame-frequency point cloud data in the area array form, and meets the requirements of high-resolution and high-precision three-dimensional imaging in the technical field of depth vision.
The embodiment of the present invention further provides a three-dimensional imaging device, which specifically includes:
a memory for storing a computer program;
a processor for executing a computer program for implementing the steps of the three-dimensional imaging method according to any of the above embodiments.
The functions of the functional modules of the three-dimensional imaging device according to the embodiments of the present invention may be specifically implemented according to the method in the above method embodiments, and the specific implementation process may refer to the related description of the above method embodiments, which is not described herein again.
Therefore, the embodiment of the invention realizes the output of the large-resolution and high-frame-frequency point cloud data in the area array form, and meets the requirements of high-resolution and high-precision three-dimensional imaging in the technical field of depth vision.
The embodiment of the present invention further provides a computer-readable storage medium, in which a three-dimensional imaging program is stored, and the three-dimensional imaging program is executed by a processor, and the three-dimensional imaging method according to any one of the above embodiments is performed.
The functions of the functional modules of the computer-readable storage medium according to the embodiment of the present invention may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the related description of the foregoing method embodiment, which is not described herein again.
Therefore, the embodiment of the invention realizes the output of the large-resolution and high-frame-frequency point cloud data in the area array form, and meets the requirements of high-resolution and high-precision three-dimensional imaging in the technical field of depth vision.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, read-only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The three-dimensional imaging method, device and system provided by the invention are described in detail above. The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the method and its core concepts. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.

Claims (8)

1. A three-dimensional imaging system is characterized by comprising an image acquisition device, a synchronous drive circuit and an information processor;
the image acquisition device comprises a visible light sensor for acquiring two-dimensional image data of a measured point and an area array three-dimensional sensor for acquiring depth image data of the measured point and outputting the depth image data in an area array form;
the synchronous driving circuit is used for controlling the synchronism of the two-dimensional image data and the depth image data acquisition so as to ensure the consistency of time domain information acquisition;
the information processor is used for fusing the two-dimensional image data and the depth image data in real time and generating point cloud data output in an area array form;
the information processor further comprises a working frequency calculation module, wherein the working frequency calculation module is used for calculating the working frequency of the area array three-dimensional sensor according to a first formula or a second formula, and the first formula is as follows:
Figure FDA0003549427750000011
in the formula, N0For actually measuring the distance, n is the total frequency number of the area array three-dimensional sensor, kiIs a multiple of the wavelength, diDistances obtained for different modulation frequencies, c is the speed of light, fiIs the ith working frequency;
the second formula is:
Figure FDA0003549427750000012
in the formula, N0' is the weighted real measuring distance, n is the total number of the frequencies of the area array three-dimensional sensor, m is the total number of the frequencies, u is the first position of the frequency, v is the second position of the frequency, kvAnd kuIs a multiple of the wavelength, c is the speed of light, fuFor the u-th operating frequency, fvFor the v-th operating frequency, the frequency of the first frequency,
Figure FDA0003549427750000013
is fuCorresponding variance, μuIs fuA corresponding mean value; mu.svIs fvThe corresponding average value of the average value,
Figure FDA0003549427750000014
is fvThe corresponding variance.
2. The three-dimensional imaging system of claim 1, wherein the image acquisition device further comprises a single-point laser sensor, the information processor further comprises a data self-calibration module and a light source control module;
the light source control module is used for modulating the working frequency of a laser lighting source of the single-point laser sensor according to the working frequency of the area array three-dimensional sensor so as to synchronously trigger the area array three-dimensional sensor and the single-point laser sensor to acquire data;
the data self-calibration module is used for calibrating data of the three-dimensional imaging system and self-calibrating point cloud data according to the distance information of the measured point acquired by the single-point laser sensor.
3. The three-dimensional imaging system of claim 2, further comprising an energy-concentrating optical device;
the energy converging optical device is used for carrying out remote energy converging of the laser illumination light source in the working area of the single-point laser sensor so as to increase the illumination distance of the laser illumination light source.
4. The three-dimensional imaging system of claim 3, wherein the energy concentrating optics are aspheric discontinuous arc reflective light cups.
5. The three-dimensional imaging system of claim 4, wherein the aspheric discontinuous arc reflective light cup is further provided with a reflective film covering the surface of the light cup.
6. A method of three-dimensional imaging, comprising:
simultaneously acquiring two-dimensional visible light image data of a measured point at a first moment and depth image data output in an area array form;
fusing the two-dimensional visible light image data and the depth image data in real time to generate point cloud data output in an area array form;
calculating the working frequency of the area array three-dimensional sensor according to a first formula or a second formula, wherein the first formula is as follows:
Figure FDA0003549427750000021
in the formula, N0For actually measuring the distance, n is the total frequency number of the area array three-dimensional sensor, kiIs a multiple of the wavelength, diDistances obtained for different modulation frequencies, c is the speed of light, fiIs the ith working frequency;
the second formula is:
Figure FDA0003549427750000022
in the formula, N0' is the weighted real measuring distance, n is the total number of the frequencies of the area array three-dimensional sensor, m is the total number of the frequencies, u is the first position of the frequency, v is the second position of the frequency, kvAnd kuIs a multiple of the wavelength, c is the speed of light, fuFor the u-th operating frequency, fvFor the v-th operating frequency, the frequency of the first frequency,
Figure FDA0003549427750000031
is fuCorresponding variance, μuIs fuA corresponding mean value; mu.svIs fvThe corresponding average value of the average value,
Figure FDA0003549427750000032
is fvThe corresponding variance.
7. The three-dimensional imaging method according to claim 6, wherein after the fusing the two-dimensional visible light image data and the depth image data in real time, further comprising:
self-calibrating the fused data according to the laser distance data information of the measured point to serve as point cloud data of the measured point to be output;
the laser distance data information is the laser distance data of the measured point acquired at the first moment.
8. A three-dimensional imaging apparatus, comprising:
the data acquisition module is used for simultaneously acquiring two-dimensional visible light image data of a measured point at a first moment and depth image data output in an area array form;
the data fusion module is used for fusing the two-dimensional visible light image data and the depth image data in real time to generate point cloud data output in an area array form;
the apparatus is further configured to: calculating the working frequency of the area array three-dimensional sensor according to a first formula or a second formula, wherein the first formula is as follows:
Figure FDA0003549427750000033
in the formula, N0For the true measurement of the distance, n is the total number of frequencies of the area array three-dimensional sensor, kiIs a multiple of the wavelength, diDistances obtained for different modulation frequencies, c is the speed of light, fiIs the ith working frequency;
the second formula is:
Figure FDA0003549427750000034
in the formula, N0' is the weighted real measuring distance, n is the total number of the frequencies of the area array three-dimensional sensor, m is the total number of the frequencies, u is the first position of the frequency, v is the second position of the frequency, kvAnd kuIs a multiple of the wavelength, c is the speed of light, fuFor the u-th operating frequency, fvFor the v-th operating frequency, the frequency of the first frequency,
Figure FDA0003549427750000041
is fuCorresponding variance, μuIs fuA corresponding mean value; mu.svIs fvThe corresponding average value of the average value,
Figure FDA0003549427750000042
is fvThe corresponding variance.
CN201910927761.4A 2019-09-27 2019-09-27 Three-dimensional imaging method, device, equipment and computer readable storage medium Active CN110619617B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910927761.4A CN110619617B (en) 2019-09-27 2019-09-27 Three-dimensional imaging method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910927761.4A CN110619617B (en) 2019-09-27 2019-09-27 Three-dimensional imaging method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110619617A CN110619617A (en) 2019-12-27
CN110619617B true CN110619617B (en) 2022-05-27

Family

ID=68924767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910927761.4A Active CN110619617B (en) 2019-09-27 2019-09-27 Three-dimensional imaging method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110619617B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111667537B (en) * 2020-04-16 2023-04-07 奥比中光科技集团股份有限公司 Optical fiber calibration device and method
CN112598719B (en) * 2020-12-09 2024-04-09 上海芯翌智能科技有限公司 Depth imaging system, calibration method thereof, depth imaging method and storage medium
CN112509023B (en) * 2020-12-11 2022-11-22 国网浙江省电力有限公司衢州供电公司 Multi-source camera system and RGBD registration method
CN113367638B (en) * 2021-05-14 2023-01-03 广东欧谱曼迪科技有限公司 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106291512A (en) * 2016-07-29 2017-01-04 中国科学院光电研究院 A kind of method of array push-broom type laser radar range Nonuniformity Correction
CN106796728A (en) * 2016-11-16 2017-05-31 深圳市大疆创新科技有限公司 Generate method, device, computer system and the mobile device of three-dimensional point cloud
CN106898022A (en) * 2017-01-17 2017-06-27 徐渊 A kind of hand-held quick three-dimensional scanning system and method
CN108694731A (en) * 2018-05-11 2018-10-23 武汉环宇智行科技有限公司 Fusion and positioning method and equipment based on low line beam laser radar and binocular camera
CN109061648A (en) * 2018-07-27 2018-12-21 廖双珍 Speed based on frequency diversity/range ambiguity resolving radar waveform design method
CN109613558A (en) * 2018-12-12 2019-04-12 北京华科博创科技有限公司 A kind of the data fusion method for parallel processing and system of all-solid state laser radar system
CN209375823U (en) * 2018-12-20 2019-09-10 武汉万集信息技术有限公司 3D camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106291512A (en) * 2016-07-29 2017-01-04 中国科学院光电研究院 A kind of method of array push-broom type laser radar range Nonuniformity Correction
CN106796728A (en) * 2016-11-16 2017-05-31 深圳市大疆创新科技有限公司 Generate method, device, computer system and the mobile device of three-dimensional point cloud
CN106898022A (en) * 2017-01-17 2017-06-27 徐渊 A kind of hand-held quick three-dimensional scanning system and method
CN108694731A (en) * 2018-05-11 2018-10-23 武汉环宇智行科技有限公司 Fusion and positioning method and equipment based on low line beam laser radar and binocular camera
CN109061648A (en) * 2018-07-27 2018-12-21 廖双珍 Speed based on frequency diversity/range ambiguity resolving radar waveform design method
CN109613558A (en) * 2018-12-12 2019-04-12 北京华科博创科技有限公司 A kind of the data fusion method for parallel processing and system of all-solid state laser radar system
CN209375823U (en) * 2018-12-20 2019-09-10 武汉万集信息技术有限公司 3D camera

Also Published As

Publication number Publication date
CN110619617A (en) 2019-12-27

Similar Documents

Publication Publication Date Title
CN110619617B (en) Three-dimensional imaging method, device, equipment and computer readable storage medium
CN109949372B (en) Laser radar and vision combined calibration method
EP3531066B1 (en) Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner
CN106772431B (en) A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision
GB2567353B (en) Handheld dimensioning system with feedback
EP2313737B1 (en) System for adaptive three-dimensional scanning of surface characteristics
US10728525B2 (en) Image capturing apparatus, image processing method, and recording medium
CN109215083A (en) The method and apparatus of the calibrating external parameters of onboard sensor
CN109211298A (en) A kind of transducer calibration method and device
CN105115445A (en) Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision
CN109444916B (en) Unmanned driving drivable area determining device and method
Wiedemann et al. Analysis and characterization of the PMD camera for application in mobile robotics
CN106225676B (en) Method for three-dimensional measurement, apparatus and system
CN113658241B (en) Monocular structured light depth recovery method, electronic device and storage medium
CN109410234A (en) A kind of control method and control system based on binocular vision avoidance
JP2006322853A (en) Distance measuring device, distance measuring method and distance measuring program
CN111798507A (en) Power transmission line safety distance measuring method, computer equipment and storage medium
CN102316355A (en) Generation method of 3D machine vision signal and 3D machine vision sensor
JP2020153797A (en) Detector, method for measuring distance, method for detection, program, and mobile body
CN104748721A (en) Monocular vision sensor with coaxial distance measuring function
EP2913999A1 (en) Disparity value deriving device, equipment control system, movable apparatus, robot, disparity value deriving method, and computer-readable storage medium
CN110992463A (en) Three-dimensional reconstruction method and system for sag of power transmission conductor based on trinocular vision
CN109799493A (en) Radar and Multisensor video fusion system and method
Rothbucher et al. Measuring anthropometric data for HRTF personalization
CN108195291B (en) Moving vehicle three-dimensional detection method and detection device based on differential light spots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant