WO2021238213A1 - 一种基于tof的深度测量装置、方法及电子设备 - Google Patents

一种基于tof的深度测量装置、方法及电子设备 Download PDF

Info

Publication number
WO2021238213A1
WO2021238213A1 PCT/CN2020/141868 CN2020141868W WO2021238213A1 WO 2021238213 A1 WO2021238213 A1 WO 2021238213A1 CN 2020141868 W CN2020141868 W CN 2020141868W WO 2021238213 A1 WO2021238213 A1 WO 2021238213A1
Authority
WO
WIPO (PCT)
Prior art keywords
tof
depth
target object
depth value
control
Prior art date
Application number
PCT/CN2020/141868
Other languages
English (en)
French (fr)
Inventor
杨鹏
王兆民
Original Assignee
奥比中光科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 奥比中光科技集团股份有限公司 filed Critical 奥比中光科技集团股份有限公司
Publication of WO2021238213A1 publication Critical patent/WO2021238213A1/zh
Priority to US17/748,406 priority Critical patent/US20220277467A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out
    • G01S7/4914Circuits for detection, sampling, integration or read-out of detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Definitions

  • This application relates to the field of optical measurement technology, and in particular to a TOF-based depth measurement device, method, and electronic equipment.
  • ToF ranging technology is a technology that achieves precise ranging by measuring the round-trip flight time of light pulses between the transmitting/receiving device and the target object.
  • the emission light signal is periodically modulated, and the phase delay of the reflected light signal relative to the emission light signal is measured, and the measurement technology that calculates the flight time by the phase delay is called iToF (Indirect-TOF). )technology.
  • iToF technology can be divided into continuous wave (Continuous Wave, CW) modulation and demodulation method and pulse modulation (Pulse Modulated, PM) modulation and demodulation method according to different types of modulation and demodulation.
  • phase of the returning beam can be used to calculate the accurate measurement within a given phase "winding" (ie: wavelength), but once the actual distance exceeds the maximum imaging system When measuring the distance, a large error will occur in the measurement, which will greatly affect the accuracy of the measurement data.
  • the purpose of this application is to provide a TOF-based depth measurement device, method, and electronic equipment to solve at least one of the above-mentioned background technical problems.
  • the embodiment of the application provides a TOF-based depth measurement device, including: a light emitting module, used to emit a light beam to a target object; an imaging module, used to collect the reflected light beam reflected by the target object and generate a corresponding Electrical signals, and collecting a two-dimensional image of the target object; a control and processor, respectively connected to the light emitting module and the imaging module, for controlling the light emitting module to emit a modulated emission beam Into the target space; and, synchronously triggering the imaging module to turn on and receiving the electrical signal and the two-dimensional image generated by the imaging module, and calculating the electrical signal to obtain one or more of the target object Multiple TOF depth values, while using the two-dimensional image for deep learning to obtain the relative depth value of the target object, and then determine the actual depth value from the one or more TOF depth values based on the relative depth value.
  • the light emitting module is configured to emit a light beam whose amplitude is continuous wave modulated in time sequence at one or more modulation frequencies under the control of the control and processor;
  • the imaging module is configured To collect at least part of the light beam and generate a corresponding electrical signal;
  • the control and processor are configured to calculate a phase difference based on the electrical signal, and calculate the flight required for the light beam from emission to collection based on the phase difference Time, and calculate the TOF depth value of each pixel based on the flight time.
  • control and processor further includes a depth calculation unit, the depth calculation unit includes a convolutional neural network structure, and the two-dimensional image is deep-learned through the convolutional neural network structure to obtain The relative depth value of the target object.
  • the embodiment of the present application also provides a TOF-based depth measurement method, which includes the following steps:
  • S21 Control the imaging module to collect the reflected beam of the emitted light beam reflected by the target object and generate a corresponding electrical signal; and collect a two-dimensional image of the target object;
  • S22 Use the control and processor to receive the electrical signal generated by the imaging module and receive the two-dimensional image from the imaging module, and calculate the electrical signal to obtain one or more TOFs of the target object Depth value; at the same time, the two-dimensional image is used for deep learning to obtain the relative depth value of the target object, and the actual depth value is determined from the one or more TOF depth values based on the obtained relative depth value.
  • the light emitting module is configured to emit a light beam whose amplitude is continuous wave modulated in time sequence at one or more modulation frequencies under the control of the control and processor;
  • the imaging module is configured In order to collect at least part of the light beam and generate a corresponding electrical signal;
  • the control and processor are configured to calculate the phase difference according to the electrical signal, and calculate the flight required for the light beam from launch to collection based on the phase difference Time, and calculate the TOF depth value of each pixel based on the flight time.
  • control and processor includes a depth calculation unit.
  • the depth calculation unit designs a convolutional neural network structure to perform deep learning on the two-dimensional image to obtain the target object. The relative depth value.
  • step S22 the control and processor selects the TOF depth value corresponding to the smallest absolute value as the actual depth by making the difference between the multiple TOF depth values and the relative depth value and taking the absolute value. Value; or, using the continuity of the relative depth value obtained by the deep learning, instruct the TOF depth value to unwrap to determine the actual depth value from the one or more TOF depth values.
  • control and processor generates a TOF depth map of the target object based on the TOF depth value, and generates a relative depth map of the target object based on the relative depth value.
  • it further includes the following steps:
  • the TOF depth map and the relative depth map are fused to generate a depth map of the target object.
  • An embodiment of the present application also provides an electronic device, including a housing, a screen, and a TOF-based depth measuring device; wherein the light emitting module and the imaging module of the TOF-based depth measuring device are arranged on the same part of the electronic device.
  • the surface is used to emit a light beam to the target object and to receive the light beam reflected by the target object and form an electrical signal;
  • the TOF-based depth measuring device includes: a light emitting module for emitting a light beam to the target object; an imaging module , Used to collect the reflected light beam reflected by the target object and generate corresponding electrical signals, and collect the two-dimensional image of the target object;
  • the control and processor are respectively connected with the light emitting module and the imaging module Group connection, used to control the light emitting module to emit the modulated emission beam to the target space; and, synchronously trigger the imaging module to turn on and receive the electrical signal generated from the imaging module and the A two-dimensional image, the electrical signal is calculated to obtain one or more TOF depth values of the target object, and at the same time, the two-dimensional image is used for deep learning to obtain the relative depth value of the target object, and then based on the relative depth value The actual depth value is determined from the one or more TOF depth values.
  • the embodiment of the application provides a TOF-based depth measurement device, including: a light emitting module, used to emit a light beam to a target object; an imaging module, used to collect the reflected light beam reflected by the target object and generate a corresponding Electrical signals, and collecting a two-dimensional image of the target object; a control and processor, respectively connected to the light emitting module and the imaging module, for controlling the light emitting module to emit a modulated emission beam Into the target space; and, synchronously triggering the imaging module to turn on and receiving the electrical signal and the two-dimensional image generated by the imaging module, and calculating the electrical signal to obtain one or more of the target object Multiple TOF depth values, while using the two-dimensional image for deep learning to obtain the relative depth value of the target object, and then determine the actual depth value from the one or more TOF depth values based on the relative depth value.
  • the two-dimensional image undergoes deep learning to obtain the relative depth value and assists ToF unwinding based on the relative depth value, so as to improve the accuracy,
  • Fig. 1 is a schematic diagram of a TOF-based depth measuring device according to an embodiment of the present application.
  • Fig. 2 is a flowchart of a TOF-based depth measurement method according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of an electronic device using the measuring device shown in Fig. 1 according to an embodiment of the present application.
  • connection can be used for fixing or circuit connection.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, "a plurality of” means two or more than two, unless otherwise specifically defined.
  • the TOF depth measurement device includes a light emitting module, an imaging module, a control theory and a processor.
  • the light emitting module emits a light beam to the target space, the light beam is emitted into the target area space to illuminate the target object in the space, at least part of the emitted light beam is reflected by the target area to form a reflected light beam, and at least part of the reflected light beam is imaged Module receiving;
  • the control and processor are respectively connected with the light emitting module and the imaging module, and the trigger signal of the light emitting module and the imaging module is synchronized to calculate the time required for the light beam from emission to reflection back to be received, that is, the emitted light beam
  • the flight time t between and the reflected beam further, according to the flight time t, the distance D of the corresponding point on the target object can be calculated by the following formula:
  • FIG. 1 is a schematic diagram of a TOF-based depth measurement device provided by one of the embodiments of this application.
  • the TOF-based depth measurement device 10 includes a light emitting module 11, an imaging module 12, and a control and processor 13.
  • the light emitting module 11 is used to emit a light beam 30 to a target object; the imaging module 12 is used to collect and emit The light beam 30 reflects the reflected light beam 40 back by the target object and generates corresponding electrical signals, and collects a two-dimensional image of the target object; the control and processor 13 is respectively connected with the light emitting module 11 and the imaging module 12 for controlling the light
  • the emission module 11 emits the modulated emission light beam into the target space 20 to illuminate the target object in the space; and synchronously triggers the imaging module 12 to turn on and receive the electrical signal and the two-dimensional image generated by the imaging module 12,
  • the signal is calculated to obtain one or more TOF depth values of the target object, and at the same time, two-dimensional images are used for deep learning to obtain the relative depth value of the target object, and then based on the relative depth value in the obtained one or more TOF depth values Determine the actual depth value.
  • the light emitting module 11 includes a light source, a light source driver (not shown in the figure), and the like.
  • the light source can be a dot-matrix light source or an area-matrix light source.
  • the dot matrix light source can be a combination of a dot matrix laser plus a liquid crystal switch/diffuser/diffractive optical device
  • the dot matrix laser can be a light emitting diode (LED), an edge emitting laser (EEL), a vertical cavity surface emitting laser (VCSEL), etc.
  • the combination of dot matrix laser plus liquid crystal switch/diffuser is equivalent to area array light source, which can output evenly distributed area array beam, which is convenient for subsequent charge integration.
  • the function of the liquid crystal switch is to make the light beam emitted by the light source irradiate the entire target space more uniformly;
  • the function of the diffuser is to shape the beam emitted by the dot matrix laser into an area array beam; and the combination of the dot matrix laser and the diffractive optical element
  • the emitted light beam is still a speckle laser, and the intensity of the emitted speckle laser is enhanced by the diffractive optical element, so that the energy is more concentrated and the energy per unit area is stronger, and the action distance is longer.
  • the area array light source can be a light source array composed of multiple dot matrix lasers or a flood light source, such as an infrared flood light source, which can also output a uniformly distributed area array light beam to facilitate subsequent charge integration.
  • the light beam emitted by the light source can be visible light, infrared light, ultraviolet light, etc. In consideration of environmental light, laser safety and other issues, generally infrared light is mainly used.
  • the light source emits a light beam whose amplitude is modulated in time sequence under the control of the light source driver.
  • the light source is driven by the light source driver to emit light beams such as pulse modulated light beams, square wave modulated light beams, sine wave modulated light beams, etc., at a modulation frequency f.
  • the light emitting module further includes a collimating lens, which is arranged above the light source for collimating the light beam emitted by the light source. It should be understood that the light source driver can be further controlled by the control and processor, and of course can also be integrated into the control and processor.
  • the imaging module 12 includes a TOF image sensor 121 and a lens unit (not shown). In some embodiments, it may also include a filter (not shown).
  • the lens unit is generally a condenser lens, which is used to focus and image at least part of the reflected light beam reflected by the target object on at least part of the TOF image sensor.
  • the filter needs to select a narrowband filter that matches the wavelength of the light source. To suppress the background light noise in the remaining bands.
  • the TOF image sensor 121 can be an image sensor array composed of charge coupled devices (CCD), complementary metal oxide semiconductors (CMOS), avalanche diodes (AD), single photon avalanche diodes (SPAD), etc. The size of the array represents the depth of the camera.
  • the TOF image sensor 121 is also connected with a data processing unit and a readout circuit ( Figure Not shown in). It is understandable that since the scenes at different distances of the imaging module are concentric spheres with different diameters instead of parallel planes, there will be errors in actual use, and this error can be corrected by the data processing unit.
  • the TOF image sensor includes at least one pixel. Compared with a traditional image sensor that is only used for taking pictures, each pixel here includes more than two taps for storing and reading or reading under the control of the corresponding electrode. Output the charge signal generated by incident photons), for example: including 3 taps, within a single frame period (or single exposure time) in a certain order to switch the taps to collect the corresponding charge.
  • the control and processor also provides the demodulated signal (collection signal) of each tap in each pixel of the TOF image sensor, and the tap collects the electrical signal (charge) generated by the reflected light beam reflected by the target object under the control of the demodulated signal. .
  • the control and processor are respectively connected to the light emitting module and the imaging module.
  • it controls the light emitting module to emit the light beam it triggers the imaging module to turn on to collect the part of the reflected light beam reflected by the target object and the Converted into an electrical signal, and further by measuring the phase difference between the transmitting signal and the receiving signal between the transmitting module and the imaging module To measure the distance between the target object and the measuring device, where the phase difference
  • the relationship with the distance value d is:
  • c is the speed of light
  • f 1 is the modulation frequency
  • the expression is:
  • the modulation frequency is f 2
  • the amplitude is a
  • the period T is 1/f 2
  • the wavelength ⁇ is c/f 2 .
  • the signal is delayed
  • the reflected signal obtained later is set to r(t)
  • the offset caused by the ambient light intensity is B
  • the difference is used to eliminate the largest noise interference source ambient light offset in the depth measurement process.
  • phase difference of the final solution Start to repeat, for example Therefore, each single frequency measurement provides multiple measured distance values, and each distance is separated by phase winding, namely:
  • n is the number of phase windings, and multiple distances measured by a single frequency are called ambiguity distances.
  • the modulation frequency will affect the measurement distance. Under certain circumstances, the measurement distance can be extended by reducing the frequency of the modulation signal (that is, increasing its wavelength), but reducing the modulation frequency will reduce the measurement accuracy. In order to satisfy the measurement distance while ensuring the measurement accuracy, the depth measurement device based on the TOF camera usually introduces the multi-frequency extension measurement technology, and the multi-frequency technology will be described later.
  • Multi-frequency technology realizes mixing by adding one or more continuous modulating waves of different frequencies to the light emitting module.
  • the modulating wave of each frequency has its corresponding fuzzy distance, and the distance measured by multiple modulating waves is the same.
  • the true distance of the measured target object, its corresponding frequency is the greatest common divisor of multiple modulating wave frequencies, referred to as the striking frequency.
  • the hitting frequency should be less than the frequency of any modulating wave, so as to ensure that the measurement distance is extended without reducing the actual modulating frequency.
  • the adopted multi-frequency ranging method can effectively de-alias the phase difference data, compared with the single-frequency ranging, in order to obtain multiple depth maps, the multi-frequency ranging must increase the exposure time of the pixel. This not only increases the power consumption caused by data transmission, but also reduces the frame rate of the depth map.
  • the two-dimensional image formed by the ambient light collected by the imaging module or the flood beam emitted by the light source is subjected to deep learning to obtain the relative depth value of the target object, and then the relative depth value is used to guide the TOF The depth value is unwound.
  • the relative depth value obtained by deep learning through the two-dimensional image does not need to be very accurate, and the final depth value only needs to be selected from multiple TOF depth values The one closest to the relative depth value is sufficient, and since the two-dimensional image and the TOF depth map are both acquired by the TOF image sensor at the same viewing angle, each pixel in the two-dimensional image and each pixel in the TOF depth map are strictly There is a one-to-one correspondence, which can omit the complicated image matching process, thereby avoiding increasing the power consumption of the device.
  • the depth measurement device in the embodiment of the present application uses deep learning on the two-dimensional image to obtain the relative depth value, and assists the ToF to unwind based on the relative depth value, thereby achieving the condition that the cost of the existing depth measurement device is not increased.
  • the purpose of improving the accuracy, completeness and frame rate of the depth map is not increased.
  • the control and processor includes a depth calculation unit, and the aforementioned deep learning is performed by the depth calculation unit in the control and processor.
  • the depth calculation unit may be an FPGA/NPU/GPU or the like. It should be understood that the depth map includes multiple depth values, and each depth value corresponds to a single pixel of the TOF image sensor.
  • the depth calculation unit may output the relative depth value of each pixel in the two-dimensional image to guide the TOF depth value of a single pixel obtained by calculating the phase difference, that is, to select from multiple TOF depth values corresponding to the phase difference Closest to the relative depth value obtained through deep learning, the actual depth value is obtained, and then the final depth map is obtained.
  • control and processor divides multiple TOF depth values and relative depth values and takes the absolute value, and selects the TOF depth value corresponding to the smallest absolute value as the actual depth value.
  • control and processor generates a depth map of the target object according to the actual depth value; alternatively, the TOF depth map is fused with a relative depth map obtained according to a relative depth value to generate the target Depth map of the object.
  • the depth calculation unit may design the convolutional neural network structure, train the loss function of the known depth map to obtain the depth model, and directly input the two-dimensional image into the above-mentioned convolutional neural network structure when estimating the relative depth value.
  • the corresponding relative depth map can be obtained by deep learning. It should be understood that in this method, a color value (or gray value) is used to represent a relative depth value.
  • the depth value calculated by the phase delay has high accuracy, so the relative depth value of the target object estimated by the two-dimensional image is not required, so the design requirements of the convolutional neural network structure are relatively simple In this way, it will not increase the power of the depth measuring device or reduce its calculation rate, and it can also assist the TOF to unwind to obtain the precise actual distance value of the target object.
  • FIG. 2 shows the flow chart of the TOF-based depth measurement method.
  • the measurement method includes the following steps:
  • S21 Control the imaging module to collect the reflected beam of the emitted light beam reflected by the target object and generate a corresponding electrical signal; and collect a two-dimensional image of the target object;
  • S22 Use the control and processor to receive the electrical signal generated by the imaging module and receive the two-dimensional image from the imaging module, and calculate the electrical signal to obtain one or more TOF depth values of the target object; at the same time, use The two-dimensional image undergoes deep learning to obtain the relative depth value of the target object, and the actual depth value is determined from one or more TOF depth values based on the obtained relative depth value.
  • control and processor includes a depth calculation unit that designs a convolutional neural network structure to perform deep learning on the two-dimensional image to obtain the relative depth value of the target object.
  • step S22 the control and processor selects the TOF depth value corresponding to the smallest absolute value as the actual depth value by making the difference between the multiple TOF depth values and the relative depth value and taking the absolute value; or According to the continuity of the relative depth value obtained by the deep learning, the TOF depth value is instructed to unwrap to determine the actual depth value from the one or more TOF depth values.
  • it further includes:
  • the TOF depth map is fused with the relative depth map obtained according to the relative depth value to generate a depth map of the target object.
  • the light emitting module is configured to emit a light beam whose amplitude is continuous wave modulated in time sequence at one or more modulation frequencies under the control of the control and processor;
  • the imaging module is configured to capture At least part of the light beam and generate a corresponding electrical signal;
  • the control and processor are configured to calculate a phase difference based on the electrical signal, and calculate the flight time required for the light beam from emission to collection based on the phase difference , And calculate the TOF depth value of each pixel based on the flight time.
  • the TOF-based depth measurement method in the embodiment of the present application is specifically executed by the TOF-based depth measurement device in the foregoing embodiment.
  • the specific implementation method please refer to the description in the embodiment of the TOF-based depth measurement device, which will not be repeated here.
  • This application uses deep learning on a two-dimensional image to obtain a relative depth value, and assists ToF unwinding based on the relative depth value, so as to improve the accuracy and completeness of the depth map without increasing the cost of the existing depth measurement device
  • the purpose of sex and frame rate is the purpose of sex and frame rate.
  • the computer program can be stored in a computer-readable storage medium.
  • the computer program includes computer program code
  • the computer program code may be in the form of source code, object code, executable file, or some intermediate form.
  • the computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U disk, mobile hard disk, magnetic disk, optical disk, computer memory, read-only memory (ROM, Read-Only Memory) , Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunications signal, and software distribution media, etc.
  • the content contained in the computer-readable medium can be appropriately added or deleted according to the requirements of the legislation and patent practice in the jurisdiction.
  • the computer-readable medium Does not include electrical carrier signals and telecommunication signals.
  • an electronic device may be a desktop, a desktop-mounted device, a portable device, a wearable device or a vehicle-mounted device, a robot, and the like.
  • the device may be a notebook computer or an electronic device to allow gesture recognition or biometric recognition.
  • the device may be a head-mounted device to identify objects or hazards in the user's surrounding environment to ensure safety.
  • a virtual reality system that obstructs the user's vision of the environment can detect objects or hazards in the surrounding environment. To provide users with warnings about nearby objects or obstacles.
  • the electronic device 300 may be a mixed reality system that mixes virtual information and images with the user's surrounding environment, and can detect objects or people in the user's environment to integrate the virtual information with the physical environment and objects.
  • it may also be a device used in fields such as unmanned driving. 3, taking a mobile phone as an example for description, the electronic device 300 includes a housing 31, a screen 32, and the TOF-based depth measurement device described in the foregoing embodiment; wherein, the TOF-based depth measurement device
  • the light emitting module 11 and the imaging module 12 are arranged on the same surface of the electronic device 300, and are used to emit a light beam to the target object and receive the flood light beam reflected by the target object and form an electrical signal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

一种基于TOF的深度测量装置、方法及电子设备,该装置(10)包括:光发射模组(11),用于向目标物体发射光束;成像模组(12),用于采集经目标物体反射回的反射光束并生成相应的电信号,以及采集目标物体的二维图像;控制与处理器(13),分别与光发射模组(11)以及成像模组(12)连接,用于控制光发射模组(11)发射被调制的发射光束;以及,同步触发成像模组(12)开启接收电信号和二维图像,对电信号进行计算以获得目标物体的一个或多个TOF深度值,同时利用二维图像进行深度学习以获得相对深度值,进而基于相对深度值从一个或多个TOF深度值中确定实际深度值。该装置通过对二维图像进行深度学习以得到相对深度值,基于该相对深度值辅助TOF解卷绕,从而提升深度图的精度。

Description

一种基于TOF的深度测量装置、方法及电子设备
本申请要求于2020年5月24日提交中国专利局,申请号为202010445256.9,发明名称为“一种基于TOF的深度测量装置、方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及光学测量技术领域,尤其涉及一种基于TOF的深度测量装置、方法及电子设备。
背景技术
ToF的全称是Time-of-Flight,即:飞行时间,ToF测距技术是一种通过测量光脉冲在发射/接收装置和目标物体间的往返飞行时间来实现精确测距的技术。在ToF技术中,对发射光信号进行周期性调制,通过对反射光信号相对于发射光信号的相位延迟进行测量,再由相位延迟对飞行时间进行计算的测量技术被称为iToF(Indirect-TOF)技术。iToF技术按照调制解调类型方式的不同可以分为连续波(Continuous Wave,CW)调制解调方法和脉冲调制(Pulse Modulated,PM)调制解调方法。
应理解,在基于相位的iToF技术的测量方案中,返回光束的相位可以用于计算在给定相位“卷绕”(即:波长)内的准确测量,但是一旦实际距离超出了成像***的最大测量距离,则测量就会发生很大误差,从而导致测量数据的精度受到极大影响。
发明内容
本申请的目的在于提供一种基于TOF的深度测量装置、方法及电子设备, 以解决上述背景技术问题中的至少一种问题。
本申请实施例提供一种基于TOF的深度测量装置,包括:光发射模组,用于向目标物体发射光束;成像模组,用于采集经所述目标物体反射回的反射光束并生成相应的电信号,以及采集所述目标物体的二维图像;控制与处理器,分别与所述光发射模组以及所述成像模组连接,用于控制所述光发射模组发射被调制的发射光束至目标空间中;以及,同步触发所述成像模组开启并接收来自所述成像模组生成的所述电信号和所述二维图像,对所述电信号进行计算以获得目标物体的一个或多个TOF深度值,同时利用所述二维图像进行深度学习以获取目标物体的相对深度值,进而基于所述相对深度值从所述一个或多个TOF深度值中确定实际深度值。
在一些实施例中,所述光发射模组被设置成在所述控制与处理器的控制下以一个或多个调制频率发射时序上振幅被连续波调制的光束;所述成像模组被设置成采集至少部分所述光束并生成相应的电信号;所述控制与处理器被设置成根据所述电信号计算相位差,基于所述相位差计算所述光束由发射到被采集所需要的飞行时间,并基于所述飞行时间计算各个像素的TOF深度值。
在一些实施例中,所述控制与处理器还包括有深度计算单元,所述深度计算单元包括卷积神经网络结构,通过所述卷积神经网络结构对所述二维图像进行深度学习以获得所述目标物体的相对深度值。
本申请实施例还提供一种基于TOF的深度测量方法,包括如下步骤:
S20、控制光发射模组向目标物体发射被调制的发射光束;
S21、控制成像模组采集所述发射光束经目标物体反射回的反射光束并生成相应的电信号;以及,采集目标物体的二维图像;
S22、利用控制与处理器接收所述成像模组生成的所述电信号以及接收来自所述成像模组的二维图像,并对所述电信号进行计算以得到目标物体的一个或多个TOF深度值;同时,利用所述二维图像进行深度学习以获得目标物体的相 对深度值,基于所获得的相对深度值从所述一个或多个TOF深度值中确定实际深度值。
在一些实施例中,所述光发射模组被设置成在所述控制与处理器的控制下以一个或多个调制频率发射时序上振幅被连续波调制的光束;所述成像模组被设置成采集至少部分所述光束并生成相应的电信号;所述控制与处理器被设置成根据所述电信号计算相位差,基于所述相位差计算所述光束由发射到被采集所需的飞行时间,并基于所述飞行时间计算各个像素的TOF深度值。
在一些实施例中,所述控制与处理器包括有深度计算单元,在步骤S22中,所述深度计算单元通过设计卷积神经网络结构,以对所述二维图像进行深度学习以获取目标物体的相对深度值。
在一些实施例中,步骤S22中,所述控制与处理器通过将所述多个TOF深度值与所述相对深度值作差并取绝对值,选取最小绝对值对应的TOF深度值作为实际深度值;或,利用所述深度学习得到的所述相对深度值的连续性,指导TOF深度值解卷绕以从所述一个或多个TOF深度值中确定实际深度值。
在一些实施例中,所述控制与处理器基于所述TOF深度值生成目标物体的TOF深度图,同时基于所述相对深度值生成目标物体的相对深度图。
在一些实施例中,还包括如下步骤:
根据所述实际深度值生成所述目标物体的深度图;或者,
将所述TOF深度图与所述相对深度图进行融合以生成目标物体的深度图。
本申请实施例还提供一种电子设备,包括壳体、屏幕、以及基于TOF的深度测量装置;其中,所述基于TOF的深度测量装置的光发射模组与成像模组设置于电子设备的同一面,以用于向目标物体发射光束以及接收经目标物体反射回来的光束并形成电信号;所述基于TOF的深度测量装置包括:光发射模组,用于向目标物体发射光束;成像模组,用于采集经所述目标物体反射回的反射光束并生成相应的电信号,以及采集所述目标物体的二维图像;控制与处理器, 分别与所述光发射模组以及所述成像模组连接,用于控制所述光发射模组发射被调制的发射光束至目标空间中;以及,同步触发所述成像模组开启并接收来自所述成像模组生成的所述电信号和所述二维图像,对所述电信号进行计算以获得目标物体的一个或多个TOF深度值,同时利用所述二维图像进行深度学习以获取目标物体的相对深度值,进而基于所述相对深度值从所述一个或多个TOF深度值中确定实际深度值。
本申请实施例提供一种基于TOF的深度测量装置,包括:光发射模组,用于向目标物体发射光束;成像模组,用于采集经所述目标物体反射回的反射光束并生成相应的电信号,以及采集所述目标物体的二维图像;控制与处理器,分别与所述光发射模组以及所述成像模组连接,用于控制所述光发射模组发射被调制的发射光束至目标空间中;以及,同步触发所述成像模组开启并接收来自所述成像模组生成的所述电信号和所述二维图像,对所述电信号进行计算以获得目标物体的一个或多个TOF深度值,同时利用所述二维图像进行深度学习以获取目标物体的相对深度值,进而基于所述相对深度值从所述一个或多个TOF深度值中确定实际深度值,通过对二维图像进行深度学习以得到相对深度值,并基于该相对深度值辅助ToF解卷绕,从而实现在不增加现有深度测量装置成本的条件下,提升深度图的精度、完整性与帧率的目的。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是根据本申请一个实施例基于TOF的深度测量装置的原理图示。
图2是根据本申请一个实施例基于TOF的深度测量方法的流程图示。
图3是根据本申请一个实施例采用图1所示测量装置的电子设备的示意图。
具体实施方式
为了使本申请实施例所要解决的技术问题、技术方案及有益效果更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
需要说明的是,当元件被称为“固定于”或“设置于”另一个元件,它可以直接在另一个元件上或者间接在该另一个元件上。当一个元件被称为是“连接于”另一个元件,它可以是直接连接到另一个元件或间接连接至该另一个元件上。另外,连接即可以是用于固定作用也可以是用于电路连通作用。
需要理解的是,术语“长度”、“宽度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请实施例和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多该特征。在本申请实施例的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
为方便理解,对基于TOF深度测量装置先作说明,TOF深度测量装置包括光发射模组、成像模组以及控制理与处理器。其中,光发射模组向目标空间发射光束,该光束发射至目标区域空间中以照明空间中的目标物体,至少部分发射光束经目标区域反射后形成反射光束,反射光束中的至少部分光束被成像模组接收;控制与处理器分别与光发射模组以及成像模组连接,同步光发射模组以及成像模组的触发信号以计算光束从发射到反射回来被接收所需要的时间,即发射光束与反射光束之间的飞行时间t,进一步的,根据飞行时间t,目标物体 上对应点的距离D可由下式计算出:
D=c·t/2        (1)
其中,c为光速。
具体地,请参阅图1,图1为本申请其中一实施例提供的一种基于TOF的深度测量装置的原理图示。该基于TOF的深度测量装置10包括光发射模组11、成像模组12以及控制与处理器13;其中,光发射模组11用于向目标物体发射光束30;成像模组12用于采集发射光束30经目标物体反射回的反射光束40并生成相应的电信号,以及采集目标物体的二维图像;控制与处理器13分别与光发射模组11以及成像模组12连接,用于控制光发射模组11发射被调制的发射光束至目标空间20中以照明空间中的目标物体;以及同步触发成像模组12开启并接收来自成像模组12生成的电信号及二维图像,对该电信号进行计算以获得目标物体的一个或多个TOF深度值,同时利用二维图像进行深度学习以获取目标物体的相对深度值,进而基于相对深度值在所获得的一个或多个TOF深度值中确定实际深度值。
在一些实施例中,光发射模组11包括光源以及光源驱动器(图中未示出)等。光源可以为点阵光源或面阵光源。其中,点阵光源可以是点阵激光器加液晶开关/扩散器/衍射光学器件的组合,点阵激光器可以是发光二极管(LED)、边发射激光器(EEL)、垂直腔面发射激光器(VCSEL)等。点阵激光器加液晶开关/扩散器的组合相当于面阵光源,能输出均匀分布的面阵光束,便于后续的电荷积分。其中,液晶开关的作用是使得光源发射的光束能更加均匀地照射整个目标空间中;扩散器的作用是将点阵激光器发射的光束整形成面阵光束;而点阵激光器加衍射光学元件的组合所发射的光束依然是散斑激光,通过衍射光学元件使得发射的散斑激光密度加强,以实现能量较为集中且单位面积内的能量较强,作用距离较远。面阵光源可以是多个点阵激光器组成的光源阵列也可以是泛光光源,例如红外泛光光源,其同样能输出均匀分布的面阵光束,便于后续的电荷积分。
可以理解的是,光源所发射的光束可以是可见光、红外光、紫外光等,考虑到环境光、激光安全等问题,一般主要采用红外光。光源在光源驱动器的控制下向外发射时序上振幅被调制的光束。比如在一些实施例中,光源在光源驱动器的驱动下以调制频率f发射脉冲调制光束、方波调制光束、正弦波调制光束等光束。在一些实施例中,光发射模组还包括有准直透镜,其设置在光源上方,用于准直光源发出的光束。应理解,光源驱动器可以进一步被控制和处理器控制,当然也可以整合到控制与处理器中。
成像模组12包括TOF图像传感器121、透镜单元(未图示),在一些实施例中,还可以包含有滤光片(图中未示出)。其中,透镜单元一般为聚光透镜,用于将由目标物体反射回的至少部分反射光束聚焦并成像在至少部分TOF图像传感器上,滤光片需选择与光源波长相匹配的窄带滤光片,用于抑制其余波段的背景光噪声。TOF图像传感器121可以是电荷耦合元件(CCD)、互补金属氧化物半导体(CMOS)、雪崩二极管(AD)、单光子雪崩二极管(SPAD)等组成的图像传感器阵列,阵列大小代表着该深度相机的分辨率,比如320×240等。一般地,TOF图像传感器121还连接有数据处理单元以及包括由信号放大器、时数转换器(TDC)、模数转换器(ADC)等器件中的一种或多种组成的读出电路(图中未示出)。可以理解的是,由于成像模组不同距离的场景为各个不同直径的同心球面,而非平行平面,所以在实际使用时会有误差存在,而通过数据处理单元可以对这个误差进行校正。
应理解,TOF图像传感器包括至少一个像素,与传统的仅用于拍照的图像传感器相比,这里每个像素包含两个以上的抽头(tap,用于在相应电极的控制下存储并读取或者输出由入射光子产生的电荷信号),比如:包括3个抽头,在单个帧周期内(或单次曝光时间内)以一定的次序依次切换抽头以采集相应的电荷。此外,控制与处理器还提供TOF图像传感器各像素中各抽头的解调信号(采集信号),抽头在解调信号的控制下采集由目标物体反射回的反射光束所产生的电信号(电荷)。
控制与处理器分别与光发射模组以及成像模组连接,当其控制光发射模组发射光束的同时会触发成像模组开启以采集上述发射光束经目标物体反射回的部分反射光束并将其转换成电信号,并进一步通过测量发射模组与成像模组之间的发射信号与接收信号的相位差
Figure PCTCN2020141868-appb-000001
来测得目标物体与测量装置之间的距离值,其中相位差
Figure PCTCN2020141868-appb-000002
与距离值d的关系为:
Figure PCTCN2020141868-appb-000003
其中,c为光速,f 1为调制频率。
在一些实施例中,假设光源发出连续的正弦波调制光束,其表达式为:
(t)=a*(1+sin(2πf 2t))
其中,调制频率为f 2,振幅为a,周期T为1/f 2,波长λ为c/f 2
该信号经过延时
Figure PCTCN2020141868-appb-000004
以后得到的反射信号设为r(t),其信号幅度经过传播后衰减为A,由环境光强引起的偏移量为B,则反射信号的表达式为:
Figure PCTCN2020141868-appb-000005
求取相位差
Figure PCTCN2020141868-appb-000006
本申请实施例中,在有效积分时间内通过在四个等距的测量相位点(一般为0°、90°、180°、270°)处对接收到的电荷进行采样来进行测量计算,即通过在连续的四帧图像中(4次曝光时间内),以与发射光相位相差为0°、90°、180°、270°的四个点为起点分别对电荷采样,将上述四个采样点0°、90°、180°、270°对应时间点t0=0,t1=T/4,t2=T/2,t3=3T/2代入方程r(t),通过求解方程可以得到
Figure PCTCN2020141868-appb-000007
其中I、Q分别为不相邻两次电荷采样值的差值,利用差值消除了深度测量过程中最大的噪声干扰源环境光偏置,且I、Q越大,相位差测量的精度就越高。应理解,在实际深度测量过程中,为了提高测量精度,往往需要采集多张图像,进行多次测量,最后再通过加权平均值等方法求取每个像素的深度值,进而得到完整的深度图。
应理解,目标物体的距离的改变会导致相位差
Figure PCTCN2020141868-appb-000008
的改变,但通过上述单频率求出的相位差
Figure PCTCN2020141868-appb-000009
始终在0~2π的区间范围内,其可有效用于计算在给定相位 “卷绕”(即,波长)内的准确测量,但一旦实际距离超过最大测量距离λ=c/2f 2时,最终求解的相位差
Figure PCTCN2020141868-appb-000010
开始重复,例如
Figure PCTCN2020141868-appb-000011
因此,每个单频测量提供多个测量距离值,每个距离由相位卷绕隔开,即:
Figure PCTCN2020141868-appb-000012
其中,n为相位卷绕的数目,通过单频率测得的多个距离被称为模糊距离。由公式
Figure PCTCN2020141868-appb-000013
可知,调制频率会影响测量距离,在特定情况下,可以通过降低调制信号频率(即增大其波长)来扩展测量距离,但降低调制频率会降低测量精度。为了满足测量距离的同时又要保证测量精度,因此基于TOF相机的深度测量装置通常会引入多频扩展测量技术,后续将对多频技术进行描述。
多频技术通过在光发射模组增加一个或者多个不同频率的连续调制波来实现混频,每个频率的调制波都有其对应的模糊距离,多个调制波共同测量到的距离才是被测目标物体的真实距离,其对应的频率为多个调制波频率的最大公约数,简称击打频率。显然击打频率要小于任意调制波的频率,从而保证在不降低实际调制频率的情况下扩展测量距离。应理解,尽管采用的多频率测距方法可高效地对相位差数据进行去混叠,但相对于单频测距而言,多频测距为了得到多个深度图,必须增加像素的曝光时间,这样不仅增加了数据传输所造成的功耗,还降低了深度图的帧率。
因此,在本申请实施例中,通过成像模组采集的环境光或者光源发射的泛光光束所形成的二维图像进行深度学习以获取目标物体的相对深度值,进而利用相对深度值来指导TOF深度值进行解卷绕。应理解,由于通过相位延迟(相位差)计算的高准确度,因此通过二维图像进行深度学习所得到的相对深度值不需要非常精确,最终的深度值只需要从多个TOF深度值中选取与相对深度值最接近的一个即可,且由于二维图像及TOF深度图均由是TOF图像传感器在同一视角采集得到,因此二维图像中的各个像素与TOF深度图中的各个像素是严格一一对应的,由此可省略复杂的图像匹配过程,进而避免增加装置的功耗。 如此,本申请实施例深度测量装置采用对二维图像进行深度学习以得到相对深度值,并基于该相对深度值辅助ToF进行解卷绕,从而实现了在不增加现有深度测量装置成本的条件下,提升深度图的精度、完整性与帧率的目的。
在一些实施例中,控制与处理器包括有深度计算单元,通过控制与处理器中的深度计算单元来执行上述的深度学习。在一些实施例中,深度计算单元可以是FPGA/NPU/GPU等。应理解,深度图包括多个深度值,每个深度值对应TOF图像传感器的单个像素。在一些示例中,深度计算单元可以输出二维图像中各个像素的相对深度值,以指导通过计算相位差得到的单个像素的TOF深度值,即从对应于相位差的多个TOF深度值中选取最接近于通过深度学习得到的相对深度值,得到实际深度值,进而得到最终的深度图。在一些实施例中,控制与处理器将多个TOF深度值与相对深度值作差并取绝对值,选取最小绝对值对应的TOF深度值作为实际深度值。在一些实施例中,控制与处理器根据所述实际深度值生成所述目标物体的深度图;或者,将所述TOF深度图与根据相对深度值所得的相对深度图进行融合以生成所述目标物体的深度图。
在一些实施例中,深度计算单元可通过设计卷积神经网络结构,训练已知深度图的损失函数得到深度模型,在估计相对深度值时直接将二维图像输入到上述卷积神经网络结构中进行深度学习即可得到对应的相对深度图。应理解,在此方法中用颜色值(或灰度值)表示相对深度值。由上述叙述可知,通过相位延迟(相位差)计算的深度值具有高准确度,因此对利用二维图像估算的目标物体的相对深度值要求不高,因此卷积神经网络结构的设计要求相对简单,如此,既不会增加深度测量装置的功率或降低其计算速率,又能辅助TOF进行解卷绕以得到目标物体精确的实际距离值。
参照图2所示,作为本申请另一实施例,还提供一种基于TOF的深度测量方法。图2所示为基于TOF的深度测量方法的流程图示,测量方法包括如下步骤:
S20、控制光发射模组向目标物体发射被调制的发射光束;
S21、控制成像模组采集所述发射光束经目标物体反射回的反射光束并生成相应的电信号;以及,采集目标物体的二维图像;
S22、利用控制与处理器接收成像模组生成的所述电信号以及接收来自成像模组的二维图像,并对电信号进行计算以得到目标物体的一个或多个TOF深度值;同时,利用所述二维图像进行深度学习以获得目标物体的相对深度值,基于所获得的相对深度值从一个或多个TOF深度值中确定实际深度值。
步骤S22中,所述控制与处理器包括有深度计算单元,所述深度计算单元通过设计卷积神经网络结构,以对所述二维图像进行深度学习以获取目标物体的相对深度值。
步骤S22中,所述控制与处理器通过将所述多个TOF深度值与所述相对深度值作差并取绝对值,选取最小绝对值对应的TOF深度值作为实际深度值;或,利用所述深度学习得到的所述相对深度值的连续性,指导TOF深度值解卷绕以从所述一个或多个TOF深度值中确定实际深度值。
在一些实施例中,还包括:
根据所述实际深度值生成所述目标物体的深度图;或者,
将所述TOF深度图与根据相对深度值所得的相对深度图进行融合以生成目标物体的深度图。
在一些实施例中,光发射模组被设置成在所述控制与处理器的控制下以一个或多个调制频率发射时序上振幅被连续波调制的光束;所述成像模组被设置成采集至少部分所述光束并生成相应的电信号;所述控制与处理器被设置成根据所述电信号计算出相位差,基于所述相位差计算所述光束由发射到被采集所需的飞行时间,并基于所述飞行时间计算各个像素的TOF深度值。
本申请实施例基于TOF的深度测量方法具体由前述实施例基于TOF的深度测量装置来执行,具体实现方法可参照基于TOF的深度测量装置实施例方案中的描述,在此不再赘述。
本申请采用对二维图像进行深度学习以得到相对深度值,并基于该相对深 度值辅助ToF解卷绕,从而实现了在不增加现有深度测量装置成本条件下,提升深度图的精度、完整性与帧率的目的。
本申请实现上述实施例方法中的全部或部分流程,也可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质可以包括:能够携带所述计算机程序代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质等。需要说明的是,所述计算机可读介质包含的内容可以根据司法管辖区内立法和专利实践的要求进行适当的增减,例如在某些司法管辖区,根据立法和专利实践,计算机可读介质不包括电载波信号和电信信号。
作为本申请另一个实施例,还提供一种电子设备,所述电子设备可以是台式、桌面安装设备、便携式设备、可穿戴设备或车载设备以及机器人等。具体的,设备可以是笔记本电脑或电子设备,以允许手势识别或生物识别。在其他示例中,设备可以是头戴式设备,以用于标识用户周围环境的对象或危险,以确保安全,例如,阻碍用户对环境视觉的虚拟现实***,可以检测周围环境中的对象或危险,以向用户提供关于附近对象或障碍物的警告。在另一些示例中,可以是将虚拟信息和图像与用户周围环境相混合的混合现实***,可检测用户环境中的对象或人,以将虚拟信息与物理环境和对象集成。在其它示例中,还可以是应用在无人驾驶等领域的设备。参照图3所示,以手机为例进行说明,所述电子设备300包括壳体31、屏幕32、以及前述实施例所述的基于TOF的深度测量装置;其中,所述基于TOF的深度测量装置的光发射模组11与成像模组12设置于电子设备300的同一面,用于向目标物体发射光束以及接收目标物体反射回来的泛光光束并形成电信号。
可以理解的是,以上内容是结合具体/优选的实施方式对本申请所作的进一步详细说明,不能认定本申请的具体实施只局限于这些说明。对于本申请所属技术领域的普通技术人员来说,在不脱离本申请构思的前提下,其还可以对这些已描述的实施方式做出若干替代或变型,而这些替代或变型方式都应当视为属于本申请的保护范围。在本说明书的描述中,参考术语“一种实施例”、“一些实施例”、“优选实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。
在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。尽管已经详细描述了本申请的实施例及其优点,但应当理解,在不脱离由所附权利要求限定的范围的情况下,可以在本文中进行各种改变、替换和变更。
此外,本申请的范围不旨在限于说明书中所述的过程、机器、制造、物质组成、手段、方法和步骤的特定实施例。本领域普通技术人员将容易理解,可以利用执行与本文所述相应实施例基本相同功能或获得与本文所述实施例基本相同结果的目前存在的或稍后要开发的上述披露、过程、机器、制造、物质组成、手段、方法或步骤。因此,所附权利要求旨在将这些过程、机器、制造、物质组成、手段、方法或步骤包含在其范围内。

Claims (10)

  1. 一种基于TOF的深度测量装置,其特征在于,包括:
    光发射模组,用于向目标物体发射光束;
    成像模组,用于采集经所述目标物体反射回的反射光束并生成相应的电信号,以及采集所述目标物体的二维图像;
    控制与处理器,分别与所述光发射模组以及所述成像模组连接,用于控制所述光发射模组发射被调制的发射光束至目标空间中;以及,同步触发所述成像模组开启并接收来自所述成像模组生成的所述电信号和所述二维图像,对所述电信号进行计算以获得目标物体的一个或多个TOF深度值,同时利用所述二维图像进行深度学习以获取目标物体的相对深度值,进而基于所述相对深度值从所述一个或多个TOF深度值中确定实际深度值。
  2. 如权利要求1所述的基于TOF的深度测量装置,其特征在于:
    所述光发射模组被设置成在所述控制与处理器的控制下以一个或多个调制频率发射时序上振幅被连续波调制的光束;
    所述成像模组被设置成采集至少部分所述光束并生成相应的电信号;
    所述控制与处理器被设置成根据所述电信号计算出相位差,基于所述相位差计算所述光束由发射到被采集所需要的飞行时间,并基于所述飞行时间计算各个像素的TOF深度值。
  3. 如权利要求1所述的基于TOF的深度测量装置,其特征在于:所述控制与处理器还包括有深度计算单元,所述深度计算单元包括有卷积神经网络结构,通过所述卷积神经网络结构对所述二维图像进行深度学习以获得所述目标物体的相对深度值。
  4. 一种基于TOF的深度测量方法,其特征在于,包括如下步骤:
    S20、控制光发射模组向目标物体发射被调制的发射光束;
    S21、控制成像模组采集所述发射光束经目标物体反射回的反射光束并生成相应的电信号;以及,采集目标物体的二维图像;
    S22、利用控制与处理器接收所述成像模组生成的所述电信号以及接收来自所述成像模组的二维图像,并对所述电信号进行计算以得到目标物体的一个或多个TOF深度值;同时,利用所述二维图像进行深度学习以获得目标物体的相对深度值,基于所获得的相对深度值从所述一个或多个TOF深度值中确定实际深度值。
  5. 如权利要求4所述的基于TOF的深度测量方法,其特征在于:
    所述光发射模组被设置成在所述控制与处理器的控制下以一个或多个调制频率发射时序上振幅被连续波调制的光束;
    所述成像模组被设置成采集经目标物体反射回的至少部分反射光束并生成相应的电信号;
    所述控制与处理器被设置成根据所述电信号计算相位差,基于所述相位差计算所述光束由发射到被采集所需的飞行时间,并基于所述飞行时间计算各个像素的TOF深度值。
  6. 如权利要求4所述的基于TOF的深度测量方法,其特征在于:所述控制与处理器包括有深度计算单元,在步骤S22中,所述深度计算单元通过设计卷积神经网络结构,以对所述二维图像进行深度学习以获取所述目标物体的相对深度值。
  7. 如权利要求4所述的基于TOF的深度测量方法,其特征在于:步骤S22中,所述控制与处理器通过将所述多个TOF深度值与所述相对深度值作差并取绝对值,选取最小绝对值对应的TOF深度值作为实际深度值;或,
    利用所述深度学习得到的所述相对深度值的连续性,指导TOF深度值解卷绕以从所述一个或多个TOF深度值中确定实际深度值。
  8. 如权利要求4所述的基于TOF的深度测量方法,其特征在于:所述控制与处理器基于所述TOF深度值生成目标物体的TOF深度图,同时基于所述相对深度值生成目标物体的相对深度图。
  9. 如权利要求8所述的基于TOF的深度测量方法,其特征在于,还包括如 下步骤:
    根据所述实际深度值生成所述目标物体的深度图;或者,
    将所述TOF深度图与所述相对深度图进行融合以生成目标物体的深度图。
  10. 一种电子设备,其特征在于:包括壳体、屏幕、以及基于TOF的深度测量装置;其中,所述基于TOF的深度测量装置的光发射模组与成像模组设置于电子设备的同一面,以用于向目标物体发射光束以及接收经目标物体反射回来的光束并形成电信号;所述基于TOF的深度测量装置包括:光发射模组,用于向目标物体发射光束;成像模组,用于采集经所述目标物体反射回的反射光束并生成相应的电信号,以及采集所述目标物体的二维图像;控制与处理器,分别与所述光发射模组以及所述成像模组连接,用于控制所述光发射模组发射被调制的发射光束至目标空间中;以及,同步触发所述成像模组开启并接收来自所述成像模组生成的所述电信号和所述二维图像,对所述电信号进行计算以获得目标物体的一个或多个TOF深度值,同时利用所述二维图像进行深度学习以获取目标物体的相对深度值,进而基于所述相对深度值从所述一个或多个TOF深度值中确定实际深度值。
PCT/CN2020/141868 2020-05-24 2020-12-30 一种基于tof的深度测量装置、方法及电子设备 WO2021238213A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/748,406 US20220277467A1 (en) 2020-05-24 2022-05-19 Tof-based depth measuring device and method and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010445256.9 2020-05-24
CN202010445256.9A CN111736173B (zh) 2020-05-24 2020-05-24 一种基于tof的深度测量装置、方法及电子设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/748,406 Continuation US20220277467A1 (en) 2020-05-24 2022-05-19 Tof-based depth measuring device and method and electronic equipment

Publications (1)

Publication Number Publication Date
WO2021238213A1 true WO2021238213A1 (zh) 2021-12-02

Family

ID=72647663

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/141868 WO2021238213A1 (zh) 2020-05-24 2020-12-30 一种基于tof的深度测量装置、方法及电子设备

Country Status (3)

Country Link
US (1) US20220277467A1 (zh)
CN (1) CN111736173B (zh)
WO (1) WO2021238213A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111736173B (zh) * 2020-05-24 2023-04-11 奥比中光科技集团股份有限公司 一种基于tof的深度测量装置、方法及电子设备
CN111965660B (zh) * 2020-10-26 2021-02-23 深圳市汇顶科技股份有限公司 飞行时间传感器、测距***及电子装置
WO2022087776A1 (zh) * 2020-10-26 2022-05-05 深圳市汇顶科技股份有限公司 飞行时间传感器、测距***及电子装置
CN113298778B (zh) * 2021-05-21 2023-04-07 奥比中光科技集团股份有限公司 一种基于飞行时间的深度计算方法、***及存储介质
CN113466884B (zh) * 2021-06-30 2022-11-01 深圳市汇顶科技股份有限公司 飞行时间深度测量发射装置及电子设备
CN113822919B (zh) * 2021-11-24 2022-02-25 中国海洋大学 基于语义信息约束的水下图像相对深度估计方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765260A (zh) * 2016-08-22 2018-03-06 三星电子株式会社 用于获取距离信息的方法、设备及计算机可读记录介质
US10242454B2 (en) * 2017-01-25 2019-03-26 Google Llc System for depth data filtering based on amplitude energy values
CN109889809A (zh) * 2019-04-12 2019-06-14 深圳市光微科技有限公司 深度相机模组、深度相机、深度图获取方法以及深度相机模组形成方法
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法
CN110456379A (zh) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 融合的深度测量装置及距离测量方法
CN110471080A (zh) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 基于tof图像传感器的深度测量装置
CN110488240A (zh) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 深度计算芯片架构
CN111736173A (zh) * 2020-05-24 2020-10-02 深圳奥比中光科技有限公司 一种基于tof的深度测量装置、方法及电子设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9760837B1 (en) * 2016-03-13 2017-09-12 Microsoft Technology Licensing, Llc Depth from time-of-flight using machine learning
US11181623B2 (en) * 2017-09-30 2021-11-23 Massachusetts Institute Of Technology Methods and apparatus for gigahertz time-of-flight imaging
CN109253708B (zh) * 2018-09-29 2020-09-11 南京理工大学 一种基于深度学习的条纹投影时间相位展开方法
CN109803079B (zh) * 2019-02-18 2021-04-27 Oppo广东移动通信有限公司 一种移动终端及其拍照方法、计算机存储介质
CN110425986B (zh) * 2019-07-17 2020-10-16 北京理工大学 基于单像素传感器的三维计算成像方法及装置
CN110686652B (zh) * 2019-09-16 2021-07-06 武汉科技大学 一种基于深度学习和结构光相结合的深度测量方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765260A (zh) * 2016-08-22 2018-03-06 三星电子株式会社 用于获取距离信息的方法、设备及计算机可读记录介质
US10242454B2 (en) * 2017-01-25 2019-03-26 Google Llc System for depth data filtering based on amplitude energy values
CN109889809A (zh) * 2019-04-12 2019-06-14 深圳市光微科技有限公司 深度相机模组、深度相机、深度图获取方法以及深度相机模组形成方法
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法
CN110456379A (zh) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 融合的深度测量装置及距离测量方法
CN110471080A (zh) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 基于tof图像传感器的深度测量装置
CN110488240A (zh) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 深度计算芯片架构
CN111736173A (zh) * 2020-05-24 2020-10-02 深圳奥比中光科技有限公司 一种基于tof的深度测量装置、方法及电子设备

Also Published As

Publication number Publication date
CN111736173A (zh) 2020-10-02
CN111736173B (zh) 2023-04-11
US20220277467A1 (en) 2022-09-01

Similar Documents

Publication Publication Date Title
WO2021238212A1 (zh) 一种深度测量装置、方法及电子设备
WO2021238213A1 (zh) 一种基于tof的深度测量装置、方法及电子设备
WO2021128587A1 (zh) 一种可调的深度测量装置及测量方法
WO2021008209A1 (zh) 深度测量装置及距离测量方法
CN111123289B (zh) 一种深度测量装置及测量方法
WO2021051477A1 (zh) 一种直方图可调的飞行时间距离测量***及测量方法
WO2021120402A1 (zh) 一种融合的深度测量装置及测量方法
CN111025318B (zh) 一种深度测量装置及测量方法
WO2022017366A1 (zh) 一种深度成像方法及深度成像***
CN111427048B (zh) 一种ToF深度测量装置、控制方法及电子设备
CN111538024B (zh) 一种滤波ToF深度测量方法及装置
WO2021212916A1 (zh) 一种tof深度测量装置、方法及电子设备
CN111025321B (zh) 一种可变焦的深度测量装置及测量方法
CN212694038U (zh) 一种tof深度测量装置及电子设备
CN209894976U (zh) 时间飞行深度相机及电子设备
US20220043129A1 (en) Time flight depth camera and multi-frequency modulation and demodulation distance measuring method
CN110501714A (zh) 一种距离探测器及距离探测方法
CN111025319B (zh) 一种深度测量装置及测量方法
CN111712781A (zh) 具有硅光电倍增管传感器的高效的基于mems的眼睛追踪***
EP3709050B1 (en) Distance measuring device, distance measuring method, and signal processing method
WO2022241942A1 (zh) 一种深度相机及深度计算方法
CN114488173A (zh) 一种基于飞行时间的距离探测方法和***
CN114365007A (zh) 测量到目标距离的方法
WO2022088492A1 (zh) 一种采集器、距离测量***及电子设备
CN114549609A (zh) 一种深度测量***及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20938021

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20938021

Country of ref document: EP

Kind code of ref document: A1