WO2024050895A1 - iTOF深度测量***及深度测量方法 - Google Patents

iTOF深度测量***及深度测量方法 Download PDF

Info

Publication number
WO2024050895A1
WO2024050895A1 PCT/CN2022/122364 CN2022122364W WO2024050895A1 WO 2024050895 A1 WO2024050895 A1 WO 2024050895A1 CN 2022122364 W CN2022122364 W CN 2022122364W WO 2024050895 A1 WO2024050895 A1 WO 2024050895A1
Authority
WO
WIPO (PCT)
Prior art keywords
tap
exposure
control
time
exposure time
Prior art date
Application number
PCT/CN2022/122364
Other languages
English (en)
French (fr)
Inventor
孙瑞
Original Assignee
奥比中光科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 奥比中光科技集团股份有限公司 filed Critical 奥比中光科技集团股份有限公司
Publication of WO2024050895A1 publication Critical patent/WO2024050895A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal

Definitions

  • the present application relates to the technical field of depth measurement, and in particular to an iTOF depth measurement system and a depth measurement method.
  • the iTOF depth measurement system is a system that uses the indirect time-of-flight principle for depth measurement. It has been widely used in various smart terminals, mobile robots, and smart cars. It can meet the needs of background blur, night vision, three-dimensional scanning, and avoidance. to meet the needs of various application scenarios such as fault requirements and driver cockpit detection.
  • the iTOF depth measurement system Since the iTOF depth measurement system is at a certain distance from the target, it takes a certain transmission time for the beam to be emitted and received. During this period of time, the beam has not yet entered the iTOF depth measurement system, so the optical signal received during this period is mainly It is ambient light, and the ambient light in outdoor scenes is strong, which further reduces the signal-to-noise ratio of the data collected by the iTOF measurement system, resulting in a decrease in measurement accuracy.
  • This application provides an iTOF depth measurement system and a depth measurement method, which can effectively improve the depth measurement accuracy of the iTOF depth measurement system.
  • an iTOF depth measurement system including a transmitting module, a receiving module, and a control and processor.
  • the transmitting module is used to transmit a beam to the target; the receiving module is used to collect the beam reflected by the target; the control and processor are used to control the transmitting module to transmit the first beam to the target and synchronously control the exposure of the receiving module to collect the target
  • the first beam reflected back from the object is used to obtain the first beam data; the shortest flight time of the first beam is calculated based on the first beam data, and the delayed exposure time of the receiving module is determined based on the shortest flight time;
  • the transmitting module is controlled to transmit to the target object
  • the second beam and controls the receiving module to delay the exposure time and start exposure to collect the second beam reflected back by the target object to obtain the second beam data; calculate the depth distance of the target object based on the second beam data.
  • control and processor are specifically configured to: calculate the minimum phase delay between the first beam being transmitted and being received based on the first beam data; calculate the shortest measurement distance based on the minimum phase delay; and calculate the shortest measurement distance based on the minimum phase delay. , calculate the shortest flight time.
  • control and processor are also used to: calculate the exposure extension time of the receiving module according to the delayed exposure time; control the receiving module to extend the exposure time according to the exposure extension time.
  • control and processor are specifically configured to: obtain multiple preset delayed exposure times; select a preset delayed exposure time that is smaller than and closest to the shortest flight time as the delay from the multiple preset delayed exposure times. exposure time.
  • control and processor are specifically configured to: obtain the mapping relationship between the preconfigured delayed exposure time and the shortest flight time; and calculate the delayed exposure time based on the mapping relationship and the shortest flight time.
  • the receiving module includes an image sensor, and the image sensor includes at least two taps, and different taps have different starting exposure times; the control and processor are further configured to: control at least one tap to delay exposure time exposure.
  • the image sensor includes a first tap, a second tap, a third tap and a fourth tap respectively corresponding to 0° phase, 180° phase, 90° phase and 270° phase, and the control and processor are used to control The first tap and the fourth tap start exposure after delaying the exposure time, and the second tap and the third tap do not delay the exposure.
  • control and processor are also used to: calculate the exposure time of each tap based on the delayed exposure time, the emission cycle of the emission module and the starting exposure time of each tap, and control each tap according to Expose according to the corresponding exposure time.
  • control and processor are specifically configured to: calculate the time difference between the initial exposure time of each tap and the initial emission time of the emission module; according to the time difference corresponding to each tap, the delayed exposure time and For the duty cycle of the transmit module, calculate the duty cycle corresponding to each tap; based on the duty cycle corresponding to each tap and the emission period of the transmit module, calculate the exposure duration corresponding to each tap.
  • embodiments of the present application provide a depth measurement method.
  • the method includes: controlling the transmitting module to emit a first beam to the target object, and synchronously controlling the exposure of the receiving module to collect the reflected light of the target object.
  • the first beam obtain the first beam data; calculate the shortest flight time of the first beam according to the first beam data, and determine the delayed exposure time of the receiving module according to the shortest flight time; control the emission
  • the module emits a second beam to the target object, and controls the receiving module to delay the exposure by the delayed exposure time to collect the second beam reflected back by the target object, obtain the second beam data, and according to the
  • the second beam data calculates the depth distance of the target object.
  • the iTOF depth measurement system and depth measurement method provided by the embodiments of this application first calculate the shortest flight time of the first beam from emission to reception, then determine the delayed exposure time of the receiving module based on the shortest flight time, and then control the transmitting module.
  • the second beam is emitted, and the receiving module is controlled to start the exposure after the delayed exposure time, and then the depth is calculated based on the second beam data collected after the delayed exposure. This avoids the need for the receiving module to collect excessive data during the flight of the beam. With more ambient light, the receiving module collects relatively less ambient light in the second beam data, thereby improving the signal-to-noise ratio of the second beam data, which can effectively improve the depth measurement accuracy of the iTOF depth measurement system.
  • Figure 1 is a schematic structural diagram of an iTOF depth measurement system provided in an embodiment of the present application.
  • Figure 2 is the working timing diagram of the existing iTOF system phase modulation and demodulation
  • Figure 3 is a working sequence diagram of phase modulation and demodulation of an iTOF depth measurement system provided in the embodiment of the present application;
  • Figure 4 is a working sequence diagram of phase modulation and demodulation of another iTOF depth measurement system provided in the embodiment of the present application;
  • FIG. 5 is a schematic flowchart of steps of a depth measurement method provided in an embodiment of the present application.
  • module means any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic or combination of hardware or/and software code capable of performing the function associated with that element.
  • the indirect Time of Flight (iTOF) depth measurement system obtains the depth distance by indirectly measuring the flight time.
  • the iTOF depth measurement system 10 includes a transmitting module 11, a receiving module 12 and a control and processor 13.
  • the transmitting module 11 is used to transmit a beam signal (ie, the transmitting beam in Figure 1)
  • the receiving module 12 is used to collect the beam signal reflected back by the target object 20 (ie, the reflected beam in Figure 1) and generate an electrical signal.
  • the control and processor 13 are respectively connected to the transmitting module 11 and the receiving module 12, and can The transmitting module 11 and the receiving module 12 are respectively controlled.
  • the control and processor 13 is used to calculate the phase offset between the transmitting beam and the reflected beam according to the electrical signal generated by the receiving module 12, and then calculate the depth information.
  • the outdoor use effect of the iTOF depth measurement system is limited by the full well of the chip, ambient light noise, etc., resulting in low accuracy when measuring depth outdoors.
  • the ambient light is strong, and the ambient light accounts for more of the light data collected by the receiving module.
  • the signal-to-noise ratio of the light data received by the receiving module is low, which affects the outdoor performance of the iTOF depth measurement system.
  • FIG 2 is a working sequence diagram of phase modulation and demodulation of the existing iTOF depth measurement system.
  • the modulation of the transmitting beam and the demodulation of the tap of the receiving module will be performed simultaneously.
  • the module 12 is all at a certain distance from the target. It takes a transmission time t 2 for the light beam to be sent from the transmitting module 11 to being received by the receiving module 12.
  • the light gray area is the laser (i.e., the emitted beam), and the dark gray area is the ambient light.
  • the proportion of ambient light in the collected demodulation phase data is relatively large, resulting in that in each demodulation phase The signal-to-noise ratio is low, thus affecting the application effect.
  • an iTOF depth measurement system 10 is provided in an embodiment of the present application.
  • the control and processor 13 is used to control the transmitting module 11 to emit the first beam to the target object and to synchronously control the exposure of the receiving module 12 to collect the third reflection of the target object.
  • a beam is used to obtain the first beam data; the shortest flight time of the first beam is calculated based on the first beam data, and the delayed exposure time of the receiving module 12 is determined based on the shortest flight time; the transmitting module 11 is controlled to transmit the second beam to the target. beam, and controls the receiving module 12 to delay the exposure time exposure to collect the second beam reflected by the target object, and obtain the second beam data; and calculate the depth distance of the target object based on the second beam data.
  • the iTOF depth measurement system 10 provided by the embodiment of the present application first calculates the shortest flight time of the first beam from emission to reception, then determines the delayed exposure time of the receiving module 12 based on the shortest flight time, and then controls the transmitting module 11
  • the second beam is emitted, and the receiving module 12 is controlled to start exposure after the delayed exposure time, and then the depth is calculated based on the second beam data collected after the delayed exposure, thereby avoiding the need for the receiving module 12 to be exposed during the flight of the beam.
  • the ambient light is collected through exposure, and the receiving module 12 collects relatively little ambient light in the second beam data, thereby improving the signal-to-noise ratio of the second beam data, thereby effectively improving the depth measurement accuracy of the iTOF depth measurement system.
  • the transmitting module 11, the receiving module 12 and the control processor 13 in the iTOF depth measurement system are integrated into a depth camera.
  • the transmitting module 11 and the receiving module 12 are integrated into a depth camera, and the control and processor 13 is a peripheral device connected to the depth camera. It can be understood that the specific form of the iTOF system is not limited here.
  • the emission module 11 includes a light source and a diffuser.
  • the light source is used to generate a light beam
  • the diffuser is used to diffuse the light beam.
  • the receiving module 12 includes a lens, a filter and an image sensor. The lens converges the light to the filter.
  • the filter is used to filter part of the light.
  • the image sensor includes at least two taps. Different taps start exposure at different times. The collected There are certain differences in the beam data.
  • the image sensor may be an image sensor composed of a charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS), or the like.
  • the image sensor is connected to a readout circuit (not shown in the figure) composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC), and other devices.
  • a readout circuit (not shown in the figure) composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC), and other devices.
  • the control and processor 13 can calculate the minimum phase delay generated by the first light beam from being transmitted to being received based on the collected first light beam data, and then calculate the shortest measurement distance based on the minimum phase delay, and further The shortest flight time is calculated based on the shortest measured distance.
  • the first beam data includes the first beam reflected at each point on the target, and the control and processor 13 calculates the phase delay of each point on the target, and then the minimum phase delay can be obtained, and the measured distance and phase delay There is a fixed mapping relationship between them.
  • the corresponding shortest measurement distance can be calculated based on the minimum phase delay.
  • the shortest measurement distance can be the shortest distance between the iTOF measurement system and the target. Since the speed of the beam is known, according to the shortest measurement distance Thus, the shortest flight time of the light beam from being emitted by the transmitting module 11 to being received by the receiving module 12 can be determined.
  • the control and processor 13 can determine the delayed exposure time of the receiving module 12 based on the shortest flight time, where the delayed exposure time is less than or equal to the shortest flight time, so that the receiving module When collecting the second beam data, the group 12 can receive all the beams reflected from the target object as much as possible, thus avoiding the situation that the beams reflected from some points or areas on the target object cannot be received by the receiving module 12 .
  • multiple preset delayed exposure times are stored in advance.
  • the preset delayed exposure times that are less than the shortest flight time can be determined from the multiple preset delayed exposure times, and then the preset delayed exposure times can be found from these preset delayed exposure times.
  • the preset delayed exposure time closest to the shortest flight time is used as the delayed exposure time. For example, assuming that the register is configured with multiple delayed exposure times such as 1ns, 2ns, 3ns, 4ns, 5ns, 6ns, 7ns, etc., if the shortest flight time is 1.4ns, 1ns can be selected as the delayed exposure time; if the shortest flight time is 4.8 ns, then 4ns can be selected as the delayed exposure time.
  • the mapping relationship between the delayed exposure time and the shortest flight time is pre-configured.
  • the delayed exposure time corresponding to the shortest flight time can be directly calculated; for example, delay
  • the exposure time is t
  • the shortest flight time is t min
  • the mapping relationship can be obtained through pre-calibration and is not limited to the above examples, and will not be introduced in detail here.
  • the minimum flight time can also be directly set as the delayed exposure time, which is not limited here.
  • the image sensor includes at least one pixel.
  • each pixel of the image sensor of this embodiment includes two or more taps, and the taps are used for Collect the reflected beam and/or ambient light.
  • the control and processor 13 can control one tap, some taps, or all taps to delay the exposure by the delayed exposure time, which is not limited here.
  • the image processor includes a first tap and a second tap.
  • the first tap is powered on synchronously with the transmitting module 11, and the second tap is powered on for a period of time after the first tap.
  • the control and processor 13 can control the first tap to be powered on after the delayed exposure time according to the technology of the existing solution, so that the first tap starts to expose after the delayed exposure time. In this way, when the emission module 11 sends out the second During the process when the beam is projected to the target object and the target object reflects the second beam to the receiving module 12, the first tap is in a closed state (that is, not powered on) and does not receive ambient light, which greatly reduces the efficiency of the first tap collection.
  • the control and processor 13 may also synchronously control the second tap to delay the exposure time, which may be determined according to the start exposure time of the second tap within the delayed exposure time or outside the delayed exposure time. For example, when the start exposure time of the second tap is within the delayed exposure time, the second tap can be controlled to delay the exposure, and the delayed exposure time of the second tap can be smaller than the delayed exposure time of the first tap; when the start exposure time of the second tap When the delayed exposure time is outside, the second tap can be controlled not to delay the exposure, and the exposure is performed with the original exposure time.
  • control and processor 13 can be specifically configured to calculate the exposure duration of each tap based on the delayed exposure time, the emission cycle of the emission module 11 and the starting exposure time of each tap, and then control each tap according to the corresponding The exposure time is adjusted so that the signal-to-noise ratio of the data collected by each tap is high and the accuracy of depth measurement is improved.
  • control and processor 13 can be used to calculate the time difference between the initial exposure time of each tap and the initial emission time of the emission module 11; and then according to the time difference corresponding to each tap, the delayed exposure time and the emission Based on the duty cycle of module 11, calculate the duty cycle corresponding to each tap; and then calculate the exposure time corresponding to each tap based on the duty cycle corresponding to each tap and the emission period of the transmitting module. In this way, the exposure duration of each tap is calculated more accurately, and the signal-to-noise ratio of the data collected after each tap is exposed is higher.
  • each pixel of the image sensor includes a first tap, a second tap, a third tap, and a fourth tap.
  • the first tap, the second tap, the third tap, and the fourth tap respectively corresponding to 0° phase, 180° phase, 90° phase and 270° phase.
  • the control and processor 13 controls the first tap and the fourth tap to delay the exposure time according to the delayed exposure time, and the control and processor 13 controls the second tap and the third tap not to delay the exposure, that is, the second tap
  • the tap and the third tap are still exposed at the original starting exposure time.
  • the control and processor 13 can also control the delayed exposure of the second tap and the third tap.
  • the delayed exposure time of the second tap and the third tap can be the same or different. Specifically, the delayed exposure time and the second tap can be controlled according to the delayed exposure time and the second tap.
  • the starting exposure time of the tap and the third tap is determined.
  • the light gray area is the laser data (ie, the reflected light speed), and the dark gray area is the ambient light data. It can be seen from Figure 3 that during the process when the second beam is emitted to the target and reflected to the receiving module and then received by the receiving module, that is, during the flight of the second beam, the first tap is in a closed state. In this process, because the first tap does not receive ambient light, the intensity of the ambient light signal collected by the first tap can be greatly reduced and the signal-to-noise ratio of the signal collected in the first tap can be improved.
  • the duty cycle Duty 50%
  • the entire period Tz is 8ns
  • the control and processor 13 when the target object moves, the control and processor 13 needs to recalculate the minimum flight time and redetermine the delayed exposure time, and then the transmitting module 11 emits the beam and the receiving module 12 delays the exposure according to the delayed exposure time. , measure the depth of the moved target. It is understandable that after the target moves, the minimum phase delay of the iTOF depth measurement system will change, and the corresponding minimum flight time will also change. Therefore, the delay exposure time needs to be re-determined to make subsequent depth measurement results more accurate. .
  • the exposure time of the receiving module 12 can be appropriately increased, thereby further improving the signal-to-noise ratio and achieving improved outdoor effects.
  • the iTOF depth measurement system can obtain multiple frames of second beam data according to the above embodiment, and then calculate the depth by combining the multiple frames of second beam data, so that the depth calculated in this way is more accurate. It can be understood that one cycle is one frame.
  • FIG. 5 is a schematic flowchart of the steps of a depth measurement method provided in the embodiment of the present application.
  • the above depth measurement method includes the following steps:
  • the depth measurement method provided by the embodiment of the present application first calculates the shortest flight time from emission to reception of the first beam, then determines the delayed exposure time of the receiving module based on the shortest flight time, and then controls the transmitting module to emit the second beam. It also controls the receiving module to start exposure after the delayed exposure time, and then calculates the depth based on the second beam data collected after delayed exposure. This avoids the receiving module from collecting too much ambient light during the flight of the beam.
  • the receiving module collects relatively little ambient light in the second beam data, thereby improving the signal-to-noise ratio of the second beam data, which can effectively improve the depth measurement accuracy of the iTOF depth measurement system in outdoor use scenarios.
  • the above depth measurement method can be applied to the iTOF depth measurement system described in the above embodiment.
  • the above depth measurement method may be executed by the above control and processor 13.
  • the above-mentioned calculation of the shortest flight time of the first beam based on the first beam data includes: calculating the minimum phase delay between the first beam being transmitted and being received based on the first beam data; based on the minimum Phase delay, calculate the shortest measurement distance; based on the shortest measurement distance, calculate the shortest flight time.
  • the above steps, determining the delayed exposure time of the receiving module based on the shortest flight time include: obtaining multiple preset delayed exposure times; selecting from the multiple preset delayed exposure times less than and the minimum The preset delayed exposure time close to the minimum flight time is used as the delayed exposure time.
  • the above steps, determining the delayed exposure time of the receiving module based on the shortest flight time include: obtaining the mapping relationship between the preconfigured delayed exposure time and the shortest flight time; based on the mapping relationship and the shortest flight time time to calculate the delayed exposure time.
  • the above-mentioned receiving module includes an image sensor, and the image sensor includes at least two taps, and different taps have different starting exposure times.
  • the above steps, controlling the receiving module to delay the exposure time include: controlling the first tap and the fourth tap to delay the exposure time and then starting the exposure, and the second tap and the third tap not to delay the exposure.
  • the above method further includes the following steps: calculating the exposure duration of each tap based on the delayed exposure time, the emission cycle of the emission module and the starting exposure time of each tap, and controlling each tap Expose according to the corresponding exposure time.
  • the above steps, calculating the exposure duration of each tap based on the delayed exposure time, the emission cycle of the emission module and the starting exposure time of each tap include the following steps: Calculate the exposure time of each tap The time difference between the initial exposure time and the initial emission time of the emission module; according to the time difference corresponding to each tap, the delayed exposure time and the duty cycle of the emission module, calculate the duty cycle corresponding to each tap; according to each tap According to the duty cycle corresponding to each tap and the emission period of the transmitting module, calculate the exposure time corresponding to each tap.
  • the method further includes the following steps: calculating the exposure extension time of the receiving module according to the delayed exposure time; controlling the receiving module to extend the exposure time according to the exposure extension time.
  • the above-mentioned iTOF depth measurement system and depth measurement method are especially suitable for outdoor static or low frame rate application scenarios. It can be understood that in an outdoor static scene, the target object is not easy to move or the moving distance is very small, and the ambient light in the outdoor scene is strong, and the minimum phase delay is not easy to change. This application improves the signal noise by reducing the collected ambient light. The ratio is more obvious; in outdoor dynamic scenes, the target object often moves and the minimum phase delay often changes, so the delayed exposure time often needs to be adjusted, which is very inconvenient to apply and has large errors; low frame rate application scenarios If the refresh frequency is low, delayed exposure will have little impact on imaging efficiency. However, in high frame rate application scenarios where the refresh frequency is high, delayed exposure will have a greater impact on imaging efficiency.
  • embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores computer-executable instructions. When the computer executes the computer-executed instructions time to implement various steps of control and processor execution as in the above embodiments.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules is only a logical function division. In actual implementation, there may be other division methods, for example, multiple modules may be combined or integrated. to another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or modules, and may be in electrical, mechanical or other forms.
  • modules described as separate components may or may not be physically separated, and the components shown as modules may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional module in each embodiment of the present application can be integrated into a processing unit, or each module can exist physically alone, or two or more modules can be integrated into one unit.
  • the units formed by the above modules can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-mentioned integrated modules implemented in the form of software function modules can be stored in a computer-readable storage medium.
  • the above-mentioned software function modules are stored in a storage medium and include a number of instructions to cause a computer device (which can be a personal computer, a server, or a network device, etc.) or a processor (English: processor) to execute the various embodiments of this application. Some steps of the method.
  • processor may be a central processing unit (English: Central Processing Unit, referred to as: CPU), or other general-purpose processor, digital signal processor (English: Digital Signal Processor, referred to as: DSP), or an application-specific integrated circuit (English: Application Specific Integrated Circuit, abbreviation: ASIC), etc.
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in the application can be directly implemented by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the above storage medium can be implemented by any type of volatile or non-volatile storage device or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable Except programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EPROM erasable except programmable read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory flash memory
  • flash memory magnetic disk or optical disk.
  • Storage media can be any available media that can be accessed by a general purpose or special purpose computer.
  • the aforementioned program can be stored in a computer-readable storage medium.
  • the steps including the above-mentioned method embodiments are executed; and the aforementioned storage media include: ROM, RAM, magnetic disks, optical disks and other media that can store program codes.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

一种iTOF深度测量***(10)及深度测量方法,iTOF深度测量***(10)包括发射模组(11)、接收模组(12)及控制与处理器(13)。控制与处理器(13)用于控制发射模组(11)向目标物(20)发射第一光束及同步控制接收模组(12)曝光,以采集目标物(20)反射的第一光束,得到第一光束数据;根据第一光束数据计算第一光束的最短飞行时间,并根据最短飞行时间确定接收模组(12)的延迟曝光时间;控制发射模组(11)向目标物(20)发射第二光束,并控制接收模组(12)在延迟曝光时间后开始曝光,以采集目标物(20)反射回的第二光束,得到第二光束数据;根据第二光束数据计算目标物(20)的深度距离,可以有效提升iTOF深度测量***(10)的深度测量精度。

Description

iTOF深度测量***及深度测量方法
本申请要求于2022年9月8日提交中国专利局,申请号为202211096222.9,发明名称为“iTOF深度测量***及深度测量方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及深度测量技术领域,尤其涉及一种iTOF深度测量***及深度测量方法。
背景技术
iTOF深度测量***是一种采用间接飞行时间原理进行深度测量的***,目前已经广泛应用于各类智能终端、移动机器人、智能汽车中,其能够满足背景虚化、夜视仪、三维扫描、避障需求、驾驶员座舱检测等各类应用场景的需求。
由于iTOF深度测量***距离目标物有一定的距离,光束从发出到被接收需要一定的传输时间,在该段时间内光束还没有进入iTOF深度测量***,所以该段时间内接收到的光信号主要为环境光,而户外场景下环境光较强,进一步降低了iTOF测量***采集到的数据的信噪比,导致测量精度下降。
技术解决方案
本申请提供一种iTOF深度测量***及深度测量方法,可以有效提升iTOF深度测量***的深度测量精度。
第一方面,本申请实施例中提供了一种iTOF深度测量***,包括发射模组、接收模组及控制与处理器。发射模组用于向目标物发射光束;接收模组用于采集目标物反射的光束;控制与处理器用于控制发射模组向目标物发射第一光束及同步控制接收模组曝光,以采集目标物反射回的第一光束,得到第一光束数据;根据第一光束数据计算第一光束的最短飞行时间,并根据最短飞行时间确定接收模组的延迟曝光时间;控制发射模组向目标物发射第二光束,并控制接收模组延迟延迟曝光时间后开始曝光,以采集目标物反射回的第二光束,得到第二光束数据;根据第二光束数据计算目标物的深度距离。
在一些实施例中,控制与处理器具体用于:根据第一光束数据,计算第一光束从发射到被接收之间的最小相位延迟;根据最小相位延迟,计算最短测量距离;根据最 短测量距离,计算最短飞行时间。在一些实施例中,控制与处理器还用于:根据延迟曝光时间,计算接收模组的曝光延长时长;控制接收模组按照曝光延长时长,延长曝光时间。
在一些实施例中,控制与处理器具体用于:获取多个预设延迟曝光时间;从多个预设延迟曝光时间中,选取小于且最接近于最短飞行时间的预设延迟曝光时间作为延迟曝光时间。在另一些实施例中,控制与处理器具体用于:获取预先配置的延迟曝光时间与最短飞行时间之间的映射关系;根据映射关系及最短飞行时间,计算延迟曝光时间。
在一些实施例中,接收模组包括图像传感器,图像传感器包括至少两个抽头,不同的抽头的开始曝光时间不同;控制与处理器还用于:控制至少一个抽头延迟延迟曝光时间曝光。在其中一些实施例中,图像传感器包括分别对应0°相位、180°相位、90°相位及270°相位的第一抽头、第二抽头、第三抽头及第四抽头,控制与处理器用于控制第一抽头与第四抽头延迟延迟曝光时间后开始曝光,第二抽头与第三抽头不延迟曝光。
在其中一些实施例中,控制与处理器还用于:根据延迟曝光时间、发射模组的发射周期及每个抽头的起始曝光时间,计算每个抽头的曝光时长,并控制每个抽头按照对应的曝光时长进行曝光。在其中一些实施例中,控制与处理器具体用于:计算每个抽头的起始曝光时间与发射模组的起始发射时间之间的时间差;根据每个抽头对应的时间差、延迟曝光时间及发射模组的占空比,计算每个抽头对应的占空比;根据每个抽头对应的占空比及发射模组的发射周期,计算每个抽头对应的曝光时长。
第二方面,本申请实施例中提供了一种深度测量方法,所述方法包括:控制发射模组向目标物发射第一光束,及同步控制接收模组曝光,以采集所述目标物反射的第一光束,得到第一光束数据;根据所述第一光束数据计算所述第一光束的最短飞行时间,并根据所述最短飞行时间确定所述接收模组的延迟曝光时间;控制所述发射模组向所述目标物发射第二光束,及控制所述接收模组延迟所述延迟曝光时间曝光,以采集所述目标物反射回的第二光束,得到第二光束数据,并根据所述第二光束数据计算所述目标物的深度距离。
有益效果
本申请实施例所提供的iTOF深度测量***及深度测量方法,先计算第一光束从发射到被接收的最短飞行时间,然后根据最短飞行时间确定接收模组的延迟曝光时间, 再控制发射模组发射第二光束,并控制接收模组在延迟曝光时间后开始曝光,然后再根据延迟曝光后采集到的第二光束数据计算深度,由此,避免了在光束飞行过程中接收模组采集到过多的环境光,接收模组采集到第二光束数据中环境光相对较少,进而提高了第二光束数据的信噪比,从而能够有效提升iTOF深度测量***的深度测量精度。
附图说明
图1为本申请实施例中提供的一种iTOF深度测量***的结构示意图;
图2为现有的iTOF***相位调制解调的工作时序图;
图3为本申请实施例中提供的一种iTOF深度测量***的相位调制解调的工作时序图;
图4为本申请实施例中提供的另一种iTOF深度测量***的相位调制解调的工作时序图;
图5为本申请实施例中提供的一种深度测量方法的步骤流程示意图。
本发明的实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。此外,虽然本申请中公开内容按照示范性一个或几个实例来介绍,但应理解,可以就这些公开内容的各个方面也可以单独构成一个完整实施方式。
需要说明的是,本申请中对于术语的简要说明,仅是为了方便理解接下来描述的实施方式,而不是意图限定本申请的实施方式。除非另有说明,这些术语应当按照其普通和通常的含义理解。
本申请中说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似或同类的对象或实体,而不必然意味着限定特定的顺序或先后次序,除非另外注明。应该理解这样使用的用语在适当情况下可以互换,例如能够根据本申请实施例图示或描述中给出那些以外的顺序实施。
此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖但不排他的包含,例如,包含了一系列组件的产品或设备不必限于清楚地列出的那些组件,而是可包括没有清楚地列出的或对于这些产品或设备固有的其它组件。
本申请中使用的术语“模块”,是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
间接飞行时间(indirect Time of Flight,iTOF)深度测量***通过间接测量飞行时间进而得到深度距离。具体而言,如图1所示,iTOF深度测量***10包括发射模组11、接收模组12及控制与处理器13,发射模组11用于发射光束信号(即图1中的发射光束),接收模组12用于采集目标物20反射回的光束信号(即图1中的反射光束)并生成电信号,控制与处理器13分别与发射模组11及接收模组12连接,并能分别控制发射模组11和接收模组12,控制与处理器13用于根据接收模组12生成的电信号,计算发射光束与反射光束之间的相位偏移,进而计算得到深度信息。
但是,iTOF深度测量***在户外的使用效果受限于芯片的满阱、环境光噪声等的影响,导致其在户外进行深度测量时的精度较低。例如,户外使用时,环境光较强,接收模组采集到的光数据中环境光占较多,导致接收模组接收的光数据信噪比较低,从而影响了iTOF深度测量***在户外的应用表现。
图2为现有的iTOF深度测量***相位调制解调的工作时序图。如图2中所示,发射光束的调制与接收模组的抽头的解调会同步进行,但是,由于发送控制命令至光束发出的过程中,存在电路延迟t 1,且发射模组11与接收模组12均距离目标物有一定的距离,光束从发射模组11发出到被接收模组12接收需要传输时间t 2,iTOF深度测量***在户外工作时,在t 1和t 2的时间段内,由于光束还没有进入接收模组的抽头中,所以采集的解调相位数据中主要为环境光。如图2中,浅灰色的区域为激光(即发射光束),深灰色的区域为环境光,由此导致采集的解调相位数据中,环境光的比例比较大,导致每个解调相位中的信噪比较低,从而影响了应用效果。
面对上述技术问题,本申请实施例中提供了一种iTOF深度测量***10。本申请的iTOF深度测量***10,当需要进行深度测量时,控制与处理器13用于控制发射模组11向目标物发射第一光束及同步控制接收模组12曝光以采集目标物反射的第一光束,得到第一光束数据;根据第一光束数据计算第一光束的最短飞行时间,并根据该最短飞行时间确定接收模组12的延迟曝光时间;控制发射模组11向目标物发射第二光束,并控制接收模组12延迟延迟曝光时间曝光以采集目标物反射的第二光束,得到第二光束数据;根据第二光束数据计算目标物的深度距离。
本申请实施例所提供的iTOF深度测量***10,先计算第一光束从发射到被接收的最短飞行时间,然后根据该最短飞行时间确定接收模组12的延迟曝光时间,再控 制发射模组11发射第二光束,并控制接收模组12在该延迟曝光时间后开始曝光,然后再根据延迟曝光后采集到的第二光束数据计算深度,由此,避免了在光束飞行过程中接收模组12曝光而采集到环境光,接收模组12采集到第二光束数据中环境光相对较少,进而提高了第二光束数据的信噪比,从而能够有效提升iTOF深度测量***的深度测量精度。
在一个实施例中,上述iTOF深度测量***内的发射模组11、接收模组12及控制于处理器13集成在一个深度相机内。在另一个实施例中,发射模组11及接收模组12集成在一个深度相机内,控制与处理器13为与深度相机连接的外设器件。可以理解,iTOF***的具体形式在此不做限制。
其中,发射模组11包括光源及扩散器,光源用于产生光束,扩散器用于扩散光束。接收模组12包括镜头、滤光片及图像传感器,镜头会聚光线至滤光片,滤光片用于过滤部分光线,图像传感器包括至少两个抽头,不同的抽头开始曝光的时间不同,采集到的光束数据存在一定差异。
进一步地,图像传感器可以是电荷耦合元件(CCD)、互补金属氧化物半导体(CMOS)等组成的图像传感器。另外,与图像传感器连接的还包括由信号放大器、时数转换器(TDC)、模数转换器(ADC)等器件中的一种或多种组成的读出电路(图中未示出)。
在一些实施例中,控制与处理器13可根据采集到的第一光束数据,计算第一光束从发射到被接收过程中产生的最小相位延迟,然后根据最小相位延迟计算最短测量距离,进一步地根据最短测量距离计算最短飞行时间。可以理解,第一光束数据包括目标物上每个点反射的第一光束,控制与处理器13计算目标物的每一个点的相位延迟,然后可以得到最小的相位延迟,而测量距离与相位延迟之间存在固定的映射关系,根据最小相位延迟即可计算得到对应的最短测量距离,最短测量距离可为iTOF测量***与目标物之间的最短距离,由于光束的速度已知,根据最短测量距离即可确定光束从被发射模组11发射出来到被接收模组12接收的最短飞行时间。
在一些实施例中,在计算得到最短飞行时间后,控制与处理器13可根据最短飞行时间确定接收模组12的延迟曝光时间,其中,延迟曝光时间小于或等于最短飞行时间,以使接收模组12在采集第二光束数据时尽可能可以全部接收到目标物反射回的光束,避免了目标物上部分点或区域反射回的光束无法被接收模组12接收到的情况。
在其中一个实施例中,预先存储有多个预设延迟曝光时间,可以从多个预设延迟 曝光时间先确定小于最短飞行时间的预设延迟曝光时间,然后从这些预设延迟曝光时间内寻找最接近最短飞行时间的预设延迟曝光时间,将该预设延迟曝光时间作为延迟曝光时间。例如,假设寄存器内配置有1ns、2ns、3ns、4ns、5ns、6ns、7ns等多个延迟曝光时间,若最短飞行时间为1.4ns,则可选取1ns作为延迟曝光时间;若最短飞行时间为4.8ns,则可选取4ns作为延迟曝光时间。
在其中另一个实施例中,预先配置好了延迟曝光时间与最短飞行时间之间的映射关系,根据映射关系及最短飞行时间,可以直接确计算得到最短飞行时间对应的延迟曝光时间;例如,延迟曝光时间为t,最短飞行时间为t min,映射关系可以为t=0.9t min、0.8t min、0.7t min等。其中,映射关系可通过预先标定的方式得到,并不限于上述举例,在此不详细介绍。
在其它实施例中,也可以直接将最短飞行时间设置成延迟曝光时间,在此不做限制。
在一些实施例中,图像传感器其包括至少一个像素,与传统的仅用于拍照的图像传感器相比,本实施例的图像传感器的每个像素包含两个或两个以上的抽头,抽头用于采集反射回的光束和/或环境光。在确定延迟曝光时间后,控制与处理器13可控制一个抽头、部分个抽头、全部抽头以延迟曝光时间延迟曝光,在此不做限制。
在其中一个实施例中,图像处理器中包括第一抽头及第二抽头,现有方案中第一抽头与发射模组11同步通电,第二抽头在第一抽头后一段时间通电,本实施例中,控制与处理器13可控制第一抽头在现有方案的技术上在延迟曝光时间后才通电,以使第一抽头在延迟曝光时间后开始曝光,如此,在发射模组11发出第二光束投射至目标物,以及目标物反射第二光束至接收模组12的过程中,第一抽头为关闭状态(即未通电状态),不会接收到环境光,大大降低了第一抽头采集的环境光强度,提高第一抽头采集到的光束数据的信噪比。当然,在其他实施例中,控制与处理器13还可以同步控制第二抽头以延迟曝光时间延迟曝光,具体可根据第二抽头的开始曝光时间在延迟曝光时间内或在延迟曝光时间外确定。例如,当第二抽头的开始曝光时间在延迟曝光时间内时,可控制第二抽头延迟曝光,第二抽头的延迟曝光时间可小于第一抽头的延迟曝光时间;当第二抽头的开始曝光时间在延迟曝光时间外时,可控制第二抽头不延迟曝光,以原曝光时间进行曝光。
进一步地,控制与处理器13可具体用于根据延迟曝光时间、发射模组11的发射周期及每个抽头的起始曝光时间,计算每个抽头的曝光时长,然后控制每个抽头按照 对应的曝光时长进行曝光,以使每个抽头采集到的数据信噪比较高,提升深度测量的精度。更具体而言,控制与处理器13可用于计算每个抽头的起始曝光时间与发射模组11的起始发射时间之间的时间差;然后根据每个抽头对应的时间差、延迟曝光时间及发射模组11的占空比,计算每个抽头对应的占空比;再然后根据每个抽头对应的占空比及发射模组的发射周期,计算每个抽头对应的曝光时长。如此,计算得到各个抽头的曝光时长更加精确,并且各个抽头曝光后采集到的数据信噪比更高。
在一个实施例中,如图3所示,图像传感器的每个像素包括第一抽头、第二抽头、第三抽头及第四抽头,第一抽头、第二抽头、第三抽头及第四抽头,分别对应0°相位、180°相位、90°相位及270°相位。在确定延迟曝光时间后,控制与处理器13控制第一抽头和第四抽头均按照延迟曝光时间延迟开始曝光时间,控制与处理器13控制第二抽头与第三抽头不延迟曝光,即第二抽头与第三抽头仍以原开始曝光时间进行曝光。当然,在其他实施例中,控制与处理器13也可控制第二抽头与第三抽头延迟曝光,第二抽头与第三抽头延迟曝光时间可相同或不同,具体可根据延迟曝光时间与第二抽头及第三抽头的开始曝光时间确定。
控制与处理器13具体可以按照以下方式分别计算第一抽头、第二抽头、第三抽头及第四抽头的曝光时长,其中,假设延迟曝光时间为t,发射模组11的发射周期为Tz,发射模组11的占空比为Duty,则发射模组11的脉宽T=Tz*Duty。
第一抽头对应的时间差为延迟曝光时间t 0=t,
Figure PCTCN2022122364-appb-000001
曝光时长t1=Duty 0*Tz;第二抽头对应的时间差为t 180=T,占空比
Figure PCTCN2022122364-appb-000002
曝光时长t2=Duty 180*Tz;第三抽头对应的时间差为
Figure PCTCN2022122364-appb-000003
占空比
Figure PCTCN2022122364-appb-000004
曝光时长t3=Duty 90*Tz;第四抽头对应的时间差为
Figure PCTCN2022122364-appb-000005
曝光时长t4=Duty 270*Tz。
在图3中,浅灰色的区域为激光数据(即,反射回的光速),深灰色的区域为环境光数据。从图3中可以看出,从第二光束出射至目标物,以及反射至接收模组进而被接收模组接收的过程中,即,第二光束的飞行过程中,第一抽头为关闭状态,这个过程中因为第一抽头不接收环境光,所以可以大大降低第一抽头采集的环境光信号强度,提高第一抽头中采集信号的信噪比。
在一个实施例中,如图4所示,发射信号的脉宽T=4ns,占空比Duty=50%,整个周期Tz为8ns,发射信号和反射信号的频率均为f=125MHz。若根据采集的第一光束 数据,计算得到的延迟曝光时间t=1ns,则第一抽头、第二抽头、第三抽头及第四抽头的曝光时长计算如下:
(1)第一抽头对应的时间差为t 0=t=1ns,占空比
Figure PCTCN2022122364-appb-000006
Figure PCTCN2022122364-appb-000007
则对应的曝光时长t1=37.5%*8ns=3ns。
(2)第二抽头对应的时间差为t 180=T=4ns,占空比
Figure PCTCN2022122364-appb-000008
12.5%,则对应的曝光时长t2=12.5%*8ns=1ns。
(3)第三抽头对应的时间差为
Figure PCTCN2022122364-appb-000009
占空比
Figure PCTCN2022122364-appb-000010
Figure PCTCN2022122364-appb-000011
则对应的曝光时长t3=37.5%*8ns=3ns。
(4)第四抽头对应的时间差为
Figure PCTCN2022122364-appb-000012
占空比
Figure PCTCN2022122364-appb-000013
Figure PCTCN2022122364-appb-000014
则对应的曝光时长t4=12.5%*8ns=1ns。
在一些实施例中,当目标物发生移动时,控制与处理器13需重新计算最短飞行时间及重新确定延迟曝光时间,然后再发射模组11发射光束及接收模组12按照延迟曝光时间延迟曝光,对移动后的目标物进行深度测量。可以理解的是,在目标物发生移动后,iTOF深度测量***的最小相位延迟会发生变化,对应的最短飞行时间也会发生变化,因此需要重新确定延迟曝光时间,以使后续深度测量结果更加准确。
在一些实施例中,当目标物体发生轻微移动时,也可以选择不重新采集抽头所需要的最短飞行时间。
在一些实施例中,由于延迟曝光降低了抽头采集环境光信号的占比,使得像素不容易过曝,可以适当增加接收模组12的曝光时间,从而进一步提高信噪比,达到提高户外效果的目的。例如,可以根据延迟曝光时间,计算增加的曝光延长时长,例如在t=0.9t min时,曝光延长时长可以为原曝光时长的0.1倍,即新的曝光时长为原曝光时长的1.1倍;在t=0.8t min时,曝光延长时长可以为原曝光时长的0.2倍,即新的曝光时长为原曝光时长的1.2倍;或者,还可以是其它计算方式,在此不做限制。
在一个实施例中,iTOF深度测量***在获取延迟曝光时间后,可依据上述实施例获取多帧第二光束数据,然后结合多帧第二光束数据计算深度,如此计算得到的深度更加准确。可以理解,一个周期即为一帧。
基于上述实施例中描述的内容,本申请还提供一种深度测量方法,参照图5,图5为本申请实施例中提供的一种深度测量方法的步骤流程示意图。
在一种可行的实施方式中,上述深度测量方法包括以下步骤:
S501、控制发射模组向目标物发射第一光束,及同步控制接收模组曝光,以采集所述目标物反射回的第一光束,得到第一光束数据。
S502、根据第一光束数据计算第一光束的最短飞行时间,并根据最短飞行时间确定接收模组的延迟曝光时间。
S503、控制发射模组向目标物发射第二光束,及控制接收模组在延迟曝光时间后开始曝光,以采集目标物反射的第二光束,得到第二光束数据,并根据第二光束数据计算目标物的深度距离。
本申请实施例所提供的深度测量方法,先计算第一光束从发射到被接收的最短飞行时间,然后根据最短飞行时间确定接收模组的延迟曝光时间,再控制发射模组发射第二光束,并控制接收模组在延迟曝光时间后开始曝光,然后再根据延迟曝光后采集到的第二光束数据计算深度,由此,避免了在光束飞行过程中接收模组采集到过多的环境光,接收模组采集到第二光束数据中环境光相对较少,进而提高了第二光束数据的信噪比,从而能够有效提升iTOF深度测量***在户外使用场景下的深度测量精度。
在一种可行的实施方式中,上述深度测量方法可以应用于上述实施例所述的iTOF深度测量***。可选的,上述深度测量方法可以由上述控制与处理器13执行。
在一种可行的实施方式中,上述根据第一光束数据计算第一光束的最短飞行时间,包括:根据第一光束数据,计算第一光束从发射到被接收之间的最小相位延迟;根据最小相位延迟,计算最短测量距离;根据最短测量距离,计算最短飞行时间。
在一种可行的实施方式中,上述步骤,根据最短飞行时间确定接收模组的延迟曝光时间,包括:获取多个预设延迟曝光时间;从多个预设延迟曝光时间中,选取小于且最接近于最短飞行时间的预设延迟曝光时间作为延迟曝光时间。
在一种可行的实施方式中,上述步骤,根据最短飞行时间确定接收模组的延迟曝光时间,包括:获取预先配置的延迟曝光时间与最短飞行时间之间的映射关系;根据映射关系及最短飞行时间,计算延迟曝光时间。
在一种可行的实施方式中,上述接收模组包括图像传感器,该图像传感器包括至少两个抽头,不同的抽头的开始曝光时间不同。上述步骤,控制接收模组延迟延迟曝光时间曝光,包括:控制第一抽头与第四抽头延迟延迟曝光时间后开始曝光,第二抽头与第三抽头不延迟曝光。
在一种可行的实施方式中,上述方法还包括以下步骤:根据延迟曝光时间、发射模组的发射周期及每个抽头的起始曝光时间,计算每个抽头的曝光时长,并控制每个 抽头按照对应的曝光时长进行曝光。
在一种可行的实施方式中,上述步骤,根据延迟曝光时间、发射模组的发射周期及每个抽头的起始曝光时间,计算每个抽头的曝光时长,包括以下步骤:计算每个抽头的起始曝光时间与发射模组的起始发射时间之间的时间差;根据每个抽头对应的时间差、延迟曝光时间及发射模组的占空比,计算每个抽头对应的占空比;根据每个抽头对应的占空比及发射模组的发射周期,计算每个抽头对应的曝光时长。
在一些实施例中,方法还包括以下步骤:根据延迟曝光时间,计算接收模组的曝光延长时长;控制接收模组按照曝光延长时长延长曝光时间。
上述iTOF深度测量***及深度测量方法,尤其适用于户外静态或者低帧率的应用场景。可以理解的是,在户外静态场景下,目标物体不易发生移动或者移动距离非常小,并且户外场景下环境光较强,最小相位延迟不易发生变化,本申请通过降低采集到的环境光提升信噪比更加明显;而在户外动态场景下,目标物体经常移动,最小相位延迟经常发生变化,则经常需要调整延迟曝光时间,应用起来非常不方便,并且存在较大的误差;低帧率的应用场景刷新频率低,延迟曝光对成像效率影响较小,而高帧率的应用场景刷新频率较高,延迟曝光对成像效率影响较大。
进一步的,基于上述实施例中所描述的内容,本申请实施例中还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有计算机执行指令,当计算机执行所述计算机执行指令时,以实现如上述实施例中控制与处理器执行的各个步骤。
应当理解的是,在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。例如,以上所描述的设备实施例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或模块的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理单元中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个单元中。上述模块 成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
上述以软件功能模块的形式实现的集成的模块,可以存储在一个计算机可读取存储介质中。上述软件功能模块存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(英文:processor)执行本申请各个实施例所述方法的部分步骤。
应理解,上述处理器可以是中央处理单元(英文:Central Processing Unit,简称:CPU),还可以是其他通用处理器、数字信号处理器(英文:Digital Signal Processor,简称:DSP)、专用集成电路(英文:Application Specific Integrated Circuit,简称:ASIC)等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合申请所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
上述存储介质可以是由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。存储介质可以是通用或专用计算机能够存取的任何可用介质。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可读取存储介质中。该程序在执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (10)

  1. 一种iTOF深度测量***,其特征在于,包括:
    发射模组,用于向目标物发射光束;
    接收模组,用于采集所述目标物反射回的光束;
    控制与处理器,用于:
    控制所述发射模组向所述目标物发射第一光束及同步控制所述接收模组曝光,以采集所述目标物反射回的所述第一光束,得到第一光束数据;
    根据所述第一光束数据计算所述第一光束的最短飞行时间,并根据所述最短飞行时间确定所述接收模组的延迟曝光时间;
    控制所述发射模组向所述目标物发射第二光束,并控制所述接收模组在所述延迟曝光时间后开始曝光,以采集所述目标物反射的所述第二光束,得到第二光束数据;
    根据所述第二光束数据计算所述目标物的深度距离。
  2. 根据权利要求1所述的iTOF深度测量***,其特征在于,所述控制与处理器具体用于:
    根据所述第一光束数据,计算所述第一光束从发射到被接收之间的最小相位延迟;
    根据所述最小相位延迟,计算最短测量距离;
    根据所述最短测量距离,计算所述最短飞行时间。
  3. 根据权利要求1或2所述的iTOF深度测量***,其特征在于,所述控制与处理器具体用于:
    获取多个预设延迟曝光时间;
    从所述多个预设延迟曝光时间中,选取小于且最接近于所述最短飞行时间的预设延迟曝光时间作为所述延迟曝光时间。
  4. 根据权利要求1或2所述的iTOF深度测量***,其特征在于,所述控制与处理器具体用于:
    获取预先配置的所述延迟曝光时间与所述最短飞行时间之间的映射关系;
    根据所述映射关系及所述最短飞行时间,计算所述延迟曝光时间。
  5. 根据权利要求1所述的iTOF深度测量***,其特征在于,所述接收模组包括图像传感器,所述图像传感器包括至少两个抽头,不同的所述抽头的开始曝光时间不同;
    所述控制与处理器还用于:控制至少一个所述抽头延迟所述延迟曝光时间曝光。
  6. 根据权利要求5所述的iTOF深度测量***,其特征在于,所述图像传感器包括分别对应0°相位、180°相位、90°相位及270°相位的第一抽头、第二抽头、第三抽头及第四抽头,所述控制与处理器用于控制所述第一抽头与所述第四抽头延迟所述延迟曝光时间后开始曝光,所述第二抽头与所述第三抽头不延迟曝光。
  7. 根据权利要求5所述的iTOF深度测量***,其特征在于,所述控制与处理器还用于:
    根据所述延迟曝光时间、所述发射模组的发射周期及每个所述抽头的起始曝光时间,计算每个所述抽头的曝光时长,并控制每个所述抽头按照对应的所述曝光时长进行曝光。
  8. 根据权利要求7所述的iTOF深度测量***,其特征在于,所述控制与处理器具体用于:
    计算每个所述抽头的起始曝光时间与所述发射模组的起始发射时间之间的时间差;
    根据每个所述抽头对应的所述时间差、所述延迟曝光时间及所述发射模组的占空比,计算每个所述抽头对应的占空比;
    根据每个所述抽头对应的占空比及所述发射模组的发射周期,计算每个所述抽头对应的曝光时长。
  9. 根据权利要求1所述的iTOF深度测量***,其特征在于,所述控制与处理器还用于:
    根据所述延迟曝光时间,计算所述接收模组的曝光延长时长;
    控制所述接收模组按照所述曝光延长时长,延长曝光时间。
  10. 一种深度测量方法,其特征在于,所述方法包括:
    控制发射模组向目标物发射第一光束,及同步控制接收模组曝光,以采集所述目标物反射回的所述第一光束,得到第一光束数据;
    根据所述第一光束数据计算所述第一光束的最短飞行时间,并根据所述最短飞行时间确定所述接收模组的延迟曝光时间;
    控制所述发射模组向所述目标物发射第二光束,及控制所述接收模组延迟所述延迟曝光时间后开始曝光,以采集所述目标物反射回的第二光束,得到第二光束数据;
    根据所述第二光束数据计算所述目标物的深度距离。
PCT/CN2022/122364 2022-09-08 2022-09-29 iTOF深度测量***及深度测量方法 WO2024050895A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211096222.9 2022-09-08
CN202211096222.9A CN115524714A (zh) 2022-09-08 2022-09-08 iTOF深度测量***及深度测量方法

Publications (1)

Publication Number Publication Date
WO2024050895A1 true WO2024050895A1 (zh) 2024-03-14

Family

ID=84698257

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/122364 WO2024050895A1 (zh) 2022-09-08 2022-09-29 iTOF深度测量***及深度测量方法

Country Status (2)

Country Link
CN (1) CN115524714A (zh)
WO (1) WO2024050895A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108259744A (zh) * 2018-01-24 2018-07-06 北京图森未来科技有限公司 图像采集控制方法及其装置、图像采集***和tof相机
US10031229B1 (en) * 2014-12-15 2018-07-24 Rockwell Collins, Inc. Object designator system and method
CN108781259A (zh) * 2017-07-31 2018-11-09 深圳市大疆创新科技有限公司 一种图像拍摄的控制方法、控制装置及控制***
CN109996008A (zh) * 2019-03-18 2019-07-09 深圳奥比中光科技有限公司 一种降低多深度相机***间干扰的方法、装置及设备
CN111025315A (zh) * 2019-11-28 2020-04-17 深圳奥比中光科技有限公司 一种深度测量***及方法
CN114693793A (zh) * 2020-12-25 2022-07-01 瑞芯微电子股份有限公司 一种标定方法、控制方法、介质、结构光模组及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10031229B1 (en) * 2014-12-15 2018-07-24 Rockwell Collins, Inc. Object designator system and method
CN108781259A (zh) * 2017-07-31 2018-11-09 深圳市大疆创新科技有限公司 一种图像拍摄的控制方法、控制装置及控制***
CN108259744A (zh) * 2018-01-24 2018-07-06 北京图森未来科技有限公司 图像采集控制方法及其装置、图像采集***和tof相机
CN109996008A (zh) * 2019-03-18 2019-07-09 深圳奥比中光科技有限公司 一种降低多深度相机***间干扰的方法、装置及设备
CN111025315A (zh) * 2019-11-28 2020-04-17 深圳奥比中光科技有限公司 一种深度测量***及方法
CN114693793A (zh) * 2020-12-25 2022-07-01 瑞芯微电子股份有限公司 一种标定方法、控制方法、介质、结构光模组及电子设备

Also Published As

Publication number Publication date
CN115524714A (zh) 2022-12-27

Similar Documents

Publication Publication Date Title
WO2021051477A1 (zh) 一种直方图可调的飞行时间距离测量***及测量方法
US10630884B2 (en) Camera focusing method, apparatus, and device for terminal
WO2021051478A1 (zh) 一种双重共享tdc电路的飞行时间距离测量***及测量方法
WO2021051479A1 (zh) 一种基于插值的飞行时间测量方法及测量***
JP3797543B2 (ja) 自動焦点調節装置
CN110546530B (zh) 一种像素结构
WO2021051480A1 (zh) 一种动态直方图绘制飞行时间距离测量方法及测量***
JP2021520154A (ja) 画像処理方法、コンピュータ可読記憶媒体、および電子機器
US20190147624A1 (en) Method for Processing a Raw Image of a Time-of-Flight Camera, Image Processing Apparatus and Computer Program
JP6304567B2 (ja) 測距装置及び測距方法
WO2016000330A1 (zh) 一种焦距调节方法、装置和终端、计算机存储介质
EP3308193A1 (en) Time-of-flight (tof) system calibration
TWI780462B (zh) 距離影像攝像裝置及距離影像攝像方法
WO2021136078A1 (zh) 图像处理方法、图像处理***、计算机可读介质和电子设备
JP7094937B2 (ja) 飛行時間型深度画像化システムの内蔵較正
WO2022166723A1 (zh) 深度测量方法、芯片和电子设备
CN112037295B (zh) 一种事件型ToF相机编解码方法、装置、介质及设备
JP2010190675A (ja) 距離画像センサシステムおよび距離画像生成方法
WO2021058016A1 (zh) 激光雷达及生成激光点云数据的方法
WO2022188884A1 (zh) 一种距离测量方法、***及装置
WO2024050895A1 (zh) iTOF深度测量***及深度测量方法
WO2023279621A1 (zh) 一种itof测距***及计算被测物反射率的方法
WO2022160622A1 (zh) 一种距离测量方法、装置及***
WO2022188885A1 (zh) 飞行时间测量方法、装置及时间飞行深度相机
JP2006065355A (ja) 自動焦点調節装置及び方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22957863

Country of ref document: EP

Kind code of ref document: A1