WO2014101408A1 - 基于多次积分的三维成像雷达***及方法 - Google Patents

基于多次积分的三维成像雷达***及方法 Download PDF

Info

Publication number
WO2014101408A1
WO2014101408A1 PCT/CN2013/080416 CN2013080416W WO2014101408A1 WO 2014101408 A1 WO2014101408 A1 WO 2014101408A1 CN 2013080416 W CN2013080416 W CN 2013080416W WO 2014101408 A1 WO2014101408 A1 WO 2014101408A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image sensor
image
frame
pulse
Prior art date
Application number
PCT/CN2013/080416
Other languages
English (en)
French (fr)
Inventor
符建
张秀达
吕俊
谷颖杰
Original Assignee
Fu Jian
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fu Jian filed Critical Fu Jian
Priority to EP13868805.6A priority Critical patent/EP2975428B1/en
Priority to US15/039,453 priority patent/US9958547B2/en
Publication of WO2014101408A1 publication Critical patent/WO2014101408A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/18Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out

Definitions

  • the present invention relates to an imaging radar system, and more particularly to a three-dimensional imaging radar system and method based on flight spectrum.
  • 3D imaging radar technology can be widely used in various fields, such as safety collision safety systems for automobiles, high-speed highway camera speed measurement systems, ranging telescopes, machine vision, etc.
  • the 3D imaging radar is an imaging system capable of ranging capability, and the system consists of transmitting, receiving, and information processing.
  • the principle of radar ranging can be divided into three methods: flight time measurement, phase difference measurement and triangulation.
  • the first method is the time-of-flight method, which uses a pulsed light source to calculate the distance of the target object by measuring the time difference between the emitted light pulse and the light pulse received by the target.
  • This method can achieve very high precision, generally achieving centimeter-level accuracy within a few kilometers, but to achieve high-resolution three-dimensional imaging, it must be scanned point by point, this method is currently the most commonly used laser imaging radar, This method has a very slow imaging speed and a very poor imaging resolution.
  • Another method is to use an area array detector with pulse detection and time counting capability in each unit. For example, an enhanced CCD ( ICCD ) with high-speed modulation is added in front of the imaging device.
  • ICCD enhanced CCD
  • no scanning laser 3D radar is basically With this method, the measurement accuracy of this method is limited by the shape of the light pulse, the imaging resolution is limited by the image intensifier, and the cost is very expensive, and it is currently only used for military and defense purposes.
  • phase measurement which modulates the light source and uses the phase difference between the reflected light and the reference oscillation to obtain the distance of the target object. Since the phase has a limit of 2 scoops, this method has a limitation on the measurement distance, the measurement distance is only tens of meters, and the measurement accuracy is not high.
  • ICCD enhanced CCD
  • the third method is the triangulation method, which calculates the distance of the object from the light source by means of the structured light source on the target object and the triangular relationship of its imaging.
  • this method has high ranging accuracy, it is suitable for measuring distances and is often used in precision mold manufacturing, integrated circuits, and SMT board inspection.
  • it is also a method that uses different color coding to project color structure light in a two-dimensional space for three-dimensional imaging.
  • the above radar ranging method can only obtain the distance information of a single point. If the object to be measured is to be three-dimensionally imaged, it must be sampled point by point, or the area array type detector is used for parallel data acquisition.
  • the various laser radar sensors available: such as scanning laser radar Lower requirements, longer working distance, but higher requirements for scanning mechanism, lower frame rate and poor real-time performance; while area array laser radar has good real-time performance, high-resolution imaging requires large area array devices, devices. The cost and development difficulty are very high. These lidars require nanosecond-level light sources or fast-responding detectors.
  • the invention utilizes a relatively inexpensive LED/laser source and a conventional CCD or CMOS area array detector for 3D radar imaging.
  • a three-dimensional imaging radar system based on multiple integration comprising: an LED light source, an optical band pass filter, an image sensor, an electronic shutter, a data processor and a display terminal;
  • the optical band pass filter and the electronic shutter are both fixed on the image sensor, and the LED light source and the image sensor are both connected to the data processor, and the data processor is connected to the display terminal;
  • the LED light source generates a series of optical pulse trains.
  • the LED light source is One or more pulses capable of generating microsecond, nanosecond light pulses Source composition, said pulse light source is an LED or a laser.
  • a three-dimensional imaging method based on multiple integrations using the above system includes the following steps: The following steps are included:
  • the LED light source generates a light pulse.
  • the image sensor continuously collects three light waves reflected by the pulsed light at the same position, and the acquisition time interval is r, and the image is obtained three frames before and after.
  • the number of pixels in the first two frames is lower, and the number of pixels in the latter frame is higher;
  • I (S) fx (t - 2S / C)g ⁇ t)dt;
  • I (S') f +T x (t - 2S' IC)g ⁇ t)dt
  • S represents the distance of the previous frame, represents the distance of the next frame
  • C represents the speed of light
  • t represents time
  • (i represents The optical pulse waveform of the previous frame of the wavelength
  • c represents the weight
  • (i - 25V) represents the optical pulse waveform of the next frame of the wavelength
  • g(t) represents the waveform of the electronic shutter
  • represents the integration time
  • the collected light waves are integrated into the light intensity.
  • three integral curves can be obtained, and since the CCD collected light waves are convolutions of the light intensity in the time domain, each integral curve has a vertex, and three points can be determined.
  • a quadratic curve through which the vertices of the curve can be obtained, and the abscissa of the vertex indicates the time when the signal light travels back, that is, the distance information point of the object;
  • the data processor combines the distance information point of the object obtained by the step (3) and the contour information of the image of the third frame to obtain the specific distance information of the object, and finally displays it on the display terminal 6.
  • the invention has the beneficial effects that the working process of the image sensor using the image sensor is a light intensity integration process, and the image of the image sensor in different time is obtained in different time domains, and the phase of the two integral maps is obtained.
  • the position of all the reflective objects in the entire depth of field is obtained by successive three exposures, which reduces the difficulty of data processing and obtains a three-dimensional image with stronger anti-interference and higher precision.
  • the invention can realize the three-dimensional imaging radar by using an ordinary LED light source and a common CCD or CMOS area array, which not only greatly reduces the system cost, but also can Achieving high-speed, high-resolution 3D imaging, it is possible to create new applications for 3D imaging radar in areas such as anti-collision and 3D terrain mapping in vehicles and helicopters.
  • FIG. 1 is a schematic diagram of a principle of a three-dimensional imaging radar system based on multiple integrations of the present invention
  • Figure 2 is a schematic diagram showing the principle of single-shot imaging detection using a time-delayed double-sided array CCD or CMOS time-domain convolution imaging signal;
  • LED light source 1 optical band pass filter 2, image sensor 3, electronic shutter 4, data processor 5, display terminal 6, beam splitter 7.
  • the present invention is based on a multi-integration three-dimensional imaging radar system, which comprises: an LED light source 1, an optical band pass filter 2, an image sensor 3, an electronic shutter 4, a data processor 5, and a display terminal 6;
  • the optical band pass filter 2 and the electronic shutter 4 are both fixed on the image sensor 3, and the LED light source 1 and the image sensor 3 are both connected to the data processor 5, and the data processor 5 is connected to the display terminal 6;
  • the reflected light is detected by the image sensor 3 through the optical band pass filter 2, and an image is formed on the image sensor 3;
  • the image sensor is imaged by three consecutive exposures under the action of the electronic shutter, and the contour information and distance information of the object can be obtained by analyzing the three images by the data processor 5,
  • the LED light source 1 is composed of one or more pulsed light sources capable of generating microsecond nanosecond order light pulses, which are LEDs or lasers.
  • the optical band pass filter 2 is an optical device which is disposed on the image sensor 3 and allows only a certain wavelength range of light to pass therethrough, such as a color CMOS or an RGB filter disposed in front of the CCD.
  • the image sensor 3 is an image sensing CMOS or CCD device in which the exposure time is controlled by the electronic shutter 4 and the frame rate can be 60 frames or more.
  • the electronic shutter 4 is an electronic device or component that is disposed on the image sensor 3 and can control the global exposure time to 20 microseconds or less.
  • the data processor 5 can be implemented by a single chip microcomputer, an embedded system or a PC.
  • the working process of the data processor 5 is as follows:
  • the data processor 5 issues an electrical signal that controls the LED light source 1 to generate a light pulse
  • the data processor 5 issues a sync pulse to control the electronic shutter 4 to achieve the purpose of controlling the exposure time and controlling the delay interval of the image sensor exposure;
  • the image sensor 3 transmits the acquired image data to the data processor 5;
  • the data processor 5 processes the acquired image signal while outputting the image to the display 6;
  • the invention is based on a three-dimensional imaging method of multiple integrations, comprising the following steps:
  • the LED light source 1 generates a light pulse.
  • the image sensor 3 continuously collects three light waves reflected by the pulsed light at the same position, and the acquisition time interval is ⁇ , and the front and rear three frames are obtained. An image in which the number of pixels of the first two frames is lower, and the number of pixels of the latter frame is higher;
  • the third frame image By fitting the third frame image into the same pixel image as the first two frames by pixel merging, the high pixel and low pixel images of the third frame image can be obtained, wherein the high pixel image contains higher pixels due to higher pixels. More object outline information.
  • the intensity of light generated in two frames of light acquired with wavelength A is varied with distance:
  • S the distance from the previous frame , indicating the distance of the next frame
  • C the speed of light
  • t the time
  • (i the light pulse waveform of the previous frame of the wavelength
  • c the weight
  • (t - 25 i7) is the light pulse waveform of the next frame of the wavelength.
  • g(t) represents the waveform of the electronic shutter
  • represents the integration time.
  • the data processor 5 can obtain the specific distance information of the object by combining the distance information point of the object obtained in step 3 and the contour information of the third frame image, and finally displayed on the display terminal 6.
  • Figure 2 shows a schematic diagram of a single imaging distance detection using a time-delay double-sided array CCD or CMOS time-domain convolution imaging signal.
  • the LED light source 1 generates a light pulse of wavelength ⁇
  • the pulse width of the pulse is ⁇ .
  • the delay time of the electronic shutter 4 is ⁇ gate width is r 3
  • the image sensor 3 exposure interval is r
  • the target light intensity obtained by the image sensor 3 is the integral of the overlapping portion of the pulse echo and the electronic shutter door width, and the intensity and distance exist as The corresponding relationship shown in the figure.
  • the image sensor 3 obtains three sets of convolution waveforms, which are determined by the position and size of the vertices of the three sets of convolution waveforms. The distance information of this pixel of the object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

本发明公开一种基于多次积分的三维成像雷达***及方法,它包括:LED光源、光带通滤波器、图像传感器、电子快门、数据处理器和显示终端;LED光源产生一系列光脉冲串,这些光脉冲串照射在物体上,物体会依次将光反射回来;反射回来的光经过光带通滤波器被图像传感器感光,在图像传感器上形成图像;对同一脉冲的反射光,图像传感器在电子快门的作用下在对其连续三次曝光成像,通过数据处理器对这三幅幅图像的分析就能获得物体的轮廓信息和距离信息,最后通过显示终端将结果显示出来;本发明能够实现中近距离的高速、高分辨率的三维雷达成像,而且成本低廉。

Description

基于多次积分的三维成像雷达***及方法
技术领域
本发明涉及一种成像雷达***, 尤其涉及一种基于飞行光谱的三维成像雷 达***及方法。
背景技术
三维成像雷达技术可广泛应用于各领域, 如汽车的安全防撞安全***、 高 速公路照相测速***、 测距望远镜、 机器视觉等。 三维成像雷达是一种能具有 测距能力的成像***, ***由发射、 接收和信息处理等部分组成。 目前雷达测 距的原理可概分为飞行时间测量、 相位差测量和三角测量等三种方法。
第一种方法是飞行时间法, 这种方法使用脉冲光源, 借由测量发射光脉冲 到接收到目标反射的光脉冲的时间差来计算目标物体的距离。 这种方法可以达 到很到的精度, 一般在数公里的范围内达到厘米级的精度, 但如要实现高分辨 三维成像, 必须要逐点扫描, 这种方法是目前最常用的激光成像雷达, 这种方 法成像速度非常慢、 成像分辨率非常差。 另外一种方法是使用每一单元都具有 脉冲探测和时间计数能力的面阵探测器件, 例如在成像器件前加入具有高速调 制功能的增强 CCD ( ICCD ) , 目前无扫描激光三维雷达基本上都是采用这种方法, 这种方法测量精度受限于光脉冲形状、 成像分辨率受限于像增强器、 而且造价 非常昂贵, 目前尚只能用于军事和国防用途。
另外一种方法是相位测量法, 这种方法是通过调制光源, 利用反射光与参 考振荡之间的相位差来获得目标物体的距离。 由于相位有 2 勺局限, 这种方法 有测量距离上的限制, 测量距离只有数十米, 测量精度也不高。 目前也有借助 增强 CCD ( ICCD ) 实现面阵相位测量法的雷达***。
第三种方法是三角测距法, 这种方法是借助结构光源在目标物体上光点及 其成像的三角关系计算出物体离光源的距离。 这种方法的虽然测距精度高, 但 是适用的测量距离更短, 常用于精密模具制造、 集成电路、 SMT 电路板检测等 场合。 目前也有用不同颜色编码在二维空间中投射出彩色结构光进行三维成像 的, 都属于这种方法。
上述的雷达测距方法只能得到单点的距离信息, 如要对被测物体进行三维 成像, 必须逐点采样, 或者使用面阵型的探测器件进行并行数据采集。 现有的 各种激光雷达传感器存在一些缺点和不足: 如扫描型激光雷达虽然对器件的要 求较低, 工作距离远, 但对扫描机构的要求较高, 且帧速率较低, 实时性较差; 而面阵型激光雷达虽然实时性好, 但高分辨率成像需要大面阵器件, 器件的成 本和研制难度都很高。 这些激光雷达都需要纳秒级的光源或者快速响应的探测 器。
近年来在 Optics Letters上报道了法国科学家使用微秒激光脉冲和高速 CCD 相机基于强度积分实现三维成像的技术(OPTICS LETTERS, Vol. 32, 3146-3148 , 2007)。这种方法成本远低于其他面阵型技术,但由于采用了激光器灯昂贵器件, ***总体成本依然较高, 且探测距离和精度受到有较大的局限。
发明内容
本发明的目的在于针对现有技术的局限和不足, 提供一种基于多次积分的 三维成像雷达***及方法。 本发明利用相对低廉的 LED/激光光源和普通 CCD或 CMOS面阵探测器实现三维雷达成像。
本发明的目的是通过以下技术方案来实现的: 一种基于多次积分的三维成 像雷达***, 它包括: LED光源、 光带通滤波器、 图像传感器、 电子快门、 数据 处理器和显示终端; 其中, 所述光带通滤波器和电子快门均固定在图像传感器 上, LED光源和图像传感器均与数据处理器相连, 数据处理器和显示终端相连; 所述 LED光源产生一系列光脉冲串, 这些光脉冲串照射在物体上, 物体会依次 将光反射回来; 反射回来的光经过光带通滤波器被图像传感器感光, 在图像传 感器上形成图像; 对同一脉冲的反射光, 图像传感器在电子快门的作用下在对 其连续三次曝光成像, 通过数据处理器对这三幅幅图像的分析就能获得物体的 轮廓信息和距离信息, 最后通过显示终端将结果显示出来; 所述 LED光源是由 能产生微秒、 纳秒级光脉冲的一个或多个脉冲光源组成, 所述脉冲光源为 LED 或者激光器。
一种应用上述***的基于多次积分的三维成像方法, 包括以下步骤: 包括以下步骤:
( 1 ) LED光源产生一个光脉冲, 在电子快门控制的曝光时间内, 图像传感器连 续采集三个由这个脉冲光在同一位置物体反射回来的光波, 采集时间间隔为 r, 得到前后三帧图像, 其中前两帧图像像素数较低, 后一帧图像像素较数高;
( 2 ) 通过像素合并的方式将第三帧图像拟合成与前两帧图像相同像素的图像, 可得第三帧图像的高像素与低像素图像, 其中高像素图像由于像素较高包含了 较多的物体轮廓信息;
( 3 )通过数据处理器对前两帧图像和第三帧图像的低像素图像进行处理, 得到 物体的距离信息点; 波长为 A的光采集的两帧图像中产生的光强随距离的变化分别满足:
I (S) = fx (t - 2S / C)g{t)dt;
I (S') = f+T x (t - 2S' I C)g{t)dt 其中, S表示前一帧的距离, 表示后一帧的距离, C表示光速, t表示时 间, (i 表示波长 的前一帧的光脉冲波形, c表示权重, (i - 25V )表 示波长 的后一帧的光脉冲波形, g(t)表示电子快门的波形, Γ表示积分时间; 由上两式: 可知对于同一距离物体的反射光在图像传感器上的光强积分时 间是有长短的, 积分时间长的光强积分就大, 积分时间短的光强积分就小; 图 像传感器三次曝光都能够对所采集的光波进行光强积分, 对于同一像素点可以 得到三条积分曲线图, 且由于 CCD采集光波是光强在时域上的卷积, 每个积分 曲线都有一个顶点, 三个点就能确定一条二次曲线, 通过该曲线可以求出曲线 的顶点, 该顶点的横坐标就表示信号光往返的时间, 即是物体的距离信息点;
(4 ) 数据处理器结合步骤 (3 ) 获得的物体的距离信息点和第三帧图像的轮廓 信息即可得到物体具体距离信息, 并最终在显示终端 6上显示出来。
本发明的有益效果是, 本发明利用图像传感器感光器件的工作过程是一个 光强积分过程, 通过图像传感器在不同时间内的感光会得到在不同时间域内的 积分图, 通过两个积分图的相除比较可以获取目标的距离及深度信息, 采用连 续三次曝光获得整个景深范围内所有反射物体的位置分, 减少了数据处理的难 度, 获得抗干扰性更强、 精度更高的三维图像。 与传统三维激光雷达对于光源 和探测器要求达到纳秒级速度的要求不同, 本发明可以采用普通的 LED光源和 普通 CCD或 CMOS面阵来实现三维成像雷达, 不仅大大降低了***成本, 而且 可以实现高速、 高分辨率三维成像, 有可能开创三维成像雷达在汽车、 直升飞 机等交通工具防撞、 三维地形测绘等领域的新应用。
附图说明
图 1是本发明基于多次积分的三维成像雷达***原理示意图;
图 2是利用延时双面阵 CCD或 CMOS时域卷积成像信号实现单次成像距离 探测的原理示意图;
图中: LED光源 1、 光带通滤波器 2、 图像传感器 3、 电子快门 4、 数据处理 器 5、 显示终端 6、 分光镜 7。
具体实施方式
下面结合附图详细描述本发明, 本发明的目的和效果将变得更加明显。 如图 1所示, 本发明基于多次积分的三维成像雷达***, 它包括: LED光源 1、 光带通滤波器 2、 图像传感器 3、 电子快门 4、 数据处理器 5和显示终端 6; 其中, 所述光带通滤波器 2和电子快门 4均固定在图像传感器 3上, LED光源 1 和图像传感器 3均与数据处理器 5相连, 数据处理器 5和显示终端 6相连; 所 述 LED光源 1产生一系列光脉冲串, 这些光脉冲串照射在物体上, 物体会依次 将光反射回来; 反射回来的光经过光带通滤波器 2被图像传感器 3感光, 在图 像传感器 3上形成图像; 对同一脉冲的反射光, 图像传感器在电子快门的作用 下在对其连续三次曝光成像, 通过数据处理器 5对这三幅幅图像的分析就能获 得物体的轮廓信息和距离信息, 最后通过显示终端 6将结果显示出来
LED光源 1是由能产生微秒纳秒级光脉冲的一个或多个脉冲光源组成,所述 脉冲光源为 LED或者激光器。
光带通滤波器 2是一种设置在图像传感器 3上、 只容许某一设定波长范围 光通过的光学器件, 如彩色 CMOS或 CCD前设置的 RGB滤光片。
图像传感器 3是一种由电子快门 4控制曝光时间、 帧频能达到 60帧以上的 图像传感 CMOS或者 CCD器件。
电子快门 4是设置在图像传感器 3上, 能将全局曝光时间控制在 20微秒以 下的电子装置或部件。
数据处理器 5可以由单片机、 嵌入式***或 PC机来实现, 数据处理器 5的 工作过程如下:
( 1 ) 数据处理器 5发出控制 LED光源 1产生光脉冲的电信号;
(2 ) 数据处理器 5在精确控制延时后, 发出同步脉冲控制电子快门 4, 达到控 制曝光时间的目的并控制图像传感器曝光的延时间隔;
(3 ) 图像传感器 3将采集的图像数据传输至数据处理器 5;
(4) 数据处理器 5对获取的图像信号进行处理, 同时输出图像到显示器 6;
(5 ) 准备下一光脉冲和下一帧图像的采集。
本发明基于多次积分的三维成像方法, 包括以下步骤:
1、 LED光源 1产生一个光脉冲, 在电子快门 4控制的曝光时间内, 图像传感器 3连续采集三个由这个脉冲光在同一位置物体反射回来的光波,采集时间间隔为 τ , 得到前后三帧图像, 其中前两帧图像像素数较低, 后一帧图像像素较数高;
2、 通过像素合并的方式将第三帧图像拟合成与前两帧图像相同像素的图像, 可 得第三帧图像的高像素与低像素图像, 其中高像素图像由于像素较高包含了较 多的物体轮廓信息。
3、 通过数据处理器 5对前两帧图像和第三帧图像的低像素图像进行处理, 得到 物体的距离信息点。
波长为 A的光采集的两帧图像中产生的光强随距离的变化分别满足:
I (S) = fx (t - 2S / C)g{t)dt; I (S') = f+T x (t - 2S' I C)g{t)dt 其中, S表示前一帧的距离, 表示后一帧的距离, C表示光速, t表示时 间, (i 表示波长 的前一帧的光脉冲波形, c表示权重, (t - 25 i7)表 示波长 的后一帧的光脉冲波形, g(t)表示电子快门的波形, Γ表示积分时间。
由上两式: 可知对于同一距离物体的反射光在图像传感器 3上的光强积分 时间是有长短的, 积分时间长的光强积分就大, 积分时间短的光强积分就小; 图像传感器 3三次曝光都能够对所采集的光波进行光强积分, 对于同一像素点 可以得到三条积分曲线图, 且由于 CCD采集光波是光强在时域上的卷积, 每个 积分曲线都有一个顶点, 三个点就能确定一条二次曲线, 通过该曲线可以求出 曲线的顶点, 该顶点的横坐标就表示信号光往返的时间, 即是物体的距离信息 点。
4、 数据处理器 5结合步骤 3获得的物体的距离信息点和第三帧图像的轮廓信息 即可得到物体具体距离信息, 并最终在显示终端 6上显示出来。
图 2给出利用延时双面阵 CCD或 CMOS时域卷积成像信号实现单次成像距 离探测的原理示意图。 图中 LED光源 1产生波长 ^的光脉冲, 脉冲的脉宽为 η。 电子快门 4的延迟时间为^ 门宽为 r3, 图像传感器 3曝光间隔为 r, 图像传感器 3 获得的目标光强度是脉冲回波与电子快门门宽相重叠部分的积分, 强度和距离 存在如图所示的对应关系。 选择合适的波长和 η、 r2、 r3、 ^参数, 通过电子快 门的时间控制, 图像传感器 3获得了前后三组卷积波形图, 通过三组卷积波形 图顶点的位置及大小来确定物体这个像素点的距离信息。

Claims

权 利 要 求 书
1、一种基于多次积分的三维成像雷达***,其特征在于,它包括: LED光源(1)、 光带通滤波器 (2)、 图像传感器 (3)、 电子快门 (4)、 数据处理器 (5) 和显示 终端 (6); 其中, 所述光带通滤波器 (2) 和电子快门 (4) 均固定在图像传感 器 (3) 上, LED光源 (1) 和图像传感器 (3) 均与数据处理器 (5)相连, 数据 处理器 (5) 和显示终端 (6) 相连; 所述 LED光源 (1) 产生一系列光脉冲串, 这些光脉冲串照射在物体上, 物体会依次将光反射回来; 反射回来的光经过光 带通滤波器 (2) 被图像传感器 (3) 感光, 在图像传感器 (3) 上形成图像; 对 同一脉冲的反射光, 图像传感器在电子快门的作用下在对其连续三次曝光成像, 通过数据处理器 (5)对这三幅幅图像的分析就能获得物体的轮廓信息和距离信 息, 最后通过显示终端 (6) 将结果显示出来; 所述 LED光源 (1) 是由能产生 微秒纳秒级光脉冲的一个或多个脉冲光源组成, 所述脉冲光源为 LED或者激光 器。
2、一种应用权利要求 1所述***的基于多次积分的三维成像方法,其特征在于, 包括以下步骤:
(1) LED光源 (1) 产生一个光脉冲, 在电子快门 (4) 控制的曝光时间内, 图 像传感器 (3)连续采集三个由这个脉冲光在同一位置物体反射回来的光波, 采 集时间间隔为 r, 得到前后三帧图像, 其中前两帧图像像素数较低, 后一帧图像 像素较高;
(2) 通过像素合并的方式将第三帧图像拟合成与前两帧图像相同像素的图像, 可得第三帧图像的高像素与低像素图像, 其中高像素图像由于像素较高包含了 较多的物体轮廓信息;
(3) 通过数据处理器 (5) 对前两帧图像和第三帧图像的低像素图像进行处理, 得到物体的距离信息点;
波长为 A的光采集的两帧图像中产生的光强随距离的变化分别满足:
I (S) = fx (t - 2S / C)g{t)dt;
I (S') = f+T x (t - 2S' I C)g{t)dt 其中, S表示前一帧的距离, 表示后一帧的距离, C表示光速, t表示时 间, (i 表示波长 的前一帧的光脉冲波形, c表示权重, (t - 25 i7)表 示波长 的后一帧的光脉冲波形, g(t)表示电子快门的波形, Γ表示积分时间; 由上两式: 可知对于同一距离物体的反射光在图像传感器 (3 ) 上的光强积 分时间是有长短的, 积分时间长的光强积分就大, 积分时间短的光强积分就小; 图像传感器 (3 ) 三次曝光都能够对所采集的光波进行光强积分, 对于同一像素 点可以得到三条积分曲线图, 且由于 CCD采集光波是光强在时域上的卷积, 每 个积分曲线都有一个顶点, 三个点就能确定一条二次曲线, 通过该曲线可以求 出曲线的顶点, 该顶点的横坐标就表示信号光往返的时间, 即是物体的距离信 息点;
(4) 数据处理器 (5 ) 结合步骤 (3 ) 获得的物体的距离信息点和第三帧图像的 轮廓信息即可得到物体具体距离信息, 并最终在显示终端 (6 ) 上显示出来。
PCT/CN2013/080416 2012-12-25 2013-07-30 基于多次积分的三维成像雷达***及方法 WO2014101408A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP13868805.6A EP2975428B1 (en) 2012-12-25 2013-07-30 Three-dimensional imaging radar system
US15/039,453 US9958547B2 (en) 2012-12-25 2013-07-30 Three-dimensional imaging radar system and method based on a plurality of times of integral

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201210571277.0A CN103064087B (zh) 2012-12-25 2012-12-25 基于多次积分的三维成像雷达***及方法
CN201210571277.0 2012-12-25

Publications (1)

Publication Number Publication Date
WO2014101408A1 true WO2014101408A1 (zh) 2014-07-03

Family

ID=48106791

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/080416 WO2014101408A1 (zh) 2012-12-25 2013-07-30 基于多次积分的三维成像雷达***及方法

Country Status (4)

Country Link
US (1) US9958547B2 (zh)
EP (1) EP2975428B1 (zh)
CN (1) CN103064087B (zh)
WO (1) WO2014101408A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110809704A (zh) * 2017-05-08 2020-02-18 威力登激光雷达有限公司 Lidar数据获取与控制
CN114040186A (zh) * 2021-11-16 2022-02-11 凌云光技术股份有限公司 一种光学运动捕捉方法及装置

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103064087B (zh) * 2012-12-25 2015-02-25 符建 基于多次积分的三维成像雷达***及方法
CN104049257B (zh) * 2014-06-04 2016-08-24 西安电子科技大学 一种多相机空间目标激光立体成像装置及方法
CN104049258B (zh) * 2014-06-04 2016-10-19 王一诺 一种空间目标立体成像装置及方法
EP3057067B1 (en) * 2015-02-16 2017-08-23 Thomson Licensing Device and method for estimating a glossy part of radiation
US10627490B2 (en) * 2016-01-31 2020-04-21 Velodyne Lidar, Inc. Multiple pulse, LIDAR based 3-D imaging
US9866816B2 (en) 2016-03-03 2018-01-09 4D Intellectual Properties, Llc Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis
US10067509B1 (en) * 2017-03-10 2018-09-04 TuSimple System and method for occluding contour detection
EP3388864A1 (en) * 2017-04-10 2018-10-17 Bea S.A. Method of human body recognition and human body recognition sensor
CN107607960A (zh) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 一种光学测距的方法及装置
JP7257275B2 (ja) * 2019-07-05 2023-04-13 株式会社日立エルジーデータストレージ 3次元距離測定装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1700038A (zh) * 2005-03-25 2005-11-23 浙江大学 无扫描器脉冲调制式三维成像方法及***
CN101788667A (zh) * 2010-01-19 2010-07-28 浙江大学 一种光放大型三维成像方法及***
CN102798868A (zh) * 2012-07-27 2012-11-28 符建 基于飞行光谱的三维成像雷达***
CN103064087A (zh) * 2012-12-25 2013-04-24 符建 基于多次积分的三维成像雷达***及方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7418346B2 (en) * 1997-10-22 2008-08-26 Intelligent Technologies International, Inc. Collision avoidance methods and systems
US7962285B2 (en) * 1997-10-22 2011-06-14 Intelligent Technologies International, Inc. Inertial measurement unit for aircraft
US6535275B2 (en) * 2000-08-09 2003-03-18 Dialog Semiconductor Gmbh High resolution 3-D imaging range finder
DE10051918A1 (de) * 2000-10-19 2002-05-02 Peter Lux Verfahren und Gerät zur Aufnahme von Entfernungsbildern
GB2374743A (en) * 2001-04-04 2002-10-23 Instro Prec Ltd Surface profile measurement
DE50307744D1 (de) * 2003-10-29 2007-08-30 Fraunhofer Ges Forschung Abstandssensor und verfahren zur abstandserfassung
US20090140887A1 (en) * 2007-11-29 2009-06-04 Breed David S Mapping Techniques Using Probe Vehicles
EP2260325B1 (en) * 2008-02-29 2015-08-05 Leddartech Inc. Light-integrating rangefinding device and method
EP2477043A1 (en) * 2011-01-12 2012-07-18 Sony Corporation 3D time-of-flight camera and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1700038A (zh) * 2005-03-25 2005-11-23 浙江大学 无扫描器脉冲调制式三维成像方法及***
CN101788667A (zh) * 2010-01-19 2010-07-28 浙江大学 一种光放大型三维成像方法及***
CN102798868A (zh) * 2012-07-27 2012-11-28 符建 基于飞行光谱的三维成像雷达***
CN103064087A (zh) * 2012-12-25 2013-04-24 符建 基于多次积分的三维成像雷达***及方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
OPTICS LETTERS, vol. 32, 2007, pages 3146 - 3148
See also references of EP2975428A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110809704A (zh) * 2017-05-08 2020-02-18 威力登激光雷达有限公司 Lidar数据获取与控制
CN110809704B (zh) * 2017-05-08 2022-11-01 威力登激光雷达美国有限公司 Lidar数据获取与控制
CN114040186A (zh) * 2021-11-16 2022-02-11 凌云光技术股份有限公司 一种光学运动捕捉方法及装置

Also Published As

Publication number Publication date
US9958547B2 (en) 2018-05-01
EP2975428A4 (en) 2017-06-28
US20160313446A1 (en) 2016-10-27
EP2975428B1 (en) 2018-09-26
CN103064087A (zh) 2013-04-24
EP2975428A1 (en) 2016-01-20
CN103064087B (zh) 2015-02-25

Similar Documents

Publication Publication Date Title
WO2014101408A1 (zh) 基于多次积分的三维成像雷达***及方法
CN105115445A (zh) 基于深度相机与双目视觉复合的三维成像***及成像方法
CN102798868B (zh) 基于飞行光谱的三维成像雷达***
CN101526619B (zh) 基于无扫描激光雷达与ccd相机的同步测距测速***
EP3470774A1 (en) Three-dimensional scanner having pixel memory
CN102635056B (zh) 一种沥青路面构造深度的测量方法
CN108375773A (zh) 一种多通道激光雷达三维点云测量***及测量方法
CN110121659A (zh) 用于对车辆的周围环境进行特征描述的***
US20160299218A1 (en) Time-of-light-based systems using reduced illumination duty cycles
US11531104B2 (en) Full waveform multi-pulse optical rangefinder instrument
CN102375144A (zh) 单光子计数压缩采样激光三维成像方法
CN106707295B (zh) 基于时间关联的三维成像装置和成像方法
CN209676383U (zh) 深度相机模组、深度相机、移动终端以及成像装置
US20220120908A1 (en) Distance measurement apparatus, information processing method, and information processing apparatus
CN103983981A (zh) 基于相位测距原理的三维压缩成像方法及装置
CN202794523U (zh) 一种基于飞行光谱的三维成像雷达***
CN104483097B (zh) 测量选通像增强器光学门宽的装置及方法
US11733359B2 (en) Configurable array of single-photon detectors
US7274815B1 (en) Parallel phase-sensitive three-dimensional imaging camera
Wallace et al. 3D imaging and ranging by time-correlated single photon counting
TUDOR et al. LiDAR sensors used for improving safety of electronic-controlled vehicles
CN205826867U (zh) 一种大气风速分布探测的装置
CN103697825A (zh) 一种超分辨3d激光测量***及方法
Hussmann et al. Systematic distance deviation error compensation for a ToF-camera in the close-up range
Chaldina et al. Study of the Time-of-Flight Method for Measuring Distances to Objects Using an Active-Pulse Television Measuring System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13868805

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2013868805

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 15039453

Country of ref document: US