CN202406199U - Three-dimensional measure chip and system based on double-array image sensor - Google Patents

Three-dimensional measure chip and system based on double-array image sensor Download PDF

Info

Publication number
CN202406199U
CN202406199U CN2011203533902U CN201120353390U CN202406199U CN 202406199 U CN202406199 U CN 202406199U CN 2011203533902 U CN2011203533902 U CN 2011203533902U CN 201120353390 U CN201120353390 U CN 201120353390U CN 202406199 U CN202406199 U CN 202406199U
Authority
CN
China
Prior art keywords
image sensor
image
optical lens
sensor arrays
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2011203533902U
Other languages
Chinese (zh)
Inventor
刘立
倪海日
王建
王天慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN2011203533902U priority Critical patent/CN202406199U/en
Application granted granted Critical
Publication of CN202406199U publication Critical patent/CN202406199U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The utility model relates to the field of distance imaging and three-dimensional imaging technologies, and relates to a three-dimensional measure chip based on a double-array image sensor, comprising two optical lenses, two CMOS image sensor arrays and an image sensor control and process circuit. The two CMOS image sensor arrays and the image sensor control and process circuit are arranged on the same semiconductor substrate; the two optical lenses are wafer grade optical lenses, which are respectively positioned on the two CMOS image sensor arrays, wherein one optical lens is provided with an infrared optical filter or a film for obtaining a common two-dimension colorful image, and the other optical lens is used for obtaining a depth image; the image sensor control and process circuit obtains image information according to the two CMOS image sensor arrays to build a three-dimensional stereo image. The utility model also provides a three-dimensional measure system employing the chip. The utility model can effectively minimize the dimensions of the measure chip and the system, and raise the real-time property and accuracy of the measure.

Description

A kind of three-dimensional measurement chip and system based on two array image sensors
Technical field
The utility model relates to distance imaging and three-dimensional imaging technical field; More specifically, relate to the three-dimensional measurement chip that is used to obtain stereoscopic image data, integrated two the CMOS optic sensor array of this chip; One is used for obtaining depth image; One is used for obtaining ordinary two-dimensional image, through obtaining two kinds of information simultaneously, makes up three-dimensional image.
Background technology
The CMOS chip
CMOS technology is the main flow technology of very lagre scale integrated circuit (VLSIC), and integrated level is high, can as required multiple function be integrated on the chip piece.The cmos image sensor chip has adopted CMOS technology, can image acquisition units and signal processing unit be integrated into on the chip piece.The CMOS chip internal structure mainly is made up of photosensitive array, frame (OK) control circuit and sequence circuit, analog signal reading circuit, A/D change-over circuit, digital signal processing circuit and interface circuit etc.These several parts all are integrated on the same silicon chip usually.Its course of work generally can be divided into and resets, opto-electronic conversion, integration, reads several parts.Digital signal processing circuit mainly carries out the control of automatic exposure amount, nonuniform compensation, white balance processing, black level control, Gamma correction; Bad some detection etc.; In order to calculate fast even can DSP device and cmos device with programmable functions to be integrated, thereby form single-chip digital camera and image processing system.Compare with traditional ccd image sensor, on entire image system integration chip piece, constitute the single-chip imaging system, thereby reduce cost, practice thrift system power dissipation, improve image quality, and have in light weight, space and occupy advantages such as little.
The cmos image sensing chip is except obtaining visible images; Also can form images to infrared non-visible light wave; The sensitivity of its remolding sensitivity ccd image sensing chip is high in 890~980 nanometer range, and increases the gradient that decay also slowly with wavelength.
Wafer-level lens module (Wafer-Lens) technology
The wafer-level lens technology is promoted to wafer level with the optical element manufacturing, sees through semiconductor technology, on a wafer, can make thousands of eyeglasses, and adopts the wafer-level packaging technology that these eyeglasses are arranged combination on wafer, is cut into independently camera lens.These camera lenses at first can be arranged on wafer and combined, and are then assembled on the optical element, add in the imageing sensor on the chip again.Not only with element change on a small quantity, processing procedure is oversimplified, produce greatly and quantize, and has also reduced production cost.
This technology has been reduced to the size of optical module cell phone camera module half the of current the present.
Camera lens on the wafer is arranged and the combination program has been eliminated expensive manual focusing.These optical elements have adopted the material that can stand Reflow Soldering, and wired combination and BGA combination dual mode.When adopting BGA to combine, optical element can directly be installed on the plate.
Chinese patent CN101685767A has proposed " manufacturing approach of wafer level image module and structure thereof ".This utility model discloses a kind of manufacturing approach of wafer scale optical lens module; Solved the problem that dimensional accuracy appears in regular meeting in the common wafer-level lens assembling process; Image lens might cause the problem of offset promptly in the storehouse assembling, and the situation that optical axis can't be aimed at.This utility model has been simplified the manufacturing approach of wafer-level lens simultaneously [2]
The method that is used to obtain the target three-dimensional information commonly used comprises
Time difference range finding TOF (time-of-flight)
Time of Flight is exactly the time of compute ray flight.At first let device send pulsed light, and at the reverberation of emission place receiving target thing, calculate the distance of object by the Measuring Time difference, the relative position of sensing solid space changes, and the construction GTG is apart from the degree of depth inductor of image.Sensitive chip is owing to wanting the measuring light flight time, so need accomplish the shutter of femtosecond.Make its clock frequency bring up to G up to a hundred.But do like this its cost is raise.Different with general illumination demand is that the purpose of TOF illumination unit is not illumination, but utilizes the variation of incident optical signal and reflected light signal to carry out range measurement, so the illumination unit of TOF all is that light is carried out launching after the high frequency modulated again.Similar with general camera, TOF camera chip front end needs a camera lens of collecting light.But different with the ordinary optical camera lens is need add a bandpass filter here to guarantee to have only the light identical with the lighting source wavelength to get into.
The TOF that utilizes that Chinese patent CN101866056A " based on the three-D imaging method and the system of led array common lens TOF depth survey " has proposed a kind of novelty carries out the method for depth survey; It adopts led array as the active illuminating source, has only an outside radiant light of LED in the led array at every turn.This method has solved traditional TOF depth survey for obtaining three-dimensional depth information; Need to use precision, heaviness and expensive mechanical scanner laser beam to be carried out mechanical scanning and causes the depth image acquisition speed slow tested scene on other both direction, problems such as real-time difference [3]But thing followed problem is exactly limited because of the led array number, and extensive led array volume is too big again, and this will cause that the lower problem of resolution on the vertical direction of optical axis is being arranged.
Binocular vision
The basic principle of binocular stereo vision is the three-dimensional perception of imitation human eye and human vision; Observe same scenery from two viewpoints; To obtain the perceptual image under the different visual angles, through the position deviation between principle of triangulation computed image pixel, to obtain the three-dimensional information of scenery.Two imageing sensors that use is arranged in parallel are observed objective with different directions, and the image of acquisition has difference each other, through in two width of cloth images, searching for identical point and unique range information that obtains of comparison left and right sides image.For the identical point of search from two width of cloth images that obtained need carry out a large amount of image processing operations.A complete binocular vision system can be divided into six major parts such as image acquisition, camera calibration, feature extraction, three-dimensional coupling, depth recovery and degree of depth interpolation usually.
Chinese patent CN101401443A " is used to obtain the CMOS stereocamera of 3-D view " and has proposed a kind of 3-D view acquisition methods based on binocular vision.This utility model is simple in structure, and processing speed is fast [1]But because of two image sensor array centre-to-centre spacing are too little, it is somewhat barely satisfactory to cause carrying out depth survey Shi Jingdu.
Structure light source (Structured Light)
((Structured Light) is a kind of active optical measuring technique of the triangulation principle based on optics to method of structured light; Its basic principle is (can be laser by structured light projector; Also can be projecting apparatus) to luminous point, striation or the light face structure of testee surface projection certain pattern, form the optical 3-dimensional image of modulating by object surface shape at body surface.And,, utilize the triangle principle to calculate the three-dimensional coordinate of object through system's geometrical relationship by imageing sensor (like video camera) acquisition image.Relative position one between optical projection device and video camera regularly just can be reappeared the three-D profile of body surface by the optical imagery that obtains.
In the method for structured light,, obtain the depth information of scene through handled is carried out in the object surface structure light image of obtaining.The depth computing method that has had comparative maturity.As, document " with the reconstructing 3 D contour and the measurement of structure light coding method realization object " (author: the Wei Yuan Hu Jiasheng of Dalian University of Technology) introduced a kind of reconstructing 3 D contour and measuring system of obtaining the body surface three-dimensional coordinate with the structure light coding method [4]Have such as Chinese patent CN101667303A " a kind of three-dimensional rebuilding method " and proposed a kind of computational process, high, high three-dimensional rebuilding method of reconstruction precision of coupling accuracy simplified based on coded structured light based on coded structured light [5]Through optimizing, can realize the hardwareization of depth information computational methods easily, and be integrated into the digital signal processing module of the utility model.
List of references
[1] Siliconfile Technologies Inc.. be used to obtain the CMOS stereocamera of 3-D view: China, CN101401443A [P] .2009-04-01 [2011-06-02] .http: //www.soopa.com/Patent/200780008388
[2] Guangbao Technology Co., Ltd of Silitek Electronic (Guangzhou) Co., Ltd.. the manufacturing approach of wafer level image module and structure thereof: China, CN101685767A [P] .2010-03-31 [2011-06-02] .http: //www.soopat.com/Patent/200810211431
[3] Hefei Institutes of Physical Science, Chinese Academy of Sciences. based on the three-D imaging method and the system of led array common lens TOF depth survey: China, CN101866056A [P] .2010-10-20 [2011-06-02] .http: //www.soopat.com/Paent/201010190028
[4] Wei Yuan Hu Jiasheng. realize the reconstructing 3 D contour and measurement [EB/OL] .http of object with the structure light coding method: //cn.newmaker.com/art_22093.html
[5] Zhejiang Polytechnical University. a kind of three-dimensional rebuilding method based on coded structured light: China, CN101667303A [P] .2010-03-10 [2011-06-02] .http: //utils.soopat.com:8080/TiffFile/PdfView/E92F5BOD01FC8FEA7 B2148B4F6ADE452
The utility model content
The purpose of the utility model be to present three-D imaging method and system obtain that depth information speed is slow, registration accuracy is poor between depth image and the two dimensional image; The required hardware number of whole 3-D imaging system is more; Deficiencies such as system calibration complicacy; Propose a kind of double image sensor array chip that is used for measurement in space, be used for realizing fast the high-precision 3-D view that obtains.The technical scheme of the utility model is following:
A kind of three-dimensional measurement chip based on two array image sensors; Comprise two optical lens, two cmos image sensor arrays and imageing sensor control and treatment circuit; Two cmos image sensor arrays and imageing sensor control and treatment circuit are produced on the same Semiconductor substrate; Described optical lens is the wafer scale optical lens, lay respectively at two cmos image sensor arrays above, one of them optical lens has infrared fileter or film; The image information that imageing sensor control and treatment circuit obtain according to two cmos image sensor arrays makes up three-dimensional image.
As preferred implementation, the optical axis of described two image sensor arrays is parallel to each other and perpendicular to as the plane and have identical resolution.
The utility model provides a kind of three-dimension measuring system based on two array image sensors simultaneously; Comprise infrared light light emitting source, two optical lens, two cmos image sensor arrays, imageing sensor control and treatment circuits; Described infrared light light emitting source is used to produce infrared structure light, it is characterized in that; Two cmos image sensor arrays and imageing sensor control and treatment circuit are produced on the same Semiconductor substrate; Described optical lens is the wafer scale optical lens, lay respectively at two cmos image sensor arrays above, one of them optical lens has infrared fileter or film; The image information that imageing sensor control and treatment circuit obtain according to two cmos image sensor arrays makes up three-dimensional image.
The utility model is integrated two image sensor arrays and corresponding image sensor array control circuit, imaging signal processing circuit and digital signal processing circuit on a Semiconductor substrate; With hardware mode the infrared picture data of obtaining is handled; Directly calculate depth information; Alleviate the calculating pressure of back-end system, reduced the required bandwidth of transfer of data, improved the real-time of three-dimensional measurement.Such chip realizes having solved the calibration problem that common 3-D imaging system uses the discrete images transducer to bring; The centre-to-centre spacing of two image sensor arrays is changeless; And it is less relatively; Can eliminate calibration problem through simple hardware circuit, making the Two-dimensional Color Image data of dateout and depth data is one to one.Application wafer-level lens technology; On image sensor array, place optical lens; Reduced the size of chip and measuring system effectively; Handle and easily the utility model to be applied in a lot of fields in conjunction with two kinds of data, like field of consumer electronics, monitoring safety-security area, vehicle security drive field, control and the identification field automatically.
Description of drawings
Fig. 1 is the structural representation of the utility model, (a) is sectional view; (b) be vertical view.
Fig. 2 A, 2B are the geometrical relationship sketch map of some imaging on two imageing sensors 210 and 220 on the space.Level interval D is greater than the level interval D among Fig. 2 A among Fig. 2 B
Fig. 3 is the concrete implementation structure of imageing sensor control and treatment circuit 230.
Fig. 4 is the system construction drawing of the utility model chip.
Fig. 5 is a typical application system of the utility model.
Embodiment
The utility model provides and on a Semiconductor substrate, make two image sensor arrays, lay respectively at the left side and the right of Semiconductor substrate.In the middle of two image sensor arrays with below produce imageing sensor control circuit, imaging signal processing circuit (ISP) and digital signal processing circuit (DSP is mainly used in compute depth information), through two image sensor arrays about the bus connection.Use wafer-level lens (Wafer Level Lens) technology and on two image sensor arrays, place optical lens.Two image sensor arrays work under the different working modes: one as the common two-dimensional image data of the direct perception of common image sensor array; Another is operated under the infrared perceptual model, and the utility model adopts the active light source to send the structure infrared light of particular form, target object is shone, thus image sensor array perception infrared picture data.Two image sensor arrays are caught and are sent view data to imaging signal processing circuit through bus after the view data and digital signal processing circuit is handled output image data and corresponding depth data.
Describe the utility model in detail with reference to accompanying drawing and embodiment below.
Fig. 1 has described the outside drawing of the utility model, comprises the sectional view and the vertical view of the utility model.
With reference to figure 1, the utility model 100 is from seen two optical lens 110 and 120, one infrared light light emitting sources 130 in appearance.110 times correspondences of optical lens image sensor array 210, and 120 times correspondences of optical lens image sensor array 220, and camera lens 110 and 120 is separated with distance h and image sensor array 210 and 220 in vertical direction.In the middle of image sensor array 210 and 220 with below be that imageing sensor controls and treatment circuit 230. Image sensor array 220 and 210 and imageing sensor control and treatment circuit 230 on a block semiconductor substrate, realize.In addition, image sensor array 210 is parallel to each other with 220 optical axis and perpendicular to as the plane and have identical resolution.
Whole device busy process mid-infrared light light emitting source 130 is always to the structure infrared ray of external radiation particular form; Image sensor array 210 is as the common image sensor array; Its optical lens contains infrared absorption filter and removes sheet; Eliminated the interference to this image sensor array of infrared light light source and the infrared light in the environment of the utility model, this image sensor array obtains is the common color view data.The view data of image sensor array 220 perception infrared ray wave bands has the infrared light bandpass filter for this reason on optical lens 120, the view data of obtaining is carried out the depth information that handled can be obtained each pixel.
Fig. 2 has described the geometrical relationship of some imaging on two image sensor arrays 210 and 220 on the space.
With reference to figure 2A, certain point 200 distance apart from image sensor array 210 and 220 is d on the space, and the centre-to-centre spacing between the image sensor array 210 and 220 is D; Imaging point on the image sensor array 210 is t1 to the distance of the central point of image sensor array 210 in the horizontal direction; Imaging point on the image sensor array 220 is t2 to the distance of the central point of image sensor array 220, can find out t1 ≠ t2, and this explains between two image sensor arrays 210 and 220 because of there being distance to be the level interval of D; Cause both imaging to have a slight difference; Have a certain size horizontal translation t in the horizontal direction,, can find out that above-mentioned horizontal translation size is influenced by the level interval D between image sensor array 210 and 220 referring to Fig. 2 A and 2B; Level interval D1 is greater than the level interval D among Fig. 2 A among Fig. 2 B; Among Fig. 2 B, the imaging point on the image sensor array 220 is t3 to the distance of the central point of image sensor array 220, visible t3>t2.Horizontal translation t between more little then image sensor array 210 of the level interval D between the image sensor array 210 and 220 and 220 imaging is more little.
Two image sensor arrays are integrated in the size that can effectively reduce level interval D on the Semiconductor substrate, for the later stage signal processing brings convenience.Because the result that the existence of horizontal translation t and inevitable (because of the level interval D between image sensor array 210 and 220 can not be zero) cause is exactly not matching between two image sensor arrays 210 and 220 imagings, promptly the data of two image sensor array corresponding pixel points are not the data that same point collection in the space is come.For realizing correct match, need handle the view data that collects.Common 3-D imaging system uses the discrete images transducer, and the centre-to-centre spacing of two sensor arraies is bigger and be easy to change feasible calibration more complicated.The centre-to-centre spacing of two image sensor arrays of the utility model chip is changeless, and less relatively, can solve calibration problem through simple hardware circuit, thereby can improve measured real-time property and certainty of measurement can be higher.
Fig. 3 has described the concrete implementation structure of imageing sensor control and treatment circuit 230.
With reference to figure 3, imageing sensor control and treatment circuit 230 be by reading circuit module 310 and 320, A/D converter module 302, clock controller module 304, input/output module 330, picture signal processing module 340 and digital signal processing module 350 compositions.The reading circuit module comprises row selection logical block, analogy signal processing unit, column selection logical block.Imaging signal processing circuit mainly carries out the control of automatic exposure amount, nonuniform compensation, white balance processing, black level control, Gamma correction, bad some detection etc.Reading circuit module 310 links to each other with imageing sensor 210 in the above-mentioned module, and reading circuit module 320 links to each other with imageing sensor 220.A/D converter module 302, clock controller module 304, input/output module 330 is several modules that imageing sensor 210 and 220 is shared with 340 of picture signal processing modules.After imageing sensor 210 and 220 exposures; By reading circuit 310 and 320 analog picture signal is read respectively; Convert analog picture signal into data image signal through A/D converter module 302 again; Picture signal processing module 340 is handled said data image signal, the data image signal of the imageing sensor 210 after handling by input/output module 330 outputs, and the data image signal of the imageing sensor 220 after the processing is then sent into digital signal processing module 350; Calculate the depth information of each pixel, carry out handled again and eliminate each pixel is exported in the influence that is brought by horizontal displacement at last by input/output module 330 depth information.
Fig. 4 has described the system configuration of the utility model chip.
With reference to figure 4, the utility model utilizes the advantage of CMOS technology, and the function that the utility model is required all is integrated on the block semiconductor substrate, and this has reduced system cost effectively, has also solved the calibration problem of the three-dimensional measurement that is made up of the discrete images transducer.Through the integrated digital treatment circuit, the infrared image that obtains is handled in real time, improved the real-time of the utility model.In conjunction with wafer-level lens (Wafer Level Lens) technology direct placement camera lens on the utility model, further reduced the size of the utility model, for the extensive use of the utility model provides basis of reality.
Fig. 5 has described a typical application system of the utility model.
With reference to figure 5, it is its processing core that this typical application adopts flush bonding processor, by the utility model 500 acquisition of image data, sends into buffer memory in the storage buffer 510 after the processing.Flush bonding processor 520 reads data in the buffer handle after, send into color monitor 570 and show.Microphone 540 can obtain realaudio data, sends into digital processing unit 520 after being handled by audio codec 580 and handles.View data and voice data all can deposit in the data storage 560 after handling, like hard disk, Flash memory etc.The user can control 530 pairs of whole systems through the user and control.
All implementation methods can realize according to prior art in the utility model, and these technology all are mature and reliable, and this concrete realization to the utility model provides basis of reality.

Claims (4)

1. three-dimensional measurement chip based on two array image sensors; Comprise two optical lens, two cmos image sensor arrays and imageing sensor control and treatment circuit; It is characterized in that two cmos image sensor arrays and imageing sensor control and treatment circuit are produced on the same Semiconductor substrate, described optical lens is the wafer scale optical lens; Lay respectively at two cmos image sensor arrays above; One of them optical lens has infrared fileter or film, is used to obtain the ordinary two dimensional coloured image, and another optical lens is used to obtain depth image; The image information that imageing sensor control and treatment circuit obtain according to two cmos image sensor arrays makes up three-dimensional image.
2. three-dimensional measurement chip according to claim 1 is characterized in that, the optical axis of described two image sensor arrays is parallel to each other and perpendicular to as the plane and have identical resolution.
3. the three-dimension measuring system based on two array image sensors comprises infrared light light emitting source, two optical lens, two cmos image sensor arrays, imageing sensor control and treatment circuits, described infrared light light emitting source; Be used to produce infrared structure light; It is characterized in that two cmos image sensor arrays and imageing sensor control and treatment circuit are produced on the same Semiconductor substrate, described optical lens is the wafer scale optical lens; Lay respectively at two cmos image sensor arrays above; One of them optical lens has infrared fileter or film, is used to obtain the ordinary two dimensional coloured image, and another optical lens is used to obtain depth image; The image information that imageing sensor control and treatment circuit obtain according to two cmos image sensor arrays makes up three-dimensional image.
4. three-dimension measuring system according to claim 3 is characterized in that, the optical axis of described two image sensor arrays is parallel to each other and perpendicular to as the plane and have identical resolution.
CN2011203533902U 2011-09-20 2011-09-20 Three-dimensional measure chip and system based on double-array image sensor Expired - Fee Related CN202406199U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011203533902U CN202406199U (en) 2011-09-20 2011-09-20 Three-dimensional measure chip and system based on double-array image sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011203533902U CN202406199U (en) 2011-09-20 2011-09-20 Three-dimensional measure chip and system based on double-array image sensor

Publications (1)

Publication Number Publication Date
CN202406199U true CN202406199U (en) 2012-08-29

Family

ID=46703813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011203533902U Expired - Fee Related CN202406199U (en) 2011-09-20 2011-09-20 Three-dimensional measure chip and system based on double-array image sensor

Country Status (1)

Country Link
CN (1) CN202406199U (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102438111A (en) * 2011-09-20 2012-05-02 天津大学 Three-dimensional measurement chip and system based on double-array image sensor
CN105115445A (en) * 2015-09-14 2015-12-02 杭州光珀智能科技有限公司 Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision
CN105163014A (en) * 2015-09-15 2015-12-16 上海图甲信息科技有限公司 Road monitoring device and method
CN105262983A (en) * 2015-09-15 2016-01-20 上海图甲信息科技有限公司 Road monitoring system and method based on internet of lamps
CN111601051A (en) * 2020-05-13 2020-08-28 长江存储科技有限责任公司 Alignment image acquisition method, device and system
TWI707163B (en) * 2019-05-06 2020-10-11 大陸商三贏科技(深圳)有限公司 Camera module

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102438111A (en) * 2011-09-20 2012-05-02 天津大学 Three-dimensional measurement chip and system based on double-array image sensor
CN105115445A (en) * 2015-09-14 2015-12-02 杭州光珀智能科技有限公司 Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision
CN105163014A (en) * 2015-09-15 2015-12-16 上海图甲信息科技有限公司 Road monitoring device and method
CN105262983A (en) * 2015-09-15 2016-01-20 上海图甲信息科技有限公司 Road monitoring system and method based on internet of lamps
CN105163014B (en) * 2015-09-15 2018-07-24 上海图甲信息科技有限公司 Road monitoring device and method
TWI707163B (en) * 2019-05-06 2020-10-11 大陸商三贏科技(深圳)有限公司 Camera module
CN111601051A (en) * 2020-05-13 2020-08-28 长江存储科技有限责任公司 Alignment image acquisition method, device and system

Similar Documents

Publication Publication Date Title
CN102438111A (en) Three-dimensional measurement chip and system based on double-array image sensor
CN110044300A (en) Amphibious 3D vision detection device and detection method based on laser
CN202406199U (en) Three-dimensional measure chip and system based on double-array image sensor
JP6550536B2 (en) Multi-line array laser light three-dimensional scanning system and multi-line array laser light three-dimensional scanning method
CN106303228B (en) A kind of rendering method and system of focus type light-field camera
CN105432080B (en) Transition time camera system
CN102679959B (en) Omnibearing 3D (Three-Dimensional) modeling system based on initiative omnidirectional vision sensor
CN105004324B (en) A kind of monocular vision sensor with range of triangle function
CN106911888A (en) A kind of device
CN110390719A (en) Based on flight time point cloud reconstructing apparatus
US11632535B2 (en) Light field imaging system by projecting near-infrared spot in remote sensing based on multifocal microlens array
CN107860337B (en) Structured light three-dimensional reconstruction method and device based on array camera
CN107783353A (en) For catching the apparatus and system of stereopsis
JP2009300268A (en) Three-dimensional information detection device
CN104567818B (en) A kind of portable round-the-clock actively panoramic vision sensor
CN104079916A (en) Panoramic three-dimensional visual sensor and using method
CN106170086B (en) Method and device thereof, the system of drawing three-dimensional image
CN110728745B (en) Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model
CN106597469B (en) Imaging method of active imaging laser camera
CN106405566A (en) High-measurement-precision laser radar distance measurement method
CN108803067A (en) A kind of optical depth camera and its signal optical source processing method
US11326874B2 (en) Structured light projection optical system for obtaining 3D data of object surface
CN111337013B (en) Four-linear array CCD-based multi-target point distinguishing and positioning system
Ma et al. Single-shot 3D reconstruction imaging approach based on polarization properties of reflection lights
CN209181735U (en) Amphibious 3D vision detection device based on laser

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120829

Termination date: 20130920