CN205305018U - Acquire equipment of depth map of three -dimensional scene - Google Patents

Acquire equipment of depth map of three -dimensional scene Download PDF

Info

Publication number
CN205305018U
CN205305018U CN201520718377.0U CN201520718377U CN205305018U CN 205305018 U CN205305018 U CN 205305018U CN 201520718377 U CN201520718377 U CN 201520718377U CN 205305018 U CN205305018 U CN 205305018U
Authority
CN
China
Prior art keywords
infrared
camera
video camera
light
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201520718377.0U
Other languages
Chinese (zh)
Inventor
宋金龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Timeng Information Technology Co Ltd
Original Assignee
Shanghai Timeng Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Timeng Information Technology Co Ltd filed Critical Shanghai Timeng Information Technology Co Ltd
Priority to CN201520718377.0U priority Critical patent/CN205305018U/en
Application granted granted Critical
Publication of CN205305018U publication Critical patent/CN205305018U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The utility model relates to an acquire equipment of depth map of three -dimensional scene, including infrared camera of projection, first camera, second camera and infrared light filling light source, this first camera configuration is receiving the infrared light, and this second camera configuration has the distance that accords with binocular vision for receiving colored visible light between this first camera and this the second camera, and wherein this infrared camera of projection disposes to to throw the structured light under first mode, does not throw the structured light under the second mode, and this infrared light filling light source installation throws additional light source for optionally under the second mode.

Description

Obtain the equipment of the depth map of three-dimensional scenic
Technical field
The utility model relates to the scan rebuilding of three-dimensional scenic, especially relates to the depth map that obtains three-dimensional scenicEquipment.
Background technology
Obtaining each point in scene is one of vital task of computer vision system for the distance of video camera. ?In scape, each point can represent with depth map (DepthMap) with respect to the distance of video camera, in depth mapEach pixel value represents certain distance a bit and between video camera in scene. NI Vision Builder for Automated Inspection obtains sceneThe technology of depth map can be divided into passive ranging sensing and the large class of active depth sensing two, and passive ranging sensing refers toVision system receives the light energy from scene transmitting or reflection, forms relevant scene light energy distribution function,Be gray level image, the then depth information of restoration scenario on the basis of these images. Initiative range measurement sensing isFinger vision system, first to scene emitted energy, then receives the reflected energy of scene to institute's emitted energy.
Distance-measuring device (also claiming depth camera) based on structure laser is a kind of initiative range measurement sensing technology,Its principle please refer to Fig. 1. Laser beam that Tu1Zhong projector 110 launches (be actually several ten thousand light, thisIn taking a light as signal) be radiated at distance as Z1, in the Different Plane of Z2, on camera 110The hot spot point of imaging can produce the horizontal displacement from Xc1 to Xc2. Set a reference planes Z0, pass throughDetect the plane Zk of any distance with respect to the displacement of the luminous point of reference planes Z0, can know plane by inferenceThe distance of Zk. That is to say, in computer equipment, want the LASER SPECKLE pattern peace of stored reference plane Z0The actual distance value (i.e. a constant z0) of face Z0, in the time of the image of any measurement plane of input, by detectingThe displacement of hot spot just can realize range finding. The detection method of spot displacement is hot spot in match reference figure and input figureThe similarity of the localized mass of point.
Fig. 2 is a kind of structural representation of the known depth camera based on structured light. Shown in figure 2,Depth camera comprises infrared projection device 210, thermal camera 220 and colour TV camera 230. Infrared projectionDevice 210 and thermal camera 220 complete depth detection as described in Figure 1 jointly. Wherein infrared projection device210 laser beams of launching see through diffraction optical device (DOE), are decomposed into up to ten thousand tiny laser, its workWith being to produce random hot spot (texture) on object. Thermal camera 220 photographs random spot image, warpCross to calculate and produce dense depth image. Colour TV camera 230 obtains coloured image, by coloured imageBe mapped to depth camera, can provide colouring information to three-dimensional scenic. Coloured image can be used as depth imageSupplement, but be not that depth camera is necessary.
The normally used LASER Light Source of depth camera is sightless infrared light, this make it easily by sunshine orThe infrared band comprising in other surround lighting of person disturbs. Like this, out of doors under environment, thermal camera 220Can't detect the infrared mottled grain pattern that infrared laser 210 is launched, cause depth detection failure. At this momentThermal camera 220 be degenerated to a camera that can only gather gray scale. On the other hand, colour TV camera 230And have certain distance between thermal camera 220, and cause image parallactic, cause the RGB in coloured imageColouring information is mapped in the IR image space in depth image can produce error.
Utility model content
Technical problem to be solved in the utility model is to provide establishing of a kind of depth map that obtains three-dimensional scenicStandby, it can solve and easily be subject to the problem that external ambient light is disturbed, and thermal camera and colour TV cameraProblem of parallax experienced.
The utility model be solve the problems of the technologies described above the technical scheme adopting be propose one obtain three-dimensionalThe equipment of the depth map of scene, comprises infrared projection device, the first video camera, the second video camera and infrared benefitRadiant, this first camera arrangement is for receiving infrared light, and this second camera arrangement is colored visible for receivingLight, has the distance that meets binocular vision between this first video camera and this second video camera, and wherein this is infraredProjector is configured to projective structure light under first mode, projective structure light not under the second pattern, and this is infraredSupplementary lighting sources is configured to optionally project and supplement light source under the second pattern.
In an embodiment of the present utility model, this second video camera is also configured to receive infrared light.
In an embodiment of the present utility model, this first camera arrangement is at this first mode and this secondUnder pattern, work, this second camera arrangement for to work under this second pattern.
In an embodiment of the present utility model, this first camera arrangement is at this first mode and this secondUnder pattern, work, this second camera arrangement for working under this first mode and this second pattern.
In an embodiment of the present utility model, when this first video camera and this second video camera are worked simultaneouslyTime, composition binocular vision system.
In an embodiment of the present utility model, this first mode is without infrared jamming pattern, this second patternFor there being infrared jamming pattern.
In an embodiment of the present utility model, this infrared projection device projection infrared laser.
The utility model, owing to adopting above technical scheme, makes it compared with prior art, to pass through all partsState under different mode switches and combination adapts to different application scenarios. Without infrared interference environmentUnder, can pass through infrared projection device, the first video camera composition depth camera, having under infrared interference environment,Can, by the first video camera and the second video camera composition depth camera, can also enter by infrared supplementary lighting sourcesRow light filling, thus under various scenes, can be suitable for.
Brief description of the drawings
For above-mentioned purpose of the present utility model, feature and advantage can be become apparent, below in conjunction with accompanying drawing pairDetailed description of the invention of the present utility model elaborates, wherein:
Fig. 1 illustrates laser triangulation principle.
Fig. 2 illustrates a kind of structure of the known depth camera based on structured light.
Fig. 3 illustrates the structure of the equipment of the depth map that obtains three-dimensional scenic of the utility model the first embodiment.
Fig. 4 illustrates the structure of the equipment of the depth map that obtains three-dimensional scenic of the utility model the second embodiment.
Detailed description of the invention
Embodiment of the present utility model describes the equipment of the depth map that obtains three-dimensional scenic, and it passes through all partsState under different mode switches and combination adapts to different application scenarios.
Fig. 3 illustrates a kind of equipment 300 of the depth map that obtains three-dimensional scenic, comprise infrared projection device 310,The first video camera 320, the second video camera 330 and infrared supplementary lighting sources 340. The first video camera 320 is joinedBe set to reception infrared light, for example the first video camera 320 is thermal cameras. The second video camera 330 is joinedBe set to reception color visible, for example the second video camera 330 is colour TV cameras, can gather colourImage. Different from known device, not close between the first video camera 320 and the second video camera 330,But being arranged in relative larger distance on equipment 300, this distance meets binocular vision. Infrared projection device310 and infrared supplementary lighting sources 340 can be arranged between the first video camera 320 and the second video camera 330.But be appreciated that this is not necessary, infrared projection device 310 and infrared supplementary lighting sources 340 is as long as with theOne video camera 320 and the second video camera 330 are arranged on the same side of equipment.
Infrared projection device 310 is configured to can projective structure light, and infrared projection device 310 can be equipped with and spread out for this reasonPenetrate optics (DOE). What for instance, infrared projection device 310 projected is infrared laser. Infrared light fillingLight source 340 is configured to project supplementary infrared light supply.
Under normal circumstances, infrared projection device 310 coordinates and just can obtain three-dimensional scenic with the first video camera 320Depth image. But for example, due to the interference of the infrared light in environment (daylight), this working method is notReliably. For this reason, the equipment of the present embodiment has multiple-working mode, and allows each parts at different operating mouldUnder formula, carry out the switching of state.
Specifically, first mode and the second pattern can be set. First mode can be without infrared jamming pattern,The second pattern can be infrared jamming pattern. Infrared projection device 310 is configured to projective structure light under first mode,Projective structure light not under the second pattern. Infrared supplementary lighting sources 340 is configured under the second pattern optionallyProjection supplements light source. The first video camera 320 is configured to all work under first mode and this second pattern, theTwo video cameras 330 are configured to work under the second pattern. When the first video camera 320 and the second video camera 330While work simultaneously, composition binocular vision system.
At the indoor environment that there is no sunlight, equipment 300 can be operated in first mode, and at this moment it relies on infraredProjector 310 and the first video camera 320 form depth camera. At the stronger environment of outdoor sunlight, equipment 300Can be operated in the second pattern, the first video camera 320 and the second video camera 330 form binocular stereo imagingDepth camera, depth extraction computational methods are now identical with general binocular vision. At this, can requireThe two the geometric parameter such as visual angle resolution ratio of the first video camera 320 and the second video camera 330 is identical.
In the time that equipment 300 can be operated in the second pattern, can also carry out light filling with infrared supplementary lighting sources 340,The first video camera 320 and the second video camera 330 still can form depth camera.
Above-mentioned various pattern complementations, can cover than known depth camera application scenarios widely.
Fig. 4 illustrates the equipment 400 of the depth map that obtains three-dimensional scenic of the utility model the second embodiment, bagDraw together infrared projection device 410, the first video camera 420, the second video camera 430 and infrared supplementary lighting sources 440.The first video camera 420 is configured to receive infrared light, and for example the first video camera 420 is thermal cameras.The second video camera 430 is configured to receive infrared light and color visible, and for example the second video camera 430 comprises energyThe photo-sensitive cell of enough perception visible rays simultaneously and infrared light. Different from known device, the first video camera 420With second is not close between video camera 430, but be arranged on equipment 400 larger distance relatively, thisOne distance meets binocular vision. Infrared projection device 410 and infrared supplementary lighting sources 440 can be arranged in first and take the photographBetween camera 420 and the second video camera 430. But be appreciated that this is not necessary, infrared projection device410 and infrared supplementary lighting sources 440 as long as be arranged in equipment with the first video camera 420 and the second video camera 430The same side on.
Infrared projection device 310 is configured to can projective structure light, and infrared projection device 410 can be equipped with and spread out for this reasonPenetrate optics (DOE). What for instance, infrared projection device 410 projected is infrared laser. Infrared light fillingLight source 440 is configured to project supplementary infrared light supply.
The equipment of the present embodiment has multiple-working mode, and allows each parts under different working modes, carry out shapeThe switching of state.
Specifically, first mode and the second pattern can be set. First mode can be without infrared jamming pattern,The second pattern can be infrared jamming pattern. Infrared projection device 410 is configured to projective structure light under first mode,Projective structure light not under the second pattern. Infrared supplementary lighting sources 440 is configured under the second pattern optionallyProjection supplements light source. The second video camera 420 is configured to all work under first mode and this second pattern, theTwo video cameras 430 are also configured to all work under first mode and this second pattern. When the first video camera 420While work with the second video camera 430 simultaneously, composition binocular vision system.
At the indoor environment that there is no sunlight, equipment 400 can be operated in first mode, and at this moment it relies on infraredProjector 410, the first video camera 420 and the second video camera 430 form depth camera. Infrared projection device 410Can provide by projective structure light the texture information of scene, the first video camera 420 and the second video camera 430Can form binocular stereo imaging pair. Have benefited from, because infrared projection device 410 provides texture information, now establishingStandby 400 depth informations that obtain are more accurate, and binocular imaging mode is to DOE in infrared projection device 410The production and processing of element requires also to have reduced.
At the stronger environment of outdoor sunlight, equipment 400 can be operated in the second pattern, the first video camera 420Form the depth camera of binocular stereo imaging with the second video camera 430, the second video camera 430 can also simultaneouslyThe color information of three-dimensional scenic is provided. This pattern also has advantages of extra, in the second video camera 430,Color image information and Infrared Image Information do not have parallax, and each pixel sampling point has obtained the degree of depth simultaneouslyAnd color information.
For binocular depth image, can Binocular Vision Principle be basis, by calculating two video camerasImages match point parallax carry out compute depth figure.
In the first and second above-mentioned embodiment, the depth image that equipment 300 and 400 obtains can be in order toWith calculating depth calculation unit. Depth calculation unit both can be configured in equipment 300 and 400, alsoCan be configured in outside equipment 300 and 400. For example equipment 300 and 400 can send to depth imageThe outside main frame such as such as personal computer, panel computer, smart mobile phone, utilizes the general-purpose computations list on itUnit completes depth calculation.
Although the utility model is described with reference to current specific embodiment, in the art generalLogical technical staff will be appreciated that, above embodiment is only for the utility model is described, do not departing fromIn the situation of the utility model spirit, also can make variation or the replacement of various equivalences, therefore, as long as in this realityWith the variation to above-described embodiment, modification within the scope of novel connotation all by drop on the application right wantAsk in the scope of book.

Claims (3)

1. one kind is obtained the equipment of the depth map of three-dimensional scenic, it is characterized in that comprising infrared projection device, the first video camera, the second video camera and infrared supplementary lighting sources, this first camera arrangement is for receiving infrared light, this second camera arrangement is for receiving color visible, between this first video camera and this second video camera, there is the distance that meets binocular vision, wherein this infrared projection device is configured to projective structure light, and this infrared supplementary lighting sources is configured to optionally project supplementary light source.
2. the equipment of the depth map that obtains three-dimensional scenic as claimed in claim 1, is characterized in that, this second video camera is also configured to receive infrared light.
3. the equipment of the depth map that obtains three-dimensional scenic as claimed in claim 1, is characterized in that, this infrared projection device projection infrared laser.
CN201520718377.0U 2015-09-16 2015-09-16 Acquire equipment of depth map of three -dimensional scene Expired - Fee Related CN205305018U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201520718377.0U CN205305018U (en) 2015-09-16 2015-09-16 Acquire equipment of depth map of three -dimensional scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201520718377.0U CN205305018U (en) 2015-09-16 2015-09-16 Acquire equipment of depth map of three -dimensional scene

Publications (1)

Publication Number Publication Date
CN205305018U true CN205305018U (en) 2016-06-08

Family

ID=56438939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201520718377.0U Expired - Fee Related CN205305018U (en) 2015-09-16 2015-09-16 Acquire equipment of depth map of three -dimensional scene

Country Status (1)

Country Link
CN (1) CN205305018U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106550228A (en) * 2015-09-16 2017-03-29 上海图檬信息科技有限公司 Obtain the equipment of the depth map of three-dimensional scenic
CN106845449A (en) * 2017-02-22 2017-06-13 浙江维尔科技有限公司 A kind of image processing apparatus, method and face identification system
CN108377378A (en) * 2016-11-08 2018-08-07 聚晶半导体股份有限公司 Photographic device
CN108513078A (en) * 2017-02-24 2018-09-07 灯塔人工智能公司 Method and system for capturing video image under low light condition using light emitting by depth sensing camera
CN111514001A (en) * 2020-05-06 2020-08-11 南京中医药大学 Full-automatic intelligent scraping device and working method thereof

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106550228A (en) * 2015-09-16 2017-03-29 上海图檬信息科技有限公司 Obtain the equipment of the depth map of three-dimensional scenic
CN106550228B (en) * 2015-09-16 2019-10-15 上海图檬信息科技有限公司 The equipment for obtaining the depth map of three-dimensional scenic
CN108377378A (en) * 2016-11-08 2018-08-07 聚晶半导体股份有限公司 Photographic device
CN108377378B (en) * 2016-11-08 2020-09-01 聚晶半导体股份有限公司 Image pickup apparatus
CN106845449A (en) * 2017-02-22 2017-06-13 浙江维尔科技有限公司 A kind of image processing apparatus, method and face identification system
CN108513078A (en) * 2017-02-24 2018-09-07 灯塔人工智能公司 Method and system for capturing video image under low light condition using light emitting by depth sensing camera
CN108513078B (en) * 2017-02-24 2021-08-10 苹果公司 Method and system for capturing video imagery under low light conditions using light emission by a depth sensing camera
CN111514001A (en) * 2020-05-06 2020-08-11 南京中医药大学 Full-automatic intelligent scraping device and working method thereof

Similar Documents

Publication Publication Date Title
CN106550228A (en) Obtain the equipment of the depth map of three-dimensional scenic
CN205305018U (en) Acquire equipment of depth map of three -dimensional scene
US10896497B2 (en) Inconsistency detecting system, mixed-reality system, program, and inconsistency detecting method
US10935371B2 (en) Three-dimensional triangulational scanner with background light cancellation
CN104634276B (en) Three-dimension measuring system, capture apparatus and method, depth computing method and equipment
US10560686B2 (en) Photographing device and method for obtaining depth information
CN110476148B (en) Display system and method for providing multi-view content
WO2017221461A1 (en) System, etc., for creating mixed reality environment
US20180003498A1 (en) Visual positioning system and method based on high reflective infrared identification
WO2012142062A3 (en) Six degree-of-freedom laser tracker that cooperates with a remote structured-light scanner
CN108700662A (en) Range image acquisition device and its application
CN108779978B (en) Depth sensing system and method
KR101780122B1 (en) Indoor Positioning Device Using a Single Image Sensor and Method Thereof
CN105306922A (en) Method and device for obtaining depth camera reference diagram
CN105004324A (en) Monocular vision sensor with triangulation ranging function
KR101274301B1 (en) Augmented reality system using the restoration of the image of infrared led area
KR102185322B1 (en) System for detecting position using ir stereo camera
CN105486278A (en) Visual optical instrument
JP2009031206A (en) Position measuring device
CN103697825A (en) System and method of utilizing super-resolution 3D (three-dimensional) laser to measure
Haenel et al. Evaluation of low-cost depth sensors for outdoor applications
EP3688407A1 (en) Light projection systems
Zhou A study of microsoft kinect calibration
KR101999065B1 (en) Method for measuring distance between the camera and the object using milliradian
CN203687882U (en) Super-resolution 3D laser measurement system

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160608

Termination date: 20160916