CN107025666A - Depth detection method and device and electronic installation based on single camera - Google Patents

Depth detection method and device and electronic installation based on single camera Download PDF

Info

Publication number
CN107025666A
CN107025666A CN201710138684.5A CN201710138684A CN107025666A CN 107025666 A CN107025666 A CN 107025666A CN 201710138684 A CN201710138684 A CN 201710138684A CN 107025666 A CN107025666 A CN 107025666A
Authority
CN
China
Prior art keywords
camera
image
current scene
depth detection
electronic installation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710138684.5A
Other languages
Chinese (zh)
Inventor
孙剑波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710138684.5A priority Critical patent/CN107025666A/en
Publication of CN107025666A publication Critical patent/CN107025666A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The embodiment of the invention discloses a kind of depth detection method based on single camera.Depth detection method includes:Control camera shoots current scene to obtain the first image in first position;Control camera is moved to behind the second place the second image for controlling camera to shoot current scene from first position along the axially direction perpendicular to camera;The first image and the second image is handled to obtain the depth information of current scene.The invention also discloses a kind of depth measurement device and electronic installation of single camera.Depth detection method, the depth detection apparatus of single camera and the electronic installation of the single camera of embodiment of the present invention are based on moveable single camera and carry out image taking, on the one hand the stereo pairs more matched can be obtained, the limitation of the relative position of camera is smaller when on the other hand being shot for stereo pairs, camera one-movement-freedom-degree is big, can avoid the problem of depth detection is inaccurate caused by relative position changes.

Description

Depth detection method and device and electronic installation based on single camera
Technical field
The present invention relates to imaging technique, more particularly to a kind of depth detection method and device and electronics based on single camera Device.
Background technology
In depth detection method based on binocular stereo vision, it is ensured that the specification of two cameras is more to be matched Stereo pairs, and to ensure that the relative position between two cameras is fixed, so just can ensure that the accurate of depth data Degree.But current manufacturing for camera is difficult to ensure that the specification of two cameras is identical.In addition, the phase of two cameras To position probably due to the reason such as dropping and causing relative position to change, so that accurate depth data can not be obtained.
The content of the invention
The embodiment provides a kind of depth detection method based on single camera and device and electronic installation.
The depth detection method of the single camera of embodiment of the present invention.The depth detection method includes following step Suddenly:
The camera is controlled to shoot current scene to obtain the first image in first position;
The camera is controlled to be moved to second from the first position along the axially direction perpendicular to the camera The camera is controlled to shoot the second image of the current scene behind position;With
Described first image and second image is handled to obtain the depth information of the current scene.
The depth detection apparatus of the single camera of embodiment of the present invention, for electronic installation.The depth detection dress Put including the first control module, the second control module and processing module.First control module is used to control the camera Shoot current scene to obtain the first image in first position;Second control module is used in the camera from described the One position is moved to behind the second place along the axially direction perpendicular to the camera controls the camera to shoot described work as Second image of preceding scene;The processing module is described current to obtain for handling described first image and second image The depth information of scene.
The electronic installation of embodiment of the present invention includes camera, motion sensor and above-mentioned depth detection apparatus.Institute Depth detection apparatus is stated to electrically connect with the camera and the motion sensor.
The depth detection method based on single camera, the depth detection apparatus based on single camera of embodiment of the present invention Moveable single camera is based on electronic installation and carries out image taking, on the one hand can obtain the stereo-picture more matched Right, the limitation of the relative position of camera is smaller when on the other hand being shot for stereo pairs, camera one-movement-freedom-degree Greatly, it can avoid the problem of depth detection is inaccurate caused by relative position changes.
The additional aspect and advantage of the present invention will be set forth in part in the description, and will partly become from the following description Obtain substantially, or recognized by the practice of the present invention.
Brief description of the drawings
The above-mentioned and/or additional aspect and advantage of the present invention will become from description of the accompanying drawings below to embodiment is combined Obtain substantially and be readily appreciated that, wherein:
Fig. 1 is the schematic flow sheet of the depth detection method based on single camera of embodiment of the present invention;
Fig. 2 is the high-level schematic functional block diagram of the electronic installation of embodiment of the present invention;
Fig. 3 is the view of the depth detection method of some embodiments of the invention;
Fig. 4 is the view of the depth detection method of some embodiments of the invention;
Fig. 5 is the view of the depth detection method of some embodiments of the invention;
Fig. 6 is the schematic flow sheet of the depth detection method of some embodiments of the invention;
Fig. 7 is the high-level schematic functional block diagram of the second control module of some embodiments of the invention;
Fig. 8 is the schematic flow sheet of the depth detection method of some embodiments of the invention;
Fig. 9 is the high-level schematic functional block diagram of the second control module of some embodiments of the invention;
Figure 10 is the view of the depth detection method of some embodiments of the invention;
Figure 11 is the schematic flow sheet of the depth detection method of some embodiments of the invention;
Figure 12 is the high-level schematic functional block diagram of the processing module of some embodiments of the invention;With
Figure 13 is the view of the depth detection method of some embodiments of the invention.
Embodiment
Embodiments of the present invention are described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning Same or similar element or element with same or like function are represented to same or similar label eventually.Below by ginseng The embodiment for examining accompanying drawing description is exemplary, is only used for explaining the present invention, and is not considered as limiting the invention.
Also referring to Fig. 1 to 2, the depth detection method based on single camera of embodiment of the present invention includes following step Suddenly:
S11:Control camera 20 shoots current scene to obtain the first image in first position;
S12:Control camera 20 is moved to the second place from first position along the axially direction perpendicular to camera 20 Control camera 20 shoots the second image of current scene afterwards;
S13:The first image and the second image is handled to obtain the depth information of current scene.
The depth detection that the depth detection method based on single camera of embodiment of the present invention is used for single camera is filled Put 10.Depth detection apparatus 10 based on single camera includes the first control module 11, the second control module 12 and processing module 13.Step S11 can be realized that step S12 can realize that step S13 can by the second control module 12 by the first control module 11 To be realized by processing module 13.
In other words, the first control module 11 is used to control camera 20 to shoot current scene in first position to obtain the One image;Second control module 12 is used to control camera 20 to move from first position along the axially direction perpendicular to camera 20 The second image that control camera 20 shoots current scene is moved to the second place;Processing module 13 be used for handle the first image and Second image is to obtain the depth information of current scene.
The depth detection apparatus 10 based on single camera of embodiment of the present invention is applied to the electricity of embodiment of the present invention Sub-device 100.In other words, the electronic installation 100 of embodiment of the present invention include embodiment of the present invention based on single shooting The depth detection apparatus 10 of head.Certainly, the electronic installation 100 of embodiment of the present invention also includes camera 20 and motion sensor 30.Wherein, depth detection apparatus 10 is electrically connected with camera 20 and motion sensor 30.
In some embodiments, electronic installation 100 includes mobile phone, tablet personal computer, notebook computer, intelligent watch, intelligence Energy bracelet, intelligent glasses etc., any limitation is not done herein.In a particular embodiment of the present invention, electronic installation 100 is mobile phone.
Specifically, X in Fig. 3 to 5, figure is referred tolFor the beeline at bottom of cups midpoint in image to image high order end, XrFor the beeline at bottom of cups midpoint in image to image low order end, C is the width of image.Specifically, first first Position Ol shoots the first image, and subsequent dollying first 20 to second place Or shoots the second image.Wherein, first position Ol with Second place Or connecting line direction is vertical with the axial direction of camera 20.In this way, obtain the first image and the second image this A pair of stereo pairs, then the depth information of current scene can be obtained to above-mentioned stereo pairs progress processing.
It should be noted that the optical axis when axially direction of camera 20 refers to shooting current scene with camera 20 The parallel direction in direction.
It is appreciated that the depth detection method based on binocular stereo vision is difficult to ensure that the specification of two cameras complete one Cause, and relative position between two cameras is probably due to electronic installation 100 external factor such as is dropped and changed, from And the accuracy of influence depth detection.The depth detection method of the single camera of embodiment of the present invention is based on moveable single Individual camera carries out image taking, the stereo pairs more matched on the one hand can be obtained, on the other hand for stereo pairs The limitation of the relative position of camera is smaller during shooting, and camera one-movement-freedom-degree is big, can avoid because relative position changes The problem of caused depth detection is inaccurate.
Referring to Fig. 6, in some embodiments, step S12 controls camera 20 from first position along perpendicular to shooting First 20 axially direction, which is moved to the step of controlling camera 20 to shoot the second image of current scene behind the second place, to be included Following steps:
S121:Determine camera 20 whether from first position along perpendicular to shooting according to the detecting data of motion sensor 30 First 20 axially direction is moved to the second place;
S122:In camera 20 from first position along when being moved to the second place perpendicular to the axially direction of camera 20 Camera 20 is controlled to shoot the second image;With
S123:The second place is not moved in camera 20 from first position along the axially direction perpendicular to camera 20 When control electronic installation 100 send prompting.
Referring to Fig. 7, in some embodiments, the second control module 12 includes judging unit 121, control unit 122 With Tip element 123.Step S121 can be realized that step S122 can be realized by control unit 122 by judging unit 121, step Rapid S123 can be realized by Tip element 123.
In other words, judging unit 121 be used for according to the detecting data of motion sensor 30 determine camera 20 whether from First position is moved to the second place along the axially direction perpendicular to camera 20;Control unit 122 is used in camera 20 Control camera 20 shoots the second figure when being moved to the second place along the axially direction perpendicular to camera 20 from first position Picture;Tip element 123 is used to not be moved to the from first position along the axially direction perpendicular to camera 20 in camera 20 Electronic installation 100 is controlled to send prompting during two positions.
8 are referred to, specifically, rectangular coordinate system in space X-Y-Z is set up in the space residing for electronic installation 100.At this In the specific embodiment of invention, the axial direction of camera 20 is considered as the Y direction in rectangular coordinate system in space.X-axis side To for first moving direction vertical with the axially direction of camera 20, Y-axis is parallel with the axial direction of camera 20 Second moving direction, Z axis is threeth moving direction vertical with the axial direction of camera 20.To keep camera 20 from first Position Ol is moved along the axially direction perpendicular to camera 20, and camera 20 is shown as in rectangular coordinate system in space Moved in X-direction on i.e. the first moving direction or in Z-direction on i.e. the 3rd moving direction.In this way, obtaining two frames point The image not shot under diverse location is to obtain stereo pairs, and this stereo pairs is handled in subsequent step to be obtained Depth information.If the moving direction of camera deviates X-direction or Z-direction, electronic installation 100 is needed to send prompt message To remind user to find correct camera site so that it is guaranteed that obtaining qualified stereo pairs.
In some embodiments, motion sensor 30 includes gyroscope or acceleration transducer.In the specific of the present invention In embodiment, motion sensor 30 is gyroscope.
It is appreciated that gyroscope can detect the deflection state of electronic installation 100.Aided in by the detecting data of gyroscope User corrects the moving direction of camera 20 and determines second place Or, can be used for carrying out depth letter in subsequent step to obtain Cease the second image obtained.
Referring to Fig. 9, in some embodiments, step S122 is in camera 20 from first position along perpendicular to camera Control camera 20 to shoot the second image when 20 axially direction is moved to the second place to comprise the following steps:
S1221:In camera 20 second place is moved to from first position along the axially direction perpendicular to camera 20 When detection current scene whether be kept in motion;With
S1222:Camera 20 is controlled to shoot the second image of current scene when current scene is not in motion state.
Referring to Fig. 10, in some embodiments, control unit 122 includes detection sub-unit 1221 and control subelement 1222.Step S1221 can be realized that step S1222 can be realized by control subelement 1222 by detection sub-unit 1221.
In other words, detection sub-unit 1221 is used in camera 20 from first position along perpendicular to the axial direction of camera 20 Direction when being moved to the second place detection current scene whether be kept in motion;Subelement 1222 is controlled to be used in current field Control camera 20 shoots the second image of current scene when scape is not in motion state.
It is appreciated that when being moved to the second place along the axially direction perpendicular to camera 20 in camera 20, if working as Preceding scene is kept in motion, i.e., the people in current scene or thing be motion, then shot with camera 20 come the second figure As being probably fuzzy.In this way, may of character pair point that goes out in the second fuzzy image of None- identified in subsequent step With pixel, so that the first image and the second image can not be used to obtain depth information.Therefore, it is in motion shape in current scene Shooting processing wouldn't be carried out during state.If current scene is not in motion state, control camera 20 carries out the bat of current scene Take the photograph to obtain the second image.
Figure 11 is referred to, in some embodiments, step S13 handles the first image and the second image to obtain current field The step of depth information of scape, comprises the following steps:
S131:Linear range between first position and the second place is obtained according to the detecting data of motion sensor 30;
S132:Obtain characteristic point and correspondence of the focusing main body of current scene on the first image and on the second image special Levy matched pixel a little;
S133:Focusing main body is calculated according to the coordinate of the parameter of camera 20, linear range and characteristic point and matched pixel Depth information.
Figure 12 is referred to, in some embodiments, processing module 13 includes first acquisition unit 131, second and obtains single Member 132 and computing unit 133.Step S131 can be realized that step S132 can obtain single by second by first acquisition unit 131 Member 132 realizes that step S133 can be realized by computing unit 133.
In other words, acquiring unit 131 is used to obtain first position and second according to the detecting data of motion sensor 30 Linear range between position;Second acquisition unit 132 be used to obtain the focusing main body of current scene on the first image and The matched pixel of characteristic point and character pair point on second image;Computing unit 133 be used for according to the parameter of the camera, The linear range and the coordinate of the characteristic point and the matched pixel calculate the depth information of the focusing main body.
Also referring to Fig. 4, Fig. 5 and Figure 13, in a particular embodiment of the present invention, the moving direction of camera 20 is edge X-direction is translated.P is a certain pixel of the focusing main body of user's concern in current scene in figure, and F is to shoot first The focal length of camera 20 when image and the second image.It is appreciated that the corresponding depth information of current scene is by multiple pixels Depth information combine, in other words need the depth information for obtaining multiple pixels could to obtain the depth of current scene Spend information.In addition, when camera 20 is in first position Ol and second place Or, the picture of finding a view of camera 20 is essentially identical, In other words, when camera 20 is respectively at first position Ol and second place Or, the focusing main body of current scene is respectively positioned on Camera 20 is found a view in picture, in this way, the first image and the second image to shooting carry out Feature point recognition and characteristic point The matching of corresponding pixel, wherein, characteristic point is the characteristic point of the focusing main body of user's concern in current scene, for example, Fig. 4 I.e. the first image and Fig. 5 are that the midpoint of the bottom of cups shown in the second image is cup in the characteristic point to be recognized, two images The corresponding pixel in midpoint of sub- bottom is matched pixel.Then according to the parameter of camera 20, linear range S and identify The information such as the coordinate of characteristic point and matched pixel calculate the depth information of each matched pixel.Specifically, it can be passed according to motion The detecting data of sensor 30 obtains the linear range S between first position Ol and second place Or, then can according to linearly away from From the depth information D that S calculates the corresponding matched pixel of some characteristic point in image.Depth information D computational methods are:In this way, calculating the depth letter of each matched pixel in current scene using above-mentioned calculation formula Cease after D, you can by these there is the matched pixel of depth information to constitute the corresponding depth information of current scene.
In further embodiments, camera 20 also can be moved to the second place from first position along Z-direction.Then, According further still to the linear range S between first position and the second place, the parameter of camera 20 and characteristic point and matched pixel The information such as coordinate calculate depth information.
Electronic installation 100 also includes housing, memory, circuit board and power circuit.Wherein, circuit board is placed in housing and enclosed Into interior volume, processor and memory are set on circuit boards;Power circuit is used for each circuit for electronic installation 100 Or device is powered;Memory is used to store executable program code;What depth detection apparatus 10 was stored by reading in memory Executable program code is to run program corresponding with executable program code to realize above-mentioned any embodiment of the present invention Depth detection method.
In the description of this specification, reference term " embodiment ", " some embodiments ", " schematically implementation The description of mode ", " example ", " specific example " or " some examples " etc. means with reference to the embodiment or example description Specific features, structure, material or feature are contained at least one embodiment of the present invention or example.In this specification In, identical embodiment or example are not necessarily referring to the schematic representation of above-mentioned term.Moreover, the specific spy of description Levy, structure, material or feature can in an appropriate manner be combined in any one or more embodiments or example.
Any process described otherwise above or method description are construed as in flow chart or herein, represent to include Module, fragment or the portion of the code of one or more executable instructions for the step of realizing specific logical function or process Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not be by shown or discussion suitable Sequence, including according to involved function by it is basic simultaneously in the way of or in the opposite order, carry out perform function, this should be of the invention Embodiment person of ordinary skill in the field understood.
Represent in flow charts or logic and/or step described otherwise above herein, for example, being considered use In the order list for the executable instruction for realizing logic function, it may be embodied in any computer-readable medium, for Instruction execution system, device or equipment (such as computer based system including the system of processor or other can be held from instruction The system of row system, device or equipment instruction fetch and execute instruction) use, or combine these instruction execution systems, device or set It is standby and use.For the purpose of this specification, " computer-readable medium " can any can be included, store, communicate, propagate or pass Defeated program is for instruction execution system, device or equipment or the dress for combining these instruction execution systems, device or equipment and using Put.The more specifically example (non-exhaustive list) of computer-readable medium includes following:Electricity with one or more wirings Connecting portion (electronic installation), portable computer diskette box (magnetic device), random access memory (RAM), read-only storage (ROM), erasable edit read-only storage (EPROM or flash memory), fiber device, and portable optic disk is read-only deposits Reservoir (CDROM).In addition, can even is that can be in the paper of printing described program thereon or other are suitable for computer-readable medium Medium, because can then enter edlin, interpretation or if necessary with it for example by carrying out optical scanner to paper or other media His suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each several part of the present invention can be realized with hardware, software, firmware or combinations thereof.Above-mentioned In embodiment, the software that multiple steps or method can in memory and by suitable instruction execution system be performed with storage Or firmware is realized.If, and in another embodiment, can be with well known in the art for example, realized with hardware Any one of row technology or their combination are realized:With the logic gates for realizing logic function to data-signal Discrete logic, the application specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA), scene Programmable gate array (FPGA) etc..
Those skilled in the art are appreciated that to realize all or part of step that above-described embodiment method is carried Rapid to can be by program to instruct the hardware of correlation to complete, described program can be stored in a kind of computer-readable storage medium In matter, the program upon execution, including one or a combination set of the step of embodiment of the method.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing module, can also That unit is individually physically present, can also two or more units be integrated in a module.Above-mentioned integrated mould Block can both be realized in the form of hardware, it would however also be possible to employ the form of software function module is realized.The integrated module is such as Fruit is realized using in the form of software function module and as independent production marketing or in use, can also be stored in a computer In read/write memory medium.
Storage medium mentioned above can be read-only storage, disk or CD etc..Although having been shown and retouching above Embodiments of the present invention are stated, it is to be understood that above-mentioned embodiment is exemplary, it is impossible to be interpreted as to the present invention's Limitation, one of ordinary skill in the art can be changed to above-mentioned embodiment, change, replaces within the scope of the invention And modification.

Claims (11)

1. a kind of depth detection method based on single camera, it is characterised in that the depth detection method comprises the following steps:
The camera is controlled to shoot current scene to obtain the first image in first position;
The camera is controlled to be moved to the second place from the first position along the axially direction perpendicular to the camera The camera is controlled to shoot the second image of the current scene afterwards;With
Described first image and second image is handled to obtain the depth information of the current scene.
2. depth detection method as claimed in claim 1, it is characterised in that the camera is applied to electronic installation, described Electronic installation includes motion sensor, and the control camera is from the first position along the axle perpendicular to the camera To direction be moved to behind the second place the second image for controlling the camera to shoot the current scene the step of include with Lower step:
Determine the camera whether from the first position along perpendicular to described according to the detecting data of the motion sensor The axially direction of camera is moved to the second place;
In the camera second is moved to from the first position along the axially direction perpendicular to the camera The camera is controlled to shoot second image when putting;With
Described second is not moved in the camera from the first position along the axially direction perpendicular to the camera The electronic installation is controlled to send prompting during position.
3. depth detection method as claimed in claim 2, it is characterised in that it is described in the camera from the first position Along controlling the camera to shoot described second when being moved to the second place perpendicular to the axially direction of the camera The step of image, is further comprising the steps of:
In the camera second is moved to from the first position along the axially direction perpendicular to the camera Detect whether the current scene is kept in motion when putting;With
The camera is controlled to shoot the second image of the current scene when the current scene is not in motion state.
4. depth detection method as claimed in claim 2, it is characterised in that the processing described first image and described second The step of depth information of the image to obtain the current scene, comprises the following steps:
Linear range between the first position and the second place is obtained according to the detecting data of the motion sensor;
Obtain characteristic point of the focusing main body of the current scene in described first image and on second image and right Answer the matched pixel of the characteristic point;With
Institute is calculated according to the coordinate of the parameter of the camera, the linear range and the characteristic point and the matched pixel State the depth information of focusing main body.
5. a kind of depth detection apparatus based on single camera, for electronic installation, it is characterised in that the depth detection apparatus Including:
First control module, first control module is used to control the camera to shoot current scene to obtain in first position Obtain the first image;
Second control module, second control module is used to take the photograph perpendicular to described from first position edge in the camera As the axially direction of head is moved to behind the second place the second image for controlling the camera to shoot the current scene;With
Processing module, the processing module is used to handle described first image and second image to obtain the current scene Depth information.
6. depth detection apparatus as claimed in claim 5, it is characterised in that the camera is applied to electronic installation, described Electronic installation includes motion sensor, and second control module includes:
Judging unit, the judging unit be used for according to the detecting data of the motion sensor determine the camera whether from The first position is moved to the second place along the axially direction perpendicular to the camera;
Control unit, described control unit is used in the camera from the first position along the axle perpendicular to the camera To direction be moved to the second place when control the camera to shoot second image;With
Tip element, the Tip element is used in the camera not from the first position along perpendicular to the camera Axially direction controls the electronic installation to send prompting when being moved to the second place.
7. depth detection apparatus as claimed in claim 6, it is characterised in that described control unit includes:
Detection sub-unit, the detection sub-unit is used in the camera from the first position along perpendicular to the camera Axially direction detect whether the current scene is kept in motion when being moved to the second place;With
Subelement is controlled, the control subelement is used to control the camera when the current scene is not in motion state Shoot the second image of the current scene.
8. depth detection apparatus as claimed in claim 6, it is characterised in that the processing module includes:
First acquisition unit, the first acquisition unit is used to obtain described first according to the detecting data of the motion sensor Linear range between position and the second place;
Second acquisition unit, obtains the focusing main body of the current scene in described first image and on second image Characteristic point and the correspondence characteristic point matched pixel;With
Computing unit, the computing unit be used for according to the parameter of the camera, the linear range and the characteristic point and The coordinate of the matched pixel calculates the depth information of the focusing main body.
9. depth detection apparatus as claimed in claim 6, the motion sensor includes gyroscope or acceleration transducer.
10. a kind of electronic installation, it is characterised in that the electronic installation includes:
Camera;
Motion sensor;With
Depth detection apparatus as described in claim 5 to 9 any one, the depth detection apparatus and the camera and institute Motion sensor is stated to electrically connect.
11. electronic installation as claimed in claim 10, it is characterised in that the electronic installation includes mobile phone and/or flat board electricity Brain.
CN201710138684.5A 2017-03-09 2017-03-09 Depth detection method and device and electronic installation based on single camera Pending CN107025666A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710138684.5A CN107025666A (en) 2017-03-09 2017-03-09 Depth detection method and device and electronic installation based on single camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710138684.5A CN107025666A (en) 2017-03-09 2017-03-09 Depth detection method and device and electronic installation based on single camera

Publications (1)

Publication Number Publication Date
CN107025666A true CN107025666A (en) 2017-08-08

Family

ID=59525923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710138684.5A Pending CN107025666A (en) 2017-03-09 2017-03-09 Depth detection method and device and electronic installation based on single camera

Country Status (1)

Country Link
CN (1) CN107025666A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107749069A (en) * 2017-09-28 2018-03-02 联想(北京)有限公司 Image processing method, electronic equipment and image processing system
CN109889709A (en) * 2019-02-21 2019-06-14 维沃移动通信有限公司 A kind of camera module control system, method and mobile terminal
CN110517305A (en) * 2019-08-16 2019-11-29 兰州大学 A kind of fixed object 3-D image reconstructing method based on image sequence
CN110800023A (en) * 2018-07-24 2020-02-14 深圳市大疆创新科技有限公司 Image processing method and equipment, camera device and unmanned aerial vehicle
CN111213364A (en) * 2018-12-21 2020-05-29 深圳市大疆创新科技有限公司 Shooting equipment control method, shooting equipment control device and shooting equipment
CN112771576A (en) * 2020-05-06 2021-05-07 深圳市大疆创新科技有限公司 Position information acquisition method, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105141942A (en) * 2015-09-02 2015-12-09 小米科技有限责任公司 3d image synthesizing method and device
CN105376484A (en) * 2015-11-04 2016-03-02 深圳市金立通信设备有限公司 Image processing method and terminal
CN105989626A (en) * 2015-02-10 2016-10-05 深圳超多维光电子有限公司 Three-dimensional scene construction method and apparatus thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105989626A (en) * 2015-02-10 2016-10-05 深圳超多维光电子有限公司 Three-dimensional scene construction method and apparatus thereof
CN105141942A (en) * 2015-09-02 2015-12-09 小米科技有限责任公司 3d image synthesizing method and device
CN105376484A (en) * 2015-11-04 2016-03-02 深圳市金立通信设备有限公司 Image processing method and terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
许凌羽: "视觉坐标测量机仿真模型的研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107749069A (en) * 2017-09-28 2018-03-02 联想(北京)有限公司 Image processing method, electronic equipment and image processing system
CN110800023A (en) * 2018-07-24 2020-02-14 深圳市大疆创新科技有限公司 Image processing method and equipment, camera device and unmanned aerial vehicle
CN111213364A (en) * 2018-12-21 2020-05-29 深圳市大疆创新科技有限公司 Shooting equipment control method, shooting equipment control device and shooting equipment
CN109889709A (en) * 2019-02-21 2019-06-14 维沃移动通信有限公司 A kind of camera module control system, method and mobile terminal
CN110517305A (en) * 2019-08-16 2019-11-29 兰州大学 A kind of fixed object 3-D image reconstructing method based on image sequence
CN110517305B (en) * 2019-08-16 2022-11-04 兰州大学 Image sequence-based fixed object three-dimensional image reconstruction method
CN112771576A (en) * 2020-05-06 2021-05-07 深圳市大疆创新科技有限公司 Position information acquisition method, device and storage medium

Similar Documents

Publication Publication Date Title
CN107025666A (en) Depth detection method and device and electronic installation based on single camera
US7711201B2 (en) Method of and apparatus for generating a depth map utilized in autofocusing
US6038074A (en) Three-dimensional measuring apparatus and method, image pickup apparatus, and apparatus and method for inputting image
CN102812416B (en) Pointing input device, indicative input method, program, recording medium and integrated circuit
CN106993112A (en) Background-blurring method and device and electronic installation based on the depth of field
CN106625673A (en) Narrow space assembly system and assembly method
EP2597597A2 (en) Apparatus and method for calculating three dimensional (3D) positions of feature points
WO2015013022A1 (en) Method and computations for calculating an optical axis vector of an imaged eye
CN113639963A (en) Device, system and method for determining one or more optical parameters of an ophthalmic lens
US11042984B2 (en) Systems and methods for providing image depth information
CN105376484A (en) Image processing method and terminal
WO2019080046A1 (en) Drift calibration method and device for inertial measurement unit, and unmanned aerial vehicle
CN109451240B (en) Focusing method, focusing device, computer equipment and readable storage medium
CN105739706B (en) Control method, control device and electronic device
JPH11306363A (en) Image input device and image inputting method
CN106210527B (en) The PDAF calibration methods and device moved based on MEMS
WO2020124517A1 (en) Photographing equipment control method, photographing equipment control device and photographing equipment
CN110194173A (en) Occupant's monitoring arrangement
US20040175057A1 (en) Affine transformation analysis system and method for image matching
CN115035546A (en) Three-dimensional human body posture detection method and device and electronic equipment
CN106919928A (en) gesture recognition system, method and display device
CN106991376A (en) With reference to the side face verification method and device and electronic installation of depth information
CN116439652A (en) Diopter detection method, diopter detection device, upper computer and diopter detection system
JP5168629B2 (en) Azimuth angle measuring apparatus and azimuth angle measuring method
CN115862124A (en) Sight estimation method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170808

RJ01 Rejection of invention patent application after publication