CN111213364A - Shooting equipment control method, shooting equipment control device and shooting equipment - Google Patents

Shooting equipment control method, shooting equipment control device and shooting equipment Download PDF

Info

Publication number
CN111213364A
CN111213364A CN201880065930.1A CN201880065930A CN111213364A CN 111213364 A CN111213364 A CN 111213364A CN 201880065930 A CN201880065930 A CN 201880065930A CN 111213364 A CN111213364 A CN 111213364A
Authority
CN
China
Prior art keywords
image
information
determining
target object
photographing apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880065930.1A
Other languages
Chinese (zh)
Inventor
胡攀
郑洪涌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Shenzhen DJ Innovation Industry Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111213364A publication Critical patent/CN111213364A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

A control method of a photographing apparatus includes: (S12) acquiring a first image of the target object (P) when the photographing apparatus (100) is located at a first position; (S14) acquiring a second image of the target object (P) when the photographing apparatus (100) is located at the second position; (S16) determining depth information of the target object (P) from the first image and the second image; (S18) controlling the photographing apparatus (100) to focus on the target object (P) at the second position according to the depth information. The application also discloses a control device (10) of the shooting device (100) and the shooting device (100).

Description

Shooting equipment control method, shooting equipment control device and shooting equipment
Technical Field
The present disclosure relates to the field of photography technologies, and in particular, to a control method of a photography device, a control apparatus of a photography device, and a photography device.
Background
In the related art, for a camera with only a center to focus, if it is necessary to focus other areas of the screen, the camera needs to be moved so that the target object is located at the center of the screen, and then refocusing is performed. However, the composition of the frame may be changed, so that when the target object needs to be focused, only the center composition mode can be adopted, and asymmetric composition modes such as the trisection composition mode and the S-shaped composition mode cannot be adopted. Therefore, how to realize multi-point focusing by only a camera with focus at the center becomes a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a control method of shooting equipment, a control device of the shooting equipment and the shooting equipment.
The control method of the shooting equipment comprises the following steps:
when the shooting equipment is located at a first position, acquiring a first image of a target object;
when the shooting equipment is located at a second position, acquiring a second image of the target object;
determining depth information of the target object according to the first image and the second image;
and controlling the shooting equipment to focus on the target object at the second position according to the depth information.
The control device of the shooting equipment of the embodiment of the application comprises:
the first acquisition module is used for acquiring a first image of a target object when the shooting equipment is positioned at a first position;
the second acquisition module is used for acquiring a second image of the target object when the shooting equipment is positioned at a second position;
a determination module to determine depth information of the target object from the first image and the second image;
and the focusing module is used for controlling the shooting equipment to focus on the target object at the second position according to the depth information.
The photographing apparatus of an embodiment of the present application includes a processor and a memory, the memory storing one or more programs, the processor being configured to execute the one or more programs to implement the method of controlling the photographing apparatus of the above embodiment.
According to the control method of the shooting equipment, the control device of the shooting equipment and the shooting equipment, the depth information of the target object is determined through the images shot by the shooting equipment at different positions, and the target object is focused according to the depth information, so that the hardware cost is saved, and meanwhile, the multi-point focusing can be simply and conveniently realized.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart illustrating a control method of a photographing apparatus according to an embodiment of the present application;
fig. 2 is a block schematic diagram of a photographing apparatus according to an embodiment of the present application;
fig. 3 is another block diagram of the photographing apparatus according to the embodiment of the present application;
FIG. 4 is a schematic view of the focusing principle of the photographing apparatus according to the embodiment of the present application;
fig. 5 is a scene schematic diagram of a control method of a photographing apparatus according to an embodiment of the present application;
fig. 6 is another scene schematic diagram of a control method of a photographing apparatus according to an embodiment of the present application;
fig. 7 is a flowchart illustrating a control method of a photographing apparatus according to another embodiment of the present application;
fig. 8 is a block diagram schematically illustrating a control method of a photographing apparatus according to another embodiment of the present application;
fig. 9 is a flowchart illustrating a control method of a photographing apparatus according to still another embodiment of the present application;
fig. 10 is a block diagram schematically illustrating a control method of a photographing apparatus according to still another embodiment of the present application;
fig. 11 is a flowchart illustrating a control method of a photographing apparatus according to still another embodiment of the present application;
fig. 12 is a block diagram schematically illustrating a control method of a photographing apparatus according to still another embodiment of the present application.
Description of the main element symbols:
the image capturing apparatus 100, the optical axis 101, the control device 10, the first acquisition module 12, the second acquisition module 14, the determination module 16, the first determination unit 162, the first determination subunit 1622, the second determination subunit 1624, the second determination unit 164, the focusing module 18, the third determination unit 182, the fourth determination unit 184, the adjustment unit 186, the inertial measurement unit 30, the lens 40, the image sensor 50, the processor 60, and the memory 70.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the present application, it is to be understood that the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
The following disclosure provides many different embodiments or examples for implementing different features of the application. In order to simplify the disclosure of the present application, specific example components and arrangements are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
Referring to fig. 1 and 2, embodiments of the present application provide a control method for a shooting device 100, a control apparatus 10 for the shooting device 100, and a shooting device.
The control method of the photographing apparatus 100 of the embodiment of the present application includes:
step S12: acquiring a first image of the target object P while the photographing apparatus 100 is located at a first position;
step S14: acquiring a second image of the target object P while the photographing apparatus 100 is located at the second position;
step S16: determining the depth information of the target object P according to the first image and the second image;
step S18: and controlling the photographing apparatus 100 to focus on the target object P at the second position according to the depth information.
The control device 10 of the photographing apparatus 100 of the embodiment of the present application includes a first acquisition module 12, a second acquisition module 14, a determination module 16, and a focusing module 18. The first acquiring module 12 is configured to acquire a first image of the target object P when the photographing apparatus 100 is located at a first position. The second acquiring module 14 is configured to acquire a second image of the target object P when the photographing apparatus 100 is located at a second position. The determining module 16 is configured to determine depth information of the target object P according to the first image and the second image. The focusing module 18 is used for controlling the shooting device 100 to focus on the target object P at the second position according to the depth information.
According to the control method of the shooting device 100, the control device of the shooting device 100 and the shooting device 100, the depth information of the target object P is determined through images shot by the shooting device 100 at different positions, and the target object P is focused according to the depth information, so that the hardware cost is saved, and meanwhile, the multi-point focusing can be simply and conveniently realized.
In addition, as shown in fig. 3, the photographing apparatus 100 according to another embodiment of the present application includes a processor 60 and a memory 70, the memory 70 stores one or more programs, and the processor 60 is configured to execute the one or more programs to implement the method of controlling the photographing apparatus 100 according to any embodiment of the present application. The photographing apparatus 100 further includes an inertial measurement unit 30, a lens 40, and an image sensor 50. The inertial measurement unit 30, the lens 40, the image sensor 50, the processor 60 and the memory 70 are connected by a bus 11. Light from a subject passes through the lens 40 and is imaged on the image sensor 50. The processor 60 of the photographing apparatus 100 controls the photographing apparatus 100 and processes an image captured by the image sensor 50. The operation principle of the shooting device 100 in fig. 2 is similar to that of the shooting device 100 in fig. 3, but the shooting device 100 is controlled by the control means 10 of the shooting device 100, and is not described herein again to avoid redundancy.
In the embodiment of fig. 1, step S12 is executed before step S14. In other embodiments, step S14 may be performed before step S12.
The photographing apparatus 100 includes, but is not limited to, a camera and other electronic apparatuses having a photographing function, such as a mobile phone, a tablet computer, a smart wearable apparatus, a personal computer, a drone, a handheld cradle head apparatus, a notebook computer, and the like. The following description will be made by taking a camera as an example.
In addition, more pictures can be taken with the photographing apparatus 100 at multiple positions or in multiple postures to ensure balance of matching accuracy and calculation error, resulting in more accurate depth information. That is, the first position and the second position are different positions for distinguishing two positions, and are not exhaustive.
Of course, the photographing apparatus 100 may be provided with a depth camera, and the depth information of the target object in the picture taken by the photographing apparatus 100 is directly obtained by the depth camera, and then the subsequent adjustment of the focusing plane is performed based on the depth information. Further, the depth camera may be a Time of flight (TOF) camera. It will be appreciated that the TOF camera captures a single image to obtain a depth map as long as the relative pose between the TOF camera and the capture device 100 is calibrated. There are some differences between the TOF camera and the photographing apparatus 100 in translation amount and Field of view (FOV), and after matching, a corresponding point on the depth map of the TOF photographing can be found according to the image point and the matching relation, so as to obtain the depth of the point of the image. Further, the relative pose of the TOF camera and the photographing apparatus 100 may be calibrated by a special calibration tool.
Referring to fig. 4, the focusing principle of the camera is: when a camera takes a picture, points on the non-alignment plane can form a diffuse spot on the scene plane, and if the field angle of the diffuse spot to human eyes is smaller than the limit resolution (about 1') of the human eyes, the human eyes can not have the feeling of unclear image. The depth of distance allowed before and after the alignment plane is the depth of field, limited by the size of the diffuse spot.
The specific depth of field calculation is shown as follows:
Figure BDA0002444102460000051
wherein, L is the distance (alignment plane) of the target object, F is the aperture value, F is the camera focal length which is equal to the ratio of the focal length to the aperture diameter, and sigma is the minimum allowable diffusion light spot diameter.
In addition, there are two common ways of auto-focusing, namely, active and passive. The active automatic focusing transmits infrared rays, ultrasonic waves or laser to a shot object through the camera body, then receives reflected echoes and the like to measure distance, and then adjusts the focusing of the lens according to depth information and a focusing curve. Passive autofocus includes both phase focus detection and contrast focus detection. For a single lens reflex camera, active Auto focusing needs to design a special focusing light path and a focusing sensor to acquire Phase information, and for a non-reflective digital camera, a Phase Detection Auto Focus (PDAF) image sensor is mostly used to directly acquire Phase information on an imaging light path. Contrast focusing is mainly used in common digital cameras, and the focusing mode is slow and sensitive to the design of a contrast information filter.
At present, in order to meet the requirements of image quality and focusing speed, a multi-point focusing system, such as a multi-point focusing sensor and a specially designed optical path, needs to be assembled on a hardware structure of a camera, which may lead to a sharp increase in hardware cost.
For a camera with only a center to focus point, if other areas of the picture need to be focused, the camera needs to be moved to center the target object on the picture and refocus. However, this method may change the composition of the frame, so that when the target object needs to be focused, only the central composition method can be used, but an asymmetric composition method, such as a trisection composition, an S-shaped composition, etc., cannot be used. Therefore, generally, only the center-to-focus camera needs to consider other ways to achieve multi-point focusing.
Specifically, in one example, referring to fig. 5, the photographing apparatus 100 photographs the target object P at a first position to obtain a first image of the target object P, where the focusing plane S1 is at the human eye and is perpendicular to the optical axis 101 of the photographing apparatus 100; then, the photographing apparatus 100 photographs the target object P at a second position to acquire a second image of the target object P, since the focus plane is not adjusted and the position of the photographing apparatus 100 is changed, when the focus plane S1 is at the ear of the person and is perpendicular to the optical axis 101 of the photographing apparatus 100. That is, after the photographing apparatus 100 having only the center focus function is moved from the first position to the second position, if not adjusted, the focus plane S1 changes from a plane at the human eye and perpendicular to the optical axis 101 of the photographing apparatus 100 to a plane at the human ear and perpendicular to the optical axis 101 of the photographing apparatus 100, such that a virtual focus may occur at the plane S2 at the human eye and perpendicular to the optical axis 101 of the photographing apparatus 100.
In order to solve the above problem, the related art is provided with a high-precision processor on the shooting device to record the rotation of the shooting device, then calculate an adjustment value of the depth of a focusing plane according to the rotation angle, and move a lens or an image sensor according to a focusing table (focus) of the lens, so that after the shooting device moves from a first position to a second position, the focusing plane still falls on the focusing plane which is focused in the first position (i.e. the focusing plane still falls on the plane which is in the human eye and perpendicular to the optical axis 101 of the shooting device 100), so as to implement the function of focusing compensation. However, due to the limitation of hardware cost and the technical difficulty of high-precision instruments and calculating focus adjustment, and considering the balance of shooting device endurance and shooting device performance, it is difficult to implement focus compensation by a shooting device with only center focus.
Based on the above discussion, please refer to fig. 6, the control method of the photographing apparatus 100, the control device of the photographing apparatus 100, and the photographing apparatus 100 according to the embodiment of the present application are to perform two times of photographing with different viewing angles on the same target object P at two different positions of the photographing apparatus 100 to obtain a first image and a second image, so as to obtain depth information of the target object P, and finally adjust a focusing plane of a camera based on the depth information to realize a function of focusing compensation. Compared with the existing solution of a multi-point focusing camera and the solution of a single-point focusing camera which needs a high-precision processor, the multi-point focusing camera has the advantages that: (1) hardware cost is saved; (2) the image information is referred, and compared with a mode of calculating the object distance change based on a high-precision sensor, the multi-point depth can be calculated at one time, multi-point focusing is realized at one time, and the posture of a camera does not need to be adjusted for multiple times; (3) the focal density is higher.
Referring to fig. 7, in some embodiments, step S16 includes:
step S162: determining the space coordinates of the target object P according to the first image and the second image;
step S164: depth information is determined from the spatial coordinates.
Referring to fig. 8, in some embodiments, the determination module 16 includes a first determination unit 162 and a second determination unit 164. The first determination unit 162 is configured to determine the spatial coordinates of the target object P from the first image and the second image; the second determination unit 164 is configured to determine depth information from the spatial coordinates.
In this way, determination of depth information of the target object P is achieved. Note that the "spatial coordinates" here may be the spatial coordinates X of all points in the same field of view in the camera coordinate system at the time of the first image capturing, and the "depth information" here may be the depth information of the target object P at the time of the second image capturing. Further, R may be according to the formula X' ═ R-1(X-T) calculates the spatial coordinates X' of the corresponding point in the camera coordinate system at the time of the second image capturing. Wherein R is a rotation matrix and T is a translation matrix. The specific calculation of the rotation matrix R and the translation matrix T is described in detail later. It is understood that the Z-axis direction value of the spatial coordinate X in the camera coordinate system when the first image is taken at the first position and the Z-axis direction value of the spatial coordinate X' of the corresponding point in the camera coordinate system when the second image is taken at the second position are depths, and thus depth information can be determined.
Referring to fig. 9, in some embodiments, step S162 includes:
step S1622: determining relative posture information of the photographing apparatus 100 at the first position and the second position from the first image and the second image;
step S1624: and determining the space coordinates of the target object P according to the relative attitude information.
In some embodiments, the first determination unit 162 includes a first determination subunit 1622 and a second determination subunit 1624. The first determining subunit 1622 is configured to determine the relative posture information of the photographing apparatus 100 at the first position and the second position from the first image and the second image. The second determining subunit 1624 is configured to determine the spatial coordinates of the target object P according to the relative posture information.
In this way, determination of the spatial coordinates of the target object P from the first image and the second image is achieved.
Specifically, step S1622 includes:
processing the first image and the second image to obtain a first matching set M of the first image and the second image;
the relative posture information is determined from the first matching set M and the parameter information of the photographing apparatus 100.
Referring to fig. 10, the first determining subunit 1622 is configured to process the first image and the second image to obtain a first matching set M of the first image and the second image, and is configured to determine the relative pose information according to the first matching set M and the parameter information of the photographing apparatus 100.
In this manner, determination of the relative posture information of the photographing apparatus 100 at the first position and the second position from the first image and the second image is achieved.
In some embodiments, processing the first image and the second image to obtain a first matched set M of the first image and the second image comprises:
determining a first set of feature points I of a first image1And a second feature point set I of a second image2
Matching a first set of feature points I1And a second feature point set I2To obtain a first matching set M.
In some embodiments, the first determining subunit 1622 is configured to determine the first feature point set I of the first image1And a second feature point set I of a second image2(ii) a And for matching a first set of feature points I1And a second feature point set I2To obtain a first matching set M.
In this manner, processing the first and second images to obtain a first matched set M of the first and second images is achieved. Specifically, a first feature point set I of a first image is determined1And a second feature point set I of a second image2The method comprises the following steps: by feature extraction and block matchingDetermining a first feature point set I in at least one way1And a second feature point set I2. Similarly, the first determining subunit 1622 is configured to determine the first feature point set I by at least one of feature extraction and block matching1And a second feature point set I2
Note that in the embodiment of the present application, the first image and the second image may be processed by image sparse matching to obtain the first matching set M of the first image and the second image.
Further, a first feature point set I of the first image is determined1And a second feature point set I of a second image2In time, the algorithms for feature point extraction include, but are not limited to, Oriented FAST and Rotated BRIEF (ORB) algorithm, HARRIS corner point extraction algorithm, Scale-invariant feature transform (SIFT) algorithm, and Speeded Up Robust Features (SURF) algorithm.
Obtaining a first feature point set I of a first image1And a second feature point set I of a second image2Then, the first feature point set I1And the second feature point set I2, thereby calculating a first matching set M:
M={(x1,x2)|(K-1x2)TEK-1x1≈0,x1∈I1,x2∈I2}。
wherein x is1Is a first feature point set I1Element (ii) of (iii), x2Is the second feature point set I2Of (1). Further, the content of one element includes: two-dimensional pixel coordinates, feature descriptors, and neighborhood size. Two-dimensional pixel coordinates are also the locations of the feature points. The feature descriptor is a feature of a neighborhood of a block of image with the feature point as a center, and in general, the feature descriptor is a vector of one or more dimensions, such as SIFT feature and SURF feature, and even in the simplest case, the feature descriptor may be a mean value of pixel values of the block of image. If the image is in RGB format, the feature descriptors are RGB values, and if YUV, the feature descriptors are YUV valuesIs the YUV value. Of course, the feature descriptors will not be such simple features in general, and there will be some statistically combined features such as gradients, directions, etc.
In addition, x can be matched by1,x2The feature vectors in (1) form a matching pair with the elements with the highest similarity or exceeding a certain threshold. It can be understood that the reason why the above formula for calculating the first matching set uses the "approximately equal" sign is that the equal sign relationship exists only when two image points represent the same object point, and thus the point is a perfect match, but the method of finding a matching point by extracting feature points and then performing similarity matching does not necessarily correspond to the same point completely due to precision error and the like, and there may be a deviation of several pixels.
In some embodiments, the photographing apparatus 100 includes an Inertial Measurement Unit (IMU) 30, which matches the first feature point set I1And a second feature point set I2To obtain a first matching set M, comprising:
detecting motion information of the photographing apparatus 100 using the inertia measurement unit 30;
matching a first set of feature points I based on motion information1And a second feature point set I2To obtain a first matching set M.
In some embodiments, the photographing apparatus 100 comprises an inertial measurement unit 30, the first determining subunit 1622 is configured to detect motion information of the photographing apparatus 100 using the inertial measurement unit 30, and is configured to match the first feature point set I according to the motion information1And a second feature point set I2To obtain a first matching set M.
Thus, matching the first feature point set I is realized1And a second feature point set I2To obtain a first matching set M. Specifically, the motion information may be camera rotation and translation information provided by the IMU unit, and may guide a search area when the image feature points are matched according to the motion information. In the present embodiment, since the IMU has 3-axis acceleration and 3-axis angular velocity, and can output rotation angles and translation amounts in three directions of the YAW Axis (YAW), the ROLL axis (ROLL), and the PITCH axis (PITCH), it is possible to guide image characteristicsAnd the search area is used for improving the matching efficiency when the feature points are matched. In addition, when the IMU accuracy is sufficient, the rotation matrix R and the translation matrix T may be determined from the motion information.
In some embodiments, the relative pose information includes an essential matrix E, a rotation matrix R, and a translation matrix T, and determining the relative pose information according to the first matching set M and the parameter information of the photographing apparatus 100 includes:
determining an essential matrix E under a preset constraint condition according to the first matching set M and the parameter information;
the intrinsic matrix E is decomposed to obtain a rotation matrix R and a translation matrix T.
In some embodiments, the relative pose information includes an essential matrix E, a rotation matrix R, and a translation matrix T, and the first determining subunit 1622 is configured to determine the essential matrix E according to the first matching set M and the parameter information under a preset constraint condition; and for decomposing the essential matrix E to obtain a rotation matrix R and a translation matrix T.
In this way, determination of the relative posture information from the first matching set M and the parameter information of the photographing apparatus 100 is achieved. Note that in the present embodiment, the relative orientation information may be determined from the first matching set M and the parameter information of the photographing apparatus 100 by way of calculation based on the camera rotation and translation information of sparse matching.
Specifically, the parameter information of the photographing apparatus 100 may be an intrinsic parameter matrix K of the photographing apparatus 100. Using the intrinsic parameter matrix K and the first matching set M of the photographing apparatus 100, the optimized intrinsic matrix E can be calculated by an optimization method under the following constraints:
Figure BDA0002444102460000101
the optimized rotation matrix R and translation matrix T can be derived by decomposing the essential matrix E:
Figure BDA0002444102460000102
Figure BDA0002444102460000103
the rotation matrix R and the translation matrix T are relative attitude changes of the photographing apparatus 100 when the first image and the second image are photographed, that is, relative attitude information.
Note that the reference coordinate systems of the rotation matrix R and the translation matrix T are specifically the camera coordinates when the first image is captured or when the second image is captured, depending on the relative direction of the "relative attitude change", which is the camera coordinate system when the second image is captured if the first image is attitude changed relative to the second image. In addition, decomposing the essential matrix E to obtain the rotation matrix R and the translation matrix T may be performed by Singular Value Decomposition (SVD).
Further, the optimization method is that the point set satisfies the above constraint formula, then the equation set is solved, and then the optimal result is obtained through RANSAC (or least _ mean) re-check. Reference may be made in particular to the findEsentral function of opencv, which substantially corresponds to the findHomography function.
In addition, the brief description of the parameter matrix in the camera is as follows:
Figure BDA0002444102460000104
where fx, fy represents a camera focal length in units of pixels in the x, y direction, and cx, cy represents a center offset in units of pixels in the x, y direction.
If camera distortion is considered, radial distortion parameters such as k1, k2 and tangential distortion parameters such as p1, p2 are also included. The specific description is as follows:
u=fx·x′+cx
v=fy·y′+cy
x″=x′·(1+k1·r2+k2·r4)+2·p1·x′·y′+p2·(r2+2x′2)
y″=y′·(1+k1·r2+k2·r4)+p1·(r2+2·y′2)+2·p2·x′·y′
where r is2=x’2+y’2
u=fx·x″+cx
v=fy·y″+cy
And u and v are coordinates of a certain pixel point by taking a pixel as a unit.
In some embodiments, the relative pose information includes an essential matrix E, a rotation matrix R, and a translation matrix T, and determining the spatial coordinates of the target object P from the relative pose information includes:
processing the first image and the second image according to the intrinsic matrix E to obtain a second matching set N of the first image and the second image;
determining a third image according to the second matching set N and the first image, wherein the third image is an image corresponding to the second matching set N in the first image;
and processing the third image according to the rotation matrix R and the translation matrix T to obtain the space coordinate of the target object P.
In some embodiments, the relative pose information includes an essential matrix E, a rotation matrix R, and a translation matrix T, and the second determining subunit 1624 is configured to process the first image and the second image according to the essential matrix E to obtain a second matching set N of the first image and the second image; the image matching device is used for determining a third image according to the second matching set N and the first image, wherein the third image is an image corresponding to the second matching set N in the first image; and processing the third image according to the rotation matrix R and the translation matrix T to obtain the space coordinate of the target object P.
In this way, the determination of the spatial coordinates of the target object P from the relative posture information is achieved. Note that in the embodiment of the present application, the first image and the second image may be processed according to the intrinsic matrix E in a dense matching manner to obtain the second matching set N of the first image and the second image.
Specifically, a second matching set N of more corresponding pixel points in all the first images and the second images may be calculated under the intrinsic matrix E reference obtained by sparse matching:
N={(u1,u2)|(K-1u2)TEK-1u1=0,u2∈P1,u2∈P2};
wherein, P1And P2And densely matched pixel points of the same visual field in the first image and the second image.
Then, the image corresponding to the pixel point in the first image corresponding to the second matching set N is used as a "common image", that is, a third image.
Finally, the final rotation matrix R and translation matrix T may be used to restore the coordinate X of the pixel point (corresponding to the same object point) in the third image in the three-dimensional space, so as to obtain the spatial coordinate of the target object P:
Figure BDA0002444102460000111
as described above, the three-dimensional coordinates here are coordinate values referenced to the camera coordinate system when the first image is captured at the first position. Can be according to the formula X' ═ R-1(X-T) calculating the space coordinate X' of the corresponding point in the camera coordinate system when the second image is obtained by shooting at the second position. The Z-axis direction value of the spatial coordinate X in the camera coordinate system when the first image is taken at the first position and the Z-axis direction value of the spatial coordinate X' of the corresponding point in the camera coordinate system when the second image is taken at the second position are depths, so that depth information can be determined.
Referring to fig. 11, in some embodiments, step S18 includes:
step S182: when the target object P is focused at the second position, determining the depth of an adjusting point of the second image according to the depth information, wherein the adjusting point of the second image is related to the focus of the first image;
step S184: determining adjustment information of the photographing apparatus 100 according to the depth of the adjustment point;
step S186: the photographing apparatus 100 is adjusted according to the adjustment information so that the photographing apparatus 100 is focused on the target object P at the second position.
Referring to fig. 12, in some embodiments, the focusing module 18 includes a third determining unit 182, a fourth determining unit 184 and an adjusting unit 186. The third determining unit 182 is configured to determine a depth of an adjustment point of the second image, which is related to the focus of the first image, according to the depth information when the target object P is focused at the second position. The fourth determination unit 184 is configured to determine adjustment information of the photographing apparatus 100 according to the depth of the adjustment point. The adjusting unit 186 is configured to adjust the photographing apparatus 100 according to the adjustment information so that the photographing apparatus 100 is focused on the target object P at the second position.
In this way, it is achieved to control the photographing apparatus 100 to focus on the target object P at the second position according to the depth information. Referring to fig. 5 again, it can be understood that, in the present example, when the first image is captured at the first position, the focusing plane passes through the human eye, and when the second image is captured at the second position, the focusing plane passes through the human ear due to the unadjusted focal length, and therefore, the photographing apparatus 100 needs to be adjusted so that the adjusted focusing plane passes through the human eye.
Specifically, when the first image is captured at the first position, the focus plane is S1, the depth of the focus Q1 is L1, and the focus plane S1 passes through the human eye.
When the second image is taken at the second position, the focal plane is still the plane S1 due to the unadjusted focal length, and due to the change in position, the focal plane S1 passes through the human ear rather than the human eye, resulting in a virtual focus of the plane S2 perpendicular to the optical axis 101 and passing through the human eye, that is, the focal plane needs to be adjusted from the plane S1 to the plane S2.
In the second image, the adjustment point corresponding to the focal point Q1 of the first image is the intersection Q2 of the plane passing through the human eye and perpendicular to the optical axis 101 and the optical axis 101. The depth L2 of the setpoint Q2 may be determined from the depth information. In this way, the photographing apparatus 100 may be adjusted according to L1 and L2 such that the focus plane is adjusted from the plane S1 to the plane S2, thereby focusing the photographing apparatus 100 on the target object P in the human eye at the second position.
In certain embodiments, step S184 comprises:
the adjustment information is determined according to the depth of the adjustment point and a preset adjustment relationship of the photographing apparatus 100.
In some embodiments, the fourth determining unit 184 is configured to determine the adjustment information according to the depth of the adjustment point and a preset adjustment relationship of the photographing apparatus 100.
In this manner, the adjustment of the photographing apparatus 100 according to the adjustment information to bring the photographing apparatus 100 into focus on the target object P at the second position is achieved. Specifically, the preset adjustment relationship may be a focus table (focus table), and after the values of the depth L1 of the focus Q and the depth L2 of the adjustment point Q2 are determined, the focus table may be queried accordingly, thereby determining the adjustment information. Further, the adjustment information includes at least one of lens adjustment information and image sensor adjustment information.
In one example, the distance of lens movement required from L1 to L2 for focusing can be queried according to the focusing table, so as to adjust the plane where the human eye of the shooting device 100 focuses to the target object P at the second position, that is, the plane S2, to realize focusing compensation.
In another example, the distance that the image sensor needs to move from L1 to L2 can be queried according to the focusing table, so as to adjust the plane where the human eye of the shooting device 100 focuses to the target object P at the second position, that is, the plane S2, to implement the focusing compensation.
In still another example, the distance of lens movement and the distance of image sensor movement required for focusing from L1 to L2 may be queried according to the focusing table, so as to adjust the plane where the human eye of the photographing apparatus 100 focuses to the target object P at the second position, that is, the plane S2, to implement focusing compensation.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and the scope of the preferred embodiments of the present application includes additional implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, various steps or methods may be performed by software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for performing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried out in the method of implementing the above embodiments may be implemented by hardware associated with instructions of a program, which may be stored in a computer-readable storage medium, and which, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be executed in the form of hardware or in the form of a software functional module. The integrated module, if executed in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (25)

1. A control method of a photographing apparatus, comprising:
when the shooting equipment is located at a first position, acquiring a first image of a target object;
when the shooting equipment is located at a second position, acquiring a second image of the target object;
determining depth information of the target object according to the first image and the second image;
and controlling the shooting equipment to focus on the target object at the second position according to the depth information.
2. The method of controlling a photographing apparatus according to claim 1, wherein determining depth information of a target object from the first image and the second image comprises:
determining the space coordinates of the target object according to the first image and the second image;
and determining the depth information according to the space coordinates.
3. The method of controlling the photographing apparatus according to claim 2, wherein determining the spatial coordinates of the target object from the first image and the second image comprises:
determining relative posture information of the shooting device at the first position and the second position according to the first image and the second image;
and determining the space coordinates of the target object according to the relative attitude information.
4. The method of controlling the photographing apparatus according to claim 3, wherein determining the relative posture information of the photographing apparatus at the first position and the second position from the first image and the second image comprises:
processing the first image and the second image to obtain a first matching set of the first image and the second image;
and determining the relative attitude information according to the first matching set and the parameter information of the shooting equipment.
5. The method of controlling a camera of claim 4, wherein processing the first image and the second image to obtain a first matching set of the first image and the second image comprises:
determining a first set of feature points of the first image and a second set of feature points of the second image;
and matching the first feature point set and the second feature point set to obtain the first matching set.
6. The method of controlling a photographing apparatus according to claim 5, wherein the photographing apparatus includes an inertial measurement unit, and matching the first feature point set and the second feature point set to obtain the first matching set includes:
detecting motion information of the photographing apparatus using the inertial measurement unit;
and matching the first characteristic point set and the second characteristic point set according to the motion information to obtain the first matching set.
7. The method of controlling a photographing apparatus according to claim 5, wherein determining a first feature point set of the first image and a second feature point set of the second image comprises:
determining the first set of feature points and the second set of feature points by at least one of feature extraction and block matching.
8. The method of controlling a photographing apparatus according to claim 4, wherein the relative orientation information includes an essence matrix, a rotation matrix, and a translation matrix, and the determining the relative orientation information according to the first matching set and parameter information of the photographing apparatus includes:
determining the essential matrix according to the first matching set and the parameter information under a preset constraint condition;
decomposing the essential matrix to obtain the rotation matrix and the translation matrix.
9. The method of controlling a photographing apparatus according to claim 3, wherein the relative posture information includes an essence matrix, a rotation matrix, and a translation matrix, and determining the spatial coordinates of the target object according to the relative posture information includes:
processing the first image and the second image according to the essential matrix to obtain a second matching set of the first image and the second image;
determining a third image according to the second matching set and the first image, wherein the third image is an image corresponding to the second matching set in the first image;
and processing the third image according to the rotation matrix and the translation matrix to obtain the space coordinate of the target object.
10. The method for controlling the photographing apparatus according to claim 1, wherein controlling the photographing apparatus to focus on the target object at the second position according to the depth information comprises:
when the target object is focused at the second position, determining the depth of an adjusting point of the second image according to the depth information, wherein the adjusting point of the second image is related to the focus of the first image;
determining the adjustment information of the shooting equipment according to the depth of the adjustment point;
and adjusting the shooting equipment according to the adjustment information so that the shooting equipment focuses on the target object at the second position.
11. The method of controlling a photographing apparatus according to claim 10, wherein determining the adjustment information of the photographing apparatus according to the depth of the adjustment point comprises:
and determining the adjustment information according to the depth of the adjustment point and a preset adjustment relation of the shooting equipment.
12. The method of controlling a photographing apparatus according to claim 10, wherein the adjustment information includes at least one of lens adjustment information and image sensor adjustment information.
13. A control apparatus of a photographing device, characterized by comprising:
the first acquisition module is used for acquiring a first image of a target object when the shooting equipment is positioned at a first position;
the second acquisition module is used for acquiring a second image of the target object when the shooting equipment is positioned at a second position;
a determination module to determine depth information of the target object from the first image and the second image;
and the focusing module is used for controlling the shooting equipment to focus on the target object at the second position according to the depth information.
14. The control apparatus of a photographing device according to claim 13, wherein the determining module comprises:
a first determination unit configured to determine spatial coordinates of the target object from the first image and the second image;
a second determination unit to determine the depth information from the spatial coordinates.
15. The control apparatus of the photographing device according to claim 14, wherein the first determination unit includes:
a first determining subunit configured to determine, from the first image and the second image, relative posture information of the photographing apparatus at the first position and the second position;
a second determining subunit, configured to determine spatial coordinates of the target object according to the relative posture information.
16. The control apparatus of the photographing device of claim 15, wherein the first determining subunit is configured to:
processing the first image and the second image to obtain a first matching set of the first image and the second image;
and determining the relative attitude information according to the first matching set and the parameter information of the shooting equipment.
17. The control device of the photographing apparatus of claim 16, wherein the first determining subunit is configured to determine a first feature point set of the first image and a second feature point set of the second image; and for matching the first set of feature points and the second set of feature points to obtain the first matching set.
18. The control apparatus of a photographing device according to claim 17, wherein the photographing device includes an inertial measurement unit, the first determining subunit is configured to detect motion information of the photographing device using the inertial measurement unit; and the motion information matching unit is used for matching the first characteristic point set and the second characteristic point set according to the motion information to obtain the first matching set.
19. The control apparatus of the photographing device of claim 17, wherein the first determining subunit is configured to:
determining the first set of feature points and the second set of feature points by at least one of feature extraction and block matching.
20. The control apparatus of the photographing device according to claim 16, wherein the relative posture information includes an essential matrix, a rotation matrix, and a translation matrix, and the first determining subunit is configured to determine the essential matrix under a preset constraint condition according to the first matching set and the parameter information; and means for decomposing the essential matrix to obtain the rotation matrix and the translation matrix.
21. The control apparatus of the photographing device of claim 15, wherein the relative pose information includes an essential matrix, a rotation matrix, and a translation matrix, and the second determining subunit is configured to process the first image and the second image according to the essential matrix to obtain a second matching set of the first image and the second image; and determining a third image according to the second matching set and the first image, wherein the third image is the image corresponding to the second matching set in the first image; and the third image is processed according to the rotation matrix and the translation matrix to obtain the space coordinate of the target object.
22. The control device of a photographing apparatus according to claim 13, wherein the focusing module comprises:
a third determining unit, configured to determine, when the target object is focused at the second position, a depth of an adjustment point of the second image according to the depth information, where the adjustment point of the second image is related to the focus of the first image;
a fourth determination unit configured to determine adjustment information of the photographing apparatus according to a depth of the adjustment point;
and the adjusting unit is used for adjusting the shooting equipment according to the adjusting information so as to enable the shooting equipment to focus on the target object at the second position.
23. The control apparatus of the photographing device of claim 22, wherein the fourth determination unit is configured to:
and determining the adjustment information according to the depth of the adjustment point and a preset adjustment relation of the shooting equipment.
24. The control apparatus of a photographing device according to claim 22, wherein the adjustment information includes at least one of lens adjustment information and image sensor adjustment information.
25. A photographing apparatus comprising a processor and a memory, the memory storing one or more programs, the processor being configured to execute the one or more programs to implement the method of controlling the photographing apparatus of any one of claims 1 to 12.
CN201880065930.1A 2018-12-21 2018-12-21 Shooting equipment control method, shooting equipment control device and shooting equipment Pending CN111213364A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/122523 WO2020124517A1 (en) 2018-12-21 2018-12-21 Photographing equipment control method, photographing equipment control device and photographing equipment

Publications (1)

Publication Number Publication Date
CN111213364A true CN111213364A (en) 2020-05-29

Family

ID=70790041

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880065930.1A Pending CN111213364A (en) 2018-12-21 2018-12-21 Shooting equipment control method, shooting equipment control device and shooting equipment

Country Status (2)

Country Link
CN (1) CN111213364A (en)
WO (1) WO2020124517A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113301248A (en) * 2021-04-13 2021-08-24 中科创达软件股份有限公司 Shooting method, shooting device, electronic equipment and computer storage medium
WO2022141271A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Control method and control device for platform system, platform system, and storage medium
WO2023072030A1 (en) * 2021-11-01 2023-05-04 中兴通讯股份有限公司 Automatic focusing method and apparatus for lens, and electronic device and computer-readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114500842A (en) * 2022-01-25 2022-05-13 维沃移动通信有限公司 Visual inertia calibration method and device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156859A (en) * 2011-04-21 2011-08-17 刘津甦 Sensing method for gesture and spatial location of hand
CN103246130A (en) * 2013-04-16 2013-08-14 广东欧珀移动通信有限公司 Focusing method and device
CN104102068A (en) * 2013-04-11 2014-10-15 聚晶半导体股份有限公司 Automatic focusing method and automatic focusing device
CN105744138A (en) * 2014-12-09 2016-07-06 联想(北京)有限公司 Quick focusing method and electronic equipment
JPWO2014083737A1 (en) * 2012-11-30 2017-01-05 パナソニックIpマネジメント株式会社 Image processing apparatus and image processing method
CN106412433A (en) * 2016-10-09 2017-02-15 深圳奥比中光科技有限公司 Automatic focusing method and system based on RGB-IR depth camera
CN106846403A (en) * 2017-01-04 2017-06-13 北京未动科技有限公司 The method of hand positioning, device and smart machine in a kind of three dimensions
CN107025666A (en) * 2017-03-09 2017-08-08 广东欧珀移动通信有限公司 Depth detection method and device and electronic installation based on single camera
CN107509027A (en) * 2017-08-08 2017-12-22 深圳市明日实业股份有限公司 A kind of monocular quick focusing method and system
CN108717712A (en) * 2018-05-29 2018-10-30 东北大学 A kind of vision inertial navigation SLAM methods assumed based on ground level

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102168954B (en) * 2011-01-14 2012-11-21 浙江大学 Monocular-camera-based method for measuring depth, depth field and sizes of objects
US20130057655A1 (en) * 2011-09-02 2013-03-07 Wen-Yueh Su Image processing system and automatic focusing method
JP2015017999A (en) * 2011-11-09 2015-01-29 パナソニック株式会社 Imaging device
CN103292695B (en) * 2013-05-10 2016-02-24 河北科技大学 A kind of single eye stereo vision measuring method
CN108711166B (en) * 2018-04-12 2022-05-03 浙江工业大学 Monocular camera scale estimation method based on quad-rotor unmanned aerial vehicle

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156859A (en) * 2011-04-21 2011-08-17 刘津甦 Sensing method for gesture and spatial location of hand
JPWO2014083737A1 (en) * 2012-11-30 2017-01-05 パナソニックIpマネジメント株式会社 Image processing apparatus and image processing method
CN104102068A (en) * 2013-04-11 2014-10-15 聚晶半导体股份有限公司 Automatic focusing method and automatic focusing device
CN103246130A (en) * 2013-04-16 2013-08-14 广东欧珀移动通信有限公司 Focusing method and device
CN105744138A (en) * 2014-12-09 2016-07-06 联想(北京)有限公司 Quick focusing method and electronic equipment
CN106412433A (en) * 2016-10-09 2017-02-15 深圳奥比中光科技有限公司 Automatic focusing method and system based on RGB-IR depth camera
CN106846403A (en) * 2017-01-04 2017-06-13 北京未动科技有限公司 The method of hand positioning, device and smart machine in a kind of three dimensions
CN107025666A (en) * 2017-03-09 2017-08-08 广东欧珀移动通信有限公司 Depth detection method and device and electronic installation based on single camera
CN107509027A (en) * 2017-08-08 2017-12-22 深圳市明日实业股份有限公司 A kind of monocular quick focusing method and system
CN108717712A (en) * 2018-05-29 2018-10-30 东北大学 A kind of vision inertial navigation SLAM methods assumed based on ground level

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022141271A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Control method and control device for platform system, platform system, and storage medium
CN113301248A (en) * 2021-04-13 2021-08-24 中科创达软件股份有限公司 Shooting method, shooting device, electronic equipment and computer storage medium
CN113301248B (en) * 2021-04-13 2022-09-06 中科创达软件股份有限公司 Shooting method and device, electronic equipment and computer storage medium
WO2023072030A1 (en) * 2021-11-01 2023-05-04 中兴通讯股份有限公司 Automatic focusing method and apparatus for lens, and electronic device and computer-readable storage medium

Also Published As

Publication number Publication date
WO2020124517A1 (en) 2020-06-25

Similar Documents

Publication Publication Date Title
US9998650B2 (en) Image processing apparatus and image pickup apparatus for adding blur in an image according to depth map
KR102143456B1 (en) Depth information acquisition method and apparatus, and image collection device
CN107924104B (en) Depth sensing autofocus multi-camera system
CN111213364A (en) Shooting equipment control method, shooting equipment control device and shooting equipment
EP3089449B1 (en) Method for obtaining light-field data using a non-light-field imaging device, corresponding device, computer program product and non-transitory computer-readable carrier medium
JP5163409B2 (en) Imaging apparatus, imaging method, and program
US20070189750A1 (en) Method of and apparatus for simultaneously capturing and generating multiple blurred images
CN108574825B (en) Method and device for adjusting pan-tilt camera
US20210044725A1 (en) Camera-specific distortion correction
JP5938281B2 (en) Imaging apparatus, control method therefor, and program
JP2010147635A (en) Imaging apparatus, imaging method, and program
JP2010136302A (en) Imaging apparatus, imaging method and program
CN107113376A (en) A kind of image processing method, device and video camera
CN106998413A (en) Image processing equipment, picture pick-up device and image processing method
KR102200866B1 (en) 3-dimensional modeling method using 2-dimensional image
US12007617B2 (en) Imaging lens system, image capturing unit and electronic device
CN109002796A (en) A kind of image-pickup method, device and system and electronic equipment
US20220187509A1 (en) Enhanced imaging device using liquid lens, embedded digital signal processor, and software
US10880536B2 (en) Three-dimensional image capturing device and method
KR102601288B1 (en) Camera module and image operating method performed therein
JP7254562B2 (en) Imaging device and its control device
CN111292380A (en) Image processing method and device
KR102061087B1 (en) Method, apparatus and program stored in storage medium for focusing for video projector
CN113382166B (en) Optical center alignment method and device for image pickup equipment, storage medium and electronic equipment
JP2020067511A (en) Camera system, control method and program of the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200529