CN114449173B - Optical anti-shake control method and device, storage medium and electronic equipment - Google Patents

Optical anti-shake control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114449173B
CN114449173B CN202210181901.XA CN202210181901A CN114449173B CN 114449173 B CN114449173 B CN 114449173B CN 202210181901 A CN202210181901 A CN 202210181901A CN 114449173 B CN114449173 B CN 114449173B
Authority
CN
China
Prior art keywords
object distance
camera
shake
determining
optical anti
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210181901.XA
Other languages
Chinese (zh)
Other versions
CN114449173A (en
Inventor
陈伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210181901.XA priority Critical patent/CN114449173B/en
Publication of CN114449173A publication Critical patent/CN114449173A/en
Application granted granted Critical
Publication of CN114449173B publication Critical patent/CN114449173B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure provides an optical anti-shake control method, an optical anti-shake control device, a computer readable storage medium and electronic equipment, and relates to the technical field of images. The optical anti-shake control method comprises the following steps: acquiring an object distance between a shot object and a camera; determining a target class in the data classes of the inertial sensor according to the object distance; determining pose information of the camera based on the inertial sensing data of the target class; and obtaining the optical anti-shake control parameters by analyzing the pose information of the camera. The optical anti-shake device can realize optical anti-shake under different object distances, broaden the application range of the optical anti-shake, and improve the response speed and the response efficiency of the optical anti-shake.

Description

Optical anti-shake control method and device, storage medium and electronic equipment
Technical Field
The disclosure relates to the field of image technology, and in particular, to an optical anti-shake control method, an optical anti-shake control device, a computer readable storage medium, and an electronic apparatus.
Background
With the development of imaging technology, electronic devices having cameras are increasingly used to capture images or videos to record various information. During shooting, shake of a photographer's hand, shake in the environment, or the like may cause shake of a shot picture, resulting in blurring of an image.
In the related art, optical anti-shake (Optical Image Stabilizer, OIS) is used to compensate the shift of the lens during the shake process, so as to achieve the anti-shake effect. However, the current optical anti-shake can be applied to a limited scene.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure provides an optical anti-shake control method, an optical anti-shake control device, a computer readable storage medium and an electronic device, so as to widen the application range of optical anti-shake to at least a certain extent.
According to a first aspect of the present disclosure, there is provided an optical anti-shake control method including: acquiring an object distance between a shot object and a camera; determining a target class in the data classes of the inertial sensor according to the object distance; determining pose information of the camera based on the inertial sensing data of the target class; and obtaining the optical anti-shake control parameters by analyzing the pose information of the camera.
According to a second aspect of the present disclosure, there is provided an optical anti-shake control apparatus comprising: the object distance acquisition module is configured to acquire an object distance between a shot object and a camera; a target class determination module configured to determine a target class from among data classes of the inertial sensor according to the object distance; a pose information determining module configured to determine pose information of the camera based on inertial sensing data of the target class; and the control parameter determining module is configured to obtain the optical anti-shake control parameters by analyzing the pose information of the camera.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the optical anti-shake control method of the first aspect described above and possible implementations thereof.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; a memory for storing executable instructions of the processor; the camera, the camera includes optics anti-shake system. Wherein the processor is configured to perform the optical anti-shake control method of the first aspect described above and possible implementations thereof via execution of the executable instructions.
The technical scheme of the present disclosure has the following beneficial effects:
according to the object distance between the shooting object and the camera, determining the target type in the inertial sensing data, determining the pose information of the camera by adopting the inertial sensing data of the target type, and further obtaining the optical anti-shake control parameters by analyzing the pose information, wherein the optical anti-shake control parameters can be used for realizing accurate optical anti-shake control. On one hand, the optical anti-shake device can realize optical anti-shake at different object distances, thereby breaking through the limitation of the object distances in the related technology and widening the application range of the optical anti-shake device. On the other hand, according to different object distances, the method adopts the inertial sensing data of the target class which is suitable for the object distance to perform the related calculation of the pose information, and can reduce the quantity of the inertial sensing data to be processed to a certain extent, thereby improving the response speed and efficiency of optical anti-shake and improving the potential problem of shooting hair heating.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those skilled in the art from this disclosure that the drawings described below are merely some embodiments of the present disclosure and that other drawings may be made from these drawings without the exercise of inventive effort.
Fig. 1 shows a schematic configuration diagram of an electronic device in the present exemplary embodiment;
fig. 2 is a schematic diagram showing the principle of optical anti-shake in the present exemplary embodiment;
Fig. 3 shows a flowchart of an optical anti-shake control method in the present exemplary embodiment;
Fig. 4 is a schematic diagram showing that rotation and translation of the lens causes a screen shift in the present exemplary embodiment;
FIG. 5 illustrates a flow chart of determining a first object distance threshold in the present exemplary embodiment;
Fig. 6 shows a schematic diagram of determining first, second, and third object distance thresholds in the present exemplary embodiment;
Fig. 7 shows a flowchart of determining position adjustment information in the present exemplary embodiment;
Fig. 8 shows a schematic view in which a lens is moved in the X direction in the present exemplary embodiment;
fig. 9 is a schematic diagram showing an optical anti-shake process in the present exemplary embodiment;
fig. 10 is a schematic diagram showing a structure of an optical anti-shake control apparatus according to the present exemplary embodiment.
Detailed Description
Exemplary embodiments of the present disclosure will now be described with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many different forms and should not be construed as limited to the examples set forth herein. These embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the exemplary embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. However, those skilled in the art will recognize that the aspects of the present disclosure may be practiced with one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
In the related art, the optical anti-shake has a certain limitation on the object distance, and a better anti-shake effect can be generally realized only in a specific object distance range, so that applicable scenes are limited.
In view of the above, exemplary embodiments of the present disclosure provide an optical anti-shake control method, and an electronic apparatus for performing the optical anti-shake control method. The electronic device will be first described below.
The electronic device may include a processor, a memory, and a camera. The camera includes an optical anti-shake system. The memory stores executable instructions of the processor, such as program code. The processor executes the optical anti-shake control method in the present exemplary embodiment by executing the executable instructions. The electronic device can be a mobile phone, a tablet computer, a digital camera, an unmanned aerial vehicle, intelligent wearable equipment and the like.
The configuration of the electronic device will be exemplarily described below taking the mobile terminal 100 in fig. 1 as an example. It will be appreciated by those skilled in the art that the configuration of fig. 1 can also be applied to stationary type devices in addition to components specifically for mobile purposes.
Referring to fig. 1, the mobile terminal 100 may specifically include: processor 110, memory 120, communication module 130, bus 140, display module 150, power module 160, camera 170, and sensor module 180.
The processor 110 may include one or more processing units, such as: the Processor 110 may include an AP (Application Processor ), modem Processor, GPU (Graphics Processing Unit, graphics Processor), ISP (IMAGE SIGNAL Processor ), controller, encoder, decoder, DSP (DIGITAL SIGNAL Processor ), baseband Processor and/or NPU (Neural-Network Processing Unit, neural network Processor), and the like. The optical anti-shake control method in the present exemplary embodiment may be performed by one or more of AP, GPU, DSP, ISP. The method may also be performed by the NPU when it relates to a neural network.
The processor 110 may form a connection with the memory 120 or other components via a bus 140.
Memory 120 may be used to store computer-executable program code that includes instructions. The processor 110 performs various functional applications of the mobile terminal 100 and data processing by executing instructions stored in the memory 120. For example, the memory 120 may store a degree code of the optical anti-shake control method in the present exemplary embodiment, and the processor 110 implements the optical anti-shake control method by executing the code. The memory 120 may also store application data, such as files that store images, videos, and the like.
The communication functions of the mobile terminal 100 may be implemented by the communication module 130 and an antenna, a modem processor, a baseband processor, etc. Antennas are used to transmit and receive electromagnetic wave signals, such as radio frequency signals. The communication module 130 may provide a mobile communication solution of 3G, 4G, 5G, etc. applied on the mobile terminal 100, or a wireless communication solution of wireless local area network, bluetooth, near field communication, etc.
The display module 150 is used to provide display functions of the mobile terminal 100, such as displaying a graphical user interface. The power module 160 is used to implement power management functions such as charging a battery, powering a device, monitoring a battery status, etc.
The camera 170 is used for capturing images or video, and may include components such as a lens 171, an image sensor 172, an optical anti-shake system 173, and other components such as a cover plate, a filter, and the like, which are not shown. The lens 171 may be a lens for performing control of an optical path. The image sensor 172 is configured to receive an optical signal and convert it into an electrical signal for further conversion into a digital signal or the like. The optical anti-shake system 173 is used for providing an optical anti-shake function of the camera 170, so as to ensure that the camera 170 can capture a clearer image or video when shake occurs.
In the present exemplary embodiment, an optical anti-shake control method may be performed by the processor 110 to control the optical anti-shake system 173. For example, an ISP may be provided in association with the camera 170, and the ISP performs an optical anti-shake control method, and transmits the generated control signal to the optical anti-shake system 173 to achieve corresponding control. In addition, ISP can also be used for realizing functions of automatic focusing, automatic exposure, automatic white balance, flicker detection, black level compensation and the like.
The sensor module 180 may include one or more sensors for implementing corresponding sensing functions. In this exemplary embodiment, the sensor module 180 may include an inertial sensor 181, such as an accelerometer, a gyroscope, and a magnetometer, for sensing and collecting acceleration data, angular velocity data (or angular acceleration data, which may be obtained by integration, may be regarded as equivalent to angular velocity data, and thus will not be described in detail below), and magnetometer data, respectively. The inertial sensor 181 is used for sensing the pose of the mobile terminal 100, and further obtains the pose of the camera 170.
In one embodiment, the inertial sensor 181 may be configured with the camera 170, for example, the inertial sensor 181 is disposed in the camera 170, which can accurately sense the pose of the camera 170.
In addition, the sensor module 180 may also include other sensors, such as a depth sensor, for sensing depth information. In one embodiment, a depth sensor may be provided in conjunction with camera 170 to form a depth camera. Also, the mobile terminal 100 may further include components not shown in other figures, such as an audio module, a touch input module, and the like.
The number of the above components is not limited in the present disclosure, for example, the number of cameras 170 is not limited, and two or three cameras may be provided according to actual requirements, and the like.
Based on the above mobile terminal 100, the implementation process of the optical anti-shake is: the inertial sensor 181 may sense the shake of the camera 170 during photographing, transmit the related data of the shake to the ISP, perform the optical anti-shake control method in the present exemplary embodiment by the ISP to obtain an optical anti-shake control parameter, and then the ISP may transmit the optical anti-shake control parameter to a driving circuit (Driver IC), and control the optical anti-shake system to perform corresponding adjustment by the driving circuit, for example, the position of the lens 171 or the image sensor 172 in the camera 170 may be adjusted by controlling a motor in the optical anti-shake system to implement shake compensation, thereby improving image blur due to shake.
Fig. 2 shows the principle of optical anti-shake. Under the condition of no shake, the optical axis direction of the lens 220 can be aligned to the shot object 230, and stable imaging can be performed on the imaging plane 210, so that the image is clearer. When shake occurs and optical anti-shake is not turned on, the optical axis of the lens 220 is shifted compared with the photographed object 230, so that the whole picture is shifted, and imaging is unstable due to movement of the lens 220, and an image is blurred. When shake occurs and optical anti-shake is started, shake can be compensated by adjusting the lens 220, so that the optical axis can be aligned to the shot object 230, imaging is stable, and the image is clearer.
The optical anti-shake control method in the present exemplary embodiment is described below with reference to fig. 3. FIG. 3 illustrates an exemplary flow of the optical anti-shake control method, which may include:
Step S310, obtaining the object distance between the shot object and the camera;
Step S320, determining a target class in the data classes of the inertial sensor according to the object distance;
step S330, determining pose information of the camera based on the inertial sensing data of the target class;
step S340, obtaining the optical anti-shake control parameters by analyzing the pose information of the camera.
In the optical anti-shake control method, the object type in the inertial sensing data is determined according to the object distance between the shooting object and the camera, the pose information of the camera is determined by adopting the inertial sensing data of the object type, and further the optical anti-shake control parameters are obtained by analyzing the pose information and can be used for realizing accurate optical anti-shake control. On one hand, the optical anti-shake device can realize optical anti-shake at different object distances, thereby breaking through the limitation of the object distances in the related technology and widening the application range of the optical anti-shake device. On the other hand, according to different object distances, the method adopts the inertial sensing data of the target class which is suitable for the object distance to perform the related calculation of the pose information, and can reduce the quantity of the inertial sensing data to be processed to a certain extent, thereby improving the response speed and efficiency of optical anti-shake and improving the potential problem of shooting hair heating.
Each step in fig. 3 is specifically described below.
In step S310, the object distance between the subject and the camera is acquired.
The object distance generally refers to the distance from the object to be photographed to the optical center of the lens. The object distance in the present exemplary embodiment is used to measure the distance between the object and the camera, and is the object distance in a broad sense, and may be the distance between the object and any component of the camera (not limited to a lens), such as the distance between the object and the image sensor of the camera, and so on. The object to be photographed may be a foreground object in the screen, or an object at the center of the screen, or an in-focus object, or the like. In step S310, the object distance between the object and the camera when the previous frame or frames of images are captured may be obtained for optical anti-shake control when the current frame of images are captured, or the object distance between the object and the camera may be obtained based on the preview image.
The present disclosure is not limited to the particular manner in which the object distance is determined, and several examples are provided below.
In one embodiment, the step of obtaining the object distance between the photographed object and the camera may include the following steps:
Acquiring focusing parameters when a camera is aligned with a shot object;
And determining the object distance according to the focusing parameters.
The focus parameters may include focal length, auto focus control parameters, and the like, among others. The focal length may be a focal length determined by auto-focus or manual focus. The autofocus control parameter may be a parameter that adjusts a lens position at the time of autofocus, such as an AF (Auto Focus) code. For example, the range of the AF code value may be 0-1023, which is used to indicate the position of the lens in the optical axis direction, for example, the larger the AF code value, the farther the lens is from the image sensor. AF code has a strong correlation with focal length and can therefore also be used to calculate object distance.
The above-mentioned focusing parameter may be a focusing parameter for capturing an image of the current frame. In capturing successive images, since the focus parameter does not generally vary much between successive frames, the focus parameter in capturing the image of the previous frame or frames may also be used.
In one embodiment, where the focal length is obtained, the object distance may be calculated from the relationship of focal length, object distance, image distance, etc. in lens imaging.
In one embodiment, the determining the object distance according to the focusing parameter may include the following steps:
and determining the object distance corresponding to the automatic focusing control parameter when the camera is aligned to the shot object based on the calibration relation between the automatic focusing control parameter and the object distance.
Generally, when the object distances are different, the focal lengths of the cameras during automatic focusing are different, and then the automatic focusing control parameters for adjusting the lens positions are also different, that is, the corresponding relationship between the automatic focusing control parameters and the object distances exists, calibration can be performed in advance, for example, the automatic focusing control parameters are determined under a plurality of known object distances, so that the calibration relationship between the automatic focusing control parameters and the object distances is obtained. Therefore, when the current frame image is shot, the corresponding object distance can be found out from the calibration relation after the current automatic focusing control parameter is obtained.
In one embodiment, if the camera is a depth camera, depth information of the photographed object may be acquired and the object distance may be determined therefrom. For example, the depth value of the photographed object may be obtained, and when the depth value is taken as the object distance, or when the depth value of the photographed object is not unique, for example, when the depth values of different parts of the photographed object are different, the respective depth values may be fused (for example, an average value or a weighted average value, etc.) to obtain the object distance.
In one embodiment, the object distance may be determined from two images of the object being photographed. The two images may be two images at different viewing angles. For example, the electronic device includes two cameras (binocular cameras), and two images of the photographed object can be acquired by the two cameras; or the camera is in a motion state, two images at different moments can be acquired, for example, the first two images of the current frame can be acquired, or preview images can be acquired respectively at the two moments. Under the condition of acquiring the two images, three-dimensional information of the photographed object can be reconstructed through a triangulation principle, so that the object distance is determined.
With continued reference to fig. 3, in step S320, a target class is determined among the data classes of the inertial sensors according to the object distance described above.
The data categories of the inertial sensors may be divided by the category of the sensor, or may be divided by other means, such as in the form of the data itself. By way of example, the data categories of the inertial sensor may include: acceleration data, angular velocity data. Magnetometer data, gravity data, etc. may also be included.
The camera shake generally includes two motion cases, namely rotation and translation, and the influence of rotation and translation on the screen shift will be described below by taking the shake of the lens as an example with reference to fig. 4. It should be appreciated that the shake of the lens may be equivalent to the shake of the image sensor. Fig. 4 shows the picture shift at different object distances caused by the lens rotation and the picture shift at different object distances caused by the lens translation. As can be seen from fig. 4, when the lens 220 rotates, the angle of the optical axis changes, and when the object distance is small, the distance between the optical axis (or focus) and the photographed object 230 is small, i.e., the screen offset is small; as the object distance increases, the distance between the optical axis (or focus) and the subject 230 becomes larger, i.e., the screen offset becomes larger. When the lens 220 is shifted, the angle of the optical axis is not changed, but the entire lens is shifted with respect to the subject 230, and the distance between the optical axis (or focus) and the subject 230 is not changed, i.e., the screen offset is not changed, as the object distance is changed. Therefore, when the object distance is smaller, the picture offset is insensitive to the rotation of the lens and sensitive to the translation of the lens, so that the translation of the lens is more required to be compensated; when the object distance is large, the picture offset is insensitive to the translation of the lens and sensitive to the rotation of the lens, so that the rotation of the lens needs to be compensated.
In fig. 4, the straight-line distance between the subject 230 and the actual optical axis is used as the screen offset, which is merely illustrative. The screen shift amount may be determined based on the curve distance, or based on the displacement of the feature points in the image, or the like. The principle is similar in either way of calculating the screen offset, and the relationship shown in fig. 4 is also true.
Therefore, in the present exemplary embodiment, it is possible to determine whether to use inertial sensing data of a translational type or inertial sensing data of a rotational type as a target type according to the difference in object distance, and perform optical shake compensation. The inertial sensing data of the translational class may be acceleration data and the inertial sensing data of the rotational class may be angular velocity data.
In one embodiment, the determining the target class from the data classes of the inertial sensor according to the object distance may include the steps of:
responsive to the object distance being less than a first object distance threshold, determining acceleration data as a target class;
In response to the object distance being greater than the first object distance threshold, angular velocity data is determined as the target class.
The first object distance threshold may be an object distance when the image offset caused by rotation and translation of the lens is the same, and may be obtained according to experience or a pre-test. If the object distance between the photographed object and the camera is smaller than the first object distance threshold, that is, if the object distance is smaller, the acceleration data is determined as the target class because the influence of the translation on the screen shift is larger. If the object distance between the photographed object and the camera is greater than the first object distance threshold, that is, if the object distance is greater, the angular velocity data is determined as the target class because the influence of the rotation on the screen shift is greater.
In one embodiment, the first object distance threshold may be determined by a pre-test. Specifically, referring to fig. 5, the optical anti-shake control method may further include the following steps S510 to S530:
step S510, based on the relation between the translational compensation and the rotation compensation, obtaining a translational test amount and a corresponding rotation test amount.
Wherein, the relation between the translational compensation and the rotational compensation can be: in general terms, the correspondence between the compensation for translation and the compensation for rotation. For example, optical anti-shake systems generally compensate for shake by translating a lens rather than rotating, and even if the lens rotates, the effects of rotation can be reduced or even eliminated by translational compensation. Empirically, when a 1 degree rotation of the lens occurs, the compensation may be performed by translating 0.2mm, so the relationship of translation compensation and rotation compensation may include a correspondence of 0.2mm to 1 degree. The relationship may be linear or non-linear, which is not limited by the present disclosure.
Based on the relation of the translation compensation and the rotation compensation, at least one set of translation test amounts and corresponding rotation test amounts may be obtained, as may be expressed in the form of (translation amount a, rotation angle B). For example, the translation test amount may be 0.2mm and the rotation test amount may be 1 degree, which form a set of test data.
In step S520, a first relationship between the object distance and the frame offset is obtained when the translation test amount is used to apply translation to the camera, and a second relationship between the object distance and the frame offset is obtained when the rotation test amount is used to apply rotation to the camera.
In this exemplary embodiment, the translation test amount may be used to apply a translation to the camera, for example, translate the lens by 0.2mm, in which case the frame offset at different object distances is tested, to obtain the first relationship between the object distance and the frame offset. Rotation may be applied to the camera using the rotation test amount, for example by rotating the lens by 1 degree, in which case the picture offset at different object distances is tested, resulting in a second relationship of object distance and picture offset.
In step S530, a first object distance threshold is determined according to the first relationship and the second relationship.
According to the first relation and the second relation, under which object distance, the image offset caused by the translation test amount is equal to the image offset caused by the rotation test amount, and the object distance is the first object distance threshold.
For example, referring to fig. 6, the first relationship and the second relationship may be plotted as curves, respectively, where the abscissa may be the object distance and the ordinate may be the amount of screen translation. In general, the first relationship may be a straight line with a constant ordinate, and the second relationship may be a primary or secondary curve, etc. The first object distance threshold is the object distance corresponding to the intersection point of the first relation curve and the second relation curve.
In one embodiment, the determining the target class from the data classes of the inertial sensor according to the object distance may include the steps of:
Responsive to the object distance being less than a second object distance threshold, determining acceleration data as a target class;
determining acceleration data and angular velocity data as target categories in response to the object distance being greater than a second object distance threshold and less than a third object distance threshold;
in response to the object distance being greater than a third object distance threshold, angular velocity data is determined as the target class.
When the object distance is smaller than the second object distance threshold, the image offset caused by rotation is small due to the fact that the object distance is small, the influence of rotation can be ignored, only the influence of translation is considered, and therefore only the acceleration data are determined to be the target type. When the object distance is larger than the third object distance threshold, since the object distance is large and the screen shift caused by the translation is small, the influence of the translation can be ignored and only the influence of the rotation is considered, so that only the angular velocity data is determined as the target class. The influence of both translation and rotation needs to be taken into account when the object distance is between the second object distance threshold and the third object distance threshold, and both are thus determined as target classes.
The second distance threshold and the third distance threshold may be obtained empirically or by testing in advance. For example, a predetermined ratio (e.g., 90% or other empirical value) may be determined for when the ratio of the shift or rotation-induced frame offset to the equivalent total frame offset is below the predetermined ratio, the effect of the shift or rotation may be ignored. Referring to fig. 6, the frame offset caused by the shift amount a and the rotation angle B may be calculated at different object distances, and the ratio of the frame shift amount caused by the shift amount a to the frame offset caused by the rotation angle B may be calculated, where when the object distance is smaller than the second object distance threshold, the ratio of the rotation portion is smaller than the preset ratio, by determining the object distance corresponding to the point where the ratio of the frame offset amount caused by the rotation angle B is equal to the preset ratio as the second object distance threshold. And determining the object distance corresponding to the point, which is caused by the translation quantity A and occupies the proportion equal to the preset proportion, of the picture translation quantity as a third object distance threshold value, wherein when the object distance is larger than the third object distance threshold value, the proportion occupied by the translation part is lower than the preset proportion.
Based on the first object distance threshold, or the second object distance threshold and the third object distance threshold, a suitable target class may be determined for different object distances. The optical anti-shake control is performed only by using the inertial sensing data of the target class, and the inertial sensing data of other classes are not required to be used.
In one embodiment, in the event that a target class is determined, inertial sensors of non-target classes may also be turned off. For example, if acceleration data is determined to be a target class, without using angular velocity data, a gyroscope for sensing angular velocity data may be turned off, thereby further reducing power consumption.
With continued reference to fig. 3, in step S330, pose information of the camera is determined based on inertial sensing data of the target class.
The pose information of the camera may include at least one of position data and pose data of the camera. The position data is used to represent the absolute position or relative position of the camera, and typically includes X, Y, Z coordinates in three axial directions, but the disclosure is not limited thereto, and the position data may be represented as spherical coordinates or the like. The pose data is used to represent the orientation state of the camera. The form and the specific content of the gesture data are not limited in the present disclosure, and may be, for example, an absolute gesture in a certain coordinate system, a relative gesture with respect to a certain reference gesture, or the like. By way of example, the gesture data may include any one or more of gesture quaternions, euler angles, rotation matrices.
In one embodiment, the target class may include acceleration data. After the acceleration data are obtained, the triaxial displacement data of the camera, namely translation data, can be obtained by correcting the acceleration data and integrating the acceleration data in the time domain.
In one embodiment, the target class may include angular velocity data. After angular velocity data are acquired, three-axis angle change data, namely rotation data, of the camera can be obtained through correction and integration of the angular velocity data in the time domain.
In one embodiment, the pose information of the camera may be determined in a machine learning manner. For example, a pose prediction model may be trained in advance, and the pose prediction model may be an LSTM (Long Short-Term Memory) network, for example. In step S330, inertial sensor data of the target class is input into the pose prediction model, and pose information of the camera is output through processing of the model.
In one embodiment, the pose information of the camera may be represented as a vector Qi (i represents the current i moment), which may include position data and pose data in X, Y, Z axial directions.
In one embodiment, pose information of the camera may be periodically determined. For example, the sensing frequency of the inertial sensor is 50Hz, that is, the period is 20ms, which means that the inertial sensor collects inertial sensing data once every 20ms, and then pose information of the camera can be determined once every 20 ms.
In one embodiment, since the pose information of the camera is used to determine the shake condition of the camera, the relative pose information of the camera may be used for calculation, that is, the relative position information of the camera may be determined in step S330. For example, relative pose information of the camera at a current time relative to a preamble time of the current time may be determined based on inertial sensing data of a target class.
In one embodiment, the target class may include data classes of at least two inertial sensors. If the sensing frequencies of the at least two inertial sensors are different, the frequencies of the data output by the different kinds of inertial sensors can be the same by upsampling or downsampling the data of at least one of the inertial sensors. For example, the target class includes acceleration data and angular velocity data, and the frequency of the accelerometer collecting the acceleration data is different from the frequency of the gyroscope collecting the angular velocity data, so that the acceleration data or the angular velocity data can be up-sampled or down-sampled, and the two data can be time-synchronized.
There may be a time error between different kinds of inertial sensors, for example, an accelerometer outputs acceleration data at time t, and a gyroscope outputs angular velocity data at time t, which is not the same as the data sensing time actually corresponding to the angular velocity data due to the influence of response delay of the sensor itself, delay of data transmission, and other factors. In one embodiment, the time error between different kinds of inertial sensors may be corrected in advance, for example, calibrating the time error between the accelerometer and the gyroscope in combination with other or external sensors, in a compensating manner to minimize the time error.
With continued reference to fig. 3, in step S340, the optical anti-shake control parameter is obtained by analyzing pose information of the camera.
Wherein the optical anti-shake control parameter is used for controlling one or more components in the optical anti-shake system, such as can be used for controlling a lens or an image sensor to perform position adjustment, i.e. the optical anti-shake control parameter can comprise position adjustment information for the lens or the image sensor in the camera.
The shaking condition of the camera can be obtained by analyzing the pose information of the camera. In this exemplary embodiment, the camera shake can be represented by anti-shake compensation information. Camera shake may refer to the amount of additional motion of the camera relative to a smooth motion state. The smooth motion state may include: a stationary state, a motion state in which the speed (translational speed or rotational angular speed) is kept unchanged, a motion state in which the acceleration (linear acceleration or angular acceleration) is kept unchanged, a motion state in which the jerk is kept unchanged, and the like. The anti-shake compensation information may be the above-described additional motion amount or an inverted additional motion amount (for compensating the additional motion amount).
In one embodiment, referring to fig. 7, the obtaining the optical anti-shake control parameter by analyzing the pose information of the camera may include the following steps S710 and S720:
Step S710, determining anti-shake compensation information according to the pose information of the camera at the current moment and the pose information of at least one preamble moment at the current moment.
The preamble time of the current time may be any time before the current time. For example, when the current time is denoted by i time, the preamble time may be i-1 time, the last inertial sensing data acquisition time of the current time, or i-2 time, the last inertial sensing data acquisition time of the current time, etc.
And obtaining smooth pose information at the current moment according to the pose information at the preamble moment, namely, the expected pose information of the camera in a state of keeping smooth motion. For example, the pose information of the camera at the plurality of preamble moments may be smoothed to obtain smoothed pose information of the camera at the current moment, and the smoothing processing may include smoothing fitting the pose information at the plurality of preamble moments, and obtaining the smoothed pose information at the current moment from a fitting curve. The pose information of the camera at the current moment obtained in step S330 may be used as actual pose information, and the anti-shake compensation information may be calculated by the difference between the actual pose information and the smoothed pose information.
In one embodiment, the pose information of the camera at the current moment and the pose information of the front moment can be subjected to filtering processing to obtain the filtering pose information of the camera at the current moment, wherein the filtering pose information can be used as smooth pose information; and then determining anti-shake compensation information based on the actual pose information at the current moment and the filtering pose information at the current moment. In the filtering process, the pose information of the preamble time may be the filtered pose information of the preamble time, or may be the actual pose information of the preamble time. For example, after the actual pose information at the i moment is obtained, the actual pose information is fused (e.g. weighted and fused) with the filtered pose information at the i-1 moment, so as to obtain the filtered pose information at the i moment.
In one embodiment, the filtering pose information at the current time may be determined based on the anti-shake strength, the pose information at the current time, and the pose information at the preamble time; and then determining anti-shake compensation information based on the actual pose information at the current moment and the filtering pose information at the current moment.
The anti-shake intensity is used to indicate the degree of applied optical anti-shake, for example, the greater the anti-shake intensity, the higher the degree of applied optical anti-shake, i.e., the better the picture stabilization effect. The anti-shake intensity can be set by a user or automatically set by a system, for example, the system can set different anti-shake intensities for different shooting modes, or pre-configure the anti-shake intensities corresponding to different motion states, for example, the higher the motion speed is, the greater the anti-shake intensity is, and the current anti-shake intensity is determined according to the current motion state (pose information can be adopted). The anti-shake intensity may be used to determine the filtering intensity, for example, when the pose information at the current time is filtered, the anti-shake intensity may be used as the weight of the pose information at the previous time, and the greater the anti-shake intensity, the higher the weight of the pose information at the previous time, the higher the smoothness.
The specific filtering mode is not limited in this disclosure, for example, the weighted fusion filtering mode may be adopted, and other modes such as kalman filtering (KALMAN FILTERING) may also be adopted.
For example, the filtering pose Qfilter i at the current time may be determined based on the anti-shake intensity alpha, the actual pose Q i at the current time, and the filtering pose Qfilter i-1 at the previous time, as follows:
Qfilteri=f(Qi,Qfilteri-1,alpha) (1)
In the above filtering process, qfilter i-1 has a weight of alpha and Q i has a weight of 1-alpha.
Further, the actual pose Q i at the current time and the filtered pose Qfilter i may be subtracted to obtain the anti-shake compensation amount Δq, that is, the anti-shake compensation information. The formula is as follows:
ΔQ=Qi-Qfilteri (2)
the anti-shake compensation amount Δq is a vector, and may be an angle compensation amount or a translation compensation amount, or may be a fusion of the angle compensation amount and the translation compensation amount.
When the target class includes only angular velocity data, the actual pose Q i includes only rotation data, which may be, for example, euler angles (X, Y, Z), and the calculated anti-shake compensation amount Δq may be an angle compensation amount including compensation amounts Δx, Δy, Δz of rotation angles on the X, Y, Z axis.
When the target class includes only acceleration data, the actual pose Q i includes only translational data, such as displacement amounts (shift_x, shift_y, shift_z), and the calculated anti-shake compensation amount Δq may be a translational compensation amount including translational compensation amounts shift_x, shift_y, shift_z on the X, Y, Z axis.
When the target class includes angular velocity data and acceleration data, the actual pose Q i may include rotation data and translation data, which may be in the form of (X, Y, Z, shift_x, shift_y, shift_z), where the first three are euler angles of three axes and the second three are displacement amounts of three axes, and the calculated anti-shake compensation amount Δq may be a fusion of the angle compensation amount and the translation compensation amount. For example, the angle compensation amount Δq 1 may be calculated according to the above formula using the rotation data (X, Y, Z), and the translation compensation amount Δq 2 may be calculated according to the above formula using the translation data (shift_x, shift_y, shift_z). The angle compensation amount Δq 1 and the translational compensation amount Δq 2 are fused in the corresponding directions to obtain an anti-shake compensation amount Δq, specifically, the angle compensation amount Δq 1, the translational compensation amount Δq 2 and the weights corresponding to the angle compensation amount Δq 1 and the translational compensation amount Δq 2 are respectively weighted and summed in the corresponding axial directions to obtain the anti-shake compensation amount Δq.
Step S720, determining position adjustment information according to the anti-shake compensation information.
The optical anti-shake system can compensate shake of the camera by moving the position of the lens or the image sensor so as to realize optical anti-shake. The position adjustment information may be used to adjust the position of the lens or the image sensor.
In one embodiment, the position of the lens is adjusted as an example. The lens may be moved in one or more directions. For example, in the current optical anti-shake system, the lens may be moved in one plane (such as XY plane), that is, the lens may be moved in the X direction and the Y direction. Of course, the present disclosure is not limited thereto, and for example, the lens may be moved in one three-dimensional space, that is, the lens may be moved in the X direction, the Y direction, and the Z direction.
Fig. 8 shows a schematic view in which the lens is moved in the X direction. The range of travel of the lens is limited, constrained by the hardware of the optical anti-shake system (e.g., frame, motor, etc.). Full stroke includes linear and nonlinear strokes. When the lens is positioned at the middle position of the full stroke, the left linear stroke of the lens accounts for half of the whole linear stroke, and the right linear stroke of the lens is also the same, so that the left and right compensation capacities are the same. Since the camera's shake direction is typically random, the lens may be positioned in the center of travel by default so that the compensation capability is the same to the left and right. The position adjustment information may include a stroke in which the lens is moved, and may have a code value of-1023 to 1023, for example, when the lens is located at a center position of the stroke, the code value is 0, the left is moved to a negative value, the code value-1023 indicates that the lens is moved to the left to the stroke boundary, the right is moved to a positive value, and the code value 1023 indicates that the lens is moved to the right to the stroke boundary.
In one embodiment, the lens may be moved in the X-direction and the Y-direction, and the position adjustment information may include a target stroke position in the X-direction and a target stroke position in the Y-direction, which is the stroke position after the movement. After the above-mentioned anti-shake compensation amount Δq is obtained, an anti-shake compensation amount in the X direction and an anti-shake compensation amount in the Y direction can be obtained, respectively, and then the stroke in the X direction and the stroke in the Y direction are calculated correspondingly.
In one embodiment, a preset calibration value may be acquired, and the position adjustment information is determined based on the anti-shake compensation information and the preset calibration value. The preset calibration value may be used to represent the relationship between the anti-shake compensation information and the stroke center position, and may be a preset empirical value, or may be a degree value set by a user or automatically set by the system.
For example, the product of the anti-shake compensation amount and the preset calibration value and the sum of the stroke center positions can be used as the corresponding position adjustment information, and the formula is as follows:
Δcode_x=(Δx·gain_x)+center_code_x
Δcode_y=(Δx·gain_y)+center_code_y (3)
target_Hall_x=f(Δcode_x,Hall_x)
target_Hall_y=f(Δcode_y,Hall_y) (4)
The gain_x and gain_y are preset calibration values in the X direction and the Y direction; center_code_x and center_code_y are intermediate positions of the full stroke of the lens, hall_x and Hall_y represent current stroke positions of the lens, and Deltacode_x and Deltacode_y represent calculated compensation stroke values; f is the fusion of the current travel value and the compensation travel values delta code_x and delta code_y, and the fusion mode can be filtering (such as Kalman filtering) or other fusion algorithms; target_hall_x and target_hall_y represent the target stroke position in the X direction and the target stroke position in the Y direction. Thereby obtaining position adjustment information.
It should be appreciated that the present exemplary embodiment may also be implemented by moving the image sensor, or by moving the lens and the image sensor simultaneously, to achieve optical anti-shake. The principle and the mode for calculating the position adjustment information of the image sensor are the same as those of the lens, so that the description is omitted.
Fig. 9 shows a schematic diagram of an optical anti-shake process. And determining a target class according to the object distance, and acquiring inertial sensing data of the target class. And acquiring real-time position values of the current moment of the lens of the camera in the XY direction, namely a current stroke position Hall_x in the X direction and a current stroke position Hall_y in the Y direction. Inertial sensing data is input with the current trip position. Calculating real-time pose of a target object in inertial sensing data of a target class, wherein the target object can be an inertial sensor, a camera, a lens of the camera or an image sensor; calculating the filtered pose and the corresponding anti-shake compensation quantity; calculating a target travel position by combining a preset calibration value, a current travel position and an anti-shake compensation amount; and sending the calculated target stroke position to a driving circuit so as to drive a motor to move the lens to target stroke positions target_Hall_x and target_Hall_y, thereby completing optical anti-shake compensation.
Exemplary embodiments of the present disclosure also provide an optical anti-shake control apparatus. Referring to fig. 10, the optical anti-shake control apparatus 1000 may include:
an object distance acquiring module 1010 configured to acquire an object distance between a subject and a camera;
A target class determination module 1020 configured to determine a target class from among data classes of the inertial sensor based on the object distance;
a pose information determination module 1030 configured to determine pose information of the camera based on inertial sensing data of the target class;
the control parameter determining module 1040 is configured to obtain the optical anti-shake control parameter by analyzing pose information of the camera.
In one embodiment, determining the target class from among data classes of the inertial sensor according to the object distance includes:
Responsive to the object distance being less than the first object distance threshold, determining acceleration data as a target class;
In response to the object distance being greater than the first object distance threshold, angular velocity data is determined as the target class.
In one embodiment, object distance acquisition module 1010 is further configured to:
Based on the relation between translation compensation and rotation compensation, obtaining translation test quantity and corresponding rotation test quantity;
Under the condition that translation is applied to the camera by using the translation test quantity, a first relation between the object distance and the picture offset is obtained, and under the condition that rotation is applied to the camera by using the rotation test quantity, a second relation between the object distance and the picture offset is obtained;
a first object distance threshold is determined from the first relationship and the second relationship.
In one embodiment, determining the target class from among data classes of the inertial sensor according to the object distance includes:
Responsive to the object distance being less than the second object distance threshold, determining acceleration data as a target class;
determining both the acceleration data and the angular velocity data as target categories in response to the object distance being greater than the second object distance threshold and less than the third object distance threshold;
in response to the object distance being greater than the third object distance threshold, the angular velocity data is determined to be the target class.
In one embodiment, the acquiring the object distance between the photographed object and the camera includes:
Acquiring focusing parameters when a camera is aligned with a shot object;
And determining the object distance according to the focusing parameters.
In one embodiment, the focus parameters may include autofocus control parameters. The determining the object distance according to the focusing parameter includes:
and determining the object distance corresponding to the automatic focusing control parameter when the camera is aligned to the shot object based on the calibration relation between the automatic focusing control parameter and the object distance.
In one embodiment, the optical anti-shake control parameter includes position adjustment information for a lens or an image sensor in the camera; the method for obtaining the optical anti-shake control parameters by analyzing the pose information of the camera comprises the following steps:
Determining anti-shake compensation information according to pose information of a camera at the current moment and pose information of at least one front moment at the current moment;
and determining position adjustment information according to the anti-shake compensation information.
The specific details of each part in the above apparatus are already described in the method part embodiments, and the details not disclosed can refer to the embodiment content of the method part, so that the details are not repeated.
Exemplary embodiments of the present disclosure also provide a computer readable storage medium, which may be implemented in the form of a program product comprising program code for causing an electronic device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the above section of the "exemplary method" when the program product is run on the electronic device. In an alternative embodiment, the program product may be implemented as a portable compact disc read only memory (CD-ROM) and comprises program code and may run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (7)

1. An optical anti-shake control method, comprising:
Acquiring an object distance between a shot object and a camera;
Responsive to the object distance being less than a first object distance threshold, determining acceleration data as a target class; determining angular velocity data as the target class in response to the object distance being greater than the first object distance threshold; or in response to the object distance being less than a second object distance threshold, determining acceleration data as the target class; responsive to the object distance being greater than the second object distance threshold and less than a third object distance threshold, determining both acceleration data and angular velocity data as the target class; determining angular velocity data as the target class in response to the object distance being greater than the third object distance threshold;
determining pose information of the camera based on the acceleration data and/or the angular velocity data of the target class;
analyzing pose information of the camera to obtain optical anti-shake control parameters;
The optical anti-shake control parameters include position adjustment information for a lens or an image sensor in the camera; the step of obtaining the optical anti-shake control parameters by analyzing the pose information of the camera comprises the following steps:
determining anti-shake compensation information according to pose information of the camera at the current moment and pose information of at least one preamble moment of the current moment;
And determining the position adjustment information according to the anti-shake compensation information.
2. The method according to claim 1, wherein the method further comprises:
Based on the relation between translation compensation and rotation compensation, obtaining translation test quantity and corresponding rotation test quantity;
Under the condition that translation is applied to the camera by using the translation test quantity, a first relation between the object distance and the picture offset is obtained, and under the condition that rotation is applied to the camera by using the rotation test quantity, a second relation between the object distance and the picture offset is obtained;
and determining the first object distance threshold according to the first relation and the second relation.
3. The method of claim 1, wherein the acquiring the object distance between the photographed object and the camera comprises:
acquiring focusing parameters when the camera is aligned to the shot object;
And determining the object distance according to the focusing parameter.
4. A method according to claim 3, wherein the focus parameter comprises an autofocus control parameter; the determining the object distance according to the focusing parameter comprises the following steps:
And determining the object distance corresponding to the automatic focusing control parameter when the camera is aligned to the shot object based on the calibration relation between the automatic focusing control parameter and the object distance.
5. An optical anti-shake control apparatus, comprising:
the object distance acquisition module is configured to acquire an object distance between a shot object and a camera;
A target class determination module configured to determine acceleration data as a target class in response to the object distance being less than a first object distance threshold; determining angular velocity data as the target class in response to the object distance being greater than the first object distance threshold; or in response to the object distance being less than a second object distance threshold, determining acceleration data as the target class; responsive to the object distance being greater than the second object distance threshold and less than a third object distance threshold, determining both acceleration data and angular velocity data as the target class; determining angular velocity data as the target class in response to the object distance being greater than the third object distance threshold;
a pose information determining module configured to determine pose information of the camera based on the acceleration data and/or the angular velocity data of the target class;
The control parameter determining module is configured to obtain optical anti-shake control parameters by analyzing pose information of the camera; the optical anti-shake control parameters include position adjustment information for a lens or an image sensor in the camera; the control parameter determination module is configured to: determining anti-shake compensation information according to pose information of the camera at the current moment and pose information of at least one preamble moment of the current moment; and determining the position adjustment information according to the anti-shake compensation information.
6. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of claims 1 to 4.
7. An electronic device, comprising:
A processor;
A memory for storing executable instructions of the processor;
the camera comprises an optical anti-shake system;
Wherein the processor is configured to perform the method of any one of claims 1 to 4 via execution of the executable instructions.
CN202210181901.XA 2022-02-25 2022-02-25 Optical anti-shake control method and device, storage medium and electronic equipment Active CN114449173B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210181901.XA CN114449173B (en) 2022-02-25 2022-02-25 Optical anti-shake control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210181901.XA CN114449173B (en) 2022-02-25 2022-02-25 Optical anti-shake control method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN114449173A CN114449173A (en) 2022-05-06
CN114449173B true CN114449173B (en) 2024-07-02

Family

ID=81373640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210181901.XA Active CN114449173B (en) 2022-02-25 2022-02-25 Optical anti-shake control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114449173B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115103108B (en) * 2022-06-06 2024-07-16 Oppo广东移动通信有限公司 Anti-shake processing method, device, electronic equipment and computer readable storage medium
CN115134525B (en) * 2022-06-27 2024-05-17 维沃移动通信有限公司 Data transmission method, inertial measurement unit and optical anti-shake unit
CN116300294B (en) * 2022-10-25 2024-04-12 荣耀终端有限公司 Method and device for simulating human body shake
CN117119303A (en) * 2023-04-07 2023-11-24 荣耀终端有限公司 Control method of camera module
CN116095489B (en) * 2023-04-11 2023-06-09 北京城建智控科技股份有限公司 Collaborative anti-shake method based on camera device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103685950A (en) * 2013-12-06 2014-03-26 华为技术有限公司 Method and device for preventing shaking of video image
CN111355888A (en) * 2020-03-06 2020-06-30 Oppo广东移动通信有限公司 Video shooting method and device, storage medium and terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105100614B (en) * 2015-07-24 2018-07-31 小米科技有限责任公司 The implementation method and device of optical anti-vibration, electronic equipment
JP2018146663A (en) * 2017-03-02 2018-09-20 キヤノン株式会社 Image tremor correction device, control method of the same, imaging device, and lens device
CN109639893A (en) * 2018-12-14 2019-04-16 Oppo广东移动通信有限公司 Play parameter method of adjustment, device, electronic equipment and storage medium
WO2021258321A1 (en) * 2020-06-24 2021-12-30 华为技术有限公司 Image acquisition method and apparatus
CN112637489A (en) * 2020-12-18 2021-04-09 努比亚技术有限公司 Image shooting method, terminal and storage medium
CN113452914A (en) * 2021-06-28 2021-09-28 上海艾为电子技术股份有限公司 Optical anti-shake control device, optical anti-shake control method thereof and mobile terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103685950A (en) * 2013-12-06 2014-03-26 华为技术有限公司 Method and device for preventing shaking of video image
CN111355888A (en) * 2020-03-06 2020-06-30 Oppo广东移动通信有限公司 Video shooting method and device, storage medium and terminal

Also Published As

Publication number Publication date
CN114449173A (en) 2022-05-06

Similar Documents

Publication Publication Date Title
CN114449173B (en) Optical anti-shake control method and device, storage medium and electronic equipment
US9979889B2 (en) Combined optical and electronic image stabilization
CN109194876B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
KR101528860B1 (en) Method and apparatus for correcting a shakiness in digital photographing apparatus
CN110035228B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
KR102143456B1 (en) Depth information acquisition method and apparatus, and image collection device
US10277809B2 (en) Imaging device and imaging method
CN104065868B (en) Image capture apparatus and control method thereof
CN104853084B (en) Image processing equipment and its control method
WO2020037959A1 (en) Image processing method, image processing apparatus, electronic device and storage medium
US9407827B2 (en) Method and system for capturing sequences of images with compensation for variations in magnification
CN107615744B (en) Image shooting parameter determining method and camera device
CN110300263B (en) Gyroscope processing method and device, electronic equipment and computer readable storage medium
JP6932531B2 (en) Image blur correction device, image pickup device, control method of image pickup device
EP3267675B1 (en) Terminal device and photographing method
CN114531546A (en) Lens adjusting method and device, storage medium and electronic equipment
US9407811B2 (en) Focus control unit in imaging apparatus, method of controlling the focus control unit and medium for controlling the focus control unit
WO2015033810A1 (en) Imaging device, method and program
US20160275657A1 (en) Imaging apparatus, image processing apparatus and method of processing image
US20190098215A1 (en) Image blur correction device and control method
JP5393877B2 (en) Imaging device and integrated circuit
JP2020136774A (en) Image processing apparatus for detecting motion vector, control method of the same, and program
EP4013030A1 (en) Image processing method and apparatus, and electronic device and computer-readable storage medium
CN115022540B (en) Anti-shake control method, device and system and electronic equipment
CN118413739A (en) Electronic image stabilization method, electronic image stabilization device, medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant