WO2024119367A1 - Method for shooting video, computer-readable storage medium and electronic device - Google Patents

Method for shooting video, computer-readable storage medium and electronic device Download PDF

Info

Publication number
WO2024119367A1
WO2024119367A1 PCT/CN2022/136894 CN2022136894W WO2024119367A1 WO 2024119367 A1 WO2024119367 A1 WO 2024119367A1 CN 2022136894 W CN2022136894 W CN 2022136894W WO 2024119367 A1 WO2024119367 A1 WO 2024119367A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
exposure
centering
start time
exposure start
Prior art date
Application number
PCT/CN2022/136894
Other languages
French (fr)
Inventor
Atsushi Matsutani
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2022/136894 priority Critical patent/WO2024119367A1/en
Publication of WO2024119367A1 publication Critical patent/WO2024119367A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors

Definitions

  • the present disclosure relates to generally the electronic technology field, and more particularly, to a method of shooting a video, a computer-readable storage medium and an electronic device.
  • Camera shake is corrected by an optical correction that physically adjusts an optical axis by moving a lens and corrected by an electronic correction that corrects image data captured by an image sensor.
  • a lens shift method known as one of the optical correction methods, a correcting lens or an anti-vibration lens is moved so that the camera shake is canceled.
  • the optical correction has the advantage of image quality not being degraded since the captured image is not corrected.
  • the optical correction may not be properly performed.
  • the centering operation to return the anti-vibration lens cannot be performed enough since a long exposure time is required although an amount of movement of the anti-vibration lens is large.
  • the inability to perform enough optical correction leads to degradation of video quality. For example, a light source flows or becomes double.
  • the present disclosure provides a method for shooting video, a computer-readable storage medium and an electronic device to perform a centering operation at high frequency.
  • a method for shooting video includes:
  • a computer-readable storage medium on which a computer program is stored, is provided.
  • the computer program is executed by a computer to implement the method according to the present disclosure.
  • an electronic device includes:
  • an acquiring module configured to acquire a photometric value in a captured image
  • a calculating module configured to calculate an exposure start time and an exposure end time based on the photometric value
  • a delaying module configured to delay the exposure start time and the exposure end time such that a centering time for an anti-vibration lens can be secured before the exposure start time.
  • FIG. 1 is a functional block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure
  • FIG. 2 is a functional block diagram illustrating a configuration of a camera unit according to an embodiment of the present disclosure
  • FIG. 3 is a diagram for explaining a centering operation of an anti-vibration lens of an OIS (Optical Image Stabilization) unit;
  • OIS Optical Image Stabilization
  • FIG. 4 is a diagram for explaining an exposure time in a series of frames
  • FIG. 5 is a functional block diagram of an application processor of the electronic device according to an embodiment of the present disclosure.
  • FIG. 6 is a flowchart illustrating a method for shooting video according to an embodiment of the present disclosure
  • FIG. 7 is a diagram for explaining the timing at which the centering operation becomes active according to a comparative example without an exposure shift process.
  • FIG. 8 is a diagram for explaining the timing at which the centering operation becomes active according to an embodiment of the present disclosure.
  • FIG. 1 is a schematic diagram illustrating an example of a configuration of the electronic device 100.
  • the electronic device 100 is a mobile device such as a smartphone or a tablet computer in this embodiment, but the electronic device 100 may be other types of electronic devices equipped with a camera unit.
  • the electronic device 100 includes a camera unit 10, an application processor 20 that controls the camera unit 10, a global navigation satellite system (GNSS) unit 31, a wireless communication unit 32, a CODEC 33, a speaker 34, a microphone 35, a display unit 36, an input unit 37, an inertial measurement unit (IMU) 38, a processor 40 and a memory 50. These elements are arranged within a housing 60 of the electronic device 100.
  • the electronic device 100 may comprise a plurality of the camera units 10 such as a main camera and an ultra-wide camera.
  • the camera unit 10 includes an OIS unit 11, an OIS driver 12, a CMOS Image Sensor (CIS) 13 and a Gyro sensor 14, as shown in FIG. 2.
  • the OIS unit 11 is configured to perform the Optical Image Stabilization that makes use of the lens shift method.
  • the OIS unit 11 includes an anti-vibration lens, a permanent magnet associated with the anti-vibration lens, a coil arranged near the magnet and a Hall element for detecting a position of the anti-vibration lens.
  • the anti-vibration lens can move within a moving area provided in the OIS unit 11.
  • the OIS unit 11 further includes an auto-focus (AF) mechanism.
  • AF auto-focus
  • the OIS driver 12 drives the OIS unit 11 to move the anti-vibration lens in a direction so as to cancel a shake of the camera unit 10 (i.e., the housing 60) .
  • the OIS driver 12 receives a signal from the Gyro sensor 14 to sense a vibration of the housing 60.
  • the CIS 13 is an image sensor that receives light emitted from an object and passed through the anti-vibration lens and converts the brightness of the light into electrical signal.
  • the CIS 13 has a photodiode and a transistor for each pixel, each of which amplifies the charge signal.
  • the Gyro Sensor 14 detects a shake of the camera unit 10 (i.e., the housing 60) and sends a detection signal to the OIS driver 12.
  • the OIS driver 12 sends the gyro information to the application processor 20.
  • the OIS driver 12 also sends the position information of the anti-vibration lens to the application processor 20.
  • the CIS 13 sends a signal for deactivating the OIS operation and activating a centering operation to the OIS driver 12.
  • the OIS driver 12 operates the OIS unit 11 to return the anti-vibration lens to the center C of the moving area, as shown in FIG. 3. It takes a certain amount of time to perform the centering operation. For example, if the anti-vibration lens is positioned at the edge of the movement area, it takes 10 msec to return the anti-vibration lens to the center of the movement area.
  • the application processor 20 controls the camera unit 10.
  • the application processor 20 performs the auto exposure (AE) process to assess the brightness of the subject and automatically determine the exposure (AE: aperture and exposure) .
  • the application processor 20 sends an exposure start time, an exposure end time and the ISO sensitivity to the CIS 13 for each frame.
  • the exposure start time indicates a timing to start exposure and the exposure end time indicates a timing to end exposure.
  • a frame for acquiring an image is repeated at a predetermined period T p .
  • T p is determined by a frame rate of the video. Specifically, T p is equal to 1/F r , where F r is the frame rate. T p is about 33 msec when the frame rate is 30 fps.
  • the CIS 13 is exposed in each of the frame 1 and frame 2 and frame 3.
  • Parameters t 11 , t 21 and t 31 are exposure start times of the first line of the CIS 13.
  • t 12 , t 22 and t 32 are exposure start times of the last line of the CIS 13
  • t 13 , t 23 and t 33 are exposure end times of the first line
  • t 14 , t 24 and t 34 are exposure end times of the last line.
  • An exposure time T e1 of the frame 1 is t 14 -t 12
  • an exposure time T e2 of the frame 2 is t 24 -t 22
  • an exposure time T e3 of the frame 3 is t 34 -t 32 .
  • T s is a rolling shutter time of the CIS 13.
  • T r is a read out time of the CIS 13.
  • T s and T r are values determined by the specification of the CIS 13. Therefore, T s and T r are the same for each frame.
  • (t 13 -t 11 ) is equal to (t 14 -t 12 ) .
  • (t 23 -t 21 ) is equal to (t 24 -t 22 )
  • (t 33 -t 31 ) is equal to (t 34 -t 32 ) .
  • T b1 , T b2 and T b3 are a blank time of the frame 1, frame 2 and frame 3, respectively.
  • the blank time T b1 is given by the following equation (1) .
  • T b1 T p - (T e1 + T s ) ... (1)
  • T b2 T p - (T e2 + T s ) ... (2)
  • T b3 T p - (T e3 + T s ) ... (3)
  • the centering operation is performed in the blank time.
  • a centering time is the time necessary to perform the centering operation. For example, if the blank time is greater than 10 msec, it is possible to secure the needed centering time before exposing an image in the next frame.
  • the centering time is defined as a predetermined constant time required to return the anti-vibration lens from an edge of the moving area to a center of the moving area.
  • the centering time may be defined as a variable time based on a position of the anti-vibration lens.
  • the centering time is shorter than 10 msec when the anti-vibration lens is located near the center of the movement area.
  • the application processor 20 also performs an Electronic Image Stabilization (EIS) process that compensates for blur by digital processing that cuts out a part of the captured image.
  • EIS Electronic Image Stabilization
  • the GNSS module 31 measures a current position of the electronic device 100.
  • the wireless communication module 32 performs wireless communications with the Internet.
  • the CODEC 33 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method.
  • the speaker 34 outputs a sound in accordance with sound data decoded by the CODEC 33.
  • the microphone 35 inputs sound and outputs sound data to the CODEC 33 based on inputted sound.
  • the display unit 36 is provided on a front of the housing 60.
  • the display unit 36 may display a video captured by the camera unit 10.
  • the input unit 37 inputs information via a user’s operation.
  • the input unit 37 inputs the user’s command to start or end video shooting.
  • the input unit 37 may be a touch panel.
  • the processor 40 controls the GNSS module 31, the wireless communication module 32, the CODEC 33, the speaker 34, the microphone 35, the display unit 36, and the input unit 37.
  • the processor 40 may perform the AE process and/or the EIS process.
  • the memory 50 stores data of images captured by the camera unit 10. Further, the memory 50 stores a program which runs on processors such as the application processor 20 and the processor 40.
  • the electronic device 100 may comprise one or more camera units in addition to the camera unit 10.
  • the electronic device 100 may have a range sensor unit that measures the distance between the electronic device 100 and the subject.
  • the electronic device 100 may have one or more processors in addition to the application processor 20 and the processor 40.
  • the application processor 20 includes an acquiring module 21, a calculating module 22 and a delaying module 23, as shown in FIG. 5.
  • the acquiring module 21 is configured to acquire a photometric value in a captured image.
  • the photometric value indicates a brightness of a subject.
  • the acquiring unit 21 acquires a photometric value in the captured image of the frame 1 while an image is exposed in the frame 2.
  • the calculating module 22 is configured to calculate an exposure start time and an exposure end time based on the photometric value. In FIG. 4, the calculating module 22 calculates, while an image is exposed in the frame 2, the exposure start time t 31 and the exposure end time t 33 for an exposure in the frame 3 based on the photometric value of the frame 1. Instead, the calculating module 22 may calculate the exposure start time t 32 and the exposure end time t 34 .
  • the calculating module 22 calculates the exposure time T e3 by subtracting the exposure start time t 31 from the exposure end time t 33 . Instead, the exposure time T e3 may be calculated by the delaying module 23.
  • the exposure time varies depending on the brightness of the environment (i.e., the photometric value) .
  • the calculating module 22 may increase an ISO sensitivity to shorten the exposure time. This makes it easier to secure the centering time.
  • the delaying module 23 is configured to perform an exposure shift process. That is to say, the delaying module 23 delays the exposure start time and the exposure end time when an exposure time determined by the exposure start time and the exposure end time is greater than a threshold time. By delaying the exposure start time and the exposure end time, the centering time can be secured before the exposure start time.
  • the delaying module 23 delays the exposure start time t 31 (t 32 ) and the exposure end time t 33 (t 34 ) .
  • the delaying process is performed while an image is exposed in the frame 2.
  • the threshold time is a time based on the frame rate F r , the read out time T r of the CIS 13, and the centering time T c of the OIS unit 11. Specifically, the threshold time T h is given by the following equation (4) .
  • T h is approximately 18 msec.
  • the blank time is less than the centering time when the exposure time is greater than the threshold time. Therefore, it is necessary to delay the exposure start time and the exposure end time in order to secure the centering time.
  • the delaying module 23 may perform the exposure shift process every predetermined number of frames.
  • the predetermined number of frames is 3, for example.
  • the modules 21, 22 and 23 may be implemented by software, hardware, firmware, or a combination thereof.
  • the modules may execute steps of the method according to the present disclosure by use of a circuit, one or more Application Specific Integrated Circuits (ASICs) , one or more general-purpose integrated circuits, one or more microprocessors, one or more programmable logic devices, or a combination of the above-mentioned circuits or devices, or another proper circuit or device.
  • ASICs Application Specific Integrated Circuits
  • microprocessors one or more programmable logic devices, or a combination of the above-mentioned circuits or devices, or another proper circuit or device.
  • the acquiring module 21 acquires a photometric value in a captured image.
  • the calculating module 22 calculates an exposure start time and an exposure end time for an exposure in a next frame based on the photometric value acquired in the step SI.
  • the calculating module 22 calculates an exposure time by subtracting the exposure start time from the exposure end time.
  • the calculating module 22 may shorten the exposure time by increasing the ISO sensitivity.
  • the delaying module 23 determines whether the exposure time is greater than the threshold time.
  • the threshold time may be given by the equation (4) . If the exposure time is greater than the threshold time (S3: Yes) , the process proceeds to a step S4. On the contrary, if the exposure time is equal to or less than the threshold time (S3: No) , the process proceeds to a step S7.
  • the delaying module 23 determines whether a current frame number is a multiple of 3. If the current frame number is a multiple of 3 (S4: Yes) , the process proceeds to a step S5. On the contrary, if the current frame number is not a multiple of 3 (S4: No) , the process proceeds to the step S7.
  • the multiple of 3 is an example.
  • the delaying module 23 may determine whether the current frame number is a multiple other than 3.
  • the delaying module 23 delays the exposure start time and the exposure end time calculated in the step S2 such that the centering time can be secured before the exposure start time.
  • the delaying module 23 delays the exposure start time and the exposure end time by 6 msec as shown in FIG. 8 since the blank time is 4 msec in the frame 3 as shown in FIG. 7. As a result, 10 msec is secured as a blank time.
  • the delaying module 23 delays the exposure start time and the exposure end time by 2 msec as shown since the blank time is 8 msec in the frame 6.
  • the application processor 20 sends the delayed exposure start time and the delayed exposure end time to the CIS 13.
  • the application processor 20 sends the exposure start time and the exposure end time to the CIS 13.
  • the centering operation can be regularly and periodically ensured, and thus the centering operation can be executed at high frequency. As a result, it is possible to shoot high-quality video even when shooting video with a smartphone while walking at night, for example.
  • the delaying module 23 may not delay the exposure start time and the exposure end time when an amount of delaying the exposure start time and the exposure end time is greater than an upper limit value. This makes it possible to avoid shooting a jerky and unattractive video.
  • the upper limit value may be determined based on a frame rate of the video. For example, the higher the frame rate, the higher the upper limit.
  • the delaying module 23 may not delay the exposure start time and the exposure end time when the exposure time is greater than a second threshold time which is greater than the threshold time.
  • the second threshold time T h2 is given by an equation (5) .
  • T h2 1/F r -1/2* (T r + T c ) ... (5)
  • T h2 is the second threshold time
  • F r is the frame rate
  • T r is the read out time
  • T c is the centering time
  • T h2 is approximately 25.5 msec. If the exposure time is longer than the second threshold, a sufficient blank time cannot be secured even if the exposure start time is delayed.
  • the delaying module 23 may not delay the exposure start time and the exposure end time when a deviation of the anti-vibration lens from the center of the moving area is less than a predetermined value.
  • the predetermined value is one-third the distance between the center of the moving area and the edge of the moving area.
  • the position of the anti-vibration lens can be detected by the Hall element in the OIS unit 11.
  • the delaying module 23 may not delay the exposure start time and the exposure end time when a value indicating a vibration of the housing 60 is less than a predetermined value.
  • the vibration of the housing 60 can be detected by the Gyro sensor 14.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • the feature defined with “first” and “second” may comprise one or more of this feature.
  • “a plurality of” means two or more than two, unless specified otherwise.
  • the terms “mounted, ” “connected, ” “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures ; may also be inner communications of two elements, which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is “on” or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are contacted via an additional feature formed therebetween.
  • a first feature “on, ” “above, ” or “on top of” a second feature may include an embodiment in which the first feature is right or obliquely “on, ” “above, ” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below, ” “under, ” or “on bottom of” a second feature may include an embodiment in which the first feature is right or obliquely “below, ” “under, ” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function may be specifically achieved in any computer readable medium to be used by the instruction execution system, device or equipment (such as the system based on computers, the system comprising processors or other systems capable of obtaining the instruction from the instruction execution system, device and equipment and executing the instruction) , or to be used in combination with the instruction execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (amagnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instruction execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
  • the storage medium may be transitory or non-transitory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

A method of shooting a video is Disclosed. The method includes acquiring a photometric value in a captured image (S1), calculating an exposure start time and an exposure end time based on the photometric value (S2), and performing an exposure shift process of delaying the exposure start time and the exposure end time such that a centering time for an anti-vibration lens can be secured before the exposure start time.

Description

METHOD FOR SHOOTING VIDEO, COMPUTER-READABLE STORAGE MEDIUM AND ELECTRONIC DEVICE FIELD
The present disclosure relates to generally the electronic technology field, and more particularly, to a method of shooting a video, a computer-readable storage medium and an electronic device.
BACKGROUND
Camera shake is corrected by an optical correction that physically adjusts an optical axis by moving a lens and corrected by an electronic correction that corrects image data captured by an image sensor. In a lens shift method, known as one of the optical correction methods, a correcting lens or an anti-vibration lens is moved so that the camera shake is canceled. The optical correction has the advantage of image quality not being degraded since the captured image is not corrected.
However, when a user shoots video with an electronic device such as a smartphone while walking at night, the optical correction may not be properly performed. One of the reasons is that the centering operation to return the anti-vibration lens cannot be performed enough since a long exposure time is required although an amount of movement of the anti-vibration lens is large.
The inability to perform enough optical correction leads to degradation of video quality. For example, a light source flows or becomes double.
SUMMARY
The present disclosure provides a method for shooting video, a computer-readable storage medium and an electronic device to perform a centering operation at high frequency.
In accordance with the present disclosure, a method for shooting video is provided. The method includes:
acquiring a photometric value in a captured image;
calculating an exposure start time and an exposure end time based on the photometric value; and
performing an exposure shift process of delaying the exposure start time and the exposure end time such that a centering time for an anti-vibration lens can be secured  before the exposure start time.
In accordance with the present disclosure, a computer-readable storage medium, on which a computer program is stored, is provided. The computer program is executed by a computer to implement the method according to the present disclosure.
In accordance with the present disclosure, an electronic device is provided. The device includes:
an acquiring module configured to acquire a photometric value in a captured image;
a calculating module configured to calculate an exposure start time and an exposure end time based on the photometric value; and
a delaying module configured to delay the exposure start time and the exposure end time such that a centering time for an anti-vibration lens can be secured before the exposure start time.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 is a functional block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure;
FIG. 2 is a functional block diagram illustrating a configuration of a camera unit according to an embodiment of the present disclosure;
FIG. 3 is a diagram for explaining a centering operation of an anti-vibration lens of an OIS (Optical Image Stabilization) unit;
FIG. 4 is a diagram for explaining an exposure time in a series of frames;
FIG. 5 is a functional block diagram of an application processor of the electronic device according to an embodiment of the present disclosure;
FIG. 6 is a flowchart illustrating a method for shooting video according to an embodiment of the present disclosure;
FIG. 7 is a diagram for explaining the timing at which the centering operation becomes active according to a comparative example without an exposure shift process; and
FIG. 8 is a diagram for explaining the timing at which the centering operation becomes active according to an embodiment of the present disclosure.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples  of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and the elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory and aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
<Electronic device 100>
An electronic device 100 according to an embodiment of the present disclosure will be described with reference to FIG. 1. FIG. 1 is a schematic diagram illustrating an example of a configuration of the electronic device 100.
The electronic device 100 is a mobile device such as a smartphone or a tablet computer in this embodiment, but the electronic device 100 may be other types of electronic devices equipped with a camera unit.
As shown in FIG. 1, the electronic device 100 includes a camera unit 10, an application processor 20 that controls the camera unit 10, a global navigation satellite system (GNSS) unit 31, a wireless communication unit 32, a CODEC 33, a speaker 34, a microphone 35, a display unit 36, an input unit 37, an inertial measurement unit (IMU) 38, a processor 40 and a memory 50. These elements are arranged within a housing 60 of the electronic device 100. The electronic device 100 may comprise a plurality of the camera units 10 such as a main camera and an ultra-wide camera.
The camera unit 10 includes an OIS unit 11, an OIS driver 12, a CMOS Image Sensor (CIS) 13 and a Gyro sensor 14, as shown in FIG. 2.
The OIS unit 11 is configured to perform the Optical Image Stabilization that makes use of the lens shift method. The OIS unit 11 includes an anti-vibration lens, a permanent magnet associated with the anti-vibration lens, a coil arranged near the magnet and a Hall element for detecting a position of the anti-vibration lens. The anti-vibration lens can move within a moving area provided in the OIS unit 11. The OIS unit 11 further includes an auto-focus (AF) mechanism.
By passing an electric current through the coil, a repulsive or attractive force generated between the coil and the magnet moves the anti-vibration lens together with the magnet. The current is controlled so that a shake of the camera module 10 (the housing 60) can be canceled. Thus, an OIS operation is performed.
The OIS driver 12 drives the OIS unit 11 to move the anti-vibration lens in a direction so as to cancel a shake of the camera unit 10 (i.e., the housing 60) . The OIS driver 12 receives a signal from the Gyro sensor 14 to sense a vibration of the housing 60.
The CIS 13 is an image sensor that receives light emitted from an object and  passed through the anti-vibration lens and converts the brightness of the light into electrical signal. The CIS 13 has a photodiode and a transistor for each pixel, each of which amplifies the charge signal.
The Gyro Sensor 14 detects a shake of the camera unit 10 (i.e., the housing 60) and sends a detection signal to the OIS driver 12. The OIS driver 12 sends the gyro information to the application processor 20. The OIS driver 12 also sends the position information of the anti-vibration lens to the application processor 20.
The CIS 13 sends a signal for deactivating the OIS operation and activating a centering operation to the OIS driver 12. When the signal is received, the OIS driver 12 operates the OIS unit 11 to return the anti-vibration lens to the center C of the moving area, as shown in FIG. 3. It takes a certain amount of time to perform the centering operation. For example, if the anti-vibration lens is positioned at the edge of the movement area, it takes 10 msec to return the anti-vibration lens to the center of the movement area.
The application processor 20 controls the camera unit 10. The application processor 20 performs the auto exposure (AE) process to assess the brightness of the subject and automatically determine the exposure (AE: aperture and exposure) . The application processor 20 sends an exposure start time, an exposure end time and the ISO sensitivity to the CIS 13 for each frame. The exposure start time indicates a timing to start exposure and the exposure end time indicates a timing to end exposure. The details of the AE process according to an embodiment of the present disclosure will be described later.
As shown in FIG. 4, a frame for acquiring an image is repeated at a predetermined period T p. T p is determined by a frame rate of the video. Specifically, T p is equal to 1/F r, where F r is the frame rate. T p is about 33 msec when the frame rate is 30 fps.
The CIS 13 is exposed in each of the frame 1 and frame 2 and frame 3. Parameters t 11, t 21 and t 31 are exposure start times of the first line of the CIS 13. Similarly, t 12, t 22 and t 32 are exposure start times of the last line of the CIS 13, t 13, t 23 and t 33 are exposure end times of the first line, and t 14, t 24 and t 34 are exposure end times of the last line. An exposure time T e1 of the frame 1 is t 14 -t 12, an exposure time T e2 of the frame 2 is t 24 -t 22, and an exposure time T e3 of the frame 3 is t 34 -t 32.
T s is a rolling shutter time of the CIS 13. T r is a read out time of the CIS 13. T s and T r are values determined by the specification of the CIS 13. Therefore, T s and T r are the same for each frame. (t 13 -t 11) is equal to (t 14 -t 12) . Similarly, (t 23 -t 21) is equal to (t 24 -t 22) , and (t 33 -t 31) is equal to (t 34 -t 32) .
T b1, T b2 and T b3 are a blank time of the frame 1, frame 2 and frame 3, respectively. The blank time T b1 is given by the following equation (1) .
T b1 = T p - (T e1 + T s)       ... (1)
Similarly, the blank time T b2 and T b3 are given by the following equations (2) and (3) , respectively.
T b2 = T p- (T e2 + T s)       ... (2)
T b3 = T p - (T e3 + T s)       ... (3)
The centering operation is performed in the blank time. A centering time is the time necessary to perform the centering operation. For example, if the blank time is greater than 10 msec, it is possible to secure the needed centering time before exposing an image in the next frame.
The centering time is defined as a predetermined constant time required to return the anti-vibration lens from an edge of the moving area to a center of the moving area.
Optionally, the centering time may be defined as a variable time based on a position of the anti-vibration lens. The centering time is shorter than 10 msec when the anti-vibration lens is located near the center of the movement area.
The application processor 20 also performs an Electronic Image Stabilization (EIS) process that compensates for blur by digital processing that cuts out a part of the captured image.
The GNSS module 31 measures a current position of the electronic device 100. The wireless communication module 32 performs wireless communications with the Internet. The CODEC 33 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method. The speaker 34 outputs a sound in accordance with sound data decoded by the CODEC 33. The microphone 35 inputs sound and outputs sound data to the CODEC 33 based on inputted sound.
The display unit 36 is provided on a front of the housing 60. The display unit 36 may display a video captured by the camera unit 10. The input unit 37 inputs information via a user’s operation. For example, the input unit 37 inputs the user’s command to start or end video shooting. The input unit 37 may be a touch panel.
The processor 40 controls the GNSS module 31, the wireless communication module 32, the CODEC 33, the speaker 34, the microphone 35, the display unit 36, and the input unit 37. The processor 40 may perform the AE process and/or the EIS process.
The memory 50 stores data of images captured by the camera unit 10. Further,  the memory 50 stores a program which runs on processors such as the application processor 20 and the processor 40.
The schematic configuration of the electronic device 100 has been described above. It should be noted that the configuration of the electronic device 100 is not limited to the above description. For example, the electronic device 100 may comprise one or more camera units in addition to the camera unit 10. The electronic device 100 may have a range sensor unit that measures the distance between the electronic device 100 and the subject. The electronic device 100 may have one or more processors in addition to the application processor 20 and the processor 40.
Next, the functions of the application processor 20 according to one embodiment are described in detail.
The application processor 20 includes an acquiring module 21, a calculating module 22 and a delaying module 23, as shown in FIG. 5.
The acquiring module 21 is configured to acquire a photometric value in a captured image. The photometric value indicates a brightness of a subject. In FIG. 4, the acquiring unit 21 acquires a photometric value in the captured image of the frame 1 while an image is exposed in the frame 2.
The calculating module 22 is configured to calculate an exposure start time and an exposure end time based on the photometric value. In FIG. 4, the calculating module 22 calculates, while an image is exposed in the frame 2, the exposure start time t 31 and the exposure end time t 33 for an exposure in the frame 3 based on the photometric value of the frame 1. Instead, the calculating module 22 may calculate the exposure start time t 32 and the exposure end time t 34.
The calculating module 22 calculates the exposure time T e3 by subtracting the exposure start time t 31 from the exposure end time t 33. Instead, the exposure time T e3 may be calculated by the delaying module 23. The exposure time varies depending on the brightness of the environment (i.e., the photometric value) .
Optionally, the calculating module 22 may increase an ISO sensitivity to shorten the exposure time. This makes it easier to secure the centering time.
The delaying module 23 is configured to perform an exposure shift process. That is to say, the delaying module 23 delays the exposure start time and the exposure end time when an exposure time determined by the exposure start time and the exposure end time is greater than a threshold time. By delaying the exposure start time and the exposure end time, the centering time can be secured before the exposure start time.
In FIG. 4, when the exposure time T e3 is greater than a threshold time, the delaying module 23 delays the exposure start time t 31 (t 32) and the exposure end time t 33  (t 34) . The delaying process is performed while an image is exposed in the frame 2.
The threshold time is a time based on the frame rate F r, the read out time T r of the CIS 13, and the centering time T c of the OIS unit 11. Specifically, the threshold time T h is given by the following equation (4) .
T h = 1/F r - (T r + T c)      ... (4)
If F r, T r and T c are 30 fps, 5 msec and 10 msec respectively, T h is approximately 18 msec. The blank time is less than the centering time when the exposure time is greater than the threshold time. Therefore, it is necessary to delay the exposure start time and the exposure end time in order to secure the centering time.
Optionally, the delaying module 23 may perform the exposure shift process every predetermined number of frames. The predetermined number of frames is 3, for example.
The  modules  21, 22 and 23 may be implemented by software, hardware, firmware, or a combination thereof. The modules may execute steps of the method according to the present disclosure by use of a circuit, one or more Application Specific Integrated Circuits (ASICs) , one or more general-purpose integrated circuits, one or more microprocessors, one or more programmable logic devices, or a combination of the above-mentioned circuits or devices, or another proper circuit or device.
<Method of shooting video>
A method of shooting video by using a CMOS Image Sensor according to the method of an embodiment of the present disclosure will be described with reference to the flowchart shown in FIG. 6.
In a step S1, the acquiring module 21 acquires a photometric value in a captured image.
In a step S2, the calculating module 22 calculates an exposure start time and an exposure end time for an exposure in a next frame based on the photometric value acquired in the step SI. The calculating module 22 calculates an exposure time by subtracting the exposure start time from the exposure end time.
In the step S2, the calculating module 22 may shorten the exposure time by increasing the ISO sensitivity.
In a step S3, the delaying module 23 determines whether the exposure time is greater than the threshold time. The threshold time may be given by the equation (4) . If the exposure time is greater than the threshold time (S3: Yes) , the process proceeds to a step S4. On the contrary, if the exposure time is equal to or less than the threshold time (S3:  No) , the process proceeds to a step S7.
In the step S4, the delaying module 23 determines whether a current frame number is a multiple of 3. If the current frame number is a multiple of 3 (S4: Yes) , the process proceeds to a step S5. On the contrary, if the current frame number is not a multiple of 3 (S4: No) , the process proceeds to the step S7.
The multiple of 3 is an example. The delaying module 23 may determine whether the current frame number is a multiple other than 3.
In the step S5, the delaying module 23 delays the exposure start time and the exposure end time calculated in the step S2 such that the centering time can be secured before the exposure start time. When the current frame number is 3, the delaying module 23 delays the exposure start time and the exposure end time by 6 msec as shown in FIG. 8 since the blank time is 4 msec in the frame 3 as shown in FIG. 7. As a result, 10 msec is secured as a blank time.
As another example, when the current frame number is 6, the delaying module 23 delays the exposure start time and the exposure end time by 2 msec as shown since the blank time is 8 msec in the frame 6.
In a step S6, the application processor 20 sends the delayed exposure start time and the delayed exposure end time to the CIS 13. The CIS 13 sends a signal for activating the centering operation (i.e., OIS_EN=Low) to the OIS driver 12 based on the delayed exposure start time.
In the step S7, the application processor 20 sends the exposure start time and the exposure end time to the CIS 13. The CIS 13 sends a signal for activating the centering operation (i.e., OIS_EN=Low) to the OIS driver 12 when the exposure time is equal to or less than the threshold time (i.e., S3: No) .
As can be seen by comparing FIG. 7 and FIG. 8, according to the present disclosure, the centering operation can be regularly and periodically ensured, and thus the centering operation can be executed at high frequency. As a result, it is possible to shoot high-quality video even when shooting video with a smartphone while walking at night, for example.
According to one aspect of the present disclosure, it is possible to perform the centering operation without shortening the exposure time, that is, without lowering the S/N ratio.
Optionally, in the step S5, the delaying module 23 may not delay the exposure start time and the exposure end time when an amount of delaying the exposure start time and the exposure end time is greater than an upper limit value. This makes it possible to avoid shooting a jerky and unattractive video.
The upper limit value may be determined based on a frame rate of the video. For example, the higher the frame rate, the higher the upper limit.
Optionally, in the step S5, the delaying module 23 may not delay the exposure start time and the exposure end time when the exposure time is greater than a second threshold time which is greater than the threshold time. Specifically, the second threshold time T h2 is given by an equation (5) .
T h2 = 1/F r -1/2* (T r + T c)   ... (5)
where T h2 is the second threshold time, F r is the frame rate, T r is the read out time, and T c is the centering time.
If F r, T r and T c are respectively 30 fps, 5 msec and 10 msec, T h2 is approximately 25.5 msec. If the exposure time is longer than the second threshold, a sufficient blank time cannot be secured even if the exposure start time is delayed.
Optionally, in the step S5, the delaying module 23 may not delay the exposure start time and the exposure end time when a deviation of the anti-vibration lens from the center of the moving area is less than a predetermined value. For example, the predetermined value is one-third the distance between the center of the moving area and the edge of the moving area. The position of the anti-vibration lens can be detected by the Hall element in the OIS unit 11.
Optionally, in the step S5, the delaying module 23 may not delay the exposure start time and the exposure end time when a value indicating a vibration of the housing 60 is less than a predetermined value. The vibration of the housing 60 can be detected by the Gyro sensor 14.
This is because it is less important to perform the centering operation when the displacement of the anti-vibration lens and the vibration of the housing are small.
In the description of embodiments of the present disclosure, it is to be understood that terms such as “central, ” “longitudinal” , “transverse” , “length, ” “width, ” “thickness, ” “upper, ” “lower, ” “front, ” “rear, ” “left, ” “right, ” “vertical, ” “horizontal, ” “top, ” “bottom, ” “inner, ” “outer, ” “clockwise, ” and “counterclockwise” should be construed to refer to the orientation or the position as described or as shown in the drawings under discussion. These relative terms are only used to simplify description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as “first” and “second” are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, the feature defined with “first” and “second” may comprise one or more of this feature. In the description of the present disclosure, “a plurality of” means two or more than two, unless specified otherwise.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms “mounted, ” “connected, ” “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures ; may also be inner communications of two elements, which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is “on” or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are contacted via an additional feature formed therebetween. Furthermore, a first feature “on, ” “above, ” or “on top of” a second feature may include an embodiment in which the first feature is right or obliquely “on, ” “above, ” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below, ” “under, ” or “on bottom of” a second feature may include an embodiment in which the first feature is right or obliquely “below, ” “under, ” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may be also applied.
Reference throughout this specification to “an embodiment, ” “some embodiments, ” “an exemplary embodiment, ” “an example, ” “aspecific example” or  “some examples” means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instruction execution system, device or equipment (such as the system based on computers, the system comprising processors or other systems capable of obtaining the instruction from the instruction execution system, device and equipment and executing the instruction) , or to be used in combination with the instruction execution system, device and equipment. As to the specification, “the computer readable medium” may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (amagnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a  plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instruction execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc. The storage medium may be transitory or non-transitory.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (17)

  1. A method of shooting a video comprising:
    acquiring a photometric value in a captured image;
    calculating an exposure start time and an exposure end time based on the photometric value; and
    performing an exposure shift process of delaying the exposure start time and the exposure end time such that a centering time for an anti-vibration lens can be secured before the exposure start time.
  2. The method of claim 1, wherein the exposure shift process is performed when an exposure time determined by the exposure start time and the exposure end time is greater than a threshold time.
  3. The method of claim 2, wherein the threshold time is a time based on a frame rate for the video, a read out time of a CMOS image sensor and the centering time.
  4. The method of claim 3, wherein the threshold time is given by an equation (1) ,
    T h = 1/F r - (T r + T c)  ... (1)
    where T h is the threshold time, F r is the frame rate, T r is the read out time and T c is the centering time.
  5. The method of any one of claims 1 to 4, wherein the exposure shift process is performed every predetermined number of frames.
  6. The method of claim 5, wherein the predetermined number of frames is 3.
  7. The method of claim 5, wherein when calculating the exposure start time and the exposure end time, an ISO sensitivity is increased to shorten the exposure time.
  8. The method of any one of claims 1 to 7, wherein the centering time is a time required to return the anti-vibration lens from an edge of the moving area to a center of the moving area.
  9. The method of any one of claims 1 to 7, wherein the centering time is a time based on a position of the anti-vibration lens.
  10. The method of any one of claims 1 to 9, wherein the exposure shift process is not performed when an amount of delaying the exposure start time and the exposure end time is greater than an upper limit value.
  11. The method of claim 10, wherein the upper limit value is based on a frame rate of the video.
  12. The method of any one of claims 1 to 11, wherein the exposure shift process is not performed when the exposure time is greater than a second threshold time which is greater than the threshold time.
  13. The method of claim 12, wherein the second threshold time is given by an equation (2) ,
    T h2 = 1/F r-1/2* (T r + T c)  ... (2)
    where T h2 is the second threshold time, F r is the frame rate, T r is the read out time and T c is the centering time.
  14. The method of any one of claims 1 to 13, wherein the exposure shift process is not performed when a deviation of the anti-vibration lens from the center of the moving area is less than a predetermined value.
  15. The method of any one of claims 1 to 14, wherein the exposure shift process is not performed when a value indicating a vibration of a housing that houses the OIS unit is less than a predetermined value.
  16. A computer-readable storage medium, on which a computer program is stored, wherein the computer program is executed by a computer to implement the method according to any one of claims 1 to 15.
  17. An electronic device, comprising:
    an acquiring module configured to acquire a photometric value in a captured image;
    a calculating module configured to calculate an exposure start time and an exposure end time based on the photometric value; and
    a delaying module configured to delay the exposure start time and the exposure  end time such that a centering time for an anti-vibration lens can be secured before the exposure start time.
PCT/CN2022/136894 2022-12-06 2022-12-06 Method for shooting video, computer-readable storage medium and electronic device WO2024119367A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/136894 WO2024119367A1 (en) 2022-12-06 2022-12-06 Method for shooting video, computer-readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/136894 WO2024119367A1 (en) 2022-12-06 2022-12-06 Method for shooting video, computer-readable storage medium and electronic device

Publications (1)

Publication Number Publication Date
WO2024119367A1 true WO2024119367A1 (en) 2024-06-13

Family

ID=91378219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/136894 WO2024119367A1 (en) 2022-12-06 2022-12-06 Method for shooting video, computer-readable storage medium and electronic device

Country Status (1)

Country Link
WO (1) WO2024119367A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080199170A1 (en) * 2007-02-15 2008-08-21 Matsushita Electric Industrial Co., Ltd Camera system
CN106060249A (en) * 2016-05-19 2016-10-26 维沃移动通信有限公司 Shooting anti-shaking method and mobile terminal
US20190191088A1 (en) * 2017-12-14 2019-06-20 Renesas Electronics Corporation Semiconductor device and electronic device
US20200162674A1 (en) * 2018-11-15 2020-05-21 Canon Kabushiki Kaisha Lens apparatus, camera, control method, and storage medium
CN113596294A (en) * 2021-07-08 2021-11-02 维沃移动通信(杭州)有限公司 Shooting method and device and electronic equipment
US20220086343A1 (en) * 2019-09-05 2022-03-17 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
CN115361502A (en) * 2022-08-26 2022-11-18 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080199170A1 (en) * 2007-02-15 2008-08-21 Matsushita Electric Industrial Co., Ltd Camera system
US20130057752A1 (en) * 2007-02-15 2013-03-07 Panasonic Corporation Camera system
CN106060249A (en) * 2016-05-19 2016-10-26 维沃移动通信有限公司 Shooting anti-shaking method and mobile terminal
US20190191088A1 (en) * 2017-12-14 2019-06-20 Renesas Electronics Corporation Semiconductor device and electronic device
US20200162674A1 (en) * 2018-11-15 2020-05-21 Canon Kabushiki Kaisha Lens apparatus, camera, control method, and storage medium
US20220086343A1 (en) * 2019-09-05 2022-03-17 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
CN113596294A (en) * 2021-07-08 2021-11-02 维沃移动通信(杭州)有限公司 Shooting method and device and electronic equipment
CN115361502A (en) * 2022-08-26 2022-11-18 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US10136064B2 (en) Image processing apparatus and method of controlling image processing apparatus
US10205885B2 (en) Image capturing apparatus for controlling shutter speed during panning shooting and method for controlling the same
US9277102B2 (en) Audio processing apparatus, audio processing method and imaging apparatus
JP6472176B2 (en) Imaging apparatus, image shake correction apparatus, image pickup apparatus control method, and image shake correction method
US10536640B2 (en) Imaging device, operation method, image processing device, and image processing method
US20180227493A1 (en) Image processing apparatus, method for controlling the same, and image capturing apparatus
US9961269B2 (en) Imaging device, imaging device body, and lens barrel that can prevent an image diaphragm value from frequently changing
US10175451B2 (en) Imaging apparatus and focus adjustment method
JP5115210B2 (en) Imaging device
US10230897B2 (en) Control apparatus, image capturing apparatus, lens apparatus, control method, and non-transitory computer-readable storage medium
KR102041647B1 (en) System and method for calibration of camera
US10623643B2 (en) Image processing apparatus capable of performing image stabilization based on communication result, and method, image capturing apparatus, and image capturing system
US10798314B2 (en) Imaging apparatus and display method
JP2008289032A (en) Imaging apparatus
JP6543946B2 (en) Shake correction device, camera and electronic device
JP2016143022A (en) Imaging device and imaging method
US10009547B2 (en) Image pickup apparatus that compensates for flash band, control method therefor, and storage medium
US10554891B2 (en) Image stabilization apparatus, image stabilization method, image capturing apparatus, image capturing system and non-transitory storage medium
JP2019092018A (en) Imaging device and display method
US10129449B2 (en) Flash band, determination device for detecting flash band, method of controlling the same, storage medium, and image pickup apparatus
US10200612B2 (en) Image stabilization apparatus and control method thereof, and storage medium
JP2013157790A (en) Image pickup apparatus, control method therefor, and control program
WO2024119367A1 (en) Method for shooting video, computer-readable storage medium and electronic device
US10805541B2 (en) Image capture apparatus and control method thereof
US10848675B2 (en) Image blur correction apparatus, camera body, image blur correction method, and storage medium