CN117135420B - Image synchronization method and related equipment thereof - Google Patents

Image synchronization method and related equipment thereof Download PDF

Info

Publication number
CN117135420B
CN117135420B CN202310404131.5A CN202310404131A CN117135420B CN 117135420 B CN117135420 B CN 117135420B CN 202310404131 A CN202310404131 A CN 202310404131A CN 117135420 B CN117135420 B CN 117135420B
Authority
CN
China
Prior art keywords
image
camera
rotation
angle
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310404131.5A
Other languages
Chinese (zh)
Other versions
CN117135420A (en
Inventor
徐荣跃
陈国乔
邵涛
张力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310404131.5A priority Critical patent/CN117135420B/en
Publication of CN117135420A publication Critical patent/CN117135420A/en
Application granted granted Critical
Publication of CN117135420B publication Critical patent/CN117135420B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The application provides an image synchronization method and related equipment thereof, relating to the field of image processing, wherein the method comprises the following steps: starting a camera application program; displaying a first image acquired by a primary camera; receiving a first operation, wherein the first operation is used for indicating to change the angle of view; in response to a first operation, the movable camera rotates; determining a third image acquired by the main camera after rotation based on a second image acquired by the movable camera and a fourth image acquired by the movable camera after rotation, wherein the angle of view of the second image is the same as that of the first image, the angle of view of the fourth image is different from that of the first image, and the overlap ratio of the angle of view of the third image and the angle of view of the fourth image is greater than a first preset threshold; and displaying the fourth image. The application can realize the synchronization of the angles of view corresponding to the images shot by the two cameras, and further can realize smooth transition visually when the lenses are switched, thereby avoiding jump.

Description

Image synchronization method and related equipment thereof
Technical Field
The application relates to the field of image processing, in particular to an image synchronization method and related equipment thereof.
Background
With the widespread use of electronic devices, shooting with electronic devices has become a daily way of doing people's lives. Taking an electronic device as an example of a mobile phone, it has been a trend to install a plurality of cameras on the electronic device. By installing a plurality of cameras on the electronic equipment, more photographing and video recording modes can be provided for users to select and use.
However, due to the limitations of hardware differences of the cameras and installation processes, the angles of view obtained when the cameras shoot the same scene object have physical deviation, so that a user can easily generate a jump phenomenon visually when the two cameras are switched. In this regard, a new method for visually realizing a smooth transition at the time of lens switching is demanded.
Disclosure of Invention
The application provides an image synchronization method and related equipment, and by the image synchronization method, synchronization of view angles corresponding to images shot by two cameras can be realized, so that smooth transition can be realized visually during lens switching, and jump is avoided.
In a first aspect, there is provided an image synchronization method applied to an electronic device including a main camera and a movable camera, the method comprising:
Starting a camera application program;
Displaying a first image acquired by the primary camera;
receiving a first operation, wherein the first operation is used for indicating to change the angle of view;
in response to the first operation, the movable camera rotates;
determining a third image acquired by the main camera after rotation based on a second image acquired by the movable camera and a fourth image acquired by the movable camera after rotation, wherein the angle of view of the second image and the angle of view of the first image are the same, the angle of view of the fourth image and the angle of view of the first image are different, and the overlapping degree of the angle of view of the third image and the angle of view of the fourth image is larger than a first preset threshold;
And displaying the fourth image.
According to the image synchronization method provided by the embodiment of the application, under the condition that the main camera collects images and sends the images to be displayed, when the FOV (field of view) is transformed before and after the movable camera rotates in response to the first operation, the third image which can be collected by the main camera when the movable camera deviates from the same FOV is determined based on the fourth image collected by the movable camera after rotation. In this way, the phase difference of the two cameras can be ensured to be consistent with the phase difference before rotation.
Subsequently, if other processing such as registration is performed on the third image acquired by the main camera and the fourth image acquired by the movable camera after rotation, the efficiency of the processing such as registration is also higher and the processing effect is also relatively better because the acquired two frames of images are relatively close.
After rotation, as the angle of view of the images collected by the main camera and the movable camera is larger than the first preset threshold value, which is equivalent to the substantial alignment of the images shot by the main camera and the movable camera, smooth transition of the display images and no jump problem in vision can be ensured when the main camera is switched to the movable camera for display.
With reference to the first aspect, in certain implementation manners of the first aspect, the determining, based on the second image acquired by the movable camera and the fourth image acquired by the rotated movable camera, the third image acquired by the main camera after rotation includes:
determining a corresponding rotation vector of the movable camera after rotation;
establishing a first relation based on the rotation vector and the coordinates of the image center points corresponding to the second image and the fourth image respectively;
Establishing a second relation based on the coordinates of the image center points corresponding to the first image, the second image and the fourth image respectively;
and determining the coordinates of the image center point of the third image acquired by the main camera by combining the first relational expression and the second relational expression.
In the implementation mode, a rotation vector corresponding to the center point of the image can be determined based on the rotation of the movable camera, and a relation of coordinates of the center point of the image shot before and after the rotation of the movable camera can be established based on the rotation vector; in order to make the FOV corresponding to the movable camera and the main camera consistent before and after rotation, the offset of the coordinates of the central point of the image collected before and after rotation can be kept consistent, and a second relational expression is established based on the offset; then, by combining the first relation and the second relation, the left side of the image center point of the third image acquired by the main camera can be determined.
With reference to the first aspect, in certain implementation manners of the first aspect, an image center point coordinate of the third image acquired by the main camera is determined by using an iterative method.
With reference to the first aspect, in some implementation manners of the first aspect, a field angle overlapping ratio of the third image and the fourth image is greater than a first preset threshold, including:
The third image is identical to the fourth image.
In the implementation mode, the images acquired by the movable camera and the main camera after rotation are identical, and the coordinate offset of the main camera determined based on the coordinate offset of the center point of the images before and after rotation of the movable camera is completely consistent; when the camera is switched subsequently, no jump is generated visually.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes:
Receiving a second operation, wherein the second operation is used for indicating to change the angle of view;
in response to the first operation, the movable camera rotates;
Determining a seventh image acquired by the main camera based on the fourth image and a sixth image acquired by the rotated movable camera; the angle of view corresponding to the sixth image is different from the angle of view corresponding to the fourth image, and the angle of view coincidence degree corresponding to the seventh image and the sixth image is larger than a second preset threshold;
And displaying the seventh image.
In this implementation, when the FOV before and after the rotation of the movable camera changes in response to the second operation while the movable camera captures an image and transmits the image, a sixth image that the main camera can capture when shifting the same FOV is determined based on the sixth image captured by the movable camera after the rotation. In this way, the phase difference of the two cameras can be ensured to be consistent with the phase difference before rotation.
With reference to the first aspect, in certain implementation manners of the first aspect, determining, based on the fourth image and a sixth image acquired by the rotated movable camera, a seventh image acquired by the main camera includes:
Acquiring a sixth image acquired by the movable camera after rotation and a position vector of a rotating motor;
Determining a rotation vector corresponding to the sixth image based on the position vector;
Constructing a homography matrix based on the rotation vector;
Determining a mapping matrix between the fourth image and the sixth image based on the homography matrix;
and determining a seventh image acquired by the main camera based on the mapping matrix.
In the implementation mode, when the FOV generates transformation before and after the movable camera rotates, the rotation vector corresponding to the image can be determined based on the position vector of the rotating motor in the movable camera, and then the transformation relation corresponding to the image before and after the rotation, namely the homography matrix can be determined. In combination with the matrix, a mapping matrix can be determined to represent the displacement between the images. In order to keep the FOV of the movable camera coincident with that of the main camera after rotation, based on the displacement, a seventh image acquired after the main camera is shifted by the same FOV can be determined.
With reference to the first aspect, in certain implementations of the first aspect, the first operation is a click operation for the first image.
With reference to the first aspect, in certain implementations of the first aspect, the second operation is a click operation for the fourth image.
With reference to the first aspect, in certain implementations of the first aspect, the second preset threshold is the same size as the first preset threshold.
In a second aspect, there is provided an electronic device comprising: one or more processors, memory, and a display screen; the memory is coupled with the one or more processors, the memory is for storing computer program code, the computer program code comprising computer instructions that the one or more processors call to cause the electronic device to perform:
Starting a camera application program;
Displaying a first image acquired by the primary camera;
receiving a first operation, wherein the first operation is used for indicating to change the angle of view;
in response to the first operation, the movable camera rotates;
determining a third image acquired by the main camera after rotation based on a second image acquired by the movable camera and a fourth image acquired by the movable camera after rotation, wherein the angle of view of the second image and the angle of view of the first image are the same, the angle of view of the fourth image and the angle of view of the first image are different, and the overlapping degree of the angle of view of the third image and the angle of view of the fourth image is larger than a first preset threshold;
And displaying the fourth image.
In a third aspect, an image synchronization apparatus is provided, comprising means for performing any one of the image synchronization methods of the first aspect.
In one possible implementation, when the image synchronization apparatus is an electronic device, the processing unit may be a processor and the input unit may be a communication interface; the electronic device may further comprise a memory for storing computer program code which, when executed by the processor, causes the electronic device to perform any of the methods of the first aspect.
In a fourth aspect, there is provided a chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform any of the image synchronization methods of the first aspect.
In a fifth aspect, there is provided a computer readable storage medium storing computer program code which, when executed by an electronic device, causes the electronic device to perform any one of the image synchronization methods of the first aspect.
In a sixth aspect, there is provided a computer program product comprising: computer program code which, when run by an electronic device, causes the electronic device to perform any of the image synchronization methods of the first aspect.
Drawings
FIG. 1 is a schematic diagram of a hardware system suitable for use in an electronic device of the present application;
FIG. 2 is a schematic diagram of a camera arrangement on a cell phone;
Fig. 3 is a schematic side view structure of a tele camera according to the related art;
FIG. 4 is a schematic top view of an OIS controller controlling movement of a lens in an x-axis direction;
FIG. 5 is a schematic top view of the OIS controller controlling movement of the lens in the y-axis direction;
fig. 6 is a schematic side view structure diagram corresponding to a tele camera according to an embodiment of the present application;
fig. 7 is a schematic side view of an OIS controller controlling lens movement in a tele camera according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a rotary motor assembly for controlling movement of a prism about an x-axis according to an embodiment of the present application;
FIG. 9 is a schematic side view of a rotary motor assembly for controlling movement of a prism about an x-axis according to an embodiment of the present application;
fig. 10 is a light path diagram corresponding to fig. 8 and 9;
FIG. 11 is a schematic diagram of a rotary motor assembly for controlling movement of a prism about a y-axis according to an embodiment of the present application;
FIG. 12 is a schematic front view of a rotary motor assembly for controlling movement of a prism about a y-axis according to an embodiment of the present application;
Fig. 13 is a light path diagram corresponding to fig. 11 and 12;
FIG. 14 is a schematic view of the location of the imaging point corresponding to FIG. 13;
FIG. 15 is a schematic view of a change in the angular range of view according to an embodiment of the present application;
FIG. 16 is a schematic diagram of two frames of images provided by an embodiment of the present application;
FIG. 17 is a schematic diagram of phase difference compensation provided by an embodiment of the present application;
FIG. 18 is a flowchart of a first image synchronization method according to an embodiment of the present application;
FIG. 19 is a schematic flow chart of determining a third image according to an embodiment of the present application;
fig. 20 is a schematic view of a view angle corresponding to a main camera cut-out camera according to an embodiment of the present application;
FIG. 21 is a schematic view of an image center point provided by an embodiment of the present application;
FIG. 22 is a schematic diagram of an iterative method provided by an embodiment of the present application;
FIG. 23 is a flowchart of a second image synchronization method according to an embodiment of the present application;
FIG. 24 is a schematic flow chart of determining a seventh image according to an embodiment of the present application;
Fig. 25 is a schematic view of a view angle corresponding to a main camera cut by a tele camera according to an embodiment of the present application;
FIG. 26 is a schematic diagram of an image synchronization apparatus according to the present application;
fig. 27 is a schematic structural diagram of an electronic device according to the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the application, are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
First, some terms in the embodiments of the present application are explained for easy understanding by those skilled in the art.
1. A movable camera, which may include a prism and a rotation (Scan) motor; wherein the rotation motor may drive the prism to rotate about the x-axis, which may also be referred to as a panning motion, and/or about the y-axis, which may also be referred to as a nodding motion.
2. Optical anti-shake (Optical Image Stabilization, OIS), which may also be referred to as optical anti-shake, refers to detecting shake of an electronic device by a motion sensor (e.g., gyroscope, accelerometer) during photo exposure, and controlling a motor that pushes OIS, a moving lens, or an image sensor by the OIS controller according to shake data detected by the motion sensor, so that an optical path remains as stable as possible during the entire exposure, thereby obtaining a clearly exposed image.
The optical anti-shake includes two anti-shake modes, the first is lens movable optical anti-shake, and the second is photosensitive element movable optical anti-shake. The first principle of the movable optical anti-shake is that the gyroscope sensor in the lens detects tiny movements, then signals are transmitted to the microprocessor, the microprocessor immediately calculates the displacement amount to be compensated, and then the compensation is carried out according to the shake direction and the displacement amount of the lens through the compensation lens group, so that the image blurring caused by the vibration of the camera is effectively overcome. The second type of movable optical anti-shake for photosensitive element uses image sensor offset to realize anti-shake, its principle is: firstly, the CCD is arranged on a support which can move up and down and left and right, and then when the gyroscope sensor detects the shake, parameters such as the shake direction, the speed, the movement amount and the like are processed, and the movement amount of the CCD which is enough to offset the shake is calculated.
Optionally, the OIS controller includes a two-axis and three-axis optical image stabilizer, and the embodiment of the present application is described by taking two-axis optical anti-shake OIS with movable lenses as an example, and is related to two-axis data, which is not described in detail below.
3. The optical axis, which is the direction in which the optical system conducts light, refers to the principal ray of the central field of view.
4. In the optical instrument, a lens of the optical instrument is taken as a vertex, and an object image of a measured object can pass through an included angle formed by two edges of the maximum range of the lens. The size of the angle of view determines the field of view of the optical instrument, and the larger the angle of view is, the larger the field of view is, and the smaller the optical magnification is, i.e. the object exceeding the angle cannot be collected by the lens. The shorter the focal length, the wider the horizontal field of view, and thus the smaller the image, the narrower the horizontal field of view with increasing focal length, and the larger the subject.
5. In the present application, the image coordinate system may also be referred to as a standard coordinate system.
6. The reference phase difference (baseline), the distance between the optical centers of the two cameras results in a distance between the center points of the images captured by the two cameras. The center of the lens included in the camera can be approximately regarded as an optical center (optical center).
7. Focal point: the parallel light passes through the lens and is collected into a point, which is called the focal point.
Focal plane: a plane perpendicular to the optical axis of the imaging system and containing the focal point of the imaging system.
An image plane: a plane perpendicular to the optical axis of the imaging system and containing the image points of the imaging system.
Image distance: distance of the optical center to the image plane.
Focal length: distance of the optical center to the focal plane.
Object distance: distance of the optical center to the object.
The foregoing is a simplified description of the terminology involved in the embodiments of the present application, and is not described in detail below.
The hardware system of the electronic device according to the embodiment of the present application is described with reference to fig. 1.
The electronic device provided by the embodiment of the present application may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, a vehicle-mounted electronic device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal DIGITAL ASSISTANT, PDA), a projector, or the like, and the embodiment of the present application does not limit the specific type of the electronic device.
For convenience of explanation, fig. 1 illustrates a hardware system of an electronic device 100 as an example of a mobile phone.
Referring to fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like.
It should be noted that the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
It is understood that the controller may be a neural hub and command center of the electronic device 100. In practical application, the controller can generate operation control signals according to the instruction operation codes and the time sequence signals to complete instruction fetching and instruction execution control.
It should be noted that, a memory may be further provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In an embodiment of the present application, the processor 110 may perform: starting a camera application program; displaying a first image acquired by a primary camera; receiving a first operation, wherein the first operation is used for indicating to change the angle of view; in response to a first operation, the movable camera rotates; determining a third image acquired by the main camera after rotation based on a second image acquired by the movable camera and a fourth image acquired by the movable camera after rotation, wherein the angle of view of the second image is the same as that of the first image, the angle of view of the fourth image is different from that of the first image, and the overlap ratio of the angle of view of the third image and the angle of view of the fourth image is greater than a first preset threshold; and displaying the fourth image.
In some embodiments, the processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of the following interfaces: inter-INTEGRATED CIRCUIT, I2C interface, inter-integrated circuit audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver transmitter (universal asynchronous receiver/transceiver, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interface, SIM interface, USB interface.
Illustratively, the charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging implementations, the charge management module 140 may receive a charging input of the wired charger through the USB interface 130. In some wireless charging implementations, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
Illustratively, the power management module 141 is configured to couple the battery 142, the charge management module 140, and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle times, and battery state of health (e.g., leakage, impedance). Alternatively, the power management module 141 may be provided in the processor 110, or the power management module 141 and the charge management module 140 may be provided in the same device.
Illustratively, the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and so on. The antennas 1 and 2 are used to transmit and receive electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other implementations, the antenna may be used in conjunction with a tuning switch.
Illustratively, the mobile communication module 150 may provide a solution for wireless communication applied on the electronic device 100, such as at least one of the following: second generation (2th generation,2G) mobile communications solutions, third generation (3th generation,3G) mobile communications solutions, fourth generation (4th generation,5G) mobile communications solutions, fifth generation (5th generation,5G) mobile communications solutions.
The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some implementations, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some implementations, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
In addition, the modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some implementations, the modem processor may be a stand-alone device. In other implementations, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
Illustratively, the wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 may implement display functions through a GPU, a display screen 194, and an application processor. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
Illustratively, the display screen 194 may be used to display images or video. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (MINI LIGHT-emitting diode, mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED) or a quantum dot light LIGHT EMITTING diode (QLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In addition, it should be noted that the electronic device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. The ISP can carry out algorithm optimization on noise, brightness and color of the image, and can optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into a standard Red Green Blue (RGB), YUV, etc. format image signal. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
In addition, the digital signal processor is used to process digital signals, and may process other digital signals in addition to digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, and MPEG4.
In some implementations, the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a barometric sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
Among other things, the gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x-axis, y-axis, and z-axis) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B can also be used for scenes such as navigation and motion sensing games. The other sensors are not exemplified here, and the application is not limited thereto.
Taking a mobile phone with the hardware system as an example, a camera included in the mobile phone is described in detail below.
Fig. 2 shows a schematic diagram of a camera 193 arrangement on a cell phone.
Illustratively, the electronic device provided by the present application may include one or more cameras 193, where the one or more cameras 193 may be located on the front of the electronic device 100 or on the back of the electronic device 100. The camera 193 located on the front side of the electronic device 100 may be referred to as a front camera, and the camera located on the back side of the electronic device 100 may be referred to as a rear camera. In the present application, the camera may also be referred to as a camera module.
For example, in an embodiment of the present application, the electronic device 100 may include 5 cameras 193, the 5 cameras 193 including 2 front cameras and 3 rear cameras. Referring to fig. 2 (a) and (b), the 3 rear cameras are arranged in a row from top to bottom on the rear cover of the electronic device 100, and the 3 rear cameras are a main camera 1931, a telephoto camera 1932, and an ultra-wide angle camera 1933 in order of arrangement.
The focal length of the ultra-wide-angle camera 1933 is shorter than that of the main camera 1931, and the focal length of the tele camera 1932 is longer than that of the main camera 1931, and the longer the focal length is, the smaller the angle range is, so that the angle range of the ultra-wide-angle camera 1933 is larger than that of the main camera 1931, and the angle range of the tele camera 1932 is smaller than that of the Yu Zhu camera 1931, as shown in fig. 2 (c).
It should be appreciated that the above is only an example, and that the electronic device 100 may also include other types of cameras, such as a black-and-white camera, a multispectral camera, etc., which are not limited by the present application.
From among the above cameras, the present application selects the tele camera 1932 as an example to explain the internal structure and the working principle of the movable camera.
Fig. 3 shows a schematic side view structure of a tele camera 1932 according to the related art.
Illustratively, the tele camera 1932 generally includes a lens, which may include 1 or more lenses, and a photosensitive element (also referred to as an image sensor) or the like, the lens being configured to perform imaging using a refractive principle of the lenses, and a plane in which the photosensitive element is disposed is parallel to a plane in which the lenses are disposed.
It should be understood that a "lens" as used herein may be understood as an integral lens, may include one or more lenses, and may be understood as a lens in a lens structure or a lens or optic used to make up the lens.
In addition, a filter (not shown in fig. 3) may be further included between the lens and the photosensitive element, and the filter is used for filtering out unwanted bands of light, so as to prevent the photosensitive element from generating false colors or ripples, thereby improving the effective resolution and color reproducibility thereof. Of course, the tele camera 1932 may include other structures, which are not limited in this disclosure, by way of example only.
In connection with this example, in order to perform optical anti-shake, the related art may add an OIS controller including an OIS motor (not shown in fig. 3) to a lens included in the tele camera 1932. The OIS controller is used for acquiring shake data, such as angular velocity, of the electronic device acquired by the gyro sensor, generating a control signal for controlling movement of the OIS motor according to the shake data acquired by the gyro sensor, and the OIS motor is used for pushing the lens to move under the control of the control signal so as to offset the displacement generated by shake.
In some embodiments, with continued reference to fig. 3, taking the coordinate system shown in fig. 3 as an example, the OIS motor in the OIS controller may push the lens to move left and right, i.e., in the x-axis direction, to counteract the displacement of the dither in the x-axis direction. Specifically, as shown in (a) of fig. 4, the dotted line position indicates the initial position of the lens, and the OIS motor may push the lens to move in the positive x-axis direction (x-direction) to cancel the displacement of the shake in the negative x-axis direction (-x-direction); or as shown in fig. 4 (b), the OIS motor may push the lens to move in the negative x-axis direction to cancel the displacement of the shake in the positive x-axis direction.
In other embodiments, with continued reference to fig. 3, taking the coordinate system shown in fig. 3 as an example, the OIS motor in the OIS controller may push the lens to telescope back and forth, i.e., move in the y-axis direction, to counteract the displacement of the shake in the y-axis direction. Specifically, as shown in (a) of fig. 5, the dotted line position indicates the initial position of the lens, and the OIS motor may push the lens to move in the y-axis positive direction (y-direction) to cancel the displacement of the shake in the y-axis negative direction (-y-direction); or as shown in fig. 5 (b), the OIS motor may push the lens to move in the negative y-axis direction to cancel the displacement of the shake in the positive y-axis direction.
It should be understood that the OIS controller may be configured to control the lens to move in both the x-axis direction and the y-axis direction in accordance with the compensation requirement, and the specific moving direction and distance may be determined according to the requirement, which is not limited in the embodiments of the present application.
In addition, since the angle range of view corresponding to the telephoto camera 1932 is smaller than that of the main camera, the wide-angle camera, and the like, the content of photographing is limited when the telephoto camera 1932 performs photographing. For example, when a user switches to the tele camera 1932 to track a target subject moving in a shooting scene, the target is lost once the target subject moves out of the field of view of the tele camera 1932; because the field of view of the tele camera is smaller, the probability of losing the target in the shooting process is very large, and the experience of a user is poor.
Therefore, in order to expand the angle of view range of the tele camera 1932, the present application adds a rotating motor assembly with a prism between the lens and the photosensitive element of the tele camera 1932, and changes the position of the photosensitive element parallel to the plane of the lens to the position perpendicular to the plane of the lens, that is, in the present application, the plane of the photosensitive element is perpendicular to the plane of the lens.
Referring to fig. 6, fig. 6 shows a schematic side view structure of a tele camera 1932 according to the present application.
As shown in fig. 6, the tele camera 1932 comprises a lens, an OIS controller, a rotary motor assembly including a prism, and a photosensitive element, wherein a plane of the photosensitive element is perpendicular to a plane of a lens included in the lens; in addition, the prism in the rotary motor assembly is arranged in an inclined state, so that light rays emitted by the lens can be refracted onto the photosensitive element; on the basis, the rotating motor in the rotating motor component can also control the prism to rotate, so that the emergent light path of the lens is deviated, and the shooting angle range is enlarged. The rotary motor assembly may further include an electric motor for driving the rotary motor. Here, the rotary motor in the rotary motor assembly is not the same motor as the OIS motor in the OIS controller.
Referring to fig. 7 (a), an OIS motor in the OIS controller may push the lens to move in the x-axis direction, and the same manner as that shown in fig. 4 is not described here. Referring to fig. 7 (b), the OIS motor in the OIS controller may also push the lens to move in the y-axis direction, which is the same as the movement shown in fig. 5, and will not be described again.
In some embodiments, referring to fig. 8 (a) and (b), taking the coordinate system shown in fig. 8 (a) as an example, the rotation motor in the rotation motor assembly may control the prism to rotate up and down around the x-axis, i.e., nodding, to expand the field of view in the z-axis direction. Specifically, as shown in (a) of fig. 9, the dotted line position indicates the initial position of the prism, and the rotation motor may rotate the prism around the x-axis and in the z-axis forward direction to expand the field of view in the z-axis forward direction; or as shown in fig. 9 (b), the rotation motor may also rotate the prism around the x-axis and in the negative z-axis direction to expand the field of view in the negative z-axis direction.
Referring to fig. 10, fig. 10 shows a schematic view of the optical path corresponding to the prism shown in (a) of fig. 9 when it moves in the direction of the nodding. As shown in fig. 10, when the prism is not moving, the two incident light rays are reflected by the prism, and the imaging points on the photosensitive element are P and Q respectively. When the prism moves to the position of the prism 'in fig. 10 along the nodding direction shown in fig. 10, the same two incident light rays are reflected by the prism', and the imaging points on the photosensitive element are P 'and Q', respectively, wherein P 'corresponds to P and Q' corresponds to Q before and after the movement.
Here, since the nodding motion is in the optical path plane, the prism is displaced when nodding, and the displacements are almost equal. From this, it is clear that the length of the line segment PQ before the nodding movement is substantially equal to the length of the line segment P 'Q' after the nodding movement. Therefore, when the nod moves, the image does not shake in the y-axis direction, namely the problem of image rotation does not occur, and other distortions are avoided.
In other embodiments, referring to fig. 11 (a) and (b), taking the coordinate system shown in fig. 11 (a) as an example, the rotation motor in the rotation motor assembly may control the prism to rotate left and right around the y-axis, i.e., to swing, to expand the field of view in the x-axis direction. Specifically, as shown in (a) of fig. 12, the dotted line position indicates the initial position of the prism, and the rotation motor may rotate the prism around the y-axis and in the x-axis forward direction to expand the field of view in the x-axis forward direction; or as shown in fig. 12 (b), the rotation motor may also rotate the prism around the y-axis and in the negative x-axis direction to expand the field of view in the negative y-axis direction.
Referring to fig. 13, fig. 13 shows a schematic view of the optical path corresponding to the prism shown in (a) of fig. 12 when moving in the panning direction. As shown in fig. 13, the incident light and the reflected light are symmetrical about the optical axis, and the optical axis is rotated along with the rotation of the prism. When the prism is moved by shaking the head, such as the angle of shaking the head theta, the horizontal coordinate of the incident point on the prism surface is slightly changed, the optical path is changed, and the reflection angle is changed. Whereas a change in reflection angle will result in an image rotated approximately a theta angle in the x-axis direction and a delta angle (cross talk) in the y-axis direction.
As shown in fig. 13, when the prism is not moving, an incident light beam is reflected by the prism, and an imaging point on the photosensitive element is P2. When the prism moves to the position of the prism ' shown in fig. 13 in the panning direction shown in fig. 13, the same incident light is reflected by the prism ', and the imaging point on the photosensitive element is P2'. Here, before and after the prism shaking head, P2 and P2 'are coplanar, but the line segment P2' is not parallel to the x-axis and not parallel to the y-axis, in other words, the imaging point moves obliquely in the xy plane before and after the movement.
Further, before the prism rotates, if one of the two incident light rays is O1 (assumed to be the center of the prism), the imaging point at the photosensitive element is P1; the other ray incidence point is O2 (assuming non-center of prism), and the corresponding imaging point at the photosensitive element is P2. After the prism is rotated by an angle theta, the imaging point of the incident point O1 on the photosensitive element is P1', and the imaging point of the incident point O2 on the photosensitive element is P2'. The rotation of the image at O1 is smaller and the rotation at O2 is larger due to the change of the optical path, so that the translation corresponding to the change of P1 to P1 'is different from the translation corresponding to the change of P2 to P2'. Thus, as shown in fig. 14, when the prism moves in the panning direction, the change of P1P2 to P1'P2' approximates to the double effect caused by the translation and rotation, and the imaging will visually generate image rotation, and the larger the angle of the prism panning, the more serious the image rotation problem. Here, the rotation in three axes, i.e., the x-axis, the y-axis, and the z-axis, is included, and may include the translational and perspective effects of the x-axis and the y-axis, and the rotational effect of the z-axis.
For example, when shooting is performed with a telephoto camera as shown in fig. 6, as shown in fig. 15, an initial field angle range corresponding to the telephoto camera is FOV0, and the initial field angle range is rectangular; when the rotary motor component is started and then the prism is controlled to perform spot head movement and the movement amplitude is maximum, the field angle range corresponding to the tele camera can be changed upwards to FOV1 or downwards to FOV2; when the rotating motor component is started and then the prism is controlled to perform head shaking motion and the motion amplitude is maximum, the field angle range corresponding to the tele camera can be changed leftwards to FOV3 or rightwards to FOV4. When the rotary motor assembly is started, the control prism performs both nodding and panning (the sequence is not limited), and the corresponding field angle range of the tele camera can be changed from the furthest upper right angle to the FOV5, from the lower right angle to the FOV6, from the lower left angle to the FOV7, from the upper left angle to the FOV8, and the like.
If the angle of rotation of the prism is smaller when the motor component is rotated to control the prism to perform nodding motion and panning motion, the FOV which can be realized by the tele camera is within the visual field range formed by FOV3 to FOV 8. Further, by superimposing all FOVs, the maximum FOV that can be achieved by the tele camera can be obtained, as shown by FOV10 in (b) of fig. 15. The maximum FOV is far larger than the FOV0 in the initial field angle range, namely the expansion of the field angle range can be realized under the condition that the rotation motor component controls the prism to rotate by the long-focus camera.
Here, it should also be noted that, when the turning motor assembly controls the prism to perform the nodding motion, no image rotation problem occurs in the y-axis direction, and no other distortion occurs, so that the shapes of FOV1 and FOV2 remain substantially unchanged with respect to FOV0, or are rectangular in a horizontal and vertical direction. When the motor assembly is rotated to control the prism to make a panning motion, an image rotation problem occurs, and thus FOV3 to FOV8 are changed with respect to FOV0, which can be regarded as translating+rotating. For this reason, at the time of imaging, the images of the corresponding FOV3 to FOV8 photographed by the telephoto camera will all be images having the image rotation problem.
Illustratively, fig. 16 shows two-frame images captured with a tele camera. As shown in fig. 16 (a), the playing card image shot before the rotation of the rotating motor in the tele camera is not performed is that characters and figures in the image can be normally presented; as shown in fig. 16 (b), the image of the playing card photographed after the rotation of the rotation motor in the tele camera is shown, and the characters and figures in the image generate the image rotation problem.
Taking the main camera and the above-mentioned tele camera as an example, the two cameras have a certain distance at the arrangement position on the back cover of the mobile phone due to the limitations of the hardware difference and the installation process, etc., therefore, as shown in (c) of fig. 2, when the main camera and the tele camera are both opened, and the tele camera is located at the initial position and is not rotated, the center point of the FOV corresponding to the two cameras is not located at the same position when the two cameras shoot the same scene, that is, the optical centers of the two cameras are not located at one position, so that the images acquired by the two cameras under the same zoom multiple are also different, and a certain reference phase difference (or parallax) exists between the images.
When the electronic equipment receives zooming operation of a user on a display screen and switches between two cameras under the condition that the reference phase difference exists, in order to ensure that no jump and smooth transition exists between image streams displayed on the display screen in the switching process, the image synchronization method provided by the related technology compensates the reference phase difference.
Illustratively, fig. 17 shows a schematic diagram of phase difference compensation. As shown in fig. 17 (a), assuming that the angle of view corresponding to the main camera is FOVa, and that the tele camera is not rotated at the initial position, the angle of view corresponding to the tele camera is FOVb, and ideally no distance is provided between the two cameras, the center points are identical, so that when the zoom magnification is switched from 3x or less to 3x or more in response to the zoom operation of the user, for example, when the zoom magnification is switched from 3x or less to 3x or more, the image displayed on the display screen is provided by the main camera and is provided by the tele camera, the image transmitted by the main camera before the switching should be substantially identical to the image corresponding to FOVb transmitted by the tele camera after the switching.
However, in reality, as shown in fig. 17 (b), since the main camera is at a different distance from the tele camera and the center point is located at a different position, the image corresponding to the FOVa' that the main camera has displayed before switching at 3x is different from the image corresponding to the FOVb that the tele camera has displayed after switching, and thus, the related art acquires the image corresponding to FOVa from the main camera by determining the distance between the center points of FOVa and FOVb, and then, cuts out the image that matches the position of FOVb as the image that the main camera has displayed before switching, so that the image stream displayed on the display screen can be kept smooth before and after the zoom switching.
However, as is clear from the description of fig. 15, when the tele camera is rotated, the field of view can be greatly moved, the angle of view with respect to the main camera will be reduced, or even completely misaligned, and thus, the difference between the images captured by the tele camera and the same-magnification image captured by the main camera will become greater on the basis of the reference phase difference, or even the captured image content will be completely different. In addition, when the long-focus camera rotates, the image shot by the camera generates FOV movement and simultaneously generates the image rotation problem. Thus, the conventional image synchronization method provided by the related art will not be applicable.
In view of this, an embodiment of the present application provides an image synchronization method, where when a main camera (a first camera) sends and displays and triggers a FOV change in response to a first operation of a user, a position vector of a motor of a movable camera (a second camera) is determined by an algorithm, so as to determine displacement of images captured by the movable camera before and after rotation, and then determine an image obtained after the main camera is offset based on the same displacement after rotation, so that a phase difference of the two cameras is ensured to be substantially consistent with that before rotation; when the movable camera (the second camera) sends and displays and triggers the FOV change in response to the second operation of the user, the FOV offset of the main camera can be calculated through an algorithm, so that the phase difference of the two cameras is basically consistent with that before rotation.
And then calculating the accurate image phase difference through an image Smoothing Algorithm (SAT), and accurately adjusting a main shot cutting frame, so that the images of the two cameras are basically consistent, and therefore, when the cameras are switched, the images are free from jump, and smooth transition can be realized.
Two image synchronization methods provided in the embodiments of the present application are described in detail below with reference to fig. 18 to 22.
Both methods may be applied to the electronic device 100 provided above, where the electronic device may include a main camera and a movable camera, the main camera may be an ultra-wide-angle camera 1933, and the movable camera may be a tele camera 1932; the construction and principle of operation of the tele camera 1932 is shown in fig. 6-16.
Fig. 18 shows a schematic flowchart of a first image synchronization method provided by an embodiment of the present application.
As shown in fig. 18, the image synchronization method includes the following S210 to S260, and the following S210 to S260 are described one by one, respectively.
S210, detecting an operation of starting the camera application program, and responding to the operation to start the camera application program.
For example, a user may instruct an electronic device to open a camera application by clicking an icon of a "camera" application; or when the electronic equipment is in a screen locking state, the user can instruct the electronic equipment to start the camera application through a gesture of sliding rightwards on the display screen of the electronic equipment. Or the electronic equipment is in a screen locking state, the screen locking interface comprises an icon of the camera application program, and the user instructs the electronic equipment to start the camera application program by clicking the icon of the camera application program. Or when the electronic equipment runs other applications, the applications have the authority of calling the camera application program; the user may instruct the electronic device to open the camera application by clicking on the corresponding control. For example, while the electronic device is running an instant messaging type application, the user may instruct the electronic device to open the camera application, etc., by selecting a control for the camera function.
It should be appreciated that the above is illustrative of the operation of opening a camera application; the camera application program can be started by voice indication operation or other operation indication electronic equipment; the present application is not limited in any way.
S220, displaying a first image acquired by the main camera.
Illustratively, after the camera application is started, the main camera and the movable camera can be started, and the main camera and the movable camera synchronously acquire image streams; and sending and displaying the image acquired by the main camera to a display screen for displaying.
The first image may be a preview image transmitted in a photographing mode, a video recording mode, or other modes, for example.
S230, receiving a first operation on the first image, wherein the first operation is used for indicating to change the angle of view.
For example, the electronic device may display a first image, and the first operation may indicate a click operation by the user on any one of the targets included in the first image.
It should be understood that the clicking operation for the first image may be understood as an operation for a display screen of the electronic device. The first operation may also be other operations, which the present application is not limited to.
Here, the changing of the angle of view may also be triggered in response to a sliding operation for the zoom control, or based on a related algorithm such as a target tracking algorithm, which is not limited in any way by the embodiment of the present application.
S240, responding to the first operation, and rotating the movable camera.
For example, the movable camera may be instructed to rotate according to the coordinate information of the detected click operation, so that the shooting object indicated by the click operation is included in the angle of view of the movable camera after rotation.
For example, the nth frame image includes a portrait and a landscape, and a click operation on the portrait in the nth frame image is detected; and controlling the movable camera to rotate from the initial position to the target position, wherein the field angle of the movable camera at the target position comprises a human image.
Here, the movable camera includes a rotation motor assembly in which a rotation motor can control the prism to rotate when photographing a photographing scene with the movable camera after receiving a first operation of a user, so that a change in FOV can be achieved.
S250, determining a third image acquired by the main camera after rotation based on the second image acquired by the movable camera and the fourth image acquired by the movable camera after rotation.
The angle of view corresponding to the second image and the first image is the same, the angle of view corresponding to the fourth image and the first image is different, and the angle of view corresponding to the third image and the fourth image is greater than a first preset threshold.
The second image is an image acquired when the movable camera is positioned at the initial position before rotating.
The first preset threshold may be set as required, for example, the first preset threshold may be 90% or 100%, which is not limited in the present application. The higher the field angle overlap ratio is, the more the third image and the fourth image are identical; conversely, the lower the angle of view overlap, the more different the third image and the fourth image.
For example, assuming that the first image corresponds to a first angle of view, the fourth image corresponds to a second angle of view, and the fourth image is different from the first image, the second angle of view is indicated to be different from the first angle of view. Further, assuming that the third image corresponds to a third angle of view, and that the angle of view of the third image corresponds to a fourth image is 100%, the third angle of view is indicated to be the same as the second angle of view.
It should be understood that the angle of view of the movable camera is changed after rotation, so that the content of the images shot before and after rotation of the movable camera is changed.
And S260, displaying a fourth image.
Optionally, the method may further include: and registering the third image acquired by the main camera with the fourth image acquired by the movable camera.
The image synchronization method provided by the embodiment of the application is applied to the electronic equipment with the main camera and the movable camera. Under the condition that the main camera collects images and sends the images to be displayed, when the FOV changes before and after the movable camera rotates in response to the first operation, a third image which can be collected by the main camera when the movable camera deviates from the same FOV is determined based on the fourth image collected by the movable camera after rotation. In this way, the phase difference of the two cameras can be ensured to be consistent with the phase difference before rotation.
Subsequently, if other processing such as registration is performed on the third image acquired by the main camera and the fourth image acquired by the movable camera after rotation, the efficiency of the processing such as registration is also higher and the processing effect is also relatively better because the acquired two frames of images are relatively close.
After rotation, as the angle of view of the images collected by the main camera and the movable camera is larger than the first preset threshold value, which is equivalent to the substantial alignment of the images shot by the main camera and the movable camera, smooth transition of the display images and no jump problem in vision can be ensured when the main camera is switched to the movable camera for display.
In some embodiments, as shown in fig. 19, the above S250 may include S251 to S257.
S251, acquiring a first image acquired by the main camera and a second image acquired by the movable camera.
The second image is used for indicating images synchronously collected by the movable camera when the movable camera is positioned at the initial position and is not rotated. The first image and the second image correspond to the same FOV. The first image and the second image may be located in the RAW domain or in the YUV domain, etc.
It should be noted that "synchronization" herein indicates that, for the same scene to be shot, at the same time, the primary camera captures a first image, and the mobile camera captures a second image. Or indicating that the image frames with the same sequence in the image streams collected by the main camera and the movable camera are the N frames aiming at the same scene to be shot.
Here, the reference phase difference caused by the camera spatial distance is ignored, and the first image and the second image include substantially identical image contents.
Illustratively, the first Image may be Image1 shown in (a) in fig. 20; the second Image may be an Image2 shown in (b) of fig. 20, and Image2 may be an Image corresponding to FOV0 shown in (a) of fig. 15.
S252, determining a corresponding rotation vector after the movable camera rotates.
Optionally, the step S252 may include: and acquiring the image rotation parameters according to the corresponding current scanning code after the movable camera rotates. The rotation parameters include a rotation vector and a translation vector.
It should be noted that, in the rotary motor assembly included in the movable camera, the movement of the prism is driven by the rotary motor, and there is a correspondence between the position of the rotary motor and the angle of the movement of the prism. The rotary motor is driven by the motor, and when the motor drives the rotary motor to drive the prism to move, the motor outputs digital signals, namely, each digital signal can identify a position of the rotary motor. The application refers to the digital signal output by the motor as a Scan code, and the Scan code has a corresponding relation with the angle of the prism movement because of the corresponding relation between the position of the rotating motor and the angle of the prism movement.
Thus, a position vector table can be constructed in advance and stored based on the corresponding relation between the Scan code and the angle of the prism motion. The position vector table is used for storing a plurality of Scan codes and rotation vectors and translation vectors corresponding to the Scan codes. The rotation vector is used to represent a rotation relationship between an image photographed by a prism having a certain angle of movement corresponding to the Scan code and an image photographed by a prism at an initial position.
The form thereof may be as shown in table 1, for example.
TABLE 1
Shooting point Scan code Rotation vector Translation vector
Point1 Code1 R1 T1
Point2 Code2 R2 T2
Point3 Code3 R3 T3
Point4 Code4 R4 T4
Point5 Code5 R5 T5
... ... ... ...
Point24 Code24 R24 T24
It should be understood that table 1 is only an example listed for better understanding of the technical solution of the present embodiment, and is not the only limitation of the present embodiment.
When the rotating motor controls the prism to rotate, the digital signal of the motor driving the rotating motor is the current Scan code; according to the current Scan code, a table lookup can be performed in the position vector table to find the corresponding rotation vector and translation vector, or the rotation vector and translation vector corresponding to the adjacent Scan code are found first, and then the rotation vector and translation vector corresponding to the current Scan code are calculated through interpolation. The determined rotation vector corresponding to the current Scan code may be referred to as the rotation parameter corresponding to the current Scan code.
The rotation vector includes vector components in three axes, the x-axis, the y-axis, and the z-axis, wherein the rotation component in the z-axis is generally referred to as the image rotation angle.
S253, acquiring a fifth image acquired by the rotated movable camera.
The fifth image may be in the RAW domain or in the YUV domain, etc.
Illustratively, the fifth Image may be Image5, as shown in (b) of fig. 20.
Because the FOV corresponding to the movable camera is different after the movable camera rotates, the content of the image collected after rotation will be different from the content collected before rotation, and the image collected after rotation is affected by image rotation and perspective, so that the image collected after rotation has the problems of image rotation and perspective relative to the image collected before rotation.
S254, performing image rotation correction on the fifth image to obtain a fourth image.
The S254 may include: and carrying out image rotation correction on the fifth image by combining the image rotation angle determined from the rotation vector to obtain a fourth image.
The fourth image is used for indicating the image after the image rotation correction. The fourth image may be in the RAW domain or in the YUV domain, etc.
Here, the rotation correction may be calculated using a correlation rotation correction algorithm, which is not limited in the present application.
It should be noted that, the fourth image should be an image that the user desires to collect after the movable camera rotates, but because the image rotation affects, the fourth image cannot be directly collected after the movable camera rotates, so only the fifth image with the image rotation can be collected, and then the image rotation problem is eliminated, so as to obtain the fourth image.
And S255, based on the rotation vector, the coordinates of the center point of the image corresponding to the second image and the fourth image respectively establish a first relational expression.
The image center point coordinates of the second image and the image center point coordinates of the fourth image satisfy the rondrigas relationship.
And representing the change relation between FOVs before and after the rotation of the movable camera by utilizing the relation between the image center point of the second image acquired at the initial position before the rotation of the movable camera and the image center point of the fourth image acquired after the rotation and the image rotation correction.
S256, a second relational expression is established based on the coordinates of the center points of the images corresponding to the first image, the second image and the fourth image respectively.
It will be appreciated that since the fourth image has been image-rotation corrected, only a simple translational relationship resulting from FOV changes is indicated between the second and fourth images acquired before and after rotation of the moveable camera. Therefore, in order for the main camera to approach the FOV of the rotated movable camera, the same translational relationship is required between the images acquired by the main camera and the first image, so that the main camera corresponds to the FOV change synchronized with the movable camera, and the second relational expression can be established by the coordinates of the image center points of the first image, the second image, and the fourth image.
S257, combining the first relation and the second relation, and determining a third image synchronously acquired by the main camera.
It should be noted that "synchronization" herein indicates that, for the same scene to be shot, the movable camera after rotation acquires the fifth image or the fourth image at the same time, and the third image is acquired by the main camera. Or indicating that for the same scene to be shot, the image frames with the same sequence in the image streams collected by the main camera and the movable camera, such as the n+1st frame, are all adopted.
Optionally, the step S257 may include:
S2571, determining the coordinates of the image center point of the third image based on the first relation and the second relation.
The third image is an image which is simultaneously acquired by the main camera when the movable camera acquires the fifth image or the fourth image after the movable camera rotates. The offset of the third image relative to the first image needs to be substantially consistent with the offset of the fourth image relative to the second image.
S2572, determining the third image based on the coordinates of the image center point of the third image.
For example, the main camera may first capture a full-size original Image corresponding to the entire view angle range, then crop the full-size original Image based on the determined Image center point of the third Image, and obtain the third Image after cropping, where the third Image may be, for example, image3, as shown in (a) in fig. 20. In this way, the contents of the third Image3 and the fourth Image4 after the Image rotation correction can be basically ensured to be consistent.
Optionally, to improve the matching degree between the third image and the fourth image, the third image may be registered continuously based on the fourth image. Registration any one of the registration methods provided by the related art may be selected, and the embodiment of the present application is not limited in this regard.
According to the image synchronization method provided by the embodiment of the application, the image rotation correction is carried out on the fifth image acquired after the movable camera rotates, so that the fourth image for eliminating the image rotation can be obtained; then, based on the image center points of the second image and the corrected fourth image acquired before the movable camera rotates, a first relation can be established by combining the rotation vectors, and the first relation represents the coordinate transformation relation of the image center points shot before and after the movable camera rotates.
In order to make the image acquired by the main camera and the fourth image acquired after the movable camera rotates basically the same, the deviation of the FOV of the main camera is equivalent to the requirement of keeping the same with the deviation of the FOV of the movable camera from the second image to the fourth image, thereby, a second relational expression can be established based on the first image acquired by the main camera and the coordinates of the center points of the images of the second image and the fourth image, which are supposed to be acquired. The second relation indicates that the FOV shift relation of the main camera is equal to the FOV shift relation of the movable camera before and after rotation.
Then, by calculating the first relation and the second relation, a third image can be determined, and the purpose that after rotation, the image content collected by the main camera and the movable camera is basically kept consistent is achieved.
Fig. 20 shows a schematic view of a set of main cameras and corresponding angles of view of a tele camera.
For example, as shown in (a) of fig. 20, for a main camera, the corresponding image center point coordinates may be noted as pos1; if the main camera is used for shooting, the obtained Image should be Image1. The center point of Image1 is pos1.
As shown in fig. 20 (b), when the rotary motor is positioned at the initial position with respect to the tele camera, the coordinate system expanded to the four sides is referred to as SCAN CANVAS coordinate system with the corresponding image center point as the origin. Note that, the SCAN CANVAS coordinate system corresponds to a movement stroke covering the rotation motor, and may indicate the maximum field angle FOV10 corresponding to the movable camera as shown in (b) of fig. 15.
Based on the SCAN CANVAS coordinate system, when the rotation motor is located at the initial position, the position vector corresponding to the initial position may be denoted as (0, 0), and if the tele camera is shot at the initial position, the obtained Image is Image2, and the center point of the Image2 is the center point pos2 of the SCAN CANVAS coordinate system. The relationship between Image2 and SCAN CANVAS coordinate system is shown in fig. 20 (b).
When the movable camera rotates, that is, when the rotating motor controls the prism to rotate, the position vector of the rotating motor can be recorded as (r x,ry), in addition, considering that the image shot after the rotating motor rotates can generate image rotation and perspective, the image rotation is related to the shaking direction of the rotating motor, that is, related to r x, r x can be combined, and the rotation vector of the shot image on the z axis can be determined as follows: r z=ɑrx, wherein a is a fixed coefficient, which is not limited in the present application.
Thus, a rotation vector r= (r x,ry,rz) corresponding to the image after the rotation of the rotation motor can be obtained. If the movable camera shoots after rotating, the shot Image is shown as Image5 in SCAN CANVAS coordinate system.
At this time, if the main camera is switched to the telephoto camera, the displayed Image is directly switched from Image1 captured by the main camera to Image5 captured by the rotation of the telephoto camera, or the displayed Image is directly switched from Image5 to Image1 by switching from the telephoto camera to the main camera. Since the two frames of images include completely different contents, a serious jump will occur.
Therefore, the image synchronization method provided by the application is used for solving the problem of image rotation caused by rotation of the long-focus camera, correcting the image rotation of the image shot by the long-focus camera, wherein after correction, the image shot by the long-focus camera is equivalent to simple translation of the image with the same multiplying power shot by the main camera, and the phase difference of the image shot by the long-focus camera is equivalent to the sum of the translation distance and the reference phase difference. Generally, the translation distance is much greater than the reference phase difference. Therefore, the offset between the image center point of the rotated and rotation corrected long-focus camera and the image center point expected to be shot by the main camera is basically consistent with the offset between the image center point shot at the initial position before the rotation of the long-focus camera and the image center point originally shot by the main camera.
Based on this idea, the first stage:
First, as shown in (b) of fig. 20, it is assumed that the Image center point of the Image2 captured before the telephoto camera rotates is pos2, pos 2= (u, v, 1), u is half the length of the Image2 (u=w/2), v is half the height of the Image2 (v=h/2), and the unit is a pixel.
Then, the Image shot after the long-focus camera rotates is Image5, and Image rotation and perspective appear during imaging, and the Image rotation is generally determined by r z. Image5 is subjected to Image rotation correction so that the Image coordinate system coincides with SCAN CANVAS coordinate system, and the Image obtained after Image rotation correction is, for example, image4, and the Image center point of Image4 may be denoted as pos4.
Using a formula to express, rotating the Image2 according to a rotation vector r to obtain an Image5; image4 can be obtained after Image5 is subjected to Image rotation correction, and then the direct conversion relationship of the Image center point between Image2 and Image4 satisfies the following formula:
pos4=KR(r)K-1pos2;
pos4=(pos4x,pos4y);
Wherein, KR (r) K -1 is used for indicating the corresponding transformation relation from the center point of the image to the rotation of the movable camera. K represents the standard camera reference, and K -1 represents the inverse of the standard camera reference; r (R) represents a three-dimensional rotation matrix corresponding to the rotation vector R; θ is an intermediate variable of the rodgers transformation formula.
Substituting pos2 coordinates K, K -1, R (R) into the above formula yields a set of relationships:
Where f=focal_length/pixel_size; the focal_length is used for indicating the physical focal length of the lens, and f is equivalent to converting the physical focal length into the number of pixels in millimeter calculation; pixel size is used to indicate the image pixel size.
After the above relation is sorted, a first relation between the Image center point pos4 of Image4 and the Image center point pos2 of Image2 can be determined as follows:
The first relational expression indicates the amounts of offset of pos4 and pos2 on the x-axis and y-axis of the image coordinate system, respectively.
And a second stage:
For example, as shown in fig. 21, in order to synchronize the Image captured by the main camera with the Image captured by the telephoto camera, image3 having the same offset relationship around Image1 captured by the main camera may be predicted based on the offset relationship between pos2 and pos4 before the rotation of the telephoto camera and after the rotation and the Image rotation correction.
Assuming that the Image center point coordinates of Image3 are pos3, the second relationship should be satisfied between the Image center point pos3 of Image3 and the Image center point pos1 of Image 1:
pos2-pos1 = pos4-pos3; pos1 to pos4 are two-dimensional vectors.
Or may also be written as: pos4-pos2 = pos3-pos1.
Based on the relation determined in the two stages, an Image3 with the center point of the Image being pos3 can be determined around the Image1 in the view angle range corresponding to the main camera.
It can be understood that, for a certain zoom multiple, before no rotation, the images shot by the main camera and the tele camera are respectively Image1 and Image2; after the tele camera rotates, if the main camera continues to shoot the Image or the Image1, but the Image shot by the tele camera is Image5, the Image5 generates a certain rotation and translation relative to the Image2, and if the tele camera performs rotation correction during the process of acquiring the Image, the Image shot by the tele camera is changed into the Image4. Although Image4 has corrected Image rotation relative to Image2, there is some offset.
Then, in order to keep the Image captured by the main camera consistent with the Image captured by the rotated telephoto camera, the Image captured by the main camera is changed, for example, based on the offset relationship between Image4 and Image2, image3 having a consistent offset relationship with respect to Image1 is determined as the captured Image of the main camera. In this way, even if the camera is switched from the telephoto camera to the main camera, no jump will occur because the photographed Image acquired by the main camera is synchronized with the Image4 acquired by the telephoto camera.
Since the Image rotation problem does not greatly affect the coordinates of the Image center point, the Image center points of Image5 and Image4 may be regarded as the same point pos4.
Alternatively, on the basis of the above, since the first relation and the second relation cannot be directly solved, an iterative method can be selected for determination.
The first step: as shown in fig. 22, it is assumed that after the movable camera rotates, the center point of the acquired image is located in pos_f0 and in SCAN CANVAS coordinates. Namely: let the target coordinate of the Image center point of Image4 be pos_f0.
And a second step of: substituting optical parameters to obtain a position vector corresponding to the rotary motor asThen substituting the rotation parameters to obtain the rotation vector/>, corresponding to the imageAnd a corresponding rotation matrix R1.
And a third step of: substituting the center point coordinates pos_f0 into the rotation matrix R1, a new coordinate pos_f1 can be obtained. pos_f1=r1 (pos_f0).
Fourth step: recording the difference between pos_f0 and pos_f1, namely determining the difference between the new coordinates of the center point of the acquired image and the target coordinates: Δp=pos_f0-pos_f1.
Fifth step: and converting the delta p into a corresponding scanning code and superposing the corresponding scanning code on the initial scanning code.
For example, scan sensitivity (g x,gy) corresponding to the x-axis and y-axis directions can be calculated in the image rotation parameters, and Δp is converted into
Wherein g x is used to indicate the correspondence between the pixels and the rotation angle in the x-axis direction, and g y is used to indicate the correspondence between the pixels and the rotation angle in the y-axis direction. And delta S is used for indicating the corresponding rotation quantity of the rotation motor when the coordinate offset quantity of the center point of the image is delta p.
Sixth step: and taking delta S+S as a position vector corresponding to the rotary motor, continuing to push the rotary motor, and calculating a new rotary matrix R2 and a corresponding center point coordinate pos_f2. pos_f2=r2 (pos_f0).
The difference between pos_f0 and pos_f2 is recorded, i.e. a new Δp=pos_f0-pos_f2 is determined.
Seventh step: repeating the above steps until pos_fn-pos_f0 is less than the threshold.
It should be understood that the threshold may be set as desired, and the present application is not limited in this regard.
Therefore, by executing the steps, the accurate corresponding relation between the position vector of the rotary motor and the coordinate offset of the image center point can be determined. Based on the corresponding relation, the coordinates of the center point of the image after rotation can be determined by combining different position vectors of the rotating motor.
Fig. 23 shows a schematic flow chart of a second image synchronization method provided by an embodiment of the present application.
As shown in fig. 23, the image synchronization method may include the following S310 to S350, and the following S310 to S350 are described one by one, respectively.
S310, displaying a fourth image acquired by the movable camera.
Illustratively, after the camera application is started, the main camera and the movable camera can be started, and the main camera and the movable camera synchronously acquire image streams; and sending and displaying the image acquired by the movable camera to a display screen for displaying.
The fourth image may be, for example, a preview image displayed in a photographing mode, a video recording mode, or other modes.
Alternatively, S310 is the same as S260, and the fourth image is the fourth image displayed in S260.
S320, receiving a second operation, wherein the second operation is used for indicating to change the angle of view.
For example, the electronic device may display a fourth image, and the second operation may indicate a click operation by the user on any one of the targets included in the second image.
It should be understood that the clicking operation for the fourth image may be understood as an operation for the display screen of the electronic device. The second operation may also be other operations, which the present application is not limited to.
Here, the change of the angle of view may also be triggered based on a related algorithm such as a target tracking algorithm, which is not limited in any way by the embodiment of the present application.
S330, responding to the second operation, and rotating the movable camera.
For the description of S330, reference may be made to the description of S240, which is not repeated here.
S340, determining a seventh image acquired by the main camera based on the fourth image and the sixth image acquired by the rotated movable camera. The angle of view corresponding to the sixth image is different from the angle of view corresponding to the fourth image, and the angle of view coincidence degree corresponding to the seventh image and the sixth image is larger than a second preset threshold value.
The second preset threshold may be set as needed, for example, the second preset threshold may be 90% or 100%, which is not limited in the present application. The higher the field angle overlap ratio is, the more the seventh image is identical to the sixth image; conversely, the lower the angle of view overlap, the more different the seventh image and the sixth image.
Optionally, the second preset threshold is the same size as the first preset threshold.
S350, displaying a seventh image.
Optionally, the method may further include: and registering the seventh image acquired by the main camera with the sixth image acquired by the movable camera.
The image synchronization method provided by the embodiment of the application is applied to the electronic equipment with the main camera and the movable camera. Under the condition that the movable camera collects images and sends the images to display, when the front FOV and the back FOV of the movable camera are rotated to generate conversion in response to the second operation, a seventh image which can be collected by the main camera when the same FOV is offset is determined based on the sixth image collected by the movable camera after rotation. In this way, the phase difference of the two cameras can be ensured to be consistent with the phase difference before rotation.
Subsequently, if other processing such as registration is performed on the seventh image acquired by the main camera and the sixth image acquired by the movable camera, the efficiency of the processing such as registration is also relatively high and the processing effect is relatively good because the acquired two frames of images are relatively close.
After the camera is rotated, the angle of view corresponding to the images acquired by the main camera and the movable camera is larger than a second preset threshold value, which is equivalent to the substantial alignment of the images shot by the main camera and the movable camera, so that the smooth transition of the displayed images and no jump problem in vision can be ensured when the camera is switched from the movable camera to the main camera for display.
In some embodiments, as shown in fig. 24, the above S340 may include S341 to S345.
Assuming that the maximum moving distance of the rotary motor control prism in the moving direction is ±x and the maximum moving distance in the nodding direction is ±y in the movable camera, if the calibration network is 5×5, the coordinates of the four vertices (photographing nodes) in the scan motor coordinate system can be represented as shown in fig. 25.
S341, acquiring a sixth image acquired by the rotated movable camera and a position vector of the rotating motor.
S342, determining a rotation vector corresponding to the sixth image based on the position vector.
As shown in fig. 25, when the movable camera is rotated to the target position after rotating, the position vector of the rotation motor corresponding to the target position is
The Image taken at this position is called Image6, and the center point coordinate on the Image is pos6.
After the coordinates corresponding to the target position are obtainedThen, the rotation parameters of the 4 photographing nodes around S 6 are acquired from the position vector table shown in table 1. Wherein the rotation parameters include a rotation vector and a translation vector.
Based on the rotation parameters of the 4 shooting nodes, a rotation vector and a translation vector corresponding to the target position can be obtained through interpolation.
For example, as shown in fig. 25, it may be determined that four shooting nodes near S 6 are Point0, point13, point17, and Point18, and then based on the rotation vectors and translation vectors corresponding to Point0, point13, point17, and Point18, the rotation vectors corresponding to S 6 are obtained by interpolationTranslation vector is/>
S343, constructing a homography matrix H 6 according to the rotation vector corresponding to the target position, wherein the homography matrix H 6 is used for projecting the Image6 onto scancanvas.
And S344, determining a mapping matrix between a fourth image and a sixth image acquired before and after the movable camera rotates based on the homography matrix H 6.
And carrying out coordinate mapping on the Image scancanvas to ensure that the center point of the Image4 shot at the initial position is still at the center in the Image6 shot after rotation.
Substituting the Image center point pos4 of Image4 into H 6 to obtain the center point coordinate of Image6 as followsThen the corresponding mapping matrix is/>
It should be appreciated that the mapping matrix may indicate the offset between the center point coordinates of Image6 and Image 4.
S345, based on the mapping matrix T, determining the coordinates of the image center point of the seventh image acquired by the main camera.
Illustratively, based on the mapping matrix T, the full-size original image captured by the main camera may be cropped, so that a seventh image substantially consistent with the sixth image FOV captured by the rotated movable camera may be obtained.
Fig. 26 is a schematic structural diagram of an image synchronization device according to an embodiment of the present application. The image synchronization apparatus 400 includes a display unit 410 and a processing unit 420.
The processing unit 420 is used for starting the camera application.
The display unit 410 is used for displaying a first image acquired by the main camera.
The processing unit 420 is further configured to receive a first operation, where the first operation is used to instruct to change the angle of view; in response to a first operation, the movable camera rotates; determining a third image acquired by the main camera after rotation based on a second image acquired by the movable camera and a fourth image acquired by the movable camera after rotation, wherein the angle of view of the second image is the same as that of the first image, the angle of view of the fourth image is different from that of the first image, and the overlap ratio of the angle of view of the third image and the angle of view of the fourth image is greater than a first preset threshold; and displaying the fourth image.
The image synchronization device 400 is embodied as a functional unit. The term "unit" herein may be implemented in software and/or hardware, without specific limitation.
For example, a "unit" may be a software program, a hardware circuit or a combination of both that implements the functions described above. The hardware circuitry may include Application Specific Integrated Circuits (ASICs), electronic circuits, processors (e.g., shared, proprietary, or group processors, etc.) and memory for executing one or more software or firmware programs, merged logic circuits, and/or other suitable components that support the described functions.
Thus, the elements of the examples described in the embodiments of the present application can be implemented in electronic hardware, or in a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Fig. 27 shows a schematic structural diagram of an electronic device provided by the present application. The dashed lines in fig. 27 indicate that the unit or the module is optional, and the electronic device 500 may be used to implement the image synchronization method described in the above method embodiment.
The electronic device 500 includes one or more processors 501, which one or more processors 502 may support the electronic device 500 to implement the methods in the method embodiments. The processor 501 may be a general purpose processor or a special purpose processor. For example, the processor 501 may be a central processing unit (central processing unit, CPU), a digital signal processor (DIGITAL SIGNAL processor, DSP), an Application Specific Integrated Circuit (ASIC), a field programmable gate array (field programmable GATE ARRAY, FPGA), or other programmable logic device such as discrete gates, transistor logic, or discrete hardware components.
The processor 501 may be configured to control the electronic device 500, execute a software program, and process data of the software program. The electronic device 500 may further comprise a communication unit 505 for enabling input (reception) and output (transmission) of signals.
For example, the electronic device 500 may be a chip, the communication unit 505 may be an input and/or output circuit of the chip, or the communication unit 505 may be a communication interface of the chip, which may be an integral part of a terminal device or other electronic device.
For another example, the electronic device 500 may be a terminal device, the communication unit 505 may be a transceiver of the terminal device, or the communication unit 505 may be a transceiver circuit of the terminal device.
The electronic device 500 may include one or more memories 502 having a program 504 stored thereon, the program 504 being executable by the processor 501 to generate instructions 503 such that the processor 501 performs the image synchronization method described in the above method embodiments according to the instructions 503.
Optionally, the memory 502 may also have data stored therein. Alternatively, the processor 501 may also read data stored in the memory 502, which may be stored at the same memory address as the program 504, or which may be stored at a different memory address than the program 504.
The processor 501 and the memory 502 may be provided separately or may be integrated together; for example, integrated on a System On Chip (SOC) of the terminal device.
Illustratively, the memory 502 may be used to store a related program 504 of the image synchronization method provided in the embodiment of the present application, and the processor 501 may be used to call the related program 504 of the image synchronization method stored in the memory 502 at the time of video processing, to perform the image synchronization method of the embodiment of the present application; for example, a camera application is started; displaying a first image acquired by a primary camera; receiving a first operation, wherein the first operation is used for indicating to change the angle of view; in response to a first operation, the movable camera rotates; determining a third image acquired by the main camera after rotation based on a second image acquired by the movable camera and a fourth image acquired by the movable camera after rotation, wherein the angle of view of the second image is the same as that of the first image, the angle of view of the fourth image is different from that of the first image, and the overlap ratio of the angle of view of the third image and the angle of view of the fourth image is greater than a first preset threshold; and displaying the fourth image.
The present application also provides a computer program product which, when executed by the processor 501, implements the image synchronization method according to any one of the method embodiments of the present application.
The computer program product may be stored in the memory 502, for example, the program 504, and the program 504 is finally converted into an executable object file capable of being executed by the processor 501 through preprocessing, compiling, assembling, and linking.
The application also provides a computer readable storage medium having stored thereon a computer program which when executed by a computer implements the image synchronization method according to any one of the method embodiments of the application. The computer program may be a high-level language program or an executable object program.
Optionally, the computer readable storage medium is, for example, memory 502. The memory 502 may be volatile memory or nonvolatile memory, or the memory 502 may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an erasable programmable ROM (erasable PROM), an electrically erasable programmable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (double DATA RATE SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCHLINK DRAM, SLDRAM), and direct memory bus random access memory (direct rambus RAM, DR RAM).
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working processes and technical effects of the apparatus and device described above may refer to corresponding processes and technical effects in the foregoing method embodiments, which are not described in detail herein.
In the several embodiments provided by the present application, the disclosed systems, devices, and methods may be implemented in other manners. For example, some features of the method embodiments described above may be omitted, or not performed. The above-described apparatus embodiments are merely illustrative, the division of units is merely a logical function division, and there may be additional divisions in actual implementation, and multiple units or components may be combined or integrated into another system. In addition, the coupling between the elements or the coupling between the elements may be direct or indirect, including electrical, mechanical, or other forms of connection.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In addition, the terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In summary, the foregoing description is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (11)

1. An image synchronization method, applied to an electronic device including a main camera and a movable camera, comprising:
Starting a camera application program;
Displaying a first image acquired by the primary camera;
receiving a first operation, wherein the first operation is used for indicating to change the angle of view;
in response to the first operation, the movable camera rotates;
determining a third image acquired by the main camera after rotation based on a second image acquired by the movable camera and a fourth image acquired by the movable camera after rotation, wherein the angle of view of the second image and the angle of view of the first image are the same, the angle of view of the fourth image and the angle of view of the first image are different, and the overlapping degree of the angle of view of the third image and the angle of view of the fourth image is larger than a first preset threshold;
And displaying the fourth image.
2. The method of claim 1, wherein the determining the third image captured by the primary camera after rotation based on the second image captured by the movable camera, the fourth image captured by the movable camera after rotation, comprises:
determining a corresponding rotation vector of the movable camera after rotation;
establishing a first relation based on the rotation vector and the coordinates of the image center points corresponding to the second image and the fourth image respectively;
Establishing a second relation based on the coordinates of the image center points corresponding to the first image, the second image and the fourth image respectively;
and determining the coordinates of the image center point of the third image acquired by the main camera by combining the first relational expression and the second relational expression.
3. The method of claim 2, wherein the angle of view of the third image corresponding to the fourth image is greater than a first preset threshold, comprising:
The third image is identical to the fourth image.
4. A method according to any one of claims 1 to 3, further comprising:
Receiving a second operation, wherein the second operation is used for indicating to change the angle of view;
in response to the first operation, the movable camera rotates;
Determining a seventh image acquired by the main camera based on the fourth image and a sixth image acquired by the rotated movable camera; the angle of view corresponding to the sixth image is different from the angle of view corresponding to the fourth image, and the angle of view coincidence degree corresponding to the seventh image and the sixth image is larger than a second preset threshold;
And displaying the seventh image.
5. The method of claim 4, wherein determining a seventh image captured by the primary camera based on the fourth image, the sixth image captured by the rotated movable camera, comprises:
Acquiring a sixth image acquired by the movable camera after rotation and a position vector of a rotating motor;
Determining a rotation vector corresponding to the sixth image based on the position vector;
Constructing a homography matrix based on the rotation vector;
Determining a mapping matrix between the fourth image and the sixth image based on the homography matrix;
and determining a seventh image acquired by the main camera based on the mapping matrix.
6. The method of claim 1, wherein the first operation is a click operation for the first image.
7. The method of claim 4, wherein the second operation is a click operation for the fourth image.
8. The method of claim 4, wherein the second preset threshold is the same size as the first preset threshold.
9. An electronic device comprising a processor and a memory;
The memory is used for storing a computer program capable of running on the processor;
the processor configured to perform the image synchronization method according to any one of claims 1 to 8.
10. A chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform the image synchronization method of any of claims 1 to 8.
11. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the image synchronization method according to any one of claims 1 to 8.
CN202310404131.5A 2023-04-07 2023-04-07 Image synchronization method and related equipment thereof Active CN117135420B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310404131.5A CN117135420B (en) 2023-04-07 2023-04-07 Image synchronization method and related equipment thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310404131.5A CN117135420B (en) 2023-04-07 2023-04-07 Image synchronization method and related equipment thereof

Publications (2)

Publication Number Publication Date
CN117135420A CN117135420A (en) 2023-11-28
CN117135420B true CN117135420B (en) 2024-05-07

Family

ID=88851566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310404131.5A Active CN117135420B (en) 2023-04-07 2023-04-07 Image synchronization method and related equipment thereof

Country Status (1)

Country Link
CN (1) CN117135420B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111294517A (en) * 2020-03-03 2020-06-16 华为技术有限公司 Image processing method and mobile terminal
CN111835969A (en) * 2020-07-06 2020-10-27 海信视像科技股份有限公司 Interactive method for controlling angle of camera and display equipment
CN113556464A (en) * 2021-05-24 2021-10-26 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN115379118A (en) * 2022-08-26 2022-11-22 维沃移动通信有限公司 Camera switching method and device, electronic equipment and readable storage medium
CN115484375A (en) * 2021-05-31 2022-12-16 华为技术有限公司 Shooting method and electronic equipment
CN115550544A (en) * 2022-08-19 2022-12-30 荣耀终端有限公司 Image processing method and device
CN115802158A (en) * 2022-10-24 2023-03-14 荣耀终端有限公司 Method for switching cameras and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220025553A (en) * 2020-08-24 2022-03-03 삼성전자주식회사 Application processor, eletronic device including the same and operationg method of the electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111294517A (en) * 2020-03-03 2020-06-16 华为技术有限公司 Image processing method and mobile terminal
CN111835969A (en) * 2020-07-06 2020-10-27 海信视像科技股份有限公司 Interactive method for controlling angle of camera and display equipment
CN113556464A (en) * 2021-05-24 2021-10-26 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN115484375A (en) * 2021-05-31 2022-12-16 华为技术有限公司 Shooting method and electronic equipment
CN115550544A (en) * 2022-08-19 2022-12-30 荣耀终端有限公司 Image processing method and device
CN115379118A (en) * 2022-08-26 2022-11-22 维沃移动通信有限公司 Camera switching method and device, electronic equipment and readable storage medium
CN115802158A (en) * 2022-10-24 2023-03-14 荣耀终端有限公司 Method for switching cameras and electronic equipment

Also Published As

Publication number Publication date
CN117135420A (en) 2023-11-28

Similar Documents

Publication Publication Date Title
CN110663245B (en) Apparatus and method for storing overlapping regions of imaging data to produce an optimized stitched image
KR100994063B1 (en) Image processing device, image processing method and a computer readable storage medium having stored therein a program
CN110784651B (en) Anti-shake method and electronic equipment
WO2022262344A1 (en) Photographing method and electronic device
CN111064895B (en) Virtual shooting method and electronic equipment
WO2017206656A1 (en) Image processing method, terminal, and computer storage medium
CN113542600B (en) Image generation method, device, chip, terminal and storage medium
CN114982213B (en) Electronic device for providing camera preview and method thereof
CN113364976B (en) Image display method and electronic equipment
CN111226255A (en) Image processing apparatus, image capturing system, image processing method, and recording medium
CN114693569A (en) Method for fusing videos of two cameras and electronic equipment
CN115701125B (en) Image anti-shake method and electronic equipment
CN117135454A (en) Image processing method, device and storage medium
CN117135420B (en) Image synchronization method and related equipment thereof
CN116546316B (en) Method for switching cameras and electronic equipment
CN114531539B (en) Shooting method and electronic equipment
CN117136554A (en) Electronic apparatus including camera and control method thereof
CN117135456B (en) Image anti-shake method and electronic equipment
CN117135458A (en) Optical anti-shake method and related equipment
CN117135456A (en) Image anti-shake method and electronic equipment
CN117714863A (en) Shooting method and related equipment thereof
CN116320784B (en) Image processing method and device
US11991446B2 (en) Method of image stabilization and electronic device therefor
CN117714867A (en) Image anti-shake method and electronic equipment
CN117135459A (en) Image anti-shake method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant