CN114125197A - Mobile terminal and photographing method thereof - Google Patents

Mobile terminal and photographing method thereof Download PDF

Info

Publication number
CN114125197A
CN114125197A CN202010904737.1A CN202010904737A CN114125197A CN 114125197 A CN114125197 A CN 114125197A CN 202010904737 A CN202010904737 A CN 202010904737A CN 114125197 A CN114125197 A CN 114125197A
Authority
CN
China
Prior art keywords
image
mobile terminal
focusing
window
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010904737.1A
Other languages
Chinese (zh)
Inventor
闫三锋
明志强
高超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN202010904737.1A priority Critical patent/CN114125197A/en
Publication of CN114125197A publication Critical patent/CN114125197A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a mobile terminal and a photographing method thereof, and relates to the technical field of terminals. The mobile terminal can respond to the touch operation aiming at the photographing control to acquire the multiframe initial images collected by the camera. Because each frame of initial image is acquired after the camera focuses on the shot object in one focusing window, the window image in the focusing window in the frame of initial image can be ensured to be clearer. Then, the mobile terminal can directly intercept the window image positioned in the focusing window from each initial photographed image and synthesize the photographed image. Because the clearest partial image does not need to be identified from each frame of image and then intercepted, the calculation amount of the mobile terminal can be effectively reduced and the photographing efficiency of the mobile terminal can be improved on the premise of ensuring the definition of the photographed image obtained by synthesis.

Description

Mobile terminal and photographing method thereof
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a mobile terminal and a photographing method thereof.
Background
The mobile terminal can control the camera to shoot to obtain a full-depth shot image, and images of all parts in the full-depth shot image are clear.
In the related art, in the process of obtaining a full-depth photographed image, the mobile terminal may gradually adjust an object distance of the camera, that is, adjust a distance between a lens of the camera and a photographed object. And, every time of adjustment, mobile terminal can all obtain the picture of shooing of a frame that the camera was gathered to obtain the picture of shooing of multiframe. Then, for each frame of photographed image, the mobile terminal may recognize a clearest partial image of the frame of photographed image and intercept the partial image from the frame of photographed image. Then, the mobile terminal may perform synthesis processing on the plurality of captured most clear partial images to obtain a full-depth photographed image.
However, in the related art, the efficiency of obtaining the photographed image of the panoramic depth is low.
Disclosure of Invention
The application provides a mobile terminal and a photographing method thereof, which can solve the problem of low efficiency of obtaining a full-depth photographing image in the related art. The technical scheme is as follows:
in one aspect, a mobile terminal is provided, which includes a processor, a camera and a display screen; the processor is configured to:
responding to touch operation of a photographing control, controlling the camera to focus a photographing object in each focusing window of a plurality of focusing windows and collecting an initial image, wherein the positions of the focusing windows in an image coordinate system of the camera are different;
acquiring initial images acquired after the camera focuses on the shot object in each focusing window, and acquiring window images positioned in the focusing windows in the initial images;
synthesizing the acquired window images to obtain a photographed image;
the display screen is used for displaying the photographed image.
Optionally, the processor is configured to:
taking the initial image with the highest definition in the multiple frames of initial images collected by the camera as a background image;
and synthesizing the background image and the plurality of window images to obtain a target photographing image.
Optionally, the resolution of the camera is different from the resolution of the display screen of the mobile terminal; the processor is further configured to:
in response to a focus selection operation for a shooting interface, determining a position of each of a plurality of focuses in the shooting interface;
determining the position of one focusing window corresponding to each focus in the plurality of focuses in an image coordinate system of the camera based on the position of each focus in the shooting interface and the corresponding relation between the resolution of the image acquired by the camera and the resolution of the display screen;
and the central point of each focusing window is a focus corresponding to the focusing window.
Optionally, the processor is further configured to:
determining the position of a focusing window corresponding to each focus in the shooting interface based on the position of each focus in the plurality of focuses in the shooting interface;
and controlling the display screen to display the plurality of focusing windows on the shooting interface.
Optionally, the processor is configured to:
and controlling the camera to sequentially focus the shot object in each focusing window of the focusing windows according to the traversal sequence of the focusing windows and collecting an initial image.
Optionally, the processor is configured to:
adjusting the resolution of the photographed image, wherein the adjusted resolution of the photographed image is equal to the resolution of a display screen of the mobile terminal;
and the display screen is used for displaying the adjusted photographed image.
On the other hand, the mobile terminal comprises a camera; the method comprises the following steps:
responding to touch operation of a photographing control, controlling the camera to focus a photographing object in each focusing window of a plurality of focusing windows and collecting an initial image, wherein the positions of the focusing windows in an image coordinate system of the camera are different;
acquiring initial images acquired after the camera focuses on the shot object in each focusing window, and acquiring window images positioned in the focusing windows in the initial images;
synthesizing the acquired window images to obtain a photographed image;
and displaying the photographed image.
Optionally, the synthesizing the acquired multiple window images to obtain a photographed image includes:
taking the initial image with the highest definition in the multiple frames of initial images collected by the camera as a background image;
and synthesizing the background image and the plurality of window images to obtain a target photographing image.
Optionally, the resolution of the camera is different from the resolution of the display screen of the mobile terminal; before the controlling the camera to focus on the photographic subject in each of a plurality of focusing windows and acquiring an initial image, the method further comprises:
in response to a focus selection operation for a shooting interface, determining a position of each of a plurality of focuses in the shooting interface;
determining the position of one focusing window corresponding to each focus in the plurality of focuses in an image coordinate system of the camera based on the position of each focus in the shooting interface and the corresponding relation between the resolution of the image acquired by the camera and the resolution of the display screen;
and the central point of each focusing window is a focus corresponding to the focusing window.
Optionally, after the determining, in response to a focus selection operation for a shooting interface, a position of each of a plurality of focuses in the shooting interface, the method further includes:
determining the position of a focusing window corresponding to each focus in the shooting interface based on the position of each focus in the plurality of focuses in the shooting interface;
and displaying the plurality of focusing windows on the shooting interface.
In still another aspect, a mobile terminal is provided, which includes: the mobile terminal comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the computer program to realize the photographing method of the mobile terminal.
In yet another aspect, a computer-readable storage medium is provided, in which a computer program is stored, which, when being executed by a processor, implements the photographing method of the mobile terminal according to the above aspect.
In still another aspect, a computer program product containing instructions is provided, which when run on the computer causes the computer to execute the method for photographing of a mobile terminal according to the above aspect.
The beneficial effect that technical scheme that this application provided brought includes at least:
the application provides a mobile terminal and a photographing method thereof, wherein the mobile terminal can respond to touch operation aiming at a photographing control to obtain multi-frame initial images collected by a camera. Because each frame of initial image is acquired after the camera focuses on the shot object in one focusing window, the window image in the focusing window in the frame of initial image can be ensured to be clearer. Then, the mobile terminal can directly intercept the window image positioned in the focusing window from each initial photographed image and synthesize the photographed image. Because the clearest partial image does not need to be identified from each frame of image and then intercepted, the calculation amount of the mobile terminal can be effectively reduced and the photographing efficiency of the mobile terminal can be improved on the premise of ensuring the definition of the photographed image obtained by synthesis.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a photographing method of a mobile terminal according to an embodiment of the present disclosure;
fig. 2 is a flowchart of another photographing method for a mobile terminal according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating distribution positions of a plurality of focusing windows in an image captured by a camera according to an embodiment of the present disclosure;
fig. 4 is a flowchart of a method for determining a position of a focusing window by a mobile terminal according to an embodiment of the present application;
fig. 5 is a schematic diagram of an initial image acquired after a camera focuses on a photographic object in a focusing window 0 according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of an initial image acquired after a camera focuses on a photographic object in a focusing window 1 according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of an initial image acquired after a camera focuses on a photographic object in the focusing window 2 according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of an initial image acquired after a camera focuses on a photographic object in the focusing window 3 according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a photographed image according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 11 is a block diagram of a software structure of a mobile terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The embodiment of the application provides a photographing method of a mobile terminal, and the photographing method can be applied to the mobile terminal. The mobile terminal may include: the camera has an automatic focusing function. Optionally, the mobile terminal may be a mobile phone, a tablet computer, or a wearable device, for example, the mobile terminal may be a mobile phone. Referring to fig. 1, the method may include:
step 101, responding to a touch operation for the photographing control, controlling the camera to focus a photographing object in each focusing window of the plurality of focusing windows and collecting an initial image.
The mobile terminal can respond to the touch operation of the user for the photographing control, the camera of the mobile terminal is controlled to focus the photographing object in each focusing window of the plurality of focusing windows, and after the focusing is finished, the initial image is collected, so that the multi-frame initial image corresponding to the plurality of focusing windows one to one is obtained. The resolutions of the multiple frames of initial images are the same, namely the resolutions of the cameras.
The positions of the plurality of focusing windows in the image coordinate system of the camera are different.
The mobile terminal can control the camera to focus and photograph the photographic object in each focusing window, so that the definition of the window image in the focusing window can be ensured in the initial image collected after focusing, and the definition of the full-depth photographed image obtained based on the window image can be ensured.
And 102, acquiring initial images acquired after the camera focuses on the shot object in each focusing window, and acquiring window images positioned in the focusing windows in the initial images.
After the mobile terminal acquires a plurality of frames of initial images, for each frame of initial image, the mobile terminal can intercept a window image positioned in a focusing window from the frame of initial image based on the position of the focusing window corresponding to the frame of initial image in an image coordinate system.
And 103, synthesizing the acquired window images to obtain a photographed image.
After the mobile terminal acquires the window images in each frame of initial image, namely acquires a plurality of window images, the plurality of window images can be synthesized, so that a full-depth-of-field photographed image is obtained.
And step 104, displaying the photographed image.
After the mobile terminal obtains the photographed image, the photographed image can be displayed (also referred to as a back display) on a display screen of the mobile terminal.
To sum up, the embodiment of the present application provides a photographing method for a mobile terminal, where the mobile terminal can respond to a touch operation for a photographing control to obtain a multi-frame initial image collected by a camera. Because each frame of initial image is acquired after the camera focuses on the shot object in one focusing window, the window image in the focusing window in the frame of initial image can be ensured to be clearer. Then, the mobile terminal can directly intercept the window image positioned in the focusing window from each initial photographed image and synthesize the photographed image. Because the clearest partial image does not need to be identified from each frame of image and then intercepted, the calculation amount of the mobile terminal can be effectively reduced and the photographing efficiency of the mobile terminal can be improved on the premise of ensuring the definition of the photographed image obtained by synthesis.
Fig. 2 is a flowchart of another photographing method of a mobile terminal according to an embodiment of the present application, where the photographing method may be applied to the mobile terminal, and the mobile terminal may include: the camera has an automatic focusing function. Optionally, the mobile terminal may be a mobile phone, a tablet computer, or a wearable device, for example, the mobile terminal may be a mobile phone. Referring to fig. 1, the method may include:
step 201, responding to the touch operation aiming at the photographing control, controlling the camera to focus the photographing object in each focusing window of the plurality of focusing windows and collecting an initial image.
After the camera of the mobile terminal is started, the mobile terminal can respond to the selection operation of a user of the mobile terminal for a panoramic deep photographing mode in a plurality of photographing modes, and the current photographing mode is determined to be the panoramic deep photographing mode. And then, the user can touch the photographing control of the mobile terminal so as to enable the mobile terminal to photograph the photographing image with the full depth of field. Correspondingly, the mobile terminal can respond to the touch operation of the user for the photographing control, control the camera of the mobile terminal to focus the photographing object in each focusing window of the plurality of focusing windows, and acquire the initial image after the focusing is finished to obtain the multi-frame initial image corresponding to the plurality of focusing windows one to one.
The positions of the plurality of focusing windows in the image coordinate system of the camera are different, namely the distribution positions of the plurality of focusing windows in the image collected by the camera are different. The position of each focusing window in the image coordinate system of the camera may include: the coordinates of the upper left vertex of the focusing window in the image coordinate system, and the coordinates of the lower right vertex of the focusing window in the image coordinate system. Also, the contents of the plurality of initial images are the same, but the sharpness of the same subject may be different. That is, when the camera focuses on the subject in each focusing window, the angle of view of the camera remains unchanged.
Alternatively, the mobile terminal may use an Auto Focus (AF) algorithm to control the camera to focus on the photographic subject in each focusing window. Any two focusing windows in the plurality of focusing windows in the image coordinate system of the camera can be not overlapped with each other or partially overlapped with each other. The sizes of any two focusing windows in the plurality of focusing windows may be different or the same.
For example, referring to fig. 3, fig. 3 shows a schematic diagram of the distribution positions of a plurality of focusing windows in an image captured by a camera. As can be seen from fig. 3, 8 focus windows from focus window 0 to focus window 7 are distributed in the image. Wherein, the focusing window 2 can partially overlap with the focusing window 1, and the focusing window 2 partially overlaps with the focusing window 3. The focusing window 0 and any one of the focusing windows 4 to 7 do not overlap with other focusing windows in the image. As shown in fig. 3, each of the focusing windows 0 to 7 has a different size from the other focusing windows.
The mobile terminal can control the camera to focus the shooting object in each focusing window and collect the initial image after the focusing is finished, so that the definition of the window image in the focusing window in the collected initial image is high, and the definition of the subsequent full-depth-of-field shooting image obtained by the mobile terminal based on the window image is effectively ensured.
In an alternative implementation, the position of each of the plurality of focusing windows in the image coordinate system of the camera may be pre-stored in the mobile terminal. For example, a developer of the mobile terminal may determine that a clear partial image should be captured in an image with a full depth of field on the premise that the image with the full depth of field is clear based on a large number of images with the full depth of field captured by the camera for the same subject. Then, the developer can determine the positions of the plurality of focusing windows in the image coordinate system of the camera based on the position of each of the plurality of partial images which should be clearly shot in the image acquired by the camera, namely the position of each of the plurality of partial images which should be clearly shot in the image coordinate system of the camera, and write the positions of the plurality of focusing windows in the image coordinate system of the camera into the mobile terminal before the mobile terminal is shipped. The mobile terminal can store the position of each focusing window in the image coordinate system of the camera into a buffer area of the mobile terminal in a list form.
Under the condition, the user only needs to touch the photographing control, and the mobile terminal can photograph the photographed image with the full depth of field. Namely, the mobile terminal can respond to the touch operation of the user for the photographing control, and automatically complete the photographing of the photographing image with the full depth of field. Therefore, the operation of the user is effectively simplified, the shooting efficiency of the mobile terminal is improved, and the user experience is improved.
In another alternative implementation, the mobile terminal may determine a position of each of the plurality of focusing windows in an image coordinate system of the camera in response to a user operation. In this case, for a scenario where the resolution of the camera of the mobile terminal is different from the resolution of the display screen of the mobile terminal, referring to fig. 4, the process of the mobile terminal determining the position of each of the plurality of focusing windows may include:
in step 2011, in response to a focus selection operation for the shooting interface, a position of each of the plurality of focuses in the shooting interface is determined.
After a user of the mobile terminal touches a photographing control of the mobile terminal, different positions of a photographing interface of the mobile terminal can be touched based on own requirements, so that the position of a clearer partial image to be photographed is determined, namely the position of a focus of a camera is determined. Accordingly, the mobile terminal may determine a position of each of the plurality of focuses in the shooting interface in response to a user's focus selection operation with respect to the shooting interface, i.e., the above-described touch operation with respect to different positions of the shooting interface.
Step 2012, determining the position of a focusing window corresponding to the focus in the image coordinate system of the camera based on the position of each focus in the plurality of focuses in the shooting interface and the corresponding relationship between the resolution of the camera and the resolution of the display screen.
After determining the position of each focus in the plurality of focuses in the shooting interface, the mobile terminal may determine the position of one focusing window corresponding to each focus in the image coordinate system of the camera based on the position of each focus in the shooting interface and the corresponding relationship between the resolution of the camera and the resolution of the display screen. Then, the mobile terminal can also store the position of each focusing window in the image coordinate system of the camera so as to obtain the window image in each focusing window in the following process.
Wherein, the center point of each focusing window can be the focus corresponding to the focusing window. The mobile terminal may store the position of each focusing window in the image coordinate system of the camera in the form of a list into its buffer.
Optionally, the mobile terminal may determine the position of each focus in the image coordinate system of the camera based on the position of the focus in the shooting interface and the corresponding relationship between the resolution of the camera and the resolution of the display screen. Then, the mobile terminal may process the position of each focus in the image coordinate system of the camera by using a target detection algorithm to obtain the position of one focusing window in the image coordinate system of the camera, so as to obtain the positions of a plurality of focusing windows in the image coordinate system of the camera. Thereby, the efficiency of determining the positions of the plurality of focus windows in the image coordinates of the camera can be improved.
Or, the mobile terminal may first use a target detection algorithm to process the position of each of the multiple focuses in the shooting interface, so as to obtain the position of a focusing window in the shooting interface. Then, for the position of each focusing window in the shooting interface, the mobile terminal may determine the position of the focusing window in the image coordinate system of the camera based on the position of the focusing window in the shooting interface and the corresponding relationship between the resolution of the camera and the resolution of the display screen.
In a scene in which the resolution of the camera of the mobile terminal is the same as the resolution of the display screen of the mobile terminal, the mobile terminal may determine a position of each of the plurality of focuses in the photographing interface in response to a focus selection operation for the photographing interface. Then, for each focus, the mobile terminal may directly determine the position of the focusing window corresponding to the focus in the shooting interface as the position of the focusing window in the image coordinate system of the camera.
It should be noted that, in a scenario where the mobile terminal determines the position of each of the plurality of focus windows in the image coordinate system of the camera in response to an operation by the user, after determining the position of each of the plurality of focuses in the shooting interface in response to a focus selection operation for the shooting interface, the mobile terminal may further determine the position of the focus window corresponding to each focus in the shooting interface based on the position of each of the plurality of focuses in the shooting interface. Then, the mobile terminal can display the plurality of focusing windows on the shooting interface based on the positions of the plurality of focusing windows, so that the user can know the positions of the plurality of focusing windows.
In the embodiment of the application, the mobile terminal can respond to the selection operation of a user for a target window mode in a plurality of window modes, and the positions of a plurality of focusing windows in an image coordinate system of the camera are determined by adopting a target mode indicated by the target window mode. The target mode is any one of the above optional implementation modes.
For example, after a camera of the mobile terminal is started, the mobile terminal may display the fixed window mode control and the flexible window mode control first. If the user clicks the flexible window mode control, the mobile terminal can respond to the touch operation of the user for the flexible window mode control, and can respond to the focus selection operation of the user for the shooting interface after the touch operation for the shooting control is subsequently received, so as to determine the positions of the multiple focusing windows. Then, the mobile terminal can control the camera to focus the shooting objects in the plurality of focusing windows respectively and collect the initial image.
If the user clicks the fixed window mode control, the mobile terminal can respond to the touch operation of the user for the fixed window mode control, and after the touch operation for the photographing control is subsequently received, the camera is controlled to focus the photographing object in each focusing window of the focusing windows and collect an initial image based on the positions of the focusing windows stored in advance.
In the embodiment of the application, the mobile terminal can control the camera to sequentially focus the shot object in each focusing window of the focusing windows and acquire the initial image according to the traversal sequence of the focusing windows. Wherein the traversal order may be pre-stored in the mobile terminal.
For example, please continue to refer to fig. 3, assuming that the traversal order of the eight focusing windows is focusing window 0 to focusing window 7, the mobile terminal may first focus the shooting object in the focusing window 0, and collect an initial image of one frame after the focusing is completed. Then, the mobile terminal can sequentially and respectively focus the shooting objects in the focusing windows 1 to 7, and acquire one frame of initial image after the focusing is finished, so as to obtain eight frames of initial images.
It should be noted that the camera may include a Micro Controller Unit (MCU). The number of the focusing windows may be less than or equal to the upper limit of the number of focusing windows allowed by the processing capability of the MCU. For example, if the upper limit of the number is 8, the number of the focusing windows may be less than or equal to 8, for example, the number of the focusing windows may be 4, 5, 6, 7 or 8.
Optionally, the number of the focusing windows of the mobile terminal can be adjusted, so that the effect of adjusting the shooting efficiency of the shot image with the full depth of field obtained by shooting can be achieved. For example, the mobile terminal may display the simple window sub-mode and the fine window sub-mode after responding to a selection operation for the fixed window mode. And then, the mobile terminal can respond to the selection operation of the user for the target window sub-mode in the simple window sub-mode and the fine window sub-mode, and the number of the focusing windows is determined to be the number indicated by the target window sub-mode.
The number of focusing windows indicated by the simple window sub-mode and the number of focusing windows indicated by the fine window sub-mode may be pre-stored in the mobile terminal, and the number of focusing windows indicated by the simple window sub-mode may be smaller than the number of focusing windows indicated by the fine window sub-mode.
In this embodiment, the mobile terminal may further include: processors and Image Signal Processors (ISPs). For each focusing window in the plurality of focusing windows, the camera can transmit the acquired initial image to the ISP after focusing the shot image in the focusing window is completed. The ISP, upon receiving the initial image, may store the initial image in a buffer of the mobile terminal. After determining that the initial image is stored in the buffer area, the processor may establish a correspondence between the position of each focusing window and the initial image corresponding to the focusing window, and store the correspondence.
For example, each frame of the initial image and the position of the focusing window corresponding to the frame of the initial image may be stored in a super-buffer list (super-buffer list). The super-buffer list may include two sub-lists with a corresponding relationship, one of the two sub-lists may be used for storing the initial image, and the other of the two sub-lists may be used for storing the position of the focusing window corresponding to the initial image.
For example, referring to fig. 5, it is assumed that the traversal order of the 4 focusing windows shown in fig. 5 is focusing window 0, focusing window 1, focusing window 2, and focusing window 3. As shown in fig. 5, after the camera focuses on the shooting object in the focusing window 0, the images in the focusing window 0 are clearer and the rest of the images are blurred in the acquired initial image.
Referring to fig. 6, fig. 6 is a schematic diagram showing an initial image acquired after a camera focuses on a photographic subject in the focusing window 1. As can be seen from fig. 6, in the initial image, the images in the focusing window 1 are clearer, and the rest of the images are blurred.
As shown in fig. 7, fig. 7 is a schematic diagram showing an initial image acquired after a camera focuses on a photographic subject in the focusing window 2. Referring to fig. 7, in the initial image, the images except for the image in the focusing window 2 are clearer, and the rest of the images are blurred.
Referring to fig. 8, fig. 8 is a schematic diagram showing an initial image acquired after a camera focuses on a photographic subject in the focusing window 3. Referring to fig. 8, in the initial image, the images except for the image in the focusing window 3 are clearer, and the rest of the images are blurred.
Step 202, acquiring an initial image acquired after the camera focuses on the shot object in each focusing window, and acquiring a window image in the focusing window in the initial image.
After the mobile terminal acquires the multiple frames of initial images, for each frame of initial image, the mobile terminal can intercept a window image located in a focusing window in the frame of initial image based on the position of the focusing window corresponding to the frame of initial image.
For the initial image acquired after the camera focuses on the shooting object in each focusing window, the mobile terminal can directly acquire the window image in the focusing window in the initial image based on the position of the focusing window, and the window image is the clearer partial image in the frame of initial image. And the clear part of each frame of initial image does not need to be identified, so that the calculation amount of the mobile terminal is effectively reduced, the photographing energy consumption of the mobile terminal is reduced, and the photographing efficiency of the mobile terminal is improved.
As can be seen from the descriptions of step 201 and step 202, in the embodiment of the present application, the mobile terminal may first determine the position of the clear partial image to be captured, i.e., the position of the focusing window, in the full-depth captured image, and store the position. Then, the mobile terminal can control the camera to focus the image in the focusing window, and acquire an initial image acquired after focusing. Then, the mobile terminal may directly obtain the window image in the focusing window based on the position of the focusing window in the initial image, that is, a clearer partial image in the initial image. Therefore, the mobile terminal does not need to identify clear partial images in each frame of initial image, on one hand, the calculation amount of the mobile terminal is reduced, the photographing energy consumption is reduced, and the photographing efficiency is improved. On the other hand, the problem of inaccurate identification can be avoided, so that the definition of the subsequently obtained full-depth photographed image can be ensured.
And step 203, synthesizing the acquired window images to obtain a photographed image.
After the mobile terminal acquires the window images in each frame of initial image, namely acquires a plurality of window images, the plurality of window images can be synthesized, so that a full-depth-of-field photographed image is obtained.
In the embodiment of the application, in the process of obtaining the photographed image, the mobile terminal may determine the background image first, and then may synthesize the background image with the plurality of window images to obtain the target photographed image. For example, the mobile terminal may replace the image in the background image located in the focusing window with the window image in the focusing window obtained in step 202 based on the position of each focusing window, so as to obtain the photographed image.
Optionally, the background image may be a last initial image acquired by the camera. That is, the mobile terminal may use the last frame of initial image collected by the camera as a background image, and then the mobile terminal may synthesize the background image with the plurality of window images to obtain a target photographed image. Therefore, the photographing efficiency of the mobile terminal can be ensured.
Or, the background image may be an initial image with the highest definition in a plurality of frames of initial images acquired by the camera, and the initial image is used as the background image. Then, the mobile terminal can synthesize the background image and the plurality of window images to obtain the target photographed image. Thus, the sharpness of the photographed image obtained by photographing can be ensured, and the display effect of the photographed image can be ensured.
For a scene in which the background image is the initial image with the highest definition in the multiple initial images, before the multiple window images are synthesized to obtain the photographed image, the mobile terminal can also determine the definition of each frame of initial photographed image in the multiple initial photographed images. Then, the mobile terminal can acquire the initial photographed image with the highest definition from the plurality of frames of photographed images based on the definition of each frame of initial photographed image.
Optionally, after the mobile terminal synthesizes the acquired window images to obtain the photographed image, the synthesized photographed image may be subjected to image smoothing processing to avoid an unnatural transition between the window image in each focusing window in the synthesized photographed image and the background image, thereby effectively ensuring the display effect of the photographed image.
For example, after acquiring the window image in the focusing window in the four initial frames of images shown in fig. 5 to 8, the mobile terminal may synthesize the four window images into the photographed image shown in fig. 9. Comparing fig. 9 with any of fig. 5 to 8, it can be seen that the window images in the focusing windows in fig. 9 are clearer.
And step 204, adjusting the resolution of the photographed image.
In the embodiment of the application, after the mobile terminal obtains the photographed image, the resolution of the photographed image can be adjusted, so that the adjusted resolution of the photographed image is equal to the resolution of the display screen of the mobile terminal, and the display effect of the photographed image is ensured.
Optionally, after the mobile terminal obtains the photographed image, if it is determined that the resolution of the camera is different from the resolution of the display screen of the mobile terminal, the resolution of the photographed image may be adjusted. If the resolution of the camera is higher than that of the display screen, the mobile terminal can compress the photographed image so that the resolution of the compressed photographed image is equal to that of the display screen. If the resolution of the camera is lower than that of the display screen, the mobile terminal can stretch the photographed image so that the stretched photographed image has the same resolution as that of the display screen.
And step 205, displaying the adjusted photographed image.
After the resolution of the photographed image is adjusted, the mobile terminal can display the adjusted photographed image on a display screen of the mobile terminal.
It should be further noted that, the order of the steps of the photographing method of the mobile terminal provided in the embodiment of the present application may be appropriately adjusted, and the steps may also be increased or decreased according to the situation. For example, step 204 may be eliminated as the case may be, i.e., the resolution of the camera is the same as the resolution of the display screen. Any method that can be easily conceived by a person skilled in the art within the technical scope disclosed in the present application is covered by the protection scope of the present application, and thus the detailed description thereof is omitted.
To sum up, the embodiment of the present application provides a photographing method for a mobile terminal, where the mobile terminal can respond to a touch operation for a photographing control to obtain a multi-frame initial image collected by a camera. Because each frame of initial image is acquired after the camera focuses on the shot object in one focusing window, the window image in the focusing window in the frame of initial image can be ensured to be clearer. Then, the mobile terminal can directly intercept the window image positioned in the focusing window from each initial photographed image and synthesize the photographed image. Because the clearest partial image does not need to be identified from each frame of image and then intercepted, the calculation amount of the mobile terminal can be effectively reduced and the photographing efficiency of the mobile terminal can be improved on the premise of ensuring the definition of the photographed image obtained by synthesis.
Fig. 10 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application, where the mobile terminal may be configured to execute the photographing method according to the foregoing method embodiment. As shown in fig. 10, the mobile terminal may include: processor 1101, camera 121, and display screen 131. The processor 1101 may be configured to:
responding to touch operation aiming at the photographing control, controlling the camera to focus a photographing object in each focusing window of a plurality of focusing windows and acquiring an initial image, wherein the positions of the plurality of focusing windows in an image coordinate system of the camera are different;
acquiring initial images acquired after focusing of the shooting objects in each focusing window by the camera, and acquiring window images positioned in the focusing windows in the initial images;
synthesizing the acquired window images to obtain a photographed image;
the display screen can be used for displaying the photographed image.
Optionally, the processor 1101 may be configured to:
taking an initial image with highest definition in a plurality of frames of initial images collected by a camera as a background image; and synthesizing the background image and the plurality of window images to obtain a target photographed image.
Optionally, the resolution of the camera is different from the resolution of the display screen of the mobile terminal. The processor 1101 may also be configured to:
determining a position of each focus in the plurality of focuses in the shooting interface in response to a focus selection operation for the shooting interface; determining the position of a focusing window corresponding to each focus in an image coordinate system of a camera based on the position of each focus in a shooting interface and the corresponding relation between the resolution of an image acquired by the camera and the resolution of a display screen; and the central point of each focusing window is a focus corresponding to the focusing window.
Optionally, the processor 1101 may further be configured to:
determining the position of a focusing window corresponding to each focus in the shooting interface based on the position of each focus in the plurality of focuses in the shooting interface; and controlling the display screen to display the plurality of focusing windows on the shooting interface.
Optionally, the processor 1101 may be configured to:
and controlling the camera to sequentially focus the shot object in each focusing window of the focusing windows according to the traversal sequence of the focusing windows and acquiring an initial image.
Optionally, the processor 1101 may be configured to:
adjusting the resolution of the photographed image, wherein the adjusted resolution of the photographed image is equal to the resolution of a display screen of the mobile terminal; the display screen can be used for displaying the adjusted photographed image
To sum up, the embodiment of the present application provides a mobile terminal, and the mobile terminal can respond to the touch operation for the photographing control to acquire the multiframe initial images collected by the camera. Because each frame of initial image is acquired after the camera focuses on the shot object in one focusing window, the window image in the focusing window in the frame of initial image can be ensured to be clearer. Then, the mobile terminal can directly intercept the window image positioned in the focusing window from each initial photographed image and synthesize the photographed image. Because the clearest partial image does not need to be identified from each frame of image and then intercepted, the calculation amount of the mobile terminal can be effectively reduced and the photographing efficiency of the mobile terminal can be improved on the premise of ensuring the definition of the photographed image obtained by synthesis.
As shown in fig. 10, the mobile terminal 110 may further include: a display unit 130, a Radio Frequency (RF) circuit 150, an audio circuit 160, a wireless fidelity (Wi-Fi) module 170, a bluetooth module 180, and a power supply 190.
The camera 121 may be used to capture still pictures or video, among other things. The object generates an optical picture through the lens and projects the optical picture to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensitive elements convert the light signals into electrical signals which are then passed to the processor 1101 for conversion into digital picture signals.
The processor 1101 is a control center of the mobile terminal 110, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the mobile terminal 110 and processes data by running or executing software programs stored in the memory 140 and calling data stored in the memory 140. In some embodiments, processor 1101 may include one or more processing units; the processor 1101 may also integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a baseband processor, which mainly handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 1101. In the present application, the processor 1101 may run an operating system and an application program, may control a user interface to display, and may implement the photographing method of the mobile terminal provided in the embodiment of the present application. Additionally, processor 1101 is coupled to input unit and display unit 130.
The display unit 130 may be used to receive input numeric or character information and generate signal inputs related to user settings and function control of the mobile terminal 110, and optionally, the display unit 130 may also be used to display information input by the user or information provided to the user and a Graphical User Interface (GUI) of various menus of the mobile terminal 110. The display unit 130 may include a display screen 131 disposed on the front surface of the mobile terminal 110. The display screen 131 may be configured in the form of a liquid crystal display, a light emitting diode, or the like. The display unit 130 may be used to display various graphical user interfaces described herein.
The display unit 130 includes: a display screen 131 and a touch screen 132 disposed on the front of the mobile terminal 110. The display screen 131 may be used to display preview pictures. Touch screen 132 may collect touch operations on or near by the user, such as clicking a button, dragging a scroll box, and the like. The touch screen 132 may be covered on the display screen 131, or the touch screen 132 and the display screen 131 may be integrated to implement the input and output functions of the mobile terminal 110, and after the integration, the touch screen may be referred to as a touch display screen for short.
Memory 140 may be used to store software programs and data. The processor 1101 executes various functions of the mobile terminal 110 and data processing by executing software programs or data stored in the memory 140. The memory 140 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Memory 140 stores an operating system that enables mobile terminal 110 to operate. The memory 140 in the present application may store an operating system and various application programs, and may also store codes for executing the photographing method of the mobile terminal provided in the embodiment of the present application.
The RF circuit 150 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then deliver the received downlink data to the processor 1101 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between a user and mobile terminal 110. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161. The mobile terminal 110 may also be configured with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 160, and then outputs the audio data to the RF circuit 150 to be transmitted to, for example, another terminal or outputs the audio data to the memory 140 for further processing. In this application, the microphone 162 may capture the voice of the user.
Wi-Fi belongs to a short-distance wireless transmission technology, and the mobile terminal 110 may help a user to send and receive e-mails, browse webpages, access streaming media, and the like through the Wi-Fi module 170, which provides a wireless broadband internet access for the user.
And the Bluetooth module 180 is used for performing information interaction with other Bluetooth devices with Bluetooth modules through a Bluetooth protocol. For example, the mobile terminal 110 may establish a bluetooth connection with a wearable electronic device (e.g., a smart watch) also equipped with a bluetooth module through the bluetooth module 180, so as to perform data interaction.
Mobile terminal 110 also includes a power supply 190 (e.g., a battery) that powers the various components. The power supply may be logically coupled to the processor 1101 through a power management system to manage charging, discharging, and power consumption functions through the power management system. The mobile terminal 110 may also be configured with a power button for powering on and off the terminal, and locking the screen.
The mobile terminal 110 may include at least one sensor 1110, such as a motion sensor 11101, a distance sensor 11102, a fingerprint sensor 11103, and a temperature sensor 11104. Mobile terminal 110 may also be configured with other sensors such as gyroscopes, barometers, hygrometers, thermometers, and infrared sensors.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the mobile terminal and each device described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 11 is a block diagram of a software structure of a mobile terminal according to an embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into four layers, an application layer, an application framework layer, an Android Runtime (ART) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 11, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 11, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, pictures, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the mobile terminal 110. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the communication terminal vibrates, and an indicator light flashes.
The android runtime comprises a core library and a virtual machine. The android runtime is responsible for scheduling and management of the android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still picture files, etc. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, picture rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
An embodiment of the present application provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed on a computer, the instructions cause the computer to execute the photographing method of the mobile terminal provided in the foregoing embodiment, for example, the method shown in fig. 1 or fig. 2.
The embodiment of the present application further provides a computer program product containing instructions, which when run on a computer, causes the computer to execute the method for photographing a mobile terminal provided in the above method embodiment, for example, the method shown in fig. 1 or fig. 2.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc. Further, the term "at least one" in the present application means one or more, and the term "a plurality" in the present application means two or more.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A mobile terminal is characterized in that the mobile terminal comprises a processor, a camera and a display screen; the processor is configured to:
responding to touch operation of a photographing control, controlling the camera to focus a photographing object in each focusing window of a plurality of focusing windows and collecting an initial image, wherein the positions of the focusing windows in an image coordinate system of the camera are different;
acquiring initial images acquired after the camera focuses on the shot object in each focusing window, and acquiring window images positioned in the focusing windows in the initial images;
synthesizing the acquired window images to obtain a photographed image;
the display screen is used for displaying the photographed image.
2. The mobile terminal of claim 1, wherein the processor is configured to:
taking the initial image with the highest definition in the multiple frames of initial images collected by the camera as a background image;
and synthesizing the background image and the plurality of window images to obtain a target photographing image.
3. The mobile terminal of claim 1, wherein a resolution of the camera is different from a resolution of a display screen of the mobile terminal; the processor is further configured to:
in response to a focus selection operation for a shooting interface, determining a position of each of a plurality of focuses in the shooting interface;
determining the position of one focusing window corresponding to each focus in the plurality of focuses in an image coordinate system of the camera based on the position of each focus in the shooting interface and the corresponding relation between the resolution of the image acquired by the camera and the resolution of the display screen;
and the central point of each focusing window is a focus corresponding to the focusing window.
4. The mobile terminal of claim 3, wherein the processor is further configured to:
determining the position of a focusing window corresponding to each focus in the shooting interface based on the position of each focus in the plurality of focuses in the shooting interface;
and controlling the display screen to display the plurality of focusing windows on the shooting interface.
5. The mobile terminal of any of claims 1 to 4, wherein the processor is configured to:
and controlling the camera to sequentially focus the shot object in each focusing window of the focusing windows according to the traversal sequence of the focusing windows and collecting an initial image.
6. The mobile terminal of any of claims 1 to 4, wherein the processor is configured to:
adjusting the resolution of the photographed image, wherein the adjusted resolution of the photographed image is equal to the resolution of a display screen of the mobile terminal;
and the display screen is used for displaying the adjusted photographed image.
7. The photographing method of the mobile terminal is characterized in that the mobile terminal comprises a camera; the method comprises the following steps:
responding to touch operation of a photographing control, controlling the camera to focus a photographing object in each focusing window of a plurality of focusing windows and collecting an initial image, wherein the positions of the focusing windows in an image coordinate system of the camera are different;
acquiring initial images acquired after the camera focuses on the shot object in each focusing window, and acquiring window images positioned in the focusing windows in the initial images;
synthesizing the acquired window images to obtain a photographed image;
and displaying the photographed image.
8. The method according to claim 7, wherein the synthesizing the acquired window images to obtain a photographed image comprises:
taking the initial image with the highest definition in the multiple frames of initial images collected by the camera as a background image;
and synthesizing the background image and the plurality of window images to obtain a target photographing image.
9. The method of claim 7, wherein the resolution of the camera is different from the resolution of the display screen of the mobile terminal; before the controlling the camera to focus on the photographic subject in each of a plurality of focusing windows and acquiring an initial image, the method further comprises:
in response to a focus selection operation for a shooting interface, determining a position of each of a plurality of focuses in the shooting interface;
determining the position of one focusing window corresponding to each focus in the plurality of focuses in an image coordinate system of the camera based on the position of each focus in the shooting interface and the corresponding relation between the resolution of the image acquired by the camera and the resolution of the display screen;
and the central point of each focusing window is a focus corresponding to the focusing window.
10. The method of claim 9, wherein after said determining a location of each of a plurality of focal points in a capture interface in response to a focal point selection operation for the capture interface, the method further comprises:
determining the position of a focusing window corresponding to each focus in the shooting interface based on the position of each focus in the plurality of focuses in the shooting interface;
and displaying the plurality of focusing windows on the shooting interface.
CN202010904737.1A 2020-09-01 2020-09-01 Mobile terminal and photographing method thereof Pending CN114125197A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010904737.1A CN114125197A (en) 2020-09-01 2020-09-01 Mobile terminal and photographing method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010904737.1A CN114125197A (en) 2020-09-01 2020-09-01 Mobile terminal and photographing method thereof

Publications (1)

Publication Number Publication Date
CN114125197A true CN114125197A (en) 2022-03-01

Family

ID=80360730

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010904737.1A Pending CN114125197A (en) 2020-09-01 2020-09-01 Mobile terminal and photographing method thereof

Country Status (1)

Country Link
CN (1) CN114125197A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104270560A (en) * 2014-07-31 2015-01-07 三星电子(中国)研发中心 Multi-point focusing method and device
CN104270573A (en) * 2014-10-27 2015-01-07 上海斐讯数据通信技术有限公司 Multi-touch focus imaging system and method, as well as applicable mobile terminal
CN104735350A (en) * 2015-03-02 2015-06-24 联想(北京)有限公司 Information processing method and electronic equipment
CN105516594A (en) * 2015-12-21 2016-04-20 广东欧珀移动通信有限公司 Image shooting method and apparatus
CN107277362A (en) * 2017-07-20 2017-10-20 三星半导体(中国)研究开发有限公司 Method and apparatus for generating high-definition image
CN108777765A (en) * 2018-08-07 2018-11-09 成都西纬科技有限公司 Obtain the method, apparatus and electronic equipment of full clear image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104270560A (en) * 2014-07-31 2015-01-07 三星电子(中国)研发中心 Multi-point focusing method and device
CN104270573A (en) * 2014-10-27 2015-01-07 上海斐讯数据通信技术有限公司 Multi-touch focus imaging system and method, as well as applicable mobile terminal
CN104735350A (en) * 2015-03-02 2015-06-24 联想(北京)有限公司 Information processing method and electronic equipment
CN105516594A (en) * 2015-12-21 2016-04-20 广东欧珀移动通信有限公司 Image shooting method and apparatus
CN107277362A (en) * 2017-07-20 2017-10-20 三星半导体(中国)研究开发有限公司 Method and apparatus for generating high-definition image
CN108777765A (en) * 2018-08-07 2018-11-09 成都西纬科技有限公司 Obtain the method, apparatus and electronic equipment of full clear image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
傅军栋,姚孝明: "《实景图像拼接及其漫游控制技术》", 西南交通大学出版社, pages: 102 - 115 *

Similar Documents

Publication Publication Date Title
CN114205522B (en) Method for long-focus shooting and electronic equipment
CN111597000B (en) Small window management method and terminal
CN111367456A (en) Communication terminal and display method in multi-window mode
CN113329176A (en) Image processing method and related device applied to camera of intelligent terminal
EP4199499A1 (en) Image capture method, graphical user interface, and electronic device
CN116055857B (en) Photographing method and electronic equipment
CN113055585B (en) Thumbnail display method of shooting interface and mobile terminal
CN113038141A (en) Video frame processing method and electronic equipment
WO2023160230A1 (en) Photographing method and related device
CN114449171B (en) Method for controlling camera, terminal device, storage medium and program product
WO2021204103A1 (en) Picture preview method, electronic device, and storage medium
CN114125197A (en) Mobile terminal and photographing method thereof
CN113179362B (en) Electronic device and image display method thereof
CN111988530B (en) Mobile terminal and photographing method thereof
CN113129238B (en) Photographing terminal and image correction method
CN111479075B (en) Photographing terminal and image processing method thereof
CN116723382B (en) Shooting method and related equipment
CN114143456B (en) Photographing method and device
CN113259582B (en) Picture generation method and terminal
CN116347217B (en) Image processing method, device and storage medium
CN115334239A (en) Method for fusing photographing of front camera and photographing of rear camera, terminal equipment and storage medium
CN115802145A (en) Shooting method and electronic equipment
CN117135448A (en) Shooting method and electronic equipment
CN116347217A (en) Image processing method, device and storage medium
CN117632329A (en) Display method, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination