CN108513070B - Image processing method, mobile terminal and computer readable storage medium - Google Patents

Image processing method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN108513070B
CN108513070B CN201810298125.5A CN201810298125A CN108513070B CN 108513070 B CN108513070 B CN 108513070B CN 201810298125 A CN201810298125 A CN 201810298125A CN 108513070 B CN108513070 B CN 108513070B
Authority
CN
China
Prior art keywords
image
preview
input
mobile terminal
preview image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810298125.5A
Other languages
Chinese (zh)
Other versions
CN108513070A (en
Inventor
姚弟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810298125.5A priority Critical patent/CN108513070B/en
Publication of CN108513070A publication Critical patent/CN108513070A/en
Application granted granted Critical
Publication of CN108513070B publication Critical patent/CN108513070B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an image processing method, a mobile terminal and a computer readable storage medium. The method is applied to the mobile terminal and comprises the following steps: under the condition that the shooting preview interface displays a first preview image, receiving a first input of a user sliding on a target area, wherein the first input is used for adjusting a shooting view field; responding to the first input, and displaying a second preview image on a shooting preview interface, wherein the second preview image is as follows: on the basis of the shooting view field of the first preview image, adjusting the shooting view field according to the first input operation direction to obtain a preview image; the target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal. Compared with the situation that the post processing and the correction of the image are needed in the prior art, the method and the device can more conveniently realize the adjustment of the shooting view field in a single direction.

Description

Image processing method, mobile terminal and computer readable storage medium
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image processing method, a mobile terminal and a computer readable storage medium.
Background
At present, the photographing function of the mobile terminal is increasingly favored by users, and the users often use the photographing function to record the life drip. In the process of taking a picture with a mobile terminal, there may be a need for adjusting the shooting field of view for the user, and in this case, the user generally adjusts the shooting field of view of the image by using the zoom technology (i.e., using a zoom lens). However, only the entire photographing field of view can be adjusted using the zoom technique, and the photographing field of view in a single direction cannot be individually adjusted. If the user has a requirement of independently adjusting the shooting field of view in a single direction, the user generally needs to perform post-processing and correction on the image by using image modifying software and the like after the shooting is completed, and thus, the operation of adjusting the shooting field of view in the single direction is complicated.
Disclosure of Invention
The embodiment of the invention provides an image processing method, a mobile terminal and a computer readable storage medium, which aim to solve the problem that the operation of adjusting a shooting field of view in a single direction is complex in the prior art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, which is applied to a mobile terminal, and the method includes:
receiving a first input of a user sliding on a target area under the condition that a first preview image is displayed on a shooting preview interface, wherein the first input is used for adjusting a shooting view field;
responding to the first input, and displaying a second preview image on the shooting preview interface, wherein the second preview image is as follows: on the basis of the shooting view field of the first preview image, adjusting the shooting view field according to the first input operation direction to obtain a preview image;
the target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal.
In a second aspect, an embodiment of the present invention provides a mobile terminal, where the mobile terminal includes:
the receiving module is used for receiving a first input of a user sliding on a target area under the condition that a shooting preview interface displays a first preview image, wherein the first input is used for adjusting a shooting field of view;
a display module, configured to respond to the first input, display a second preview image on the shooting preview interface, where the second preview image is: on the basis of the shooting view field of the first preview image, adjusting the shooting view field according to the first input operation direction to obtain a preview image;
the target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal.
In a third aspect, an embodiment of the present invention provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the image processing method described above.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the image processing method described above.
In the embodiment of the invention, if the user has the requirement of independently adjusting the shooting field of view in a single direction, the user can perform a sliding first input in the target area of the mobile terminal under the condition that the mobile terminal is in a shooting preview mode; wherein, the operation direction of the first input can be consistent with the direction of the shooting field needing to be adjusted. In this way, the mobile terminal can respond to the first input, and display a second preview image on the shooting preview interface, wherein the second preview image is obtained by adjusting the shooting view field according to the operation direction of the first input on the basis of the shooting view field of the first preview image. Thus, the shooting field of view of the preview image is successfully adjusted in the direction in which the shooting field of view needs to be adjusted. Therefore, in the shooting process, a user only needs to perform sliding first input in the target area, the operation direction of the first input is ensured to be consistent with the direction of the shooting field of view needing to be adjusted, and the mobile terminal can adjust the shooting field of view in a single direction according to the requirements of the user. Therefore, compared with the situation that the post-processing and the correction of the image are needed in the prior art, the embodiment of the invention can more conveniently realize the adjustment of the shooting view field in a single direction.
Drawings
FIG. 1 is a flow chart of an image processing method provided by an embodiment of the invention;
FIG. 2 is a schematic diagram of a back touch interface of an image processing method according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a preview interface of a shooting process according to an embodiment of the present invention;
FIG. 4 is a second schematic diagram of a preview interface of the image processing method according to the embodiment of the present invention;
fig. 5 is a third schematic diagram of a shooting preview interface of the image processing method according to the embodiment of the present invention;
FIG. 6 is a fourth schematic diagram of a preview interface of the image processing method according to the embodiment of the present invention;
FIG. 7 is a fifth schematic diagram of a preview interface of the image processing method according to the embodiment of the present invention;
FIG. 8 is a sixth schematic view of a preview interface of the image processing method according to the embodiment of the present invention;
FIG. 9 is a seventh schematic diagram of a preview interface of the image processing method according to the embodiment of the present invention;
fig. 10 is an eighth schematic diagram of a preview interface of the image processing method according to the embodiment of the present invention;
fig. 11 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 12 is a schematic diagram of a hardware structure of another mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
First, an image processing method according to an embodiment of the present invention will be described.
It should be noted that the image processing method provided by the embodiment of the present invention is applied to a mobile terminal. Specifically, the mobile terminal may be any device having a communication function, such as: computers (Computer), Mobile phones, tablet computers (tablet personal Computer), Laptop computers (Laptop Computer), Personal Digital Assistants (PDA), Mobile Internet Devices (MID), Wearable devices (Wearable Device), and the like.
Referring to fig. 1, a flowchart of an image processing method according to an embodiment of the present invention is shown. As shown in fig. 1, the method is applied to a mobile terminal, and includes the following steps:
step 101, receiving a first input of a user sliding on a target area under the condition that a shooting preview interface displays a first preview image, wherein the first input is used for adjusting a shooting field of view.
The target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal.
In the case where the target area includes a preset area, the preset area may include the entire photographing preview interface or include a portion of the photographing preview interface. In the case that the preset area includes a part of the shooting preview interface, both the size of the preset area and the setting position of the preset area in the shooting preview interface may be determined according to an actual situation, which is not limited in the embodiment of the present invention.
In the case that the target area includes a back touch area, the target area may be a rectangular touch area surrounded by a dashed line frame 200 in fig. 2, and the touch area may be disposed at a center position of the back of the mobile terminal. Of course, the shape and the setting position of the back touch area are not limited to the situation shown in fig. 2, and may be determined according to the actual situation, which is not limited in this embodiment of the present invention. It should be noted that, in the case that the target area is the back touch area, the user can conveniently perform the first input of sliding on the target area only by using one hand.
In the embodiment of the present invention, before step 101, a user may operate the mobile terminal to open a camera application of the mobile terminal. At this moment, the mobile terminal enters a shooting preview mode, the mobile terminal collects a first preview image, and the first preview image is displayed on a shooting preview interface. In the case where the photographing preview interface displays the first preview image, the mobile terminal may periodically or aperiodically detect whether the first input of the user sliding on the target area is received. If the first input is detected, it may be considered that the user wishes to adjust the photographing field of view through the first input, and at this time, the mobile terminal performs the subsequent step 102.
Step 102, responding to the first input, and displaying a second preview image on a shooting preview interface, wherein the second preview image is as follows: and on the basis of the shooting view field of the first preview image, adjusting the preview image obtained after the shooting view field according to the first input operation direction.
It should be noted that the operation direction of the first input may be in various cases. Specifically, the operation direction of the first input may be a direction as shown by an arrow 201 in fig. 2, or a direction as shown by an arrow 301 in fig. 3, or a direction as shown by any one of arrows 401 to 408 in fig. 4, such that the operation direction of the first input is away from the center point of the target area (i.e., the geometric center of the target area). Of course, the operation direction of the first input may also be the direction shown by the arrow 501 in fig. 5, or the direction shown by any one of the arrows 601 to 608 in fig. 6, so that the operation direction of the first input is toward the center point of the target area.
It should be noted that the specific implementation of step 102 is various, and for clarity of layout, the following example is provided.
It is understood that, in the case that the photographing preview interface displays the second preview image, if the user is satisfied with the second preview image, the user may click a photographing button of the mobile terminal to cause the mobile terminal to save the second preview image. Thereafter, the user may operate the mobile terminal to exit the camera application.
In the embodiment of the invention, if the user has the requirement of independently adjusting the shooting field of view in a single direction, the user can perform a sliding first input in the target area of the mobile terminal under the condition that the mobile terminal is in a shooting preview mode; wherein, the operation direction of the first input can be consistent with the direction of the shooting field needing to be adjusted. In this way, the mobile terminal can respond to the first input, and display a second preview image on the shooting preview interface, wherein the second preview image is obtained by adjusting the shooting view field according to the operation direction of the first input on the basis of the shooting view field of the first preview image. Thus, the shooting field of view of the preview image is successfully adjusted in the direction in which the shooting field of view needs to be adjusted. Therefore, in the shooting process, a user only needs to perform sliding first input in the target area, the operation direction of the first input is ensured to be consistent with the direction of the shooting field of view needing to be adjusted, and the mobile terminal can adjust the shooting field of view in a single direction according to the requirements of the user. Therefore, compared with the situation that the post-processing and the correction of the image are needed in the prior art, the embodiment of the invention can more conveniently realize the adjustment of the shooting view field in a single direction.
It should be noted that the specific implementation form of step 102 is various, and two implementation forms are described below by way of example.
In a first implementation form, step 102 includes:
and acquiring a first length of the operation track of the first input under the condition that the operation direction of the first input deviates from the central point of the target area.
In the embodiment of the invention, under the condition that the first input of the user sliding on the target area is received, the mobile terminal can periodically determine and record the sliding contact point of the first input on the target area, determine the operation track of the first input according to the recorded sliding contact points, and obtain the first length of the operation track of the first input.
Based on the first length, the first preview image is enlarged to generate a first intermediate image.
Optionally, the enlarging the first preview image based on the first length to generate a first intermediate image includes:
and determining the target magnification corresponding to the first length.
In the embodiment of the present invention, the mobile terminal may store a corresponding relationship between the length and the amplification factor in advance, and the length and the amplification factor may be in positive correlation. Therefore, based on the corresponding relation, the mobile terminal can very conveniently determine the target magnification corresponding to the first length.
And according to the first input operation direction, carrying out amplification processing on the first preview image by using the target amplification factor to generate a first intermediate image.
Specifically, in a case where the operation direction of the first input is the direction indicated by the arrow 301 in fig. 3 (i.e., the oblique upper-right direction), the mobile terminal may perform the equal-scale enlargement processing on the first preview image at the target magnification along the oblique upper-right direction, thereby generating the first intermediate image. In a case where the operation direction of the first input is the direction indicated by the arrow 402 in fig. 4 (i.e., the horizontal right direction), the mobile terminal may perform enlargement processing of the length direction of the first preview image at the target magnification in the horizontal right direction, thereby generating the first intermediate image. In a case where the operation direction of the first input is the direction indicated by the arrow 404 in fig. 4 (i.e., the vertically downward direction), the mobile terminal may perform the enlargement processing of the height direction of the first preview image at the target magnification in the vertically downward direction, thereby generating the first intermediate image.
It is emphasized that no matter in which direction the first preview image is enlarged, no change in the shooting field of view is caused, and therefore, the first intermediate image and the first preview image have the same shooting field of view.
And cutting a first sub-image which exceeds the display range of the shooting preview interface in the first intermediate image to obtain a second sub-image.
Specifically, in the case that the operation direction of the first input is the oblique upper-right direction, the first intermediate image is obtained by performing equal-scale magnification processing on the first preview image, so that a first sub-image (for example, the sub-image 702 in fig. 7) in the first intermediate image may exceed the display range of the shooting preview interface, and the mobile terminal may remove the sub-image 702 from the first intermediate image through a cropping operation to obtain a second sub-image (for example, the sub-image 701 in fig. 7). Since the sub-image 702 is removed by the cropping operation, the photographing field of view of the second sub-image (i.e., the sub-image 701) is smaller than that of the first preview image in the upper right-hand direction compared to the first preview image.
In the case that the operation direction of the first input is a vertically downward direction, the first intermediate image is obtained by performing an enlargement process on the height direction of the first preview image, so that a first sub-image (specifically, the sub-image 802 in fig. 8) in the first intermediate image may exceed the display range of the shooting preview interface, and the mobile terminal may remove the sub-image 802 from the first intermediate image through a cropping operation to obtain a second sub-image (e.g., the sub-image 801 in fig. 8). Since the sub-image 802 is removed by the cropping operation, the photographing field of view of the second sub-image (i.e., the sub-image 801) is smaller than that of the first preview image in the vertically downward direction compared to the first preview image.
And performing interpolation processing on the second sub-image to generate and display a second preview image.
Specifically, when the second subimage is subjected to interpolation processing, the mobile terminal may generate the gray value of the unknown pixel point by using the gray value of the known pixel point in the second subimage, so as to better ensure the resolution and the definition of the generated second preview image.
It is to be noted that the interpolation process for the second sub-image does not cause a change in the photographing field of view, and therefore, the photographing field of view of the second sub-image is the same as that of the second preview image. In addition, since the photographing field of view of the second sub-image is smaller than that of the first preview image in the first input operation direction, the photographing field of view of the second preview image is smaller than that of the first preview image in the first input operation direction. It can be seen that in the first embodiment, through the first input deviating from the center point of the target area, the mobile terminal can conveniently achieve reduction of the shooting field of view in a single direction, so that sub-images unsatisfactory for users in preview images can be removed.
In addition, it should be noted that, in the case of positive correlation between the length and the magnification, the larger the value of the first length, the more significant the magnification effect of the first intermediate image with respect to the first preview image, and thus, the larger the area of the first sub-image removed by the mobile terminal through the cropping operation, and accordingly, the more significant the degree of reduction of the shooting field of view in a single direction.
In a second embodiment, step 102 comprises:
in the case that the operation direction of the first input is toward the center point of the target area, a first length of the operation trajectory of the first input is acquired.
Similar to the first embodiment, the mobile terminal may also determine the operation track of the first input according to the sliding contact point of the first input on the target area, and obtain the first length of the operation track of the first input.
The first preview image is reduced based on the first length to generate a second intermediate image.
Optionally, performing reduction processing on the first preview image based on the first length to generate a second intermediate image, including:
and determining the target reduction multiple corresponding to the first length.
In the embodiment of the invention, the corresponding relation between the length and the reduction multiple can be stored in the mobile terminal in advance, and the length and the reduction multiple can be positively correlated. Therefore, based on the corresponding relation, the mobile terminal can very conveniently determine the target reduction multiple corresponding to the first length.
And according to the first input operation direction, carrying out reduction processing on the first pre-image by the target reduction multiple to generate a second intermediate image.
Specifically, in a case where the operation direction of the first input is the direction indicated by the arrow 501 in fig. 5 (i.e., the oblique lower left direction), the mobile terminal may perform the equal scaling-down process on the first preview image by the target reduction factor along the oblique lower left direction, thereby generating the second intermediate image. In a case where the operation direction of the first input is the direction indicated by the arrow 602 in fig. 6 (i.e., the horizontal left direction), the mobile terminal may perform reduction processing in the length direction of the first preview image by the target reduction factor in the horizontal left direction, thereby generating the second intermediate image. In a case where the operation direction of the first input is the direction indicated by the arrow 604 in fig. 6 (i.e., the vertically upward direction), the mobile terminal may perform reduction processing for the height direction of the first preview image by the target reduction factor in the vertically upward direction, thereby generating the second intermediate image.
It is emphasized that the reduction process performed on the first preview image in any direction does not cause a change in the imaging field of view, and therefore the second intermediate image is identical to the imaging field of view of the first preview image.
A third sub-image is acquired.
The third sub-image is a preview image of a second area, except for a first area where the second intermediate image is located, in a shooting preview interface acquired by a camera of the mobile terminal; the display area of the shooting preview interface is composed of a first area and a second area. It is easy to see that the field of view of the third sub-image is different from the field of view of the second intermediate image.
And carrying out image synthesis on the second intermediate image and the third sub-image to generate and display a second preview image.
Specifically, when the operation direction of the first input is a diagonal lower left direction, the second intermediate image is obtained by performing the equal scaling processing on the first preview image, so that the second intermediate image (for example, the intermediate image 901 in fig. 9) cannot be displayed to fill the entire shooting preview interface, and the upper portion and the right portion of the shooting preview interface leave a partial area. At this time, the mobile terminal may acquire a third sub-image (e.g., the sub-image 902 shown in fig. 9) for display in the area vacated in the upper and right portions of the photographing preview interface. Thereafter, the mobile terminal may image-synthesize the intermediate image 901 and the sub-image 902 to generate and display a second preview image. Since the second preview image increases the photographing field of view of the sub image 902 in addition to the first preview image, it can be considered that the second preview image increases the photographing field of view of a certain range in the upper and right directions in addition to the first preview image, and thus the photographing field of view of the second preview image is larger than that of the first preview image in the obliquely downward and left direction.
When the operation direction of the first input is the vertically upward direction, the second intermediate image is obtained by reducing the height direction of the first preview image, and thus the second intermediate image (for example, the intermediate image 1001 in fig. 10) cannot be displayed over the entire shooting preview interface, and a partial area is left in the lower portion of the shooting preview interface. At this time, the mobile terminal may acquire a third sub-image (e.g., sub-image 1002 shown in fig. 10) for display in an area vacated in a lower portion of the photographing preview interface. Thereafter, the mobile terminal may image-synthesize the intermediate image 1001 and the sub-image 1002 to generate and display a second preview image. It should be noted that, since the second preview image is added with the sub-image 1002 on the basis of the first preview image, it can be considered that the second preview image is added with a certain range of shooting field of view below on the basis of the first preview image, and therefore, the shooting field of view of the second preview image is larger than that of the first preview image in the vertical upward direction.
It can be seen that, in the second embodiment, the mobile terminal can conveniently realize the increase of the shooting field of view in a single direction through the first input towards the central point of the target area. In addition, it is to be noted that, in the case where there is a positive correlation between the length and the reduction factor, the larger the value of the first length, the more significant the reduction effect of the second intermediate image with respect to the first preview image, and thus, the larger the third sub-image, and accordingly, the more significant the degree of increase in the shooting field of view in a single direction.
Optionally, the mobile terminal comprises a first camera; the first preview image is a preview image acquired by the first camera. In particular, in order to guarantee the effect of the mobile terminal in capturing images, the first camera is not generally a wide-angle camera, but is a general camera having a smaller field angle than the wide-angle camera.
Acquiring a third sub-image comprising:
and acquiring a preview image of the second area acquired after the first camera moves in the direction opposite to the first input operation direction.
The mobile terminal may store a corresponding relationship between the length and the moving distance in advance. In this way, the mobile terminal can determine a moving distance (assumed as S) corresponding to the first length according to the corresponding relationship, and control the first camera to move S along the direction opposite to the first input operation direction. And then, the mobile terminal continues to acquire the preview image through the moving first camera, so that the mobile terminal can acquire the preview image of the second area and perform image synthesis on the preview image of the second area and the second intermediate image to obtain a second preview image.
In the embodiment of the invention, the mobile terminal can conveniently obtain the second preview image by moving the first camera.
Optionally, the mobile terminal includes a first camera and a second camera, where the first camera is a non-wide-angle camera and the second camera is a wide-angle camera; the first preview image is a preview image acquired by the first camera;
acquiring a third sub-image comprising:
and acquiring a preview image of the second area acquired by the second camera.
It should be noted that, when the mobile terminal is in the shooting preview mode, the second camera may be kept in the activated state all the time, so that, similar to the first camera, the second camera may also capture a preview image in real time. In a case where the mobile terminal responds to the first input of the user and performs reduction processing on the first preview image to obtain the second intermediate image, the mobile terminal may obtain a preview image of the second region based on the second camera and perform image synthesis on the preview image of the second region and the second intermediate image to obtain the second preview image.
Of course, the second camera may not be kept in the activated state when the mobile terminal is in the shooting preview mode. After the first preview image is subjected to reduction processing to obtain a second intermediate image, the mobile terminal starts the second camera again so as to acquire the preview image through the second camera, and finally a second preview image is obtained.
In the embodiment of the invention, the mobile terminal can conveniently obtain the second preview image by simultaneously arranging the two cameras.
It should be noted that, in the embodiment of the present invention, a plurality of (i.e., at least two) first cameras may be simultaneously disposed in the mobile terminal, and the shooting fields of view of the first cameras are different. Under the condition that the mobile terminal is in a shooting preview mode, the first cameras can respectively collect and store corresponding preview images. Therefore, the user can control the mobile terminal to perform image synthesis on the preview images acquired by all the first cameras in the later period so as to obtain an image with a larger shooting view field.
Optionally, in a case where the photographing preview interface displays the first preview image, before receiving a first input of a user sliding on the target area, the method further includes:
receiving touch input of a user on a target area;
specifically, in the case that the target area is a preset area of a shooting preview interface of the mobile terminal, the touch input of the user on the target area may be a slide-pinch operation on the shooting preview interface; in the case where the target area is a back touch area of the mobile terminal, the touch input of the user on the target area may be a sliding circling operation on the back touch area. Of course, the type of the touch input is not limited to the two cases listed above, and may be determined according to actual situations, which is not limited in this embodiment of the present invention.
And responding to the touch input, and entering a view field adjusting mode.
In the embodiment of the invention, the mobile terminal can adjust the shooting view field only in the view field adjusting mode.
In response to the first input, displaying a second preview image on a capture preview interface, including:
and in response to the first input, displaying a second preview image on the shooting preview interface under the condition of the field of view adjusting mode.
In the embodiment of the invention, under the condition that the mobile terminal is in the shooting preview mode, if the user has the requirement of adjusting the shooting view field in a single direction, the user can firstly perform touch input on the target area so as to enable the mobile terminal to enter the view field adjusting mode.
In this way, in the case where the first input of the user on the target area is received, the mobile terminal may temporarily not respond to the first input, but first determine whether itself is in the field of view adjustment mode. If the determination result is yes, the user may be considered to have made the first input in the case where the adjustment of the photographing field of view is required, and therefore, the mobile terminal responds to the first input to implement the adjustment of the photographing field of view. If the determination result is negative, the user is not considered to perform the first input under the condition that the shooting field of view needs to be adjusted, and the first input is likely to be input by the user by mistake, so that the mobile terminal does not respond to the first input so as to avoid system resource loss caused by responding to the input by mistake.
In the embodiment of the invention, the mobile terminal responds to the first input of the user only under the condition of being in the field-of-view adjusting mode, so that the mobile terminal can be effectively prevented from responding to the incorrect input.
In summary, compared with the situation that post-processing and correction of images are needed in the prior art, the embodiment of the invention can more conveniently realize adjustment of the shooting field of view in a single direction.
The following describes a mobile terminal provided in an embodiment of the present invention.
Referring to fig. 11, a schematic structural diagram of a mobile terminal 1100 according to an embodiment of the present invention is shown. As shown in fig. 11, the mobile terminal 1100 includes:
the receiving module 1101 is configured to receive a first input of a user sliding on a target area under the condition that a shooting preview interface displays a first preview image, where the first input is used for adjusting a shooting field of view;
a display module 1102, configured to respond to the first input, and display a second preview image on the shooting preview interface, where the second preview image is: on the basis of the shooting view field of the first preview image, adjusting the shooting view field according to the first input operation direction to obtain a preview image;
the target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal.
Optionally, the display module 1102 includes:
the first obtaining submodule is used for obtaining a first length of an operation track of the first input under the condition that the operation direction of the first input deviates from the central point of the target area;
the first processing submodule is used for carrying out amplification processing on the first preview image based on the first length to generate a first intermediate image;
the cutting sub-module is used for cutting a first sub-image which exceeds the display range of the shooting preview interface in the first intermediate image to obtain a second sub-image;
and the second processing submodule is used for carrying out interpolation processing on the second sub-image to generate and display a second preview image.
Optionally, the display module 1102 includes:
the second obtaining submodule is used for obtaining the first length of the operation track of the first input under the condition that the operation direction of the first input faces the central point of the target area;
the third processing submodule is used for carrying out reduction processing on the first preview image based on the first length to generate a second intermediate image;
a third obtaining sub-module for obtaining a third sub-image;
the synthesis submodule is used for carrying out image synthesis on the second intermediate image and the third sub-image to generate and display a second preview image;
the third sub-image is a preview image of a second area, except for a first area where the second intermediate image is located, in a shooting preview interface acquired by a camera of the mobile terminal; the display area of the shooting preview interface is composed of a first area and a second area.
Optionally, the mobile terminal comprises a first camera; the first preview image is a preview image acquired by the first camera;
the third obtaining submodule is specifically configured to:
and acquiring a preview image of the second area acquired after the first camera moves in the direction opposite to the first input operation direction.
Optionally, the mobile terminal includes a first camera and a second camera, where the first camera is a non-wide-angle camera and the second camera is a wide-angle camera; the first preview image is a preview image acquired by the first camera;
the third obtaining submodule is specifically configured to:
and acquiring a preview image of the second area acquired by the second camera.
It should be noted that, the mobile terminal 1100 provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the foregoing method embodiment, and details are not described here to avoid repetition. In the photographing process, the user only needs to perform sliding first input in the target area, and the operation direction of the first input is ensured to be consistent with the direction of the photographing field of view to be adjusted, so that the mobile terminal 1100 can adjust the photographing field of view in a single direction according to the requirements of the user. Therefore, compared with the situation that the post-processing and the correction of the image are needed in the prior art, the embodiment of the invention can more conveniently realize the adjustment of the shooting view field in a single direction.
Referring to fig. 12, a schematic diagram of a hardware structure of a mobile terminal 1200 according to an embodiment of the present invention is shown. As shown in fig. 12, mobile terminal 1200 includes, but is not limited to: radio frequency unit 1201, network module 1202, audio output unit 1203, input unit 1204, sensor 1205, display unit 1206, user input unit 1207, interface unit 1208, memory 1209, processor 1210, and power source 1211. Those skilled in the art will appreciate that the mobile terminal architecture illustrated in fig. 12 is not intended to be limiting of the mobile terminal 1200, and that the mobile terminal 1200 may include more or less components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal 1200 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 1210 is configured to:
under the condition that the shooting preview interface displays a first preview image, receiving a first input of a user sliding on a target area, wherein the first input is used for adjusting a shooting view field; responding to the first input, and displaying a second preview image on a shooting preview interface, wherein the second preview image is as follows: on the basis of the shooting view field of the first preview image, adjusting the shooting view field according to the first input operation direction to obtain a preview image; the target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal.
In the embodiment of the present invention, during the photographing process, the user only needs to perform the sliding first input in the target area, and it is ensured that the operation direction of the first input is consistent with the direction of the photographing field of view that needs to be adjusted, and the mobile terminal 1200 can adjust the photographing field of view in a single direction according to the requirements of the user. Therefore, compared with the situation that the post-processing and the correction of the image are needed in the prior art, the embodiment of the invention can more conveniently realize the adjustment of the shooting view field in a single direction.
Optionally, the processor 1210 is further configured to: under the condition that the operation direction of the first input deviates from the central point of the target area, acquiring a first length of an operation track of the first input; based on the first length, carrying out amplification processing on the first preview image to generate a first intermediate image; cutting a first sub-image which exceeds the display range of the shooting preview interface in the first intermediate image to obtain a second sub-image; and performing interpolation processing on the second sub-image to generate and display a second preview image.
Optionally, the processor 1210 is further configured to: under the condition that the operation direction of the first input faces the central point of the target area, acquiring a first length of an operation track of the first input; performing reduction processing on the first preview image based on the first length to generate a second intermediate image; acquiring a third sub-image; carrying out image synthesis on the second intermediate image and the third sub-image to generate and display a second preview image; the third sub-image is a preview image of a second area, except for a first area where the second intermediate image is located, in a shooting preview interface acquired by a camera of the mobile terminal, and a display area of the shooting preview interface is composed of the first area and the second area.
Optionally, the mobile terminal comprises a first camera; the first preview image is a preview image acquired by the first camera; a processor 1210 further configured to: and acquiring a preview image of the second area acquired after the first camera moves in the direction opposite to the first input operation direction.
Optionally, the mobile terminal includes a first camera and a second camera, where the first camera is a non-wide-angle camera and the second camera is a wide-angle camera; the first preview image is a preview image acquired by the first camera; a processor 1210 further configured to: and acquiring a preview image of the second area acquired by the second camera.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1201 may be used for receiving and sending signals during information transmission and reception or during a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1210; in addition, the uplink data is transmitted to the base station. Typically, the radio frequency unit 1201 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 1201 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal 1200 provides the user with wireless broadband internet access, such as helping the user send and receive e-mails, browse web pages, and access streaming media, etc., through the network module 1202.
The audio output unit 1203 may convert audio data received by the radio frequency unit 1201 or the network module 1202 or stored in the memory 1209 into an audio signal and output as sound. Also, the audio output unit 1203 may also provide audio output related to a specific function performed by the mobile terminal 1200 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1203 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1204 is used to receive audio or video signals. The input Unit 1204 may include a Graphics Processing Unit (GPU) 12041 and a microphone 12042, and the Graphics processor 12041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1206. The image frames processed by the graphics processor 12041 may be stored in the memory 1209 (or other storage medium) or transmitted via the radio frequency unit 1201 or the network module 1202. The microphone 12042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1201 in case of the phone call mode.
The mobile terminal 1200 also includes at least one sensor 1205, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 12061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 12061 and/or backlight when the mobile terminal 1200 moves to the ear. As one of the motion sensors, the accelerometer sensor may detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and may be used to identify the posture of the mobile terminal 1200 (e.g., horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (e.g., pedometer, tapping); the sensors 1205 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., and will not be described further herein.
The display unit 1206 is used to display information input by the user or information provided to the user. The Display unit 1206 may include a Display panel 12061, and the Display panel 12061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1207 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal 1200. Specifically, the user input unit 1207 includes a touch panel 12071 and other input devices 12072. The touch panel 12071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 12071 (e.g., operations by a user on or near the touch panel 12071 using a finger, a stylus, or any suitable object or attachment). The touch panel 12071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1210, receives a command from the processor 1210, and executes the command. In addition, the touch panel 12071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 1207 may include other input devices 12072 in addition to the touch panel 12071. In particular, the other input devices 12072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 12071 can be overlaid on the display panel 12061, and when the touch panel 12071 receives a touch operation thereon or nearby, the touch operation is transmitted to the processor 1210 to determine the type of the touch event, and then the processor 1210 provides a corresponding visual output on the display panel 12061 according to the type of the touch event. Although the touch panel 12071 and the display panel 12061 are shown as two separate components in fig. 12 to implement the input and output functions of the mobile terminal 1200, in some embodiments, the touch panel 12071 and the display panel 12061 may be integrated to implement the input and output functions of the mobile terminal 1200, which is not limited herein.
The interface unit 1208 is an interface for connecting an external device to the mobile terminal 1200. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1208 may be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within mobile terminal 1200 or may be used to transmit data between mobile terminal 1200 and external devices.
The memory 1209 may be used to store software programs as well as various data. The memory 1209 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1209 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 1210 is a control center of the mobile terminal 1200, connects various parts of the entire mobile terminal 1200 using various interfaces and lines, and performs various functions of the mobile terminal 1200 and processes data by running or executing software programs and/or modules stored in the memory 1209 and calling data stored in the memory 1209, thereby monitoring the mobile terminal 1200 as a whole. Processor 1210 may include one or more processing units; preferably, the processor 1210 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 1210.
The mobile terminal 1200 may also include a power source 1211 (e.g., a battery) for powering the various components, and the power source 1211 may be logically connected to the processor 1210 through a power management system that may be configured to manage charging, discharging, and power consumption.
In addition, the mobile terminal 1200 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 1210, a memory 1209, and a computer program stored in the memory 1209 and capable of running on the processor 1210, where the computer program, when executed by the processor 1210, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. An image processing method applied to a mobile terminal is characterized by comprising the following steps:
under the condition that a first preview image is displayed on a shooting preview interface, receiving a first input of a user sliding on a target area, wherein the first input is used for adjusting a shooting view field angle;
responding to the first input, and displaying a second preview image on the shooting preview interface, wherein the second preview image is as follows: on the basis of the shooting view field angle of the first preview image, adjusting the shooting view field angle corresponding to the operation direction according to the first input operation direction to obtain a preview image;
the target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal.
2. The method of claim 1, wherein displaying a second preview image at the capture preview interface in response to the first input comprises:
under the condition that the operation direction of the first input deviates from the central point of the target area, acquiring a first length of an operation track of the first input;
based on the first length, carrying out amplification processing on the first preview image along the first input operation direction to generate a first intermediate image; the shooting view field angle of the first intermediate image and the shooting view field angle of the first preview image are the same;
cutting a first sub-image which exceeds the display range of the shooting preview interface in the first intermediate image to obtain a second sub-image;
and performing interpolation processing on the second sub-image to generate and display a second preview image.
3. The method of claim 1, wherein displaying a second preview image at the capture preview interface in response to the first input comprises:
acquiring a first length of an operation track of the first input under the condition that the operation direction of the first input faces a central point of the target area;
based on the first length, carrying out reduction processing on the first preview image along the operation direction of the first input to generate a second intermediate image; the shooting view field angle of the second intermediate image is the same as that of the first preview image;
acquiring a third sub-image;
performing image synthesis on the second intermediate image and the third sub-image to generate and display a second preview image;
the third sub-image is a preview image of a second area, except for a first area where the second intermediate image is located, in the shooting preview interface acquired by a camera of the mobile terminal; and the display area of the shooting preview interface consists of the first area and the second area.
4. The method of claim 3, wherein the mobile terminal comprises a first camera; the first preview image is a preview image acquired by the first camera;
the acquiring of the third sub-image comprises:
and acquiring a preview image of the second area acquired after the first camera moves in the direction opposite to the first input operation direction.
5. The method of claim 3, wherein the mobile terminal comprises a first camera and a second camera, wherein the first camera is a non-wide angle camera and the second camera is a wide angle camera; the first preview image is a preview image acquired by the first camera;
the acquiring of the third sub-image comprises:
and acquiring a preview image of the second area acquired by the second camera.
6. A mobile terminal, characterized in that the mobile terminal comprises:
the receiving module is used for receiving a first input of a user sliding on a target area under the condition that a shooting preview interface displays a first preview image, wherein the first input is used for adjusting a shooting view field angle;
a display module, configured to respond to the first input, display a second preview image on the shooting preview interface, where the second preview image is: on the basis of the shooting view field angle of the first preview image, adjusting the shooting view field angle corresponding to the operation direction according to the first input operation direction to obtain a preview image;
the target area comprises a preset area in the shooting preview interface or a back touch area of the mobile terminal.
7. The mobile terminal of claim 6, wherein the display module comprises:
the first obtaining submodule is used for obtaining a first length of the operation track of the first input under the condition that the operation direction of the first input deviates from the central point of the target area;
the first processing submodule is used for carrying out amplification processing on the first preview image along the first input operation direction based on the first length to generate a first intermediate image; the shooting view field angle of the first intermediate image and the shooting view field angle of the first preview image are the same;
the cutting sub-module is used for cutting a first sub-image which exceeds the display range of the shooting preview interface in the first intermediate image to obtain a second sub-image;
and the second processing submodule is used for carrying out interpolation processing on the second sub-image to generate and display a second preview image.
8. The mobile terminal of claim 6, wherein the display module comprises:
the second obtaining submodule is used for obtaining a first length of the operation track of the first input under the condition that the operation direction of the first input faces the central point of the target area;
a third processing sub-module, configured to perform reduction processing on the first preview image along the first input operation direction based on the first length, and generate a second intermediate image; the shooting view field angle of the second intermediate image is the same as that of the first preview image;
a third obtaining sub-module for obtaining a third sub-image;
the synthesis submodule is used for carrying out image synthesis on the second intermediate image and the third sub-image to generate and display a second preview image;
the third sub-image is a preview image of a second area, except for a first area where the second intermediate image is located, in the shooting preview interface acquired by a camera of the mobile terminal; and the display area of the shooting preview interface consists of the first area and the second area.
9. The mobile terminal of claim 8, wherein the mobile terminal comprises a first camera; the first preview image is a preview image acquired by the first camera;
the third obtaining submodule is specifically configured to:
and acquiring a preview image of the second area acquired after the first camera moves in the direction opposite to the first input operation direction.
10. The mobile terminal of claim 8, wherein the mobile terminal comprises a first camera and a second camera, wherein the first camera is a non-wide angle camera and the second camera is a wide angle camera; the first preview image is a preview image acquired by the first camera;
the third obtaining submodule is specifically configured to:
and acquiring a preview image of the second area acquired by the second camera.
11. A mobile terminal, characterized in that it comprises a processor, a memory, a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of the image processing method according to any one of claims 1 to 5.
12. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 5.
CN201810298125.5A 2018-04-04 2018-04-04 Image processing method, mobile terminal and computer readable storage medium Active CN108513070B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810298125.5A CN108513070B (en) 2018-04-04 2018-04-04 Image processing method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810298125.5A CN108513070B (en) 2018-04-04 2018-04-04 Image processing method, mobile terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108513070A CN108513070A (en) 2018-09-07
CN108513070B true CN108513070B (en) 2020-09-04

Family

ID=63380520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810298125.5A Active CN108513070B (en) 2018-04-04 2018-04-04 Image processing method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108513070B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2016252993B2 (en) 2015-04-23 2018-01-04 Apple Inc. Digital viewfinder user interface for multiple cameras
US10009536B2 (en) 2016-06-12 2018-06-26 Apple Inc. Applying a simulated optical effect based on data received from multiple camera sensors
DK180859B1 (en) 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
US11112964B2 (en) 2018-02-09 2021-09-07 Apple Inc. Media capture lock affordance for graphical user interface
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US10645294B1 (en) 2019-05-06 2020-05-05 Apple Inc. User interfaces for capturing and managing visual media
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
CN109474786B (en) * 2018-12-24 2021-07-23 维沃移动通信有限公司 Preview image generation method and terminal
CN109918007A (en) * 2019-01-25 2019-06-21 努比亚技术有限公司 A kind of displaying method of terminal, terminal and computer readable storage medium
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
CN113518148A (en) * 2019-05-06 2021-10-19 苹果公司 User interface for capturing and managing visual media
CN110719409B (en) * 2019-11-07 2021-06-08 维沃移动通信有限公司 Continuous shooting method and electronic equipment
CN111050073B (en) * 2019-12-26 2022-01-28 维沃移动通信有限公司 Focusing method and electronic equipment
CN111147754B (en) * 2019-12-31 2021-06-29 维沃移动通信有限公司 Image processing method and electronic device
CN113676709B (en) * 2020-05-14 2023-10-27 聚好看科技股份有限公司 Intelligent projection equipment and multi-screen display method
US11054973B1 (en) 2020-06-01 2021-07-06 Apple Inc. User interfaces for managing media
US11212449B1 (en) 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
CN112565589B (en) * 2020-11-13 2023-03-31 北京爱芯科技有限公司 Photographing preview method and device, storage medium and electronic equipment
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023788A (en) * 2009-09-15 2011-04-20 宏碁股份有限公司 Control method for touch screen display frames
CN105847663A (en) * 2015-06-24 2016-08-10 维沃移动通信有限公司 Adjusting method and adjusting device for displayed image on shooting device
CN107509037A (en) * 2014-11-28 2017-12-22 广东欧珀移动通信有限公司 The method and terminal taken pictures using different angle of visual field cameras
CN107800953A (en) * 2016-09-02 2018-03-13 聚晶半导体股份有限公司 The method of image acquiring device and its zoomed image
CN107820002A (en) * 2016-09-12 2018-03-20 安讯士有限公司 Improved monitoring camera direction control

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180070010A1 (en) * 2016-09-02 2018-03-08 Altek Semiconductor Corp. Image capturing apparatus and image zooming method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023788A (en) * 2009-09-15 2011-04-20 宏碁股份有限公司 Control method for touch screen display frames
CN107509037A (en) * 2014-11-28 2017-12-22 广东欧珀移动通信有限公司 The method and terminal taken pictures using different angle of visual field cameras
CN105847663A (en) * 2015-06-24 2016-08-10 维沃移动通信有限公司 Adjusting method and adjusting device for displayed image on shooting device
CN107800953A (en) * 2016-09-02 2018-03-13 聚晶半导体股份有限公司 The method of image acquiring device and its zoomed image
CN107820002A (en) * 2016-09-12 2018-03-20 安讯士有限公司 Improved monitoring camera direction control

Also Published As

Publication number Publication date
CN108513070A (en) 2018-09-07

Similar Documents

Publication Publication Date Title
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN108668083B (en) Photographing method and terminal
CN108495029B (en) Photographing method and mobile terminal
CN108471498B (en) Shooting preview method and terminal
CN108989672B (en) Shooting method and mobile terminal
CN109688322B (en) Method and device for generating high dynamic range image and mobile terminal
CN110602401A (en) Photographing method and terminal
CN110913139B (en) Photographing method and electronic equipment
CN109474786B (en) Preview image generation method and terminal
CN107749046B (en) Image processing method and mobile terminal
CN110602389B (en) Display method and electronic equipment
CN110769174B (en) Video viewing method and electronic equipment
CN109819166B (en) Image processing method and electronic equipment
CN109413333B (en) Display control method and terminal
CN110830713A (en) Zooming method and electronic equipment
CN108174110B (en) Photographing method and flexible screen terminal
CN108804628B (en) Picture display method and terminal
CN108881721B (en) Display method and terminal
WO2019076373A1 (en) Photographing method, mobile terminal and computer-readable storage medium
CN110769156A (en) Picture display method and electronic equipment
CN110798621A (en) Image processing method and electronic equipment
CN110944114B (en) Photographing method and electronic equipment
CN110536005B (en) Object display adjustment method and terminal
CN110290263B (en) Image display method and mobile terminal
CN109104573B (en) Method for determining focusing point and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant