CN111127595B - Image processing method and electronic equipment - Google Patents

Image processing method and electronic equipment Download PDF

Info

Publication number
CN111127595B
CN111127595B CN201911319652.0A CN201911319652A CN111127595B CN 111127595 B CN111127595 B CN 111127595B CN 201911319652 A CN201911319652 A CN 201911319652A CN 111127595 B CN111127595 B CN 111127595B
Authority
CN
China
Prior art keywords
image
images
target
editing
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911319652.0A
Other languages
Chinese (zh)
Other versions
CN111127595A (en
Inventor
杨蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911319652.0A priority Critical patent/CN111127595B/en
Publication of CN111127595A publication Critical patent/CN111127595A/en
Priority to PCT/CN2020/136731 priority patent/WO2021121253A1/en
Application granted granted Critical
Publication of CN111127595B publication Critical patent/CN111127595B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/168Details of user interfaces specifically adapted to file systems, e.g. browsing and visualisation, 2d or 3d GUIs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an image processing method and electronic equipment, which are applied to the field of communication and are used for solving the problem that the process of sharing images by the electronic equipment is complicated. The method comprises the following steps: displaying a first editing interface, wherein the first editing interface comprises M image areas indicated by a target arrangement template, and N first images are displayed on N image areas in the first editing interface; receiving a first input; in response to the first input, editing at least one image of the N first images to obtain a target image array; receiving a second input; transmitting the target image array in response to the second input; wherein N is a positive integer less than or equal to M. The method can be particularly used in the process of sharing a plurality of images in a gallery application through a communication application.

Description

Image processing method and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image processing method and electronic equipment.
Background
Currently, social networks are popular, image sharing (i.e. image sharing) has become a living normal state of people, and beautiful image typesetting and image style have become pursuit of more and more users.
When a user wants to properly repair, typeset and send an image to a social platform in the process of sharing the image by using electronic equipment such as a mobile phone or a tablet personal computer, the user generally needs to control the electronic equipment to open a gallery application to select the required image, edit the image through a specific image repair application, and store the edited image. The user may then share images via the communication application arranged in a three-pane, six-pane, or nine-pane image layout.
However, the user may need to adjust the arrangement order of the one or more images in the typesetting arrangement in the communication application before the one or more images can be shared in the communication application, which makes the process of sharing the images by the electronic device more complicated.
Disclosure of Invention
The embodiment of the invention provides an image processing method and electronic equipment, which are used for solving the problem that the process of sharing images by the electronic equipment is complicated.
In order to solve the technical problems, the embodiment of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, applied to an electronic device, where the method includes: displaying a first editing interface, wherein the first editing interface comprises M image areas indicated by a target arrangement template, and N first images are displayed on N image areas in the first editing interface; receiving a first input; in response to the first input, editing at least one image of the N first images to obtain a target image array; receiving a second input; transmitting the target image array in response to the second input; wherein N is a positive integer less than or equal to M.
In a second aspect, an embodiment of the present invention further provides an electronic device, including: the device comprises a display module, a receiving module, an editing module and a sending module; the display module is used for displaying a first editing interface, wherein the first editing interface comprises M image areas indicated by the target arrangement template, and N first images are displayed on N image areas in the first editing interface; a receiving module for receiving a first input; the editing module is used for responding to the first input received by the receiving module, editing at least one image in the N first images displayed by the display module, and obtaining a target image array; the receiving module is also used for receiving a second input; the transmitting module is used for responding to the second input received by the receiving module and transmitting the target image array obtained by the editing module; wherein N is a positive integer less than or equal to M.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program implementing the steps of the image processing method according to the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method according to the first aspect.
In the embodiment of the present invention, a first editing interface including M image areas indicated by the target arrangement template may be displayed, where N first images are displayed on N image areas of the M image areas. Subsequently, at least one image of the N first images may be edited by the first input to obtain a target image array. Further, the target image array may be transmitted through the second input. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; and the images in the target image array obtained through editing in the first image editing interface according to the target arrangement template can be taken as a whole, and the images in the target image array can be selected and sent quickly and conveniently. The electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party editing application, so that a user does not need to open a plurality of applications, and the electronic device can be controlled to edit the N first images quickly and conveniently in the gallery application. In addition, since the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call out images one by one from the gallery application to edit and send the images in real time. That is, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
Drawings
Fig. 1 is a schematic diagram of a possible architecture of an android operating system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of display content of an electronic device according to an embodiment of the present invention;
FIG. 4 is a second schematic diagram of display content of an electronic device according to an embodiment of the present invention;
FIG. 5 is a third schematic diagram of display content of an electronic device according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a display of an electronic device according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of display content of an electronic device according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of display content of an electronic device according to an embodiment of the present invention;
FIG. 9 is a diagram of a display content of an electronic device according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of display content of an electronic device according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a possible electronic device according to an embodiment of the present invention;
fig. 12 is a schematic hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In this context "/" means "or" for example, a/B may mean a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
The terms first and second and the like in the description and in the claims, are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order of the objects. For example, the first input and the second input, etc., are used to distinguish between different inputs, and are not used to describe a particular order of inputs.
The image processing method provided by the embodiment of the invention can display the first editing interface comprising M image areas indicated by the target arrangement template, wherein N first images are displayed on N image areas in the M image areas. Subsequently, at least one image of the N first images may be edited by the first input to obtain a target image array. Further, the target image array may be transmitted through the second input. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; and the images in the target image array obtained through editing in the first image editing interface according to the target arrangement template can be taken as a whole, and the images in the target image array can be selected and sent quickly and conveniently. The electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party editing application, so that a user does not need to open a plurality of applications, and the electronic device can be controlled to edit the N first images quickly and conveniently in the gallery application. In addition, since the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call out images one by one from the gallery application to edit and send the images in real time. That is, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
The electronic device in the embodiment of the invention can be a mobile electronic device or a non-mobile electronic device. The mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook or a personal digital assistant (personal digital assistant, PDA), etc.; the non-mobile electronic device may be a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiment of the present invention is not particularly limited.
It should be noted that, in the image processing method provided in the embodiment of the present invention, the execution body may be an electronic device, or a central processing unit (Central Processing Unit, CPU) of the electronic device, or a control module in the electronic device for executing the image processing method. In the embodiment of the invention, an image processing method executed by an electronic device is taken as an example, and the image processing method provided by the embodiment of the invention is described.
The electronic device in the embodiment of the invention can be an electronic device with an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, and the embodiment of the present invention is not limited specifically.
The software environment to which the image processing method provided by the embodiment of the invention is applied is described below by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, respectively: an application program layer, an application program framework layer, a system runtime layer and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third party application programs) in the android operating system.
The application framework layer is a framework of applications, and developers can develop some applications based on the application framework layer while adhering to the development principle of the framework of the applications. Such as application programs, e.g., a system setup application, a system chat application, and a system camera application. And the third party setting application, the third party camera application, the third party chat application and other application programs.
The system runtime layer includes libraries (also referred to as system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of the android operating system, and belongs to the bottommost layer of the software hierarchy of the android operating system. The kernel layer provides core system services and a driver related to hardware for the android operating system based on a Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image processing method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image processing method may be operated based on the android operating system shown in fig. 1. Namely, the processor or the electronic device can realize the image processing method provided by the embodiment of the invention by running the software program in the android operating system.
The image processing method provided by the embodiment of the present invention is described in detail below with reference to the flowchart of the image processing method shown in fig. 2. In which, although the logical order of the image processing method provided by the embodiments of the present invention is shown in a method flowchart, in some cases, the steps shown or described may be performed in an order different from that herein. For example, the image processing method shown in fig. 2 may include S201 to S205:
S201, the electronic equipment displays a first editing interface, wherein the first editing interface comprises M image areas indicated by the target arrangement template, and N first images are displayed on N image areas in the first editing interface.
Wherein a first image is displayed on each of the N image areas, the first images displayed on different image areas being different. Specifically, N is a positive integer less than or equal to M.
Optionally, in an embodiment of the present invention, the first editing interface may be an interface in a gallery application in an electronic device.
Optionally, the N image areas in the first editing interface are N image areas in the M image areas.
It is understood that, in the case where the electronic device automatically fills the image into the M image areas, the electronic device may fill the image into the image area having the front display position among the M image areas first, and then fill the image into the image area having the rear display position.
The electronic device fills in an image region, that is, displays an image on the image region. In the embodiment of the present invention, in order to clearly illustrate the relationship between the image area and the image, different descriptions are used at different positions.
The target arrangement template may be a three-square, six-square or nine-square, for example, and of course, may be other templates, which are not limited in particular in the embodiments of the present invention.
It will be appreciated that the target composition is specifically used to indicate the display positions and display sizes of the M image areas.
For example, in the case where the target arrangement template is a three-square, 3 image areas are included in the target arrangement template. At this time, the target layout is used to indicate that the 3 image areas are arranged in display positions arranged sequentially from left to right in a row and three columns on the screen of the electronic apparatus, and the display sizes of the 3 image areas are the same.
For another example, in the case where the target arrangement template is a six-square, 6 image areas are included in the target arrangement template. At this time, the target composition is used to indicate that the 6 image areas are arranged in display positions arranged sequentially from left to right and from top to bottom in two rows and three columns on the screen of the electronic device, and the display sizes of the 6 image areas are the same.
For another example, in the case where the target arrangement template is a nine-square, 9 image areas are included in the target arrangement template. At this time, the target layout is used to indicate that the 9 image areas are arranged in display positions arranged sequentially from left to right and from top to bottom in three rows and three columns on the screen of the electronic device, and the display sizes of the 9 image areas are the same.
In the following embodiment, the image processing method provided by the embodiment of the present invention is described by taking the target arrangement template as a nine-square box as an example.
S202, the electronic device receives a first input.
Illustratively, the first input is for triggering editing of images on M image areas indicated by the target arrangement template. Wherein the first input may be a user input over M image areas.
It should be noted that, the screen of the electronic device provided by the embodiment of the present invention may be a touch screen, and the touch screen may be configured to receive an input of a user and display, in response to the input, content corresponding to the input to the user. The first input may be a touch screen input, a fingerprint input, a gravity input, a key input, or the like. The touch screen input is input such as a pressing input, a long-press input, a sliding input, a clicking input, a floating input (input of a user near the touch screen) of the touch screen of the electronic device by a user. The fingerprint input is input such as sliding fingerprint, long-press fingerprint, single-click fingerprint, double-click fingerprint and the like of a fingerprint identifier of the electronic equipment by a user. The gravity input is input such as shaking in a specific direction, shaking for a specific number of times, and the like of the electronic equipment by a user. The key input corresponds to a single-click input, a double-click input, a long-press input, a combination key input, and the like of a key such as a power key, a volume key, a Home key, and the like of the electronic device by a user. Specifically, the embodiment of the present invention does not specifically limit the manner of the first input, and may be any manner that can be implemented. For example, the first input is a drag input of an image on M image areas by a user.
S203, responding to the first input, and editing at least one image in the N first images by the electronic equipment to obtain a target image array.
Alternatively, the electronic device may edit some or all of the N first images.
The images in the target image array are images arranged according to the target arrangement template.
Obviously, the target image array includes the above-mentioned N first images.
S204, the electronic device receives a second input.
In particular, the second input may be an input to the user-triggered communication application to send the target image array.
Optionally, the first editing interface may include a sharing control, which is used to trigger the electronic device to send the target image array through the communication application. The second input may be an input to a sharing control in the first editing interface.
Similarly, the description of the input manner of the second input may refer to the related description of the input manner of the first input in the above embodiment, which is not repeated herein. For example, the second input is a click input to a sharing control in the target editing interface.
S205, the electronic device transmits the target image array in response to the second input.
Specifically, the electronic device may transmit the target image array through a communication application therein.
Optionally, the image processing method provided by the embodiment of the present invention may further include S206 after S205 described above:
s206, the electronic equipment displays the target image array on the sending interface according to the target arrangement template.
The sending interface is used for displaying images to be sent or sent by the electronic equipment.
Thus, the electronic device can intuitively display the images in the sent target image array to the user, such as intuitively displaying the sequence of the images in the target image array according to the image arranged by the target arrangement template.
Optionally, the electronic device may display a plurality of sharing identifiers on the screen in response to the second input, where each sharing identifier is used to indicate an application or a plug-in of an image to be shared. Specifically, after the user inputs the sharing identifier for indicating the communication application in the plurality of sharing identifiers, the electronic device may display a sending interface of the communication application, automatically fill the target image array into the sending interface, and further trigger the communication application to send the target image array by the user.
Alternatively, the second input may include an input by the user that triggers the electronic device to select the communication application (e.g., an input that triggers the electronic device to display a transmission interface of the communication application), and an input that triggers the electronic device to transmit the target image array through the communication application.
The electronic device sends the target image array through the communication application, and the target image array can be sent to the social platform through the communication application or sent to a corresponding communication object (i.e. one or more contacts in the communication application).
For example, as shown in fig. 3 (a), the editing interface of the gallery application displayed by the electronic device includes an image area P1 to an image area P9, that is, the target arrangement template is a nine-square lattice. The images 1 to 9 are displayed in the image areas P1 to P9, respectively. Specifically, the image areas P1 to P9 are arranged in the order of three rows and three columns, the images 1 to 9 are arranged in the image areas P1 to P9 in the order of three rows and three columns, and the sizes (i.e., display sizes) of the image areas P1 to P9 in the image areas P1 to P9 are the same. At this time, the target image array is images 1 to 9 arranged in three rows and three columns. At this time, M, etc. 9.
The images 1 to 9 may be all custom images selected by the user-controlled electronic device, or the images 1 to 9 include custom images selected by a part of the user-controlled electronic device and further include a part of preset filling images. That is, the N first images are all or part of the images 1 to 9.
For example, image 2, image 4, image 5, image 6, and image 8 in images 1-9 may be the above-described N first images, i.e., N is equal to 5. At this time, the first input is the user input of image 2, image 4, image 5, image 6, and image 8.
Specifically, the editing interface shown in fig. 3 (a) further includes a sharing control S1, and after the user inputs the sharing control S1 (denoted as input 1), as shown in fig. 3 (b), the electronic device may display a sending interface of the communication application, where the sending interface includes images 1 to 9 arranged according to three rows and three columns. Specifically, the sending interface further comprises a sending control Y. Subsequently, after the user inputs the transmission control Y (denoted as input 2), the electronic device may transmit images 1 to 9 arranged in the order of three rows and three columns through the communication application at the social platform. At this time, the second input includes input 1 and input 2.
It should be noted that, in the image processing method provided by the embodiment of the present invention, a first editing interface including M image areas indicated by the target arrangement template may be displayed, where N first images are displayed on N image areas in the M image areas. Subsequently, at least one image of the N first images may be edited by the first input to obtain a target image array. Further, the target image array may be transmitted through the second input. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; and the images in the target image array obtained through editing in the first image editing interface according to the target arrangement template can be taken as a whole, and the images in the target image array can be selected and sent quickly and conveniently. The electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party editing application, so that a user does not need to open a plurality of applications, and the electronic device can be controlled to edit the N first images quickly and conveniently in the gallery application. In addition, since the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call out images one by one from the gallery application to edit and send the images in real time. That is, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
Optionally, the image processing method provided by the embodiment of the present invention may further include S207 before S203, and the corresponding S203 may be implemented through S203 a:
s207, the electronic equipment displays K preset filling images on K image areas in the first editing interface.
Wherein, each image area in the K image areas displays a preset filling image, and the preset filling images displayed on different image areas are different.
Specifically, the electronic device may display K preset filler images on K image areas provided in the gallery application.
S203a, the electronic device edits at least one image of the N first images, and edits at least one image of the K preset filling images to obtain a target image array.
Wherein K is a positive integer, and the sum of K and N is M. I.e. N is smaller than M.
By way of example, in connection with fig. 3, the K preset filler images described above may include image 1, image 3, image 7, and image 9, i.e., K, etc. 4.
Optionally, the preset filling image may be user-defined or preset.
For example, the preset fill image may be a blank image or an image including a preset pattern (e.g., a star pattern).
Optionally, the first editing interface may include a "fill" control, which is used to trigger the electronic device to fill the preset fill image in the target arrangement template. Specifically, the "filling" control is used for triggering the electronic device to display one or more different preset filling images in the target editing interface, so that a user can select preset filling images to be filled into the target arrangement template from the one or more different preset filling images.
It can be appreciated that, in the case where the user triggers the electronic device to arrange the images in the target arrangement template, the user may want the arranged images to have a certain interest, for example, the four corners in the nine-square grid are blank images so that the nine Gong Gezhong image presents a cross shape. In addition, in the case that the electronic device arranges the images in the target arrangement template, the number of the images selected by the user in a self-defining manner is small and cannot be enough to the number of the images in the target arrangement template, so that the user needs the electronic device to fill a preset filling image in the idle image area of the target arrangement template, which is not filled with the images, for example, fill a preset filling image in the last idle image area in the target arrangement template.
In the embodiment of the invention, the electronic equipment can provide the preset filling image aiming at the target arrangement template, so that a user does not need to find a specific image (such as a blank image) in a large number of images in the gallery application and then fill the specific image into the image area in the target arrangement template, and the user can quickly and conveniently trigger the electronic equipment to fill the preset filling image into the target arrangement template.
Alternatively, the user may trigger the electronic device to edit the image pattern by applying a single or sheet of image to the gallery. Exemplary, the image processing method provided by the embodiment of the present invention further includes S208 to S211 before S201 above:
s208, the electronic device receives a third input under the condition that P second images are displayed.
Wherein, each image area in the P image areas displays a second image, the second images displayed in different image areas are different, and P is a positive integer less than or equal to N.
In the embodiment of the invention, the electronic device can display P second images in the gallery application.
Similarly, for the description of the input manner of the third input, reference may be made to the description related to the input manner of the first input in the above embodiment, which is not repeated in the embodiments of the present invention.
Illustratively, P is equal to 1. In the case where the electronic device displays an image in the gallery application, i.e., displays an image preview interface, the user may make a long press input (i.e., a third input) on one image in the gallery application.
Alternatively, in the event that a user inputs one or more images in the gallery application, the electronic device may display a "delete" control and a "add multiple images to template" control on the screen. The deleting control is used for triggering the electronic equipment to delete the image selected by the electronic equipment. The 'add multiple images to template' control is used for triggering the electronic equipment to display the target editing interface so as to trigger the electronic equipment to edit the images selected by the electronic equipment through a certain arrangement template.
S209, responding to a third input, the electronic device displays at least one arrangement template identifier, wherein one arrangement template identifier is used for indicating one image arrangement template, and one image arrangement template comprises a plurality of image areas.
Specifically, the electronic device may provide a plurality of arrangement templates, such as the above three-square, six-square or nine-square, where at least one of the arrangement template identifiers may be an identifier of the three-square, six-square or nine-square, respectively.
S210, the electronic device receives a fourth input of a target arrangement template identifier in the at least one arrangement template identifier.
Wherein the target arrangement template identification is used for indicating the target arrangement template.
Similarly, for the description of the input manner of the fourth input, reference may be made to the description related to the input manner of the first input in the above embodiment, which is not repeated in the embodiments of the present invention.
Illustratively, the fourth input is an input of the user's identification of the box.
S211, responding to the fourth input, the electronic equipment displays a second editing interface, wherein the second editing interface comprises M image areas pointed by the target arrangement template, and P second images are displayed on P image areas in the second editing interface.
Specifically, the second editing interface and the first editing interface are the same editing interface displayed by the electronic device at different moments, that is, the second editing interface is an editing interface in the gallery application.
The P second images are images in the N first images, and P is a positive integer smaller than or equal to N.
Optionally, in the embodiment of the present invention, an add mark may be displayed on an image area that is idle on the target arrangement template in the second editing interface, where the add mark is used to indicate that an image is not displayed on the image area and trigger the electronic device to fill the image for the image area.
For example, as shown in fig. 4 (a), after the user inputs (denoted as input 3) an image 1 in the image preview interface for the image preview interface of the gallery application displayed by the electronic device, the electronic device may display a "delete" control S3 and a "add multiple images to template" control S4 on the image preview interface as shown in fig. 4 (b). After the user inputs "add multiple images to template" control S4 (denoted as input 4), the electronic device may display a template selection interface including "three pane" control S5, "six pane" control S6, "nine pane" control S7, and expansion control S8, as shown in fig. 4 (c). The three-grid control S5, the six-grid control S6 and the nine-grid control S7 are used for indicating a three-grid arrangement template, a six-grid arrangement template and a nine Gong Gepai-grid arrangement template respectively. In addition, the expansion control S8 is used to trigger the electronic device to select other arrangement templates provided by the electronic device, and save some arrangement templates customized by the user. At this time, the fourth input may include input 3 and input 4.
Subsequently, after the user inputs the "nine-box" control S7 shown in fig. 4 (c), as shown in fig. 4 (d), the editing interface displayed by the electronic apparatus includes image areas P1 to P9, the image area P1 includes the image 1 thereon, and the image areas P2 to P9 do not display the image thereon. In this case, the image area P1 is the P areas, the image 1 is the P images, that is, the number of the P image areas is 1.
In the embodiment of the invention, the electronic equipment can provide at least one arrangement template identifier, so that a user inputs the target arrangement template identifier in the at least one arrangement template identifier, and triggers the electronic equipment to start editing N first images according to the target arrangement template in the second editing interface. In addition, the electronic equipment can provide a plurality of arrangement template identifiers, so that the electronic equipment can edit images selected by a user in different arrangement templates, and the diversity of arrangement and image editing of the electronic equipment is improved.
It can be appreciated that after the user triggers the electronic device to start displaying images on the image areas in the target arrangement template, that is, to fill the images on the image areas in the target arrangement template, the user may also trigger the electronic device to add images to the idle image areas in the target arrangement template, so as to select the M first images. Optionally, the image processing method provided by the embodiment of the present invention may further include S212 after S211 and before S201, and the corresponding S201 may be implemented by S201 a:
S212, the electronic device receives Q fifth inputs.
Similarly, for the description of the input manner of each fifth input, reference may be made to the description related to the input manner of the first input in the above embodiment, which is not repeated in the embodiments of the present invention.
Illustratively, each of the Q fifth inputs is configured to trigger the electronic device to fill a user-defined image in an idle image area in the second ranking template.
Optionally, in the process of inputting a fifth input by the user, after the user inputs an empty image area in the second arrangement template (e.g., inputs a plus sign in the empty image area) (denoted as input 5), the electronic device may display an image selection list. Subsequently, the user selects an input of an image from the image selection list (denoted as input 6) so that the electronic device determines the image as an image to be filled into the free image area. In particular, the image selection list may be a list of images in a normal display gallery application, such as an image selection list that displays a set of images most recently captured in the gallery application. At this time, one fifth input may include two inputs of the above-described input 5 and input 6.
S201a, the electronic equipment responds to Q fifth inputs, a first editing interface is displayed, and Q third images are displayed on Q image areas in the first editing interface.
Wherein the Q image areas are image areas other than the P image areas among the N image areas; the Q third pictures are pictures other than the P second pictures in the N first pictures.
Specifically, one third image is displayed on each of the Q image areas, and the third images displayed on different image areas are different.
For example, in connection with (d) in fig. 4, after the user inputs the plus sign flag in the image area P2 shown in (a) in fig. 5 (i.e., the above-described input 5), the electronic device may display an image selection list including the images 2 to 9 as shown in (b) in fig. 5. Subsequently, after the user inputs the image 2 shown in (b) of fig. 5 (i.e., the above-described input 6), the electronic apparatus may display the image 2 on the image area P2 in the editing interface and display no image on the image areas P3 to P9 as shown in (c) of fig. 5.
Similarly, the description of the electronic device on the filling image in each of the image areas P3 to P9 shown in (c) of fig. 5 may refer to the example of fig. 5, which is not repeated in the embodiment of the present invention. For example, after the user triggers the electronic device to fill the images 2 to 9 in the image areas P3 to P9 shown in (c) of fig. 5, respectively, the electronic device may display an editing interface as shown in (a) of fig. 3.
In the embodiment of the invention, after the user triggers the electronic equipment to start displaying the image on the image area indicated by the target arrangement template, the user can trigger the electronic equipment to add the image to the idle image area indicated by the target arrangement template, so that the flexibility of the electronic equipment to select and arrange the image according to the target arrangement template is improved.
In the embodiment of the invention, under the condition that the electronic equipment displays the images in the target arrangement template, the list of the images in the gallery application can be displayed so as to support the user to view the images in the gallery application in real time, for example, the user can conveniently add the images of the gallery application into the image area in the target arrangement template. Optionally, the target editing interface further comprises a target image list. The target image list comprises Q third images, and a fifth input is an input for dragging one third image from the target image list to one image area.
Wherein the target image list may be a list of thumbnails of images in a gallery application. The user may trigger the electronic device to update the thumbnail images displayed in the target image list, such as the thumbnail images displayed as the most recently captured image, by sliding the target image list left and right.
Alternatively, the target image list may be located below the image area in the target arrangement template, such as below the Sudoku template.
Illustratively, in connection with (d) in fig. 4, as shown in fig. 6, the electronic apparatus displays a target image list 61 below the image areas P1 to P9 in the editing interface, the image list 61 including thumbnails of the images 2. Subsequently, the user drags the thumbnail of image 2 from the target image list 61 to the input of the image area P2 (i.e., a fifth input), triggering the electronic device to display the editing interface as shown in fig. 5 (c), i.e., to display image 2 on the image area P2.
In the embodiment of the invention, the electronic equipment can display the target image list in the editing interface (such as the first editing interface or the second editing interface), so that when the electronic equipment arranges images in the target arrangement template, a user can conveniently and intuitively view the images in the gallery application through the target image list. And moreover, the user can quickly and conveniently trigger the electronic equipment to fill the images into the target arrangement template by dragging and inputting the image thumbnail in the target image list to the image position in the target arrangement template, so that the rapidity and convenience of the electronic equipment for selecting and arranging the images are improved.
In the embodiment of the invention, in a scene that the electronic device arranges images in the target arrangement template, a user may need to set the images filled in the target arrangement template, such as setting the size and the tone of the images. Optionally, S213 may be further included before S203b, and the corresponding S203b may be implemented by S203 c:
s213, the electronic device receives a sixth input.
Similarly, for the description of the input manner of the sixth input, reference may be made to the description related to the input manner of the first input in the above embodiment, which is not repeated here.
S203c, responding to a sixth input, the electronic device edits at least one image of the N first images by executing a target operation, and edits at least one image of the K preset filling images to obtain a target image array.
Wherein the target operation includes any one of the following operations 1 to 4:
operation 1: the images of the different image areas in the target arrangement template are combined.
Wherein, when the target operation is operation 1, the sixth input may be a user input of an image area of a different image area, such as a continuous slide input of images on the different image area.
Illustratively, in connection with fig. 3, after the user makes a sixth input to images 3 and 6 shown in fig. 3 (a), the electronic device may combine images 3 and 6 into image 10 as shown in fig. 7.
Therefore, the diversity of the arrangement images of the electronic equipment in the target arrangement template can be improved, and the user experience is improved.
Operation 2: splitting the image of the target image area in the target arrangement template.
Wherein, when the target operation is operation 1, the sixth input may be an input of an image area of the image area by a user, such as a long press input of an image on a different image area.
Optionally, the pressing duration of the sixth input corresponds to the number of image areas dividing the image area. For example, when the pressing duration of the sixth input is within a certain duration range (such as a duration between 1 second and 1.5 seconds), the sixth input is used to trigger the electronic device to divide the image on one image area into two images; the sixth input is used to trigger the electronic device to divide the image on one image area into three images when the duration of the pressing of the sixth input is in a certain duration range (e.g., a duration between 1.5 seconds and 2.5 seconds).
For example, in connection with fig. 3, in the case where the user performs a press input on the image 3 shown in (a) of fig. 3 and the press duration is 1.2 seconds, the electronic apparatus may divide the image 3 into two images arranged one above the other.
Operation 3: and adjusting the image area where the image in the target arrangement template is located.
Wherein, when the target operation is operation 3, the sixth input may be an input that the user drags the image over a different image area.
Illustratively, in connection with fig. 3, the user may drag the image 3 shown in fig. 3 (a) from the image area P3 over the image area P6 to trigger the electronic device to display the image 6 on the image area P3 and display the image 3 on the image area P6.
Operation 4: in connection with the operation indicated by the target function, the target function is used for setting part or all of the images in the target arrangement template.
Optionally, in the embodiment of the present invention, the electronic device may display one or more function controls on the editing interface (such as the first editing interface and the second editing interface), so as to support the user to trigger the electronic device to execute the corresponding target function through each function control. Wherein one of the functionality controls corresponds to one of the target functionalities.
In addition, the electronic device may display the one or more functionality controls in a one-level menu or display the one or more functionality controls in a multi-level menu. For example, the electronic device may display an expansion control in the upper right corner of the editing interface in the gallery application, the next level of expansion control being one or more functionality controls, i.e., the expansion control is used to expand the one or more functionality controls.
Optionally, in an embodiment of the present invention, the target function is any one of the following: clipping an image, rotating the image, adjusting the image tone, sharpening the image, adjusting the image color temperature, adjusting the image saturation, adjusting the image brightness, adjusting the image contrast, adding an image filter, and filling the image in an idle image area among M image areas indicated by a target arrangement template.
By way of example, the image filter may be a style filter such as a character filter, a landscape filter, a delicacy filter, and the like. In addition, the image filter may further include an image special effect such as displaying a specific pattern such as rainbow, love, apple, etc. in the image display area.
Alternatively, in a case where a partial image in the target arrangement template is selected by the user, the electronic device may perform an operation of target function indication on the partial image, such as adding a filter to the partial image. In the case where all the images in the target arrangement template are selected by the user, the electronic device may perform an operation of target function indication on the all the images, such as adding a filter to the all the images as a whole, such as adding a filter including a rainbow pattern to the all the images as a whole.
For example, referring to fig. 3, as shown in fig. 8, an editing interface displayed by the electronic device includes an editing control K1, a filter control K2, a filling control K3, a sharing control K4, and an expansion control K5. The editing control K1 is used for providing clipping images, rotating images, adjusting image tone, sharpening images, adjusting image color temperature, adjusting image saturation, adjusting image brightness and adjusting image contrast. The filter control K2 is used to provide a filter adding function. The filling control K3 is used for triggering the electronic device to fill a preset filling image in an idle image area in the target arrangement template. The sharing control K4 is used for triggering the electronic device to send the target image array through the communication application. The expansion control K5 is used for providing other functions of setting images, such as a function of switching the current arrangement template.
Illustratively, after the user inputs the filter control K2 shown in fig. 8, the electronic device may display one or more filter template identifiers, such as a rainbow filter template identifier 91, below the target arrangement template, and display a filter expansion control 92 (for displaying more filters), as shown in fig. 9 (a). Subsequently, after the user inputs the rainbow filter template mark 91, the electronic device may display adding rainbow filters to the whole of the images 1 to 9 as shown in (b) of fig. 9. And the convenience of the electronic equipment for setting the same filter for a plurality of images is improved.
Illustratively, after the user inputs the fill control K3 shown in fig. 8, as shown in (a) of fig. 10, the electronic device may display one or more fill effect identifiers, such as a fill effect identifier 10a (i.e., a fill effect indicating that images at four corners of a nine-square are added as preset fill images of a star pattern), under the target arrangement template, and also display a fill effect expansion control 10b (for displaying more fill effects). Subsequently, after the user inputs the filling effect identification 10a, as shown in (b) of fig. 10, the electronic device may display preset filling patterns for filling star patterns in the image area P1, the image area P3, the image area P7, and the image area P9, respectively, and display the images 2, 4, 5, 6, and 8 in the image area P2, the image area P4 to the image area P6, and the image area P8, respectively.
The electronic device can provide various modes for editing the images in the target arrangement template, such as triggering different operations on the images through functional controls with different functions, so that the electronic device is favorably triggered to edit the images according to the target arrangement template.
Optionally, the image processing method provided by the embodiment of the present invention may further include S214 after S203 described above:
s214, the electronic equipment stores at least one of the following: a target image array, a target filter of the target image array.
The target filter comprises at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value and a contrast value of the target image array.
Therefore, the electronic equipment can store the target image array and the target filter of the target image array, so that a subsequent user can conveniently acquire the stored target image array and the stored target filter, and the subsequent user can check or use the target image array and the target filter conveniently.
Fig. 11 is a schematic diagram of a possible structure of an electronic device according to an embodiment of the present invention. As shown in fig. 11, the electronic device 11 includes: a display module 11a, a receiving module 11b, an editing module 11c, and a transmitting module 11d; the display module 11a is configured to display a first editing interface, where the first editing interface includes M image areas indicated by the target arrangement template, and N first images are displayed on N image areas in the first editing interface; a receiving module 11b for receiving a first input; an editing module 11c for editing at least one image of the N first images displayed by the display module 11a in response to the first input received by the receiving module 11b, to obtain a target image array; a receiving module 11b, configured to receive a second input; a transmitting module 11d for transmitting the target image array obtained by the editing module 11c in response to the second input received by the receiving module 11 b; wherein N is a positive integer less than or equal to M.
Optionally, the display module 11a is further configured to display the target image array on the transmission interface according to the target arrangement template after the transmission module 11d transmits the target image array; the sending interface is used for displaying images to be sent or sent by the electronic equipment.
Optionally, the display module 11a is further configured to display K preset filling images on K image areas in the first editing interface before the editing module 11c edits at least one image of the N first images to obtain the target image array; the editing module 11c is specifically configured to edit at least one image of the N first images, and edit at least one image of the K preset filling images, so as to obtain a target image array; wherein K is a positive integer, and the sum of K and N is M.
Optionally, the receiving module 11b is further configured to receive a third input in the case of displaying P second images before the display module 11a displays the first editing interface; the display module 11a is further configured to display at least one arrangement template identifier in response to the third input received by the receiving module 11b, where one arrangement template identifier is used to indicate one image arrangement template, and one image arrangement template is used to indicate a plurality of image areas; the receiving module 11b is further configured to receive a fourth input of a target arrangement template identifier of the at least one arrangement template identifier; the display module 11a is further configured to display a second editing interface in response to the fourth input received by the receiving module 11b, where the second editing interface includes M image areas pointed by the target arrangement template, and P second images are displayed on P image areas in the second editing interface; the target arrangement template mark is used for indicating the target arrangement template, the P second images are images in the N first images, and the P is a positive integer less than or equal to N.
Optionally, P is less than N; the receiving module 11b is further configured to receive Q fifth inputs after the second editing interface is displayed by the display module 11a and before the first editing interface is displayed; the display module 11a is specifically configured to respond to Q fifth inputs, and display a first editing interface, where Q third images are displayed on Q image areas in the first editing interface; wherein the Q image areas are image areas other than the P image areas among the N image areas; the Q third pictures are pictures other than the P second pictures in the N first pictures.
Optionally, the receiving module 11b is further configured to edit at least one image of the N first images by the editing module 11c, and edit at least one image of the K preset filling images, and receive a sixth input before the target image array is obtained; the editing module 11c is specifically configured to, in response to the sixth input received by the receiving module 11b, edit at least one image of the N first images by performing a target operation, and edit at least one image of the K preset filling images, to obtain a target image array; wherein the target operation includes any one of: combining images of different image areas in the M image areas, splitting images of target image areas in the M image areas, adjusting the image areas where the images in the M image areas are located, and performing operation of target function indication; the target function is used for editing an image in a part or all of the M image areas, the target image area being any one of the M image areas.
Optionally, the target function is any one of the following: clipping an image, rotating the image, adjusting the image tone, sharpening the image, adjusting the image color temperature, adjusting the image saturation, adjusting the image brightness, adjusting the image contrast, adding an image filter, and filling the image on an idle image area in M image areas.
Optionally, the electronic device 11 further includes: a storage module; the saving module is configured to edit at least one image of the N first images by the editing module 11c, edit at least one image of the K preset filling images, and save at least one of the following after obtaining the target image array: a target image array, a target filter of the target image array; the target filter comprises at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value and a contrast value of the target image array.
The electronic device 11 provided in the embodiment of the present invention can implement each process implemented by the electronic device in the above embodiment of the method, and in order to avoid repetition, a description is omitted here.
The electronic device provided by the embodiment of the invention can display the first editing interface comprising M image areas indicated by the target arrangement template, wherein N first images are displayed on N image areas in the M image areas. Subsequently, at least one image of the N first images may be edited by the first input to obtain a target image array. Further, the target image array may be transmitted through the second input. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; and the images in the target image array obtained through editing in the first image editing interface according to the target arrangement template can be taken as a whole, and the images in the target image array can be selected and sent quickly and conveniently. The electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party editing application, so that a user does not need to open a plurality of applications, and the electronic device can be controlled to edit the N first images quickly and conveniently in the gallery application. In addition, since the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call out images one by one from the gallery application to edit and send the images in real time. That is, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
Fig. 12 is a schematic hardware structure of an electronic device according to an embodiment of the present invention, where the electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device structure shown in fig. 12 is not limiting of the electronic device and that the electronic device may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. In an embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
The processor 110 is configured to control the display unit 106 to display a first editing interface, where the first editing interface includes M image areas indicated by the target arrangement template, and N first images are displayed on N image areas in the first editing interface.
The processor 110 is further configured to control the user input unit 107 to receive a first input.
The processor 110 is further configured to edit at least one image of the N first images displayed by the display unit 106 in response to the first input received by the user input unit 107, to obtain a target image array.
The processor 110 is further configured to control the user input unit 107 to receive a second input.
The processor 110 is further configured to control the radio frequency unit 101 to transmit the target image array in response to the second input received by the user input unit 107; wherein N is a positive integer less than or equal to M.
The electronic device provided by the embodiment of the invention can display the first editing interface comprising M image areas indicated by the target arrangement template, wherein N first images are displayed on N image areas in the M image areas. Subsequently, at least one image of the N first images may be edited by the first input to obtain a target image array. Further, the target image array may be transmitted through the second input. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; and the images in the target image array obtained through editing in the first image editing interface according to the target arrangement template can be taken as a whole, and the images in the target image array can be selected and sent quickly and conveniently. The electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party editing application, so that a user does not need to open a plurality of applications, and the electronic device can be controlled to edit the N first images quickly and conveniently in the gallery application. In addition, since the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call out images one by one from the gallery application to edit and send the images in real time. That is, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be configured to receive and send information or signals during a call, specifically, receive downlink data from a base station, and then process the received downlink data with the processor 110; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 may also communicate with networks and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user through the network module 102, such as helping the user to send and receive e-mail, browse web pages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the electronic device 100. The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used for receiving an audio or video signal. The input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. Microphone 1042 may receive sound and be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 1061 and/or the backlight when the electronic device 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the electronic equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 105 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 is operable to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1071 or thereabout using any suitable object or accessory such as a finger, stylus, etc.). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. Further, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 107 may include other input devices 1072 in addition to the touch panel 1071. In particular, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 110 to determine the type of touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of touch event. Although in fig. 12, the touch panel 1071 and the display panel 1061 are two independent components for implementing the input and output functions of the electronic device, in some embodiments, the touch panel 1071 may be integrated with the display panel 1061 to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 108 is an interface to which an external device is connected to the electronic apparatus 100. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and an external device.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 109 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 109, and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power source 111 (e.g., a battery) for powering the various components, and the power source 111 may preferably be logically coupled to the processor 110 via a power management system, such as to provide for managing charging, discharging, and power consumption.
In addition, the electronic device 100 includes some functional modules, which are not shown, and will not be described herein.
Preferably, the embodiment of the present invention further provides an electronic device, including a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program when executed by the processor 110 implements each process of the foregoing method embodiment, and the same technical effects are achieved, and for avoiding repetition, details are not repeated herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising several instructions for causing an electronic device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (18)

1. An image processing method applied to an electronic device, the method comprising:
displaying a first editing interface, wherein the first editing interface comprises M image areas indicated by a target arrangement template, and N first images are displayed on N image areas in the first editing interface;
receiving a first input;
editing at least one image in the N first images in response to the first input to obtain a target image array;
receiving a second input;
transmitting the target image array in response to the second input;
before the editing at least one image of the N first images to obtain the target image array, the method further includes:
displaying K preset filling images on K image areas in the first editing interface; the preset filling image is a blank image or an image comprising a preset pattern;
Wherein N is a positive integer less than or equal to M; k is a positive integer, and the sum of K and N is M.
2. The method of claim 1, wherein after the transmitting the target image array, the method further comprises:
displaying the target image array on a transmission interface according to the target arrangement template;
the sending interface is used for displaying images to be sent or sent by the electronic equipment.
3. The method according to claim 1 or 2, wherein said editing at least one image of said N first images to obtain an array of target images comprises:
and editing at least one image in the N first images, and editing at least one image in the K preset filling images to obtain the target image array.
4. The method of claim 1, wherein prior to the displaying the first editing interface, the method further comprises:
receiving a third input in case of displaying the P second images;
displaying at least one arrangement template identification in response to the third input, one arrangement template identification for indicating one image arrangement template and one image arrangement template for indicating a plurality of image areas;
Receiving a fourth input of a target arrangement template identifier of the at least one arrangement template identifier;
responding to the fourth input, displaying a second editing interface, wherein the second editing interface comprises the M image areas pointed by the target arrangement template, and the P second images are displayed on the P image areas in the second editing interface;
the target arrangement template identifier is used for indicating the target arrangement template, the P second images are images in the N first images, and P is a positive integer less than or equal to N.
5. The method of claim 4, wherein P is less than N;
after the displaying the second editing interface and before the displaying the first editing interface, the method further includes:
receiving Q fifth inputs;
the displaying a first editing interface includes:
in response to the Q fifth inputs, displaying the first editing interface with Q third images displayed on Q image areas in the first editing interface;
wherein the Q image areas are image areas other than the P image areas among the N image areas; the Q third pictures are pictures other than the P second pictures in the N first pictures.
6. A method according to claim 3, wherein before editing at least one of the N first images and editing at least one of the K preset filler images to obtain the target image array, the method further comprises:
receiving a sixth input;
the editing at least one image of the N first images and editing at least one image of the K preset filling images to obtain the target image array, which comprises
In response to the sixth input, editing at least one image of the N first images by executing a target operation, and editing at least one image of the K preset filling images to obtain the target image array;
wherein the target operation includes any one of: combining the images of different image areas in the M image areas, splitting the images of the target image areas in the M image areas, and adjusting the image areas where the images in the M image areas are positioned and the operation of target function indication;
the target function is used for editing images in part or all of the M image areas, wherein the target image area is any one of the M image areas.
7. The method of claim 6, wherein the target function is any one of: clipping an image, rotating the image, adjusting the image tone, sharpening the image, adjusting the image color temperature, adjusting the image saturation, adjusting the image brightness, adjusting the image contrast, adding an image filter, and filling the image on the idle image area in the M image areas.
8. The method of claim 7, wherein after editing at least one of the N first images and editing at least one of the K preset fill images to obtain the target image array, the method further comprises:
preserving at least one of: the target image array, the target filter of the target image array;
the target filter comprises at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value and a contrast value of the target image array.
9. An electronic device, the electronic device comprising: the device comprises a display module, a receiving module, an editing module and a sending module;
the display module is used for displaying a first editing interface, the first editing interface comprises M image areas indicated by the target arrangement template, and N first images are displayed on N image areas in the first editing interface;
The receiving module is used for receiving a first input;
the editing module is used for responding to the first input received by the receiving module and editing at least one image in the N first images displayed by the display module to obtain a target image array;
the receiving module is also used for receiving a second input;
the sending module is used for responding to the second input received by the receiving module and sending the target image array obtained by the editing module;
the display module is further configured to display K preset filling images on K image areas in the first editing interface before the editing module edits at least one image of the N first images to obtain the target image array; the preset filling image is a blank image or an image comprising a preset pattern;
wherein N is a positive integer less than or equal to M; k is a positive integer, and the sum of K and N is M.
10. The electronic device of claim 9, wherein the display module is further configured to display the target image array at a transmission interface according to the target arrangement template after the transmission module transmits the target image array;
The sending interface is used for displaying images to be sent or sent by the electronic equipment.
11. The electronic device according to claim 9 or 10, wherein the editing module is specifically configured to edit at least one image of the N first images and edit at least one image of the K preset filler images, so as to obtain the target image array.
12. The electronic device of claim 9, wherein the receiving module is further configured to receive a third input in the case of displaying P second images before the display module displays the first editing interface;
the display module is further configured to display at least one arrangement template identifier in response to the third input received by the receiving module, where one arrangement template identifier is used to indicate one image arrangement template, and one image arrangement template is used to indicate multiple image areas;
the receiving module is further configured to receive a fourth input of a target arrangement template identifier in the at least one arrangement template identifier;
the display module is further configured to display a second editing interface in response to the fourth input received by the receiving module, where the second editing interface includes the M image areas pointed by the target arrangement template, and P second images are displayed on P image areas in the second editing interface;
The target arrangement template identifier is used for indicating the target arrangement template, the P second images are images in the N first images, and P is a positive integer less than or equal to N.
13. The electronic device of claim 12, wherein P is less than N;
the receiving module is further configured to receive Q fifth inputs after the display module displays the second editing interface and before the display module displays the first editing interface;
the display module is specifically configured to respond to the Q fifth inputs, display the first editing interface, and display Q third images on Q image areas in the first editing interface;
wherein the Q image areas are image areas other than the P image areas among the N image areas; the Q third pictures are pictures other than the P second pictures in the N first pictures.
14. The electronic device of claim 11, wherein the electronic device comprises a memory device,
the receiving module is further configured to edit at least one image of the N first images by the editing module, and edit at least one image of the K preset filling images, and receive a sixth input before the target image array is obtained;
The editing module is specifically configured to respond to the sixth input received by the receiving module, edit at least one image of the N first images by executing a target operation, and edit at least one image of the K preset filling images, so as to obtain the target image array;
wherein the target operation includes any one of: combining the images of different image areas in the M image areas, splitting the images of the target image areas in the M image areas, and adjusting the image areas where the images in the M image areas are positioned and the operation of target function indication;
the target function is used for editing images in part or all of the M image areas, wherein the target image area is any one of the M image areas.
15. The electronic device of claim 14, wherein the target function is any one of: clipping an image, rotating the image, adjusting the image tone, sharpening the image, adjusting the image color temperature, adjusting the image saturation, adjusting the image brightness, adjusting the image contrast, adding an image filter, and filling the image on the idle image area in the M image areas.
16. The electronic device of claim 15, wherein the electronic device further comprises: a storage module;
the storage module is configured to edit at least one image of the N first images by the editing module, edit at least one image of the K preset filling images, and store at least one of the following after obtaining the target image array: the target image array, the target filter of the target image array;
the target filter comprises at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value and a contrast value of the target image array.
17. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the image processing method according to any one of claims 1 to 8.
18. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the steps of the image processing method according to any one of claims 1 to 8.
CN201911319652.0A 2019-12-19 2019-12-19 Image processing method and electronic equipment Active CN111127595B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911319652.0A CN111127595B (en) 2019-12-19 2019-12-19 Image processing method and electronic equipment
PCT/CN2020/136731 WO2021121253A1 (en) 2019-12-19 2020-12-16 Image processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911319652.0A CN111127595B (en) 2019-12-19 2019-12-19 Image processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111127595A CN111127595A (en) 2020-05-08
CN111127595B true CN111127595B (en) 2023-11-03

Family

ID=70500252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911319652.0A Active CN111127595B (en) 2019-12-19 2019-12-19 Image processing method and electronic equipment

Country Status (2)

Country Link
CN (1) CN111127595B (en)
WO (1) WO2021121253A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111127595B (en) * 2019-12-19 2023-11-03 维沃移动通信有限公司 Image processing method and electronic equipment
CN111866379A (en) * 2020-07-03 2020-10-30 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN112312022B (en) * 2020-10-30 2022-04-15 维沃移动通信有限公司 Image processing method, image processing apparatus, electronic device, and storage medium
CN113888549A (en) * 2021-09-29 2022-01-04 乐美科技股份私人有限公司 Picture generation method and device, electronic equipment and storage medium
CN114500844A (en) * 2022-01-28 2022-05-13 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN114430460A (en) * 2022-01-28 2022-05-03 维沃移动通信有限公司 Shooting method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881844A (en) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 Picture combination method and device and terminal equipment
CN105955607A (en) * 2016-04-22 2016-09-21 北京小米移动软件有限公司 Content sharing method and apparatus
CN106407365A (en) * 2016-09-08 2017-02-15 北京小米移动软件有限公司 Picture sharing method and apparatus
CN110084871A (en) * 2019-05-06 2019-08-02 珠海格力电器股份有限公司 Image typesetting process and device, electric terminal
CN110147190A (en) * 2018-06-29 2019-08-20 腾讯科技(深圳)有限公司 Image processing method and electric terminal
CN110490808A (en) * 2019-08-27 2019-11-22 腾讯科技(深圳)有限公司 Picture joining method, device, terminal and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120082401A1 (en) * 2010-05-13 2012-04-05 Kelly Berger System and method for automatic discovering and creating photo stories
CN103337086B (en) * 2013-06-17 2015-11-25 北京金山安全软件有限公司 picture editing method and device for mobile terminal
CN104168417B (en) * 2014-05-20 2019-09-13 腾讯科技(深圳)有限公司 Image processing method and device
CN105320695B (en) * 2014-07-31 2020-01-17 腾讯科技(深圳)有限公司 Picture processing method and device
US20180095653A1 (en) * 2015-08-14 2018-04-05 Martin Hasek Device, method and graphical user interface for handwritten interaction
CN111127595B (en) * 2019-12-19 2023-11-03 维沃移动通信有限公司 Image processing method and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881844A (en) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 Picture combination method and device and terminal equipment
CN105955607A (en) * 2016-04-22 2016-09-21 北京小米移动软件有限公司 Content sharing method and apparatus
CN106407365A (en) * 2016-09-08 2017-02-15 北京小米移动软件有限公司 Picture sharing method and apparatus
CN110147190A (en) * 2018-06-29 2019-08-20 腾讯科技(深圳)有限公司 Image processing method and electric terminal
CN110084871A (en) * 2019-05-06 2019-08-02 珠海格力电器股份有限公司 Image typesetting process and device, electric terminal
CN110490808A (en) * 2019-08-27 2019-11-22 腾讯科技(深圳)有限公司 Picture joining method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN111127595A (en) 2020-05-08
WO2021121253A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
CN111127595B (en) Image processing method and electronic equipment
CN111061574B (en) Object sharing method and electronic device
CN110891144B (en) Image display method and electronic equipment
CN109525874B (en) Screen capturing method and terminal equipment
WO2019149028A1 (en) Application download method and terminal
CN110874147B (en) Display method and electronic equipment
US11658932B2 (en) Message sending method and terminal device
US20220179549A1 (en) Screen capturing method and terminal device
CN109859307B (en) Image processing method and terminal equipment
CN111010523B (en) Video recording method and electronic equipment
CN111159449B (en) Image display method and electronic equipment
US11165950B2 (en) Method and apparatus for shooting video, and storage medium
CN110865745A (en) Screen capturing method and terminal equipment
CN111064848B (en) Picture display method and electronic equipment
CN110908554B (en) Long screenshot method and terminal device
CN109710151B (en) File processing method and terminal equipment
CN108804628B (en) Picture display method and terminal
CN111383175A (en) Picture acquisition method and electronic equipment
CN110913261A (en) Multimedia file generation method and electronic equipment
CN111124231B (en) Picture generation method and electronic equipment
CN108881742B (en) Video generation method and terminal equipment
CN110750200A (en) Screenshot picture processing method and terminal equipment
CN109166164B (en) Expression picture generation method and terminal
CN110703972A (en) File control method and electronic equipment
CN111178306B (en) Display control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant