CN111127595A - Image processing method and electronic device - Google Patents

Image processing method and electronic device Download PDF

Info

Publication number
CN111127595A
CN111127595A CN201911319652.0A CN201911319652A CN111127595A CN 111127595 A CN111127595 A CN 111127595A CN 201911319652 A CN201911319652 A CN 201911319652A CN 111127595 A CN111127595 A CN 111127595A
Authority
CN
China
Prior art keywords
image
images
target
input
editing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911319652.0A
Other languages
Chinese (zh)
Other versions
CN111127595B (en
Inventor
杨蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911319652.0A priority Critical patent/CN111127595B/en
Publication of CN111127595A publication Critical patent/CN111127595A/en
Priority to PCT/CN2020/136731 priority patent/WO2021121253A1/en
Application granted granted Critical
Publication of CN111127595B publication Critical patent/CN111127595B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/168Details of user interfaces specifically adapted to file systems, e.g. browsing and visualisation, 2d or 3d GUIs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an image processing method and electronic equipment, which are applied to the field of communication and are used for solving the problem that the process of sharing images by the electronic equipment is complicated. The method comprises the following steps: displaying a first editing interface, wherein the first editing interface comprises M image areas indicated by the target arrangement template, and N first images are displayed on the N image areas in the first editing interface; receiving a first input; at least one image in the N first images is edited in response to the first input, and a target image array is obtained; receiving a second input; transmitting the target image array in response to a second input; wherein N is a positive integer less than or equal to M. The method can be particularly used in the process of sharing multiple images in the gallery application through the communication application.

Description

Image processing method and electronic device
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image processing method and electronic equipment.
Background
At present, social networks are prevalent, image sharing (namely image sharing) becomes the normal life of people, and beautiful image typesetting and image styles become pursuits of more and more users.
When a user wants to correct a picture, type and send the picture to a social platform, the user generally needs to control the electronic device to open a gallery application to select the required picture, edit the picture through a specific picture correcting application, and then store the edited picture. Subsequently, the user can share the images arranged in the layout of the images such as the three-square, six-square, or nine-square through the communication application.
However, the user may need to adjust the arrangement order of the one or more images in the typesetting arrangement in the communication application before the one or more images can be shared in the communication application, which results in a cumbersome process for sharing images by the electronic device.
Disclosure of Invention
The embodiment of the invention provides an image processing method and electronic equipment, and aims to solve the problem that the process of sharing images by the electronic equipment is complex.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an image processing method applied to an electronic device, where the method includes: displaying a first editing interface, wherein the first editing interface comprises M image areas indicated by the target arrangement template, and N first images are displayed on the N image areas in the first editing interface; receiving a first input; at least one image in the N first images is edited in response to the first input, and a target image array is obtained; receiving a second input; transmitting the target image array in response to a second input; wherein N is a positive integer less than or equal to M.
In a second aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes: the device comprises a display module, a receiving module, an editing module and a sending module; the display module is used for displaying a first editing interface, the first editing interface comprises M image areas indicated by the target arrangement template, and N first images are displayed on the N image areas in the first editing interface; a receiving module for receiving a first input; the editing module is used for responding to the first input received by the receiving module and editing at least one image in the N first images displayed by the display module to obtain a target image array; the receiving module is also used for receiving a second input; the sending module is used for responding to the second input received by the receiving module and sending the target image array obtained by the editing module; wherein N is a positive integer less than or equal to M.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, and when executed by the processor, the computer program implements the steps of the image processing method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image processing method according to the first aspect.
In an embodiment of the present invention, a first editing interface including M image areas indicated by the target arrangement template may be displayed, where N first images are displayed on N of the M image areas. Subsequently, at least one of the N first images may be edited by the first input, resulting in the target image array. Further, through the second input, the target image array may be transmitted. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; the images in the target image array obtained by editing in the first image editing interface according to the target arrangement template can be quickly and conveniently selected and sent as a whole. For example, the electronic device may edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party cropping application, so that the user may quickly and conveniently control the electronic device to edit the N first images in the gallery application without opening multiple applications. In addition, since the electronic device can transmit the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and transmit the images in real time. Namely, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an electronic device displaying content according to an embodiment of the present invention;
fig. 4 is a second schematic diagram of the display content of the electronic device according to the embodiment of the invention;
fig. 5 is a third schematic diagram of the display content of the electronic device according to the embodiment of the invention;
FIG. 6 is a fourth schematic diagram illustrating display contents of an electronic device according to an embodiment of the present invention;
FIG. 7 is a fifth schematic diagram illustrating display contents of an electronic device according to an embodiment of the invention;
FIG. 8 is a sixth schematic view of the display content of the electronic device according to the embodiment of the present invention;
FIG. 9 is a seventh schematic diagram of the display content of the electronic device according to the embodiment of the present invention;
FIG. 10 is an eighth schematic diagram of the display content of the electronic device according to the embodiment of the invention;
fig. 11 is a schematic structural diagram of a possible electronic device according to an embodiment of the present invention;
fig. 12 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs.
The image processing method provided by the embodiment of the invention can display the first editing interface comprising the M image areas indicated by the target arrangement template, wherein N first images are displayed on N image areas in the M image areas. Subsequently, at least one of the N first images may be edited by the first input, resulting in the target image array. Further, through the second input, the target image array may be transmitted. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; the images in the target image array obtained by editing in the first image editing interface according to the target arrangement template can be quickly and conveniently selected and sent as a whole. For example, the electronic device may edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party cropping application, so that the user may quickly and conveniently control the electronic device to edit the N first images in the gallery application without opening multiple applications. In addition, since the electronic device can transmit the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and transmit the images in real time. Namely, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
The electronic device in the embodiment of the invention can be a mobile electronic device or a non-mobile electronic device. The mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
It should be noted that, in the image Processing method provided in the embodiment of the present invention, the execution main body may be an electronic device, or a Central Processing Unit (CPU) of the electronic device, or a control module in the electronic device for executing the image Processing method. In the embodiment of the present invention, an electronic device executes an image processing method as an example, and the image processing method provided in the embodiment of the present invention is described.
The electronic device in the embodiment of the present invention may be an electronic device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the image processing method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application. For example, applications such as a system setup application, a system chat application, and a system camera application. And the third-party setting application, the third-party camera application, the third-party chatting application and other application programs.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image processing method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image processing method may operate based on the android operating system shown in fig. 1. Namely, the processor or the electronic device can implement the image processing method provided by the embodiment of the invention by running the software program in the android operating system.
The following describes the image processing method provided by the embodiment of the present invention in detail with reference to the flowchart of the image processing method shown in fig. 2. Wherein, although the logical order of the image processing methods provided by embodiments of the present invention is shown in a method flow diagram, in some cases, the steps shown or described may be performed in an order different than here. For example, the image processing method illustrated in fig. 2 may include S201 to S205:
s201, the electronic equipment displays a first editing interface, the first editing interface comprises M image areas indicated by the target arrangement template, and N first images are displayed on the N image areas in the first editing interface.
Wherein, each image area in the N image areas displays a first image, and the first images displayed on different image areas are different. Specifically, N is a positive integer less than or equal to M.
Optionally, in this embodiment of the present invention, the first editing interface may be an interface in a gallery application in the electronic device.
Optionally, the N image regions in the first editing interface are N image regions of the M image regions.
It is to be understood that, in the case where the electronic device automatically fills the M image areas with images, the electronic device may fill the image areas in the M image areas at the front display positions with images, and then fill the image areas at the rear display positions with images.
It should be noted that the electronic device fills an image in an image area, i.e., displays an image on the image area. In the embodiment of the invention, in order to clearly illustrate the relationship between the image area and the image, different descriptions are adopted at different positions.
For example, the target arrangement template may be a three-square grid, a six-square grid, a nine-square grid, or the like, and of course, other templates may also be used, which is not specifically limited in this embodiment of the present invention.
It is to be understood that the target layout is specifically for indicating the display positions and display sizes of the M image areas.
For example, in the case where the target arrangement template is a three-grid, the target arrangement template includes 3 image regions therein. At this time, the target layout is used to indicate that the 3 image areas are arranged in display positions of a row and three columns arranged from left to right on the screen of the electronic device in sequence, and the display sizes of the 3 image areas are the same.
For another example, in the case where the target arrangement template is a six-grid, the target arrangement template includes 6 image regions therein. At this time, the target layout is used to indicate that the 6 image areas are arranged on the screen of the electronic device in the display positions of two rows and three columns which are arranged from left to right, from top to bottom in sequence, and the display sizes of the 6 image areas are the same.
As another example, in the case where the target arrangement template is a squared figure, 9 image regions are included in the target arrangement template. At this time, the target layout is used to indicate that the 9 image areas are arranged on the screen of the electronic device in display positions of three rows and three columns arranged from left to right, from top to bottom, and the display sizes of the 9 image areas are the same.
In the following embodiments, the image processing method provided by the embodiments of the present invention is described by taking an example in which the target arrangement template is a squared figure.
S202, the electronic equipment receives a first input.
Illustratively, the first input is for triggering an image on the M image areas indicated by the editing target arrangement template. Wherein the first input may be a user input on the M image areas.
It should be noted that the screen of the electronic device provided in the embodiment of the present invention may be a touch screen, and the touch screen may be configured to receive an input from a user and display a content corresponding to the input to the user in response to the input. The first input may be a touch screen input, a fingerprint input, a gravity input, a key input, or the like. The touch screen input is input such as a press input, a long press input, a slide input, a click input, a hover input (an input by a user near the touch screen) of a touch screen of the electronic device by the user. The fingerprint input is input by a user to a sliding fingerprint, a long-time pressing fingerprint, a single-click fingerprint, a double-click fingerprint and the like of a fingerprint recognizer of the electronic equipment. The gravity input is input such as shaking of the electronic equipment in a specific direction, shaking of the electronic equipment for a specific number of times and the like by a user. The key input corresponds to a single-click input, a double-click input, a long-press input, a combination key input, and the like, of a user for a key of a power key, a volume key, a Home key, and the like of the electronic device. Specifically, the embodiment of the present invention does not specifically limit the manner of the first input, and may be any realizable manner. For example, the first input is a user's drag input of an image on the M image areas.
S203, responding to the first input, the electronic equipment edits at least one image in the N first images to obtain a target image array.
Optionally, the electronic device may edit some or all of the N first images.
The images in the target image array are images arranged according to the target arrangement template.
Obviously, the target image array includes the N first images.
And S204, the electronic equipment receives a second input.
In particular, the second input may be an input that the user triggers the communication application to send the array of target images.
Optionally, the first editing interface may include a sharing control for triggering the electronic device to send the target image array through a communication application. The second input may be an input to a sharing control in the first editing interface.
Similarly, for the description of the input mode of the second input, reference may be made to the description of the input mode of the first input in the foregoing embodiment, and details are not repeated here. For example, the second input is a click input to a sharing control in the target editing interface.
And S205, responding to the second input, and sending the target image array by the electronic equipment.
Specifically, the electronic device may transmit the target image array through a communication application therein.
Optionally, the image processing method provided in the embodiment of the present invention, after S205, may further include S206:
s206, the electronic equipment displays the target image array on the sending interface according to the target arrangement template.
The sending interface is used for displaying images to be sent or sent by the electronic equipment.
Therefore, the electronic equipment can visually display the sent images in the target image array to the user, for example, visually display the sequence of the images arranged according to the target arrangement template in the target image array.
Optionally, the electronic device may display a plurality of sharing identifiers on the screen in response to the second input, where each sharing identifier is used to indicate an application or a plug-in of an image to be shared. Specifically, after the user inputs the sharing identifier, which is used for indicating the communication application, from the plurality of sharing identifiers, the electronic device may display a sending interface of the communication application, automatically fill the target image array into the sending interface, and then trigger the communication application to send the target image array by the user.
Alternatively, the second input may include an input that the user triggers the electronic device to select the communication application (e.g., an input that triggers the electronic device to display a transmission interface of the communication application), and an input that triggers the electronic device to transmit the array of target images via the communication application.
The electronic device sends the target image array through the communication application, and may send the target image array to the social platform through the communication application for the electronic device, or send the target image array to a corresponding communication object (i.e., one or more contacts in the communication application).
Illustratively, as shown in fig. 3 (a), the editing interface of the gallery application displayed by the electronic device includes image regions P1-P9, i.e., the target arrangement template is a squared figure. Among them, images 1 to 9 are displayed in the image regions P1 to P9, respectively. Specifically, image region P1 through image region P9 are arranged in the order of three rows and three columns, images 1 through 9 are arranged in the order of three rows and three columns in image region P1 through image region P9, and image regions P1 through P9 have the same size (i.e., display size) in image region P1 through image region P9. At this time, the target image array is images 1 to 9 arranged in three rows and three columns. At this point, M equals 9.
The images 1 to 9 may all be user-defined images selected by the user for controlling the electronic device, or the images 1 to 9 include part of the user-defined images selected by the user for controlling the electronic device, and also include part of the preset filling images. That is, the N first images are all or part of the images 1 to 9.
For example, the image 2, the image 4, the image 5, the image 6, and the image 8 among the images 1 to 9 may be the above-described N first images, i.e., N is equal to 5. At this time, the first input is input by the user to the image 2, the image 4, the image 5, the image 6, and the image 8.
Specifically, the editing interface shown in fig. 3 (a) further includes a sharing control S1, and after the user inputs the sharing control S1 (which is recorded as input 1), as shown in fig. 3 (b), the electronic device may display a transmission interface of the communication application, where the transmission interface includes images 1 to 9 arranged in three rows and three columns. Specifically, the sending interface further includes a sending control Y. Subsequently, after the user inputs the sending control Y (denoted as input 2), the electronic device may send images 1 to 9 arranged in the order of three rows and three columns through the communication application on the social platform. At this time, the second input includes input 1 and input 2.
It should be noted that the image processing method provided by the embodiment of the present invention may display a first editing interface including M image areas indicated by the target arrangement template, where N first images are displayed on N image areas of the M image areas. Subsequently, at least one of the N first images may be edited by the first input, resulting in the target image array. Further, through the second input, the target image array may be transmitted. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; the images in the target image array obtained by editing in the first image editing interface according to the target arrangement template can be quickly and conveniently selected and sent as a whole. For example, the electronic device may edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party cropping application, so that the user may quickly and conveniently control the electronic device to edit the N first images in the gallery application without opening multiple applications. In addition, since the electronic device can transmit the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and transmit the images in real time. Namely, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
Optionally, the image processing method provided in the embodiment of the present invention may further include S207 before S203, and the corresponding S203 may be implemented by S203 a:
s207, the electronic equipment displays K preset filling images on K image areas in the first editing interface.
And each image area in the K image areas displays a preset filling image, and the preset filling images displayed on different image areas are different.
Specifically, the electronic device may display K preset pad images on K image areas provided in the gallery application.
S203a, the electronic device edits at least one of the N first images, and edits at least one of the K preset filling images to obtain the target image array.
Wherein K is a positive integer, and the sum of K and N is M. I.e., N is less than M.
For example, in conjunction with fig. 3, the K preset filler images may include image 1, image 3, image 7, and image 9, i.e., K, etc. 4.
Optionally, the preset filler image may be customized or preset by a user.
For example, the preset filler image may be a blank image or an image including a preset pattern (e.g., a star pattern).
Optionally, the first editing interface may include a "fill" control, which is used to trigger the electronic device to fill a preset fill image in the target arrangement template. Specifically, the "fill" control is used to trigger the electronic device to display one or more different preset fill images in the target editing interface, so that the user can select a preset fill image to be filled into the target arrangement template from the one or more different preset fill images.
It can be understood that, in the case that the user triggers the electronic device to arrange the images in the target arrangement template, the user may need to have a certain interest on the arranged images, for example, the four corners of the nine-grid are blank images so that the images in the nine-grid are in the shape of a cross. In addition, in the case that the electronic device arranges the images in the target arrangement template, the number of the images selected by the user in a customized manner is less than the number of the images in the target arrangement template, and the user needs to fill the electronic device with a preset filling image in the free image area of the target arrangement template, which is not filled with images, for example, fill the last free image area in the target arrangement template with a preset filling image.
In the embodiment of the invention, the electronic equipment can provide the preset filling image aiming at the target arrangement template, so that a user does not need to search a specific image (such as a blank image) from a large number of images in the gallery application and then fill the specific image into the image area in the target arrangement template, and therefore, the user can quickly and conveniently trigger the electronic equipment to fill the preset filling image into the target arrangement template.
Alternatively, the user may trigger the electronic device to perform a mode of editing the image by touching one or more images in the gallery application. Illustratively, the image processing method provided by the embodiment of the present invention further includes, before the above S201, S208-S211:
and S208, the electronic equipment receives a third input under the condition that the P second images are displayed.
And each of the P image areas displays a second image, the second images displayed on different image areas are different, and P is a positive integer less than or equal to N.
In the embodiment of the invention, the electronic device can display the P second images in the gallery application.
Similarly, for the description of the input mode of the third input, reference may be made to the description of the input mode of the first input in the foregoing embodiment, and details are not repeated here in the embodiment of the present invention.
Illustratively, P is equal to 1. In a case where the electronic device displays an image in the gallery application, that is, displays an image preview interface, the user may perform a long press input (that is, a third input) on one image in the gallery application.
Optionally, in the event that the user inputs one or more images in the gallery application, the electronic device may display a "delete" control and an "add multiple images to template" control on the screen. The 'deleting' control is used for triggering the electronic equipment to delete the image selected by the electronic equipment. The control of adding a plurality of images to the template is used for triggering the electronic equipment to display the target editing interface so as to trigger the electronic equipment to edit the image selected by the electronic equipment through a certain arrangement template.
S209, in response to the third input, the electronic device displays at least one arrangement template identifier, one arrangement template identifier indicating one image arrangement template, and one image arrangement template includes a plurality of image areas.
Specifically, the electronic device may provide a plurality of arrangement templates, such as the three-grid, six-grid, or nine-grid, and at least one of the arrangement template identifiers may be an identifier of each of the three-grid, six-grid, or nine-grid.
S210, the electronic device receives a fourth input of a target arrangement template identifier in the at least one arrangement template identifier.
The target arrangement template mark is used for indicating the target arrangement template.
Similarly, for the description of the input mode of the fourth input, reference may be made to the description of the input mode of the first input in the foregoing embodiment, and details are not repeated here in the embodiment of the present invention.
Illustratively, the fourth input is a user input of an identification of a squared figure.
And S211, responding to the fourth input, the electronic equipment displays a second editing interface, wherein the second editing interface comprises M image areas pointed by the target arrangement template, and P second images are displayed on P image areas in the second editing interface.
Specifically, the second editing interface and the first editing interface are the same editing interface displayed by the electronic device at different times, that is, the second editing interface is an editing interface in the gallery application.
The P second images are images in the N first images, and P is a positive integer less than or equal to N.
Optionally, in this embodiment of the present invention, a plus sign may be displayed on a free image area on the target arrangement template in the second editing interface, and is used to indicate that no image is displayed on the image area and trigger the electronic device to fill the image area with an image.
Illustratively, as shown in fig. 4 (a), after a user inputs (denoted as input 3) image 1 in an image preview interface of a gallery application displayed for an electronic device, the electronic device may display a "delete" control S3 and a "add multiple images to template" control S4 on the image preview interface, as shown in fig. 4 (b). After the user inputs (noted as input 4) the "add multiple images to template" control S4, the electronic device may display a template selection interface including a "three grid" control S5, a "six grid" control S6, a "nine grid" control S7, and an expansion control S8, as shown in (c) of fig. 4. The three-grid control S5, the six-grid control S6 and the nine-grid control S7 are respectively used for indicating a three-grid arrangement template, a six-grid arrangement template and a nine-grid arrangement template. In addition, the expansion control S8 is used for triggering the electronic device to select other arrangement templates provided by the electronic device and saving some arrangement templates customized by the user. At this time, the fourth input may include input 3 and input 4.
Subsequently, after the user inputs the "squared" control S7 shown in (c) of fig. 4, as shown in (d) of fig. 4, the electronic device displays an editing interface including image regions P1 to P9, including image 1 on image region P1, and not displaying images on image regions P2 to P9. In this case, the image region P1 is the above P regions, and the image 1 is the above P images, that is, the number of P image regions is 1.
In the embodiment of the present invention, the electronic device may provide at least one arrangement template identifier, so that a user inputs a target arrangement template identifier in the at least one arrangement template identifier to trigger the electronic device to start editing the N first images in the second editing interface according to the target arrangement template. In addition, the electronic equipment can provide a plurality of arrangement template identifications, so that the electronic equipment can edit the images selected by the user in different arrangement templates, and the diversity of arrangement and image editing of the electronic equipment is improved.
It will be appreciated that after the user triggers the electronic device to begin displaying images on image areas in the target arrangement template, i.e. filling images on image areas in the target arrangement template, the user may also trigger the electronic device to add images to free image areas in the target arrangement template to select the M first images. Optionally, the image processing method according to the embodiment of the present invention may further include S212 after S211 and before S201, and the corresponding S201 may be implemented by S201 a:
s212, the electronic device receives Q fifth inputs.
Similarly, for the description of the input mode of each fifth input, reference may be made to the description of the input mode of the first input in the foregoing embodiment, and details of this description are not repeated in the embodiment of the present invention.
Illustratively, each of the Q fifth inputs is used to trigger the electronic device to fill a user-defined image in a free image area in the second arrangement template.
Optionally, in the process of inputting a fifth input by the user, after the user inputs a free image area in the second arrangement template (for example, inputs a plus sign identifier in the free image area) (denoted as input 5), the electronic device may display an image selection list. Subsequently, the user selects an input of an image from the image selection list (denoted as input 6), so that the electronic device determines the image as the image to be filled into the free image area. In particular, the image selection list may be a list that normally displays images in a gallery application, such as an image selection list that displays a group of images recently captured in the gallery application. At this time, one fifth input may include both the input 5 and the input 6 described above.
S201a, the electronic device responds to the Q fifth inputs and displays a first editing interface, and Q third images are displayed on Q image areas in the first editing interface.
Wherein, the Q image areas are the image areas except the P image areas in the N image areas; the Q third pictures are the pictures other than the P second pictures among the N first pictures.
Specifically, one third image is displayed on each of the Q image areas, and the third images displayed on different image areas are different.
Illustratively, in conjunction with (d) in fig. 4, after the user inputs the plus sign flag in the image region P2 shown in (a) in fig. 5 (i.e., the input 5 described above), the electronic device may display an image selection list including images 2 to 9, as shown in (b) in fig. 5. Subsequently, after the user inputs the image 2 shown in (b) in fig. 5 (i.e., the above-described input 6), the electronic apparatus may display the image 2 on the image area P2 in the editing interface without displaying images on the image areas P3 to P9, as shown in (c) in fig. 5.
Similarly, the description of the electronic device for filling an image in each of the image region P3 from the image region P3 to the image region P9 shown in (c) in fig. 5 may refer to the example in fig. 5, which is not described again in this embodiment of the present invention. Illustratively, after the user triggers the electronic device to sequentially fill images 2 to 9 in image regions P3 to P9 shown in (c) of fig. 5, respectively, the electronic device may display an editing interface as shown in (a) of fig. 3.
In the embodiment of the invention, after the user triggers the electronic equipment to start displaying the image on the image area indicated by the target arrangement template, the user can also trigger the electronic equipment to add the image to the idle image area indicated by the target arrangement template, so that the flexibility of selecting and arranging the images according to the target arrangement template by the electronic equipment is improved.
In the embodiment of the invention, under the condition that the electronic equipment displays the images in the target arrangement template, the electronic equipment can also display the list of the images in the gallery application so as to support a user to view the images in the gallery application in real time, for example, the user can add the images in the gallery application to the image area in the target arrangement template conveniently. Optionally, the target editing interface further includes a target image list. Wherein the target image list includes Q third images, and a fifth input is an input of dragging one of the third images from the target image list to one of the image areas.
Wherein the list of target images may be a list of thumbnails of images in the gallery application. The user's slide-left input to the target image list may trigger the electronic device to update the thumbnail of the image displayed in the target image list, such as updating the thumbnail displayed as the most recently captured image.
Alternatively, the list of target images may be located below the image area in the target arrangement template, such as below the squared figure template.
Illustratively, in conjunction with (d) in fig. 4, as shown in fig. 6, the electronic device displays a target image list 61 including thumbnails for images 2 below the image regions P1 through P9 in the editing interface. Subsequently, the user drags the thumbnail of image 2 from the target image list 61 to the input of the image area P2 (i.e., a fifth input), triggering the electronic device to display the editing interface as shown in (c) of fig. 5, i.e., to display image 2 on the image area P2.
In the embodiment of the invention, the electronic device can display the target image list in the editing interface (such as the first editing interface or the second editing interface), so that when the electronic device arranges the images in the target arrangement template, a user can conveniently and intuitively view the images in the gallery application through the target image list. Moreover, the user can quickly and conveniently trigger the electronic equipment to fill the images into the target arrangement template through dragging and inputting the image thumbnails in the target image list to the image positions in the target arrangement template, so that the rapidity and the convenience for selecting and arranging the images by the electronic equipment are improved.
In the embodiment of the present invention, in a scene in which the electronic device arranges images in the target arrangement template, a user may need to set the images filled in the target arrangement template, such as setting the size and the color tone of the images. Optionally, S213 may be further included before S203b, and the corresponding S203b may be implemented by S203 c:
and S213, the electronic equipment receives a sixth input.
Similarly, for the description of the input mode of the sixth input, reference may be made to the description of the input mode of the first input in the foregoing embodiment, and details are not repeated here.
S203c, in response to the sixth input, the electronic device edits at least one of the N first images by performing the target operation, and edits at least one of the K preset pad images, resulting in the target image array.
Wherein the target operation comprises any one of the following operations 1 to 4:
operation 1: and combining the images of different image areas in the target arrangement template.
Wherein, when the target operation is operation 1, the sixth input may be an input of an image area of a different image area by a user, such as a continuous slide input of an image on the different image area.
Exemplarily, in conjunction with fig. 3, after the user makes a sixth input on the image 3 and the image 6 shown in (a) of fig. 3, as shown in fig. 7, the electronic device may merge the image 3 and the image 6 into the image 10.
Therefore, the diversity of the images arranged in the target arrangement template by the electronic equipment can be improved, and the user experience is improved.
Operation 2: and splitting the image of the target image area in the target arrangement template.
Wherein, when the target operation is operation 1, the sixth input may be an input of an image area of the image area by a user, such as a long press input of an image on a different image area.
Optionally, the pressing duration of the sixth input corresponds to the number of image areas into which the image area is divided. For example, when the pressing time length of the sixth input is in a certain time length range (for example, the time length between 1 second and 1.5 seconds), the sixth input is used for triggering the electronic device to divide the image on one image area into two images; when the pressing time length of the sixth input is in a certain time length range (for example, the time length between 1.5 seconds and 2.5 seconds), the sixth input is used for triggering the electronic equipment to divide the image on one image area into three images.
For example, in conjunction with fig. 3, in a case where a user performs a press input on an image 3 shown in (a) of fig. 3 and the pressing time period is 1.2 seconds, the electronic apparatus may divide the image 3 into two images arranged up and down.
Operation 3: and adjusting the image area where the images in the target arrangement template are located.
Wherein, when the target operation is operation 3, the sixth input may be an input in which the user drags the image on a different image area.
Illustratively, in conjunction with fig. 3, the user may drag image 3 shown in (a) of fig. 3 from image region P3 over middle image region P6 to trigger the electronic device to display image 6 on image region P3 and image 3 on image region P6.
And operation 4: and the target function is used for setting part or all of the images in the target arrangement template according to the operation indicated by the target function.
Optionally, in this embodiment of the present invention, the electronic device may display one or more function controls on the editing interface (e.g., the first editing interface and the second editing interface), so as to support a user to trigger the electronic device to execute a corresponding target function through each function control. Wherein one function control corresponds to one target function.
In addition, the electronic device may display the one or more function controls in a one-level menu, or may display the one or more function controls in a multi-level menu. For example, the electronic device can display an extension control in the upper right corner of the editing interface in the gallery application, and the next level of the extension control is one or more functionality controls, i.e., the extension control is used for expanding the one or more functionality controls.
Optionally, in the embodiment of the present invention, the target function is any one of the following: cutting an image, rotating the image, adjusting image tone, sharpening the image, adjusting image color temperature, adjusting image saturation, adjusting image brightness, adjusting image contrast, adding an image filter, and filling the image in a free image area in M image areas indicated by the target arrangement template.
Illustratively, the image filter may be a style filter, such as a people filter, a landscape filter, a delicacy filter, and the like. In addition, the image filter can also comprise image special effects, such as displaying specific patterns in an image display area, such as rainbow, love, apple and the like.
Alternatively, in a case where a partial image in the target arrangement template is selected by the user, the electronic device may perform an operation of a target function instruction on the partial image, such as adding a filter to the partial image. In the case where all the images in the target arrangement template are selected by the user, the electronic device may perform an operation of target function indication on the all the images, such as adding a filter to the all the images in their entirety, such as adding a filter including a rainbow pattern to the all the images in their entirety.
Illustratively, referring to fig. 3, as shown in fig. 8, the editing interface displayed by the electronic device includes an editing control K1, a filter control K2, a filling control K3, a sharing control K4, and an expanding control K5. The editing control K1 is used to provide cropping images, rotating images, adjusting image tone, sharpening images, adjusting image color temperature, adjusting image saturation, adjusting image brightness, and adjusting image contrast. Filter control K2 is used to provide filter addition functionality. The fill control K3 is used to trigger the electronic device to fill in a preset fill image on a free image area in the target arrangement template. The sharing control K4 is used to trigger the electronic device to send the target image array through the communication application. The expansion control K5 is used to provide other functions for setting images, such as a function for switching the currently arranged template.
Illustratively, after a user inputs filter control K2 shown in fig. 8, the electronic device may display one or more filter template identifications, such as rainbow filter template identification 91, and filter extension control 92 (for presenting more filters) below the target arrangement template, as shown in fig. 9 (a). Subsequently, after the user inputs the rainbow filter template flag 91, the electronic device may display that rainbow filters are added to the entirety of the images 1 to 9 as shown in (b) in fig. 9. Moreover, convenience of arranging the same filter for a plurality of images by the electronic equipment is improved.
Illustratively, after the user inputs the fill control K3 shown in fig. 8, as shown in (a) of fig. 10, the electronic device may display one or more fill effect identifications, such as a fill effect identification 10a (i.e., a fill effect indicating that images at the four corners of the squared grid are added as preset fill images of a star pattern) and a fill effect expansion control 10b (for showing more fill effects) below the target arrangement template. Subsequently, after the user inputs the fill effect flag 10a, as shown in (b) of fig. 10, the electronic device may display preset fill patterns that fill the star patterns in the image region P1, the image region P3, the image region P7, and the image region P9, respectively, while displaying the image 2, the image 4, the image 5, the image 6, and the image 8 in the image region P2, the image region P4 to the image region P6, and the image region P8, respectively.
The electronic device can provide various modes for editing the images in the target arrangement template, for example, different operations are performed on the images through triggering of function controls with different functions, so that the diversity of the images edited by the electronic device according to the target arrangement template triggered by a user is facilitated.
Optionally, the image processing method provided in the embodiment of the present invention, after S203, may further include S214:
s214, the electronic equipment stores at least one of the following items: a target image array, a target filter of the target image array.
The target filter comprises at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value and a contrast value of the target image array.
Therefore, the electronic equipment can store the target image array and the target filter of the target image array, so that a subsequent user can conveniently acquire the stored target image array and the stored target filter, and the subsequent user can conveniently view or use the target image array and the target filter.
Fig. 11 is a schematic view of a possible structure of an electronic device according to an embodiment of the present invention. As shown in fig. 11, the electronic apparatus 11 includes: a display module 11a, a receiving module 11b, an editing module 11c and a sending module 11 d; the display module 11a is configured to display a first editing interface, where the first editing interface includes M image areas indicated by the target arrangement template, and N first images are displayed on N image areas in the first editing interface; a receiving module 11b for receiving a first input; an editing module 11c, configured to, in response to the first input received by the receiving module 11b, edit at least one image of the N first images displayed by the display module 11a, resulting in a target image array; the receiving module 11b is further configured to receive a second input; a sending module 11d, configured to send, in response to the second input received by the receiving module 11b, the target image array obtained by the editing module 11 c; wherein N is a positive integer less than or equal to M.
Optionally, the display module 11a is further configured to display the target image array on the sending interface according to the target arrangement template after the sending module 11d sends the target image array; the sending interface is used for displaying images to be sent or sent by the electronic equipment.
Optionally, the display module 11a is further configured to, before the editing module 11c edits at least one of the N first images to obtain the target image array, display K preset filling images on K image areas in the first editing interface; the editing module 11c is specifically configured to edit at least one of the N first images, and edit at least one of the K preset filling images to obtain a target image array; wherein K is a positive integer, and the sum of K and N is M.
Optionally, the receiving module 11b is further configured to receive a third input when P second images are displayed before the displaying module 11a displays the first editing interface; a display module 11a, further configured to display at least one arrangement template identifier in response to a third input received by the receiving module 11b, one arrangement template identifier being used to indicate one image arrangement template, and one image arrangement template being used to indicate a plurality of image areas; the receiving module 11b is further configured to receive a fourth input to a target arrangement template identifier in the at least one arrangement template identifier; the display module 11a is further configured to, in response to the fourth input received by the receiving module 11b, display a second editing interface, where the second editing interface includes M image regions pointed by the target arrangement template, and P second images are displayed on P image regions in the second editing interface; the target arrangement template mark is used for indicating a target arrangement template, the P second images are images in the N first images, and P is a positive integer smaller than or equal to N.
Optionally, P is less than N; the receiving module 11b is further configured to receive Q fifth inputs after the displaying module 11a displays the second editing interface and before the displaying module displays the first editing interface; the display module 11a is specifically configured to respond to the Q fifth inputs, and display a first editing interface, where Q third images are displayed on Q image areas in the first editing interface; wherein, the Q image areas are the image areas except the P image areas in the N image areas; the Q third pictures are the pictures other than the P second pictures among the N first pictures.
Optionally, the receiving module 11b is further configured to, before the editing module 11c edits at least one of the N first images and edits at least one of the K preset filling images to obtain the target image array, receive a sixth input; the editing module 11c is specifically configured to, in response to the sixth input received by the receiving module 11b, edit at least one of the N first images by performing a target operation, and edit at least one of the K preset filler images to obtain a target image array; wherein the target operation comprises any one of the following: combining the images of different image areas in the M image areas, splitting the image of the target image area in the M image areas, adjusting the image area where the image in the M image areas is located, and performing operation indicated by the target function; the target function is used for editing the image in part or all of the M image areas, and the target image area is any one of the M image areas.
Optionally, the target function is any one of the following: cutting an image, rotating the image, adjusting image tone, sharpening the image, adjusting image color temperature, adjusting image saturation, adjusting image brightness, adjusting image contrast, adding an image filter, and filling the image in a free image area in the M image areas.
Optionally, the electronic device 11 further includes: a storage module; a saving module, configured to edit at least one of the N first images by the editing module 11c, and edit at least one of the K preset filling images, so as to obtain a target image array, and then save at least one of the following images: a target image array, a target filter of the target image array; the target filter comprises at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value and a contrast value of the target image array.
The electronic device 11 provided in the embodiment of the present invention can implement each process implemented by the electronic device in the above method embodiments, and is not described here again to avoid repetition.
The electronic device provided by the embodiment of the invention can display the first editing interface comprising the M image areas indicated by the target arrangement template, wherein N first images are displayed on N image areas in the M image areas. Subsequently, at least one of the N first images may be edited by the first input, resulting in the target image array. Further, through the second input, the target image array may be transmitted. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; the images in the target image array obtained by editing in the first image editing interface according to the target arrangement template can be quickly and conveniently selected and sent as a whole. For example, the electronic device may edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party cropping application, so that the user may quickly and conveniently control the electronic device to edit the N first images in the gallery application without opening multiple applications. In addition, since the electronic device can transmit the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and transmit the images in real time. Namely, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
Fig. 12 is a schematic diagram of a hardware structure of an electronic device 100 according to an embodiment of the present invention, where the electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 12 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
The processor 110 is configured to control the display unit 106 to display a first editing interface, where the first editing interface includes M image areas indicated by the target arrangement template, and N first images are displayed on N image areas in the first editing interface.
The processor 110 is further configured to control the user input unit 107 to receive a first input.
The processor 110 is further configured to edit at least one of the N first images displayed by the display unit 106 in response to the first input received by the user input unit 107, so as to obtain the target image array.
The processor 110 is further configured to control the user input unit 107 to receive a second input.
The processor 110 is further configured to control the radio frequency unit 101 to transmit the target image array in response to a second input received by the user input unit 107; wherein N is a positive integer less than or equal to M.
The electronic device provided by the embodiment of the invention can display the first editing interface comprising the M image areas indicated by the target arrangement template, wherein N first images are displayed on N image areas in the M image areas. Subsequently, at least one of the N first images may be edited by the first input, resulting in the target image array. Further, through the second input, the target image array may be transmitted. Therefore, under the condition that the user needs to share the images in the target image array, the user does not need to select the images one by one, edit the images in real time and then send the images; the images in the target image array obtained by editing in the first image editing interface according to the target arrangement template can be quickly and conveniently selected and sent as a whole. For example, the electronic device may edit the N first images according to the target arrangement template in the first editing interface of the gallery application without editing the N first images through a special third-party cropping application, so that the user may quickly and conveniently control the electronic device to edit the N first images in the gallery application without opening multiple applications. In addition, since the electronic device can transmit the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and transmit the images in real time. Namely, the user can trigger the electronic device to quickly and conveniently edit and share the image through the integrated operation of the gallery application and the communication application.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 12, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power source 111 (such as a battery) for supplying power to each component, and preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program, when executed by the processor 110, implements each process of the foregoing method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (18)

1. An image processing method applied to an electronic device, the method comprising:
displaying a first editing interface, wherein the first editing interface comprises M image areas indicated by a target arrangement template, and N first images are displayed on N image areas in the first editing interface;
receiving a first input;
at least one image in the N first images is edited in response to the first input, and a target image array is obtained;
receiving a second input;
transmitting the target image array in response to the second input;
wherein N is a positive integer less than or equal to M.
2. The method of claim 1, wherein after said transmitting the array of target images, the method further comprises:
displaying the target image array on a sending interface according to the target arrangement template;
the sending interface is used for displaying images to be sent or sent by the electronic equipment.
3. The method according to claim 1 or 2, wherein before said editing at least one of said N first images into a target image array, said method further comprises:
displaying K preset filling images on K image areas in the first editing interface;
the editing at least one of the N first images to obtain a target image array includes:
editing at least one image in the N first images, and editing at least one image in the K preset filling images to obtain the target image array;
wherein K is a positive integer, and the sum of K and N is M.
4. The method of claim 1, wherein prior to displaying the first editing interface, the method further comprises:
receiving a third input in a case where the P second images are displayed;
displaying at least one arrangement template identifier in response to the third input, one arrangement template identifier indicating one image arrangement template, one image arrangement template indicating a plurality of image regions;
receiving a fourth input to a target arrangement template identifier of the at least one arrangement template identifier;
in response to the fourth input, displaying a second editing interface, the second editing interface including the M image regions pointed by the target arrangement template, the P second images being displayed on P image regions in the second editing interface;
the target arrangement template identifier is used for indicating the target arrangement template, the P second images are images in the N first images, and P is a positive integer smaller than or equal to N.
5. The method of claim 4, wherein P is less than N;
after the displaying the second editing interface and before the displaying the first editing interface, the method further comprises:
receiving Q fifth inputs;
the displaying a first editing interface includes:
in response to the Q fifth inputs, displaying the first editing interface, wherein Q third images are displayed on Q image areas in the first editing interface;
wherein the Q image areas are image areas of the N image areas except the P image areas; the Q third pictures are the pictures of the N first pictures except the P second pictures.
6. The method according to claim 3, wherein before editing at least one of the N first images and at least one of the K preset filler images to obtain the target image array, the method further comprises:
receiving a sixth input;
editing at least one of the N first images and at least one of the K preset filling images to obtain the target image array, wherein the editing comprises
Responding to the sixth input, editing at least one image in the N first images by executing target operation, and editing at least one image in the K preset filling images to obtain a target image array;
wherein the target operation comprises any one of: merging images of different image areas in the M image areas, splitting an image of a target image area in the M image areas, adjusting the image area where the image in the M image areas is located, and performing operation indicated by a target function;
the target function is used for editing images in part or all of the M image areas, and the target image area is any one of the M image areas.
7. The method of claim 6, wherein the target function is any one of: cutting an image, rotating the image, adjusting image tone, sharpening the image, adjusting image color temperature, adjusting image saturation, adjusting image brightness, adjusting image contrast, adding an image filter, and filling the image in a free image area in the M image areas.
8. The method according to claim 7, wherein after editing at least one of the N first images and at least one of the K preset filler images to obtain the target image array, the method further comprises:
saving at least one of: the target image array, a target filter of the target image array;
wherein the target filter includes at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value, a contrast value of the target image array.
9. An electronic device, characterized in that the electronic device comprises: the device comprises a display module, a receiving module, an editing module and a sending module;
the display module is used for displaying a first editing interface, the first editing interface comprises M image areas indicated by a target arrangement template, and N first images are displayed on the N image areas in the first editing interface;
the receiving module is used for receiving a first input;
the editing module is used for responding to the first input received by the receiving module and editing at least one image in the N first images displayed by the display module to obtain a target image array;
the receiving module is further used for receiving a second input;
the sending module is configured to send the target image array obtained by the editing module in response to the second input received by the receiving module;
wherein N is a positive integer less than or equal to M.
10. The electronic device of claim 9, wherein the display module is further configured to display the target image array on a sending interface according to the target arrangement template after the sending module sends the target image array;
the sending interface is used for displaying images to be sent or sent by the electronic equipment.
11. The electronic device according to claim 9 or 10, wherein the display module is further configured to display K preset filler images on K image areas in the first editing interface before the editing module edits at least one of the N first images to obtain the target image array;
the editing module is specifically configured to edit at least one of the N first images, and edit at least one of the K preset filling images to obtain the target image array;
wherein K is a positive integer, and the sum of K and N is M.
12. The electronic device of claim 9, wherein the receiving module is further configured to receive a third input when the P second images are displayed before the displaying module displays the first editing interface;
the display module is further configured to display at least one arrangement template identifier in response to the third input received by the receiving module, one arrangement template identifier being used to indicate one image arrangement template, and one image arrangement template being used to indicate a plurality of image areas;
the receiving module is further configured to receive a fourth input of a target arrangement template identifier of the at least one arrangement template identifier;
the display module is further configured to display a second editing interface in response to the fourth input received by the receiving module, where the second editing interface includes the M image regions pointed by the target arrangement template, and P second images are displayed on P image regions in the second editing interface;
the target arrangement template identifier is used for indicating the target arrangement template, the P second images are images in the N first images, and P is a positive integer smaller than or equal to N.
13. The electronic device of claim 12, wherein P is less than N;
the receiving module is further configured to receive Q fifth inputs after the display module displays the second editing interface and before the display module displays the first editing interface;
the display module is specifically configured to display the first editing interface in response to the Q fifth inputs, where Q third images are displayed on Q image areas in the first editing interface;
wherein the Q image areas are image areas of the N image areas except the P image areas; the Q third pictures are the pictures of the N first pictures except the P second pictures.
14. The electronic device of claim 11,
the receiving module is further configured to receive a sixth input before the editing module edits at least one of the N first images and edits at least one of the K preset filling images to obtain the target image array;
the editing module is specifically configured to, in response to the sixth input received by the receiving module, edit at least one of the N first images by performing a target operation, and edit at least one of the K preset filling images to obtain the target image array;
wherein the target operation comprises any one of: merging images of different image areas in the M image areas, splitting an image of a target image area in the M image areas, adjusting the image area where the image in the M image areas is located, and performing operation indicated by a target function;
the target function is used for editing images in part or all of the M image areas, and the target image area is any one of the M image areas.
15. The electronic device of claim 14, wherein the target function is any one of: cutting an image, rotating the image, adjusting image tone, sharpening the image, adjusting image color temperature, adjusting image saturation, adjusting image brightness, adjusting image contrast, adding an image filter, and filling the image in a free image area in the M image areas.
16. The electronic device of claim 15, further comprising: a storage module;
the storage module is configured to, after the editing module edits at least one of the N first images and edits at least one of the K preset filling images to obtain the target image array, store at least one of the following images: the target image array, a target filter of the target image array;
wherein the target filter includes at least one of a hue value, a sharpening value, a color temperature value, a saturation value, a brightness value, a contrast value of the target image array.
17. An electronic device, comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image processing method according to any one of claims 1 to 8.
18. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 8.
CN201911319652.0A 2019-12-19 2019-12-19 Image processing method and electronic equipment Active CN111127595B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911319652.0A CN111127595B (en) 2019-12-19 2019-12-19 Image processing method and electronic equipment
PCT/CN2020/136731 WO2021121253A1 (en) 2019-12-19 2020-12-16 Image processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911319652.0A CN111127595B (en) 2019-12-19 2019-12-19 Image processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111127595A true CN111127595A (en) 2020-05-08
CN111127595B CN111127595B (en) 2023-11-03

Family

ID=70500252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911319652.0A Active CN111127595B (en) 2019-12-19 2019-12-19 Image processing method and electronic equipment

Country Status (2)

Country Link
CN (1) CN111127595B (en)
WO (1) WO2021121253A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866379A (en) * 2020-07-03 2020-10-30 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN112312022A (en) * 2020-10-30 2021-02-02 维沃移动通信有限公司 Image processing method, image processing apparatus, electronic device, and storage medium
WO2021121253A1 (en) * 2019-12-19 2021-06-24 维沃移动通信有限公司 Image processing method and electronic device
CN113888549A (en) * 2021-09-29 2022-01-04 乐美科技股份私人有限公司 Picture generation method and device, electronic equipment and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114430460A (en) * 2022-01-28 2022-05-03 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN114500844A (en) * 2022-01-28 2022-05-13 维沃移动通信有限公司 Shooting method and device and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120082401A1 (en) * 2010-05-13 2012-04-05 Kelly Berger System and method for automatic discovering and creating photo stories
US20140372919A1 (en) * 2013-06-17 2014-12-18 Beijing Kingsoft Internet Security Software Co., Ltd Method, apparatus and mobile terminal for editing an image
CN104881844A (en) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 Picture combination method and device and terminal equipment
CN105955607A (en) * 2016-04-22 2016-09-21 北京小米移动软件有限公司 Content sharing method and apparatus
CN106407365A (en) * 2016-09-08 2017-02-15 北京小米移动软件有限公司 Picture sharing method and apparatus
CN110084871A (en) * 2019-05-06 2019-08-02 珠海格力电器股份有限公司 Image typesetting process and device, electric terminal
CN110147190A (en) * 2018-06-29 2019-08-20 腾讯科技(深圳)有限公司 Image processing method and electric terminal
CN110490808A (en) * 2019-08-27 2019-11-22 腾讯科技(深圳)有限公司 Picture joining method, device, terminal and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104168417B (en) * 2014-05-20 2019-09-13 腾讯科技(深圳)有限公司 Image processing method and device
CN105320695B (en) * 2014-07-31 2020-01-17 腾讯科技(深圳)有限公司 Picture processing method and device
US20180095653A1 (en) * 2015-08-14 2018-04-05 Martin Hasek Device, method and graphical user interface for handwritten interaction
CN111127595B (en) * 2019-12-19 2023-11-03 维沃移动通信有限公司 Image processing method and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120082401A1 (en) * 2010-05-13 2012-04-05 Kelly Berger System and method for automatic discovering and creating photo stories
US20140372919A1 (en) * 2013-06-17 2014-12-18 Beijing Kingsoft Internet Security Software Co., Ltd Method, apparatus and mobile terminal for editing an image
CN104881844A (en) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 Picture combination method and device and terminal equipment
CN105955607A (en) * 2016-04-22 2016-09-21 北京小米移动软件有限公司 Content sharing method and apparatus
CN106407365A (en) * 2016-09-08 2017-02-15 北京小米移动软件有限公司 Picture sharing method and apparatus
CN110147190A (en) * 2018-06-29 2019-08-20 腾讯科技(深圳)有限公司 Image processing method and electric terminal
CN110084871A (en) * 2019-05-06 2019-08-02 珠海格力电器股份有限公司 Image typesetting process and device, electric terminal
CN110490808A (en) * 2019-08-27 2019-11-22 腾讯科技(深圳)有限公司 Picture joining method, device, terminal and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021121253A1 (en) * 2019-12-19 2021-06-24 维沃移动通信有限公司 Image processing method and electronic device
CN111866379A (en) * 2020-07-03 2020-10-30 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN112312022A (en) * 2020-10-30 2021-02-02 维沃移动通信有限公司 Image processing method, image processing apparatus, electronic device, and storage medium
CN113888549A (en) * 2021-09-29 2022-01-04 乐美科技股份私人有限公司 Picture generation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2021121253A1 (en) 2021-06-24
CN111127595B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
CN111061574B (en) Object sharing method and electronic device
CN111127595B (en) Image processing method and electronic equipment
WO2020063091A1 (en) Picture processing method and terminal device
CN109002243B (en) Image parameter adjusting method and terminal equipment
CN111596817B (en) Icon moving method and electronic equipment
US11658932B2 (en) Message sending method and terminal device
WO2019149028A1 (en) Application download method and terminal
CN108920226B (en) Screen recording method and device
EP3699743B1 (en) Image viewing method and mobile terminal
CN110908554B (en) Long screenshot method and terminal device
CN110752981B (en) Information control method and electronic equipment
CN111026299A (en) Information sharing method and electronic equipment
CN109828731B (en) Searching method and terminal equipment
CN111064848B (en) Picture display method and electronic equipment
CN108804628B (en) Picture display method and terminal
CN110913261A (en) Multimedia file generation method and electronic equipment
CN111159449A (en) Image display method and electronic equipment
CN110768804A (en) Group creation method and terminal device
CN111124231B (en) Picture generation method and electronic equipment
CN108881742B (en) Video generation method and terminal equipment
CN108984062B (en) Content display method and terminal
CN108696642B (en) Method for arranging icons and mobile terminal
CN110703972A (en) File control method and electronic equipment
CN108196754B (en) Method, terminal and server for displaying object
CN111694497B (en) Page combination method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant