CN112748829A - Picture editing method, device, equipment and storage medium - Google Patents

Picture editing method, device, equipment and storage medium Download PDF

Info

Publication number
CN112748829A
CN112748829A CN202010676536.0A CN202010676536A CN112748829A CN 112748829 A CN112748829 A CN 112748829A CN 202010676536 A CN202010676536 A CN 202010676536A CN 112748829 A CN112748829 A CN 112748829A
Authority
CN
China
Prior art keywords
picture
color
sticker
target
edited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010676536.0A
Other languages
Chinese (zh)
Inventor
钟媛
胡文玥
刘佳卉
李静秋
陈柯辰
程功凡
黄泽彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Cyber Tianjin Co Ltd
Original Assignee
Tencent Cyber Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Cyber Tianjin Co Ltd filed Critical Tencent Cyber Tianjin Co Ltd
Priority to CN202010676536.0A priority Critical patent/CN112748829A/en
Publication of CN112748829A publication Critical patent/CN112748829A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application discloses a picture editing method, a device, equipment and a storage medium, wherein the method comprises the following steps: acquiring a picture to be edited, and extracting at least one picture characteristic color from the picture to be edited; generating a picture paster set according to at least one picture characteristic color, and outputting the picture paster set to a picture editing user interface for selection by a user, wherein the picture paster set comprises at least one paster pattern color and at least one paster pattern; determining a target sticker color and a target sticker style selected by a user based on the picture editing user interface, applying the target sticker color selection to the target sticker style to generate a target sticker; and adding the target paster to the picture to be edited to generate a target picture. By adopting the embodiment of the invention, the color of the background of the sticker can be changed according to the characteristic color of the picture, the color conflict between the sticker and the picture can be avoided, and the applicability is high.

Description

Picture editing method, device, equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium.
Background
With the development of image processing technology, people can edit pictures very conveniently in daily life. In the process of editing pictures, people often add stickers to the pictures to prompt the contents of the pictures or add interest and aesthetic feeling to the pictures. The inventor of the application discovers that the stickers are fixed in the prior art when the stickers are added to the pictures in the research and practice processes, and the stickers often conflict with the colors of the pictures when the stickers are used for editing the pictures, so that the pictures of the pictures are disordered and have poor applicability.
Disclosure of Invention
The embodiment of the application provides a picture editing method, a picture editing device, a picture editing equipment and a storage medium, which can realize that the background color of a sticker can be changed according to the characteristic color of a picture, can avoid the color conflict between the sticker and the picture, and have high applicability.
In a first aspect, an embodiment of the present application provides a method for editing a picture, where the method includes:
acquiring a picture to be edited, and extracting at least one picture characteristic color from the picture to be edited;
generating a picture paster set according to the at least one picture characteristic color, and outputting the picture paster set to a picture editing user interface for selection by a user, wherein the picture paster set comprises at least one paster sample color and at least one paster sample;
determining a target sticker color and a target sticker style selected by the user based on the picture editing user interface, and applying the target sticker color selection to the target sticker style to generate a target sticker;
and adding the target paster to the picture to be edited to generate a target picture.
With reference to the first aspect, in a possible implementation manner, the generating a picture sticker set according to the at least one picture characteristic color includes:
and applying the at least one picture characteristic color to at least one preset sticker sample color to obtain at least one alternative picture sticker, and generating a picture sticker set according to the at least one alternative picture sticker.
Wherein, an alternative picture sticker in the picture sticker set is obtained by combining a sticker pattern color and a picture sticker pattern.
The picture sticker set further comprises at least one preset sticker color, and the at least one preset sticker color is used for being combined with the at least one sticker style to obtain at least one alternative picture sticker.
With reference to the first aspect, in a possible implementation manner, the determining, based on the picture editing user interface, the target sticker style and the target sticker style selected by the user includes:
and acquiring a user operation instruction based on the picture editing user interface, and determining the target sticker style and the target sticker style selected by the user according to the user operation instruction.
With reference to the first aspect, in a possible implementation manner, the extracting at least one picture characteristic color from the picture to be edited includes:
and determining at least two initial characteristic colors from the picture to be edited, and determining the color weight of each initial characteristic color in the at least two initial characteristic colors.
And determining at least one initial characteristic color from the at least two initial characteristic colors as the picture characteristic color of the picture to be edited according to the color weight of each initial characteristic color and the color distance between the initial characteristic colors.
With reference to the first aspect, in a possible implementation manner, the determining at least two initial characteristic colors from the picture to be edited includes:
and acquiring at least two pixel points with the color saturation greater than or equal to a preset saturation threshold from the picture to be edited as an initial clustering center.
And clustering the pixel points in the picture to be edited according to the initial clustering centers to obtain the colors of at least two clustering center pixel points, and taking the colors of the at least two clustering center pixel points as at least two initial characteristic colors.
With reference to the first aspect, in one possible implementation manner, the determining, from the at least two initial characteristic colors, at least one initial characteristic color as the picture characteristic color of the picture to be edited according to the color weight of each initial characteristic color and the color distance between the initial characteristic colors includes:
at least one weighting parameter of each initial characteristic color is determined, and a weighting coefficient of each weighting parameter of each initial characteristic color is determined, wherein the weighting parameters comprise color area, saturation, brightness or hue.
And calculating the color weight of each initial characteristic color according to at least two weighting parameters of each initial characteristic color and the weighting coefficient of each weighting parameter, and sequencing the initial characteristic colors according to the order of the color weight of each initial characteristic color from high to low.
And calculating the color distance between the initial characteristic colors, and removing the initial characteristic color with smaller color weight from the two initial characteristic colors with the color distance smaller than a preset color distance threshold value to obtain at least one initial characteristic color as the picture characteristic color of the picture to be edited.
In a second aspect, an embodiment of the present application provides a picture editing apparatus, including:
the picture acquisition module is used for acquiring a picture to be edited;
the color processing module is used for extracting at least one characteristic color from the picture to be edited;
the sticker set module is used for generating a picture sticker set according to the at least one picture characteristic color;
the sticker display module is used for outputting the picture sticker set to a picture editing user interface for selection by a user, determining a target sticker color selection and a target sticker style selected by the user, and applying the target sticker color selection to the target sticker style to generate a target sticker;
and the picture generation module is used for adding the target sticker to the picture to be edited to generate a target picture.
With reference to the second aspect, in a possible implementation manner, the sticker aggregation module includes a characteristic color filling unit, configured to apply the at least one picture characteristic color to at least one preset sticker sample color to obtain at least one alternative picture sticker, and generate a picture sticker aggregation according to the at least one alternative picture sticker;
wherein, an alternative picture sticker in the picture sticker set is obtained by combining a sticker pattern color and a picture sticker pattern.
With reference to the second aspect, in a possible implementation manner, the sticker aggregation module further includes a preset color filling unit, configured to combine at least one preset sticker color with the at least one sticker pattern to obtain at least one alternative picture sticker.
With reference to the second aspect, in a possible implementation manner, the sticker display module further includes a sticker confirmation unit, configured to obtain a user operation instruction, and determine the target sticker color and the target sticker style selected by the user according to the user operation instruction.
In combination with the second aspect, in a possible implementation manner, the color processing module further includes an initial clustering unit, configured to obtain, from the to-be-edited picture, at least two pixel points whose color saturation is greater than or equal to a preset saturation threshold as initial clustering centers, cluster the pixel points in the to-be-edited picture according to the initial clustering centers to obtain colors of the at least two clustering center pixel points, and use the colors of the at least two clustering center pixel points as at least two initial characteristic colors.
With reference to the second aspect, in a possible implementation manner, the color processing module further includes a color sorting unit and a color filtering unit, where:
the color sorting unit is configured to determine at least one weighting parameter of each of the initial characteristic colors, determine a weighting coefficient of each weighting parameter of each of the initial characteristic colors, the weighting parameters including color area, saturation, lightness, or hue, calculate a color weight of each of the initial characteristic colors from at least two kinds of weighting parameters of each of the initial characteristic colors and the weighting coefficient of each of the weighting parameters, and sort the initial characteristic colors in descending order of the color weights of the initial characteristic colors;
the color screening unit is configured to calculate a color distance between the initial feature colors, and remove an initial feature color with a smaller color weight from two initial feature colors with a color distance smaller than a preset color distance threshold to obtain at least one initial feature color as a picture feature color of the picture to be edited.
In a third aspect, an embodiment of the present application provides a picture editing apparatus, which includes a processor and a memory, where the processor and the memory are connected to each other. The memory is configured to store a computer program that supports the terminal to execute the method provided by the first aspect and/or any one of the possible implementation manners of the first aspect, where the computer program includes program instructions, and the processor is configured to call the program instructions to execute the method provided by the first aspect and/or any one of the possible implementation manners of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, where the computer program is executed by a processor to implement the method provided by the first aspect and/or any one of the possible implementation manners of the first aspect.
In the embodiment of the application, the background color of the sticker can be edited by extracting the characteristic color of the picture to be edited, so that the background color of the sticker is more laminated with the picture to be edited, the color conflict between the sticker background and the picture is avoided, the disordered picture condition is realized, and the attractiveness and the degree of freedom of picture editing are increased while the practicability is enhanced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a picture editing method according to an embodiment of the present application;
fig. 3 is a scene schematic diagram of picture editing according to an embodiment of the present application;
fig. 4 is an interface schematic diagram of a picture editing method provided in the embodiment of the present application;
fig. 5 is another interface schematic diagram of a picture editing method provided in an embodiment of the present application;
FIG. 6 is a schematic flow chart of the image feature color extraction provided in the embodiment of the present application;
fig. 7 is another schematic flowchart of a picture editing method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a picture editing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of another picture editing apparatus provided in the embodiment of the present application;
fig. 10 is a schematic structural diagram of an apparatus provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present invention. As shown in fig. 1, the network architecture may include a cloud server 2000 and a user terminal cluster; the user terminal cluster may include a plurality of user terminals, as shown in fig. 1, specifically including a user terminal 3000a, user terminals 3000b, …, and a user terminal 3000 n; as shown in fig. 1, the user terminal 3000a, the user terminals 3000b and …, and the user terminal 3000n may respectively establish a data connection relationship with the cloud server 2000 under a certain data interaction condition, so as to be capable of performing network connection with the cloud server 2000.
For convenience of understanding, the embodiment of the present invention may select one user terminal from the plurality of user terminals shown in fig. 1 as the target user terminal, for example, the user terminal 3000a shown in fig. 1 may be used as the target user terminal. Wherein, the target user terminal may include: the intelligent terminal comprises an intelligent terminal with a picture editing function, such as a smart phone, a tablet computer, a desktop computer and an intelligent television. Each terminal can edit pictures in a client corresponding to an application including a picture editing function, where the pictures referred to in the picture editing can be understood as network pictures downloaded from a network, pictures stored in a social application, pictures stored in a user album application, or pictures taken by a camera (e.g., a rear camera) in the terminal.
The application and the picture corresponding to the picture editing operation in each terminal can be different, the application and the picture corresponding to the picture editing operation can be specifically determined by the user behavior corresponding to the terminal, and the user behavior can be represented as the operations of clicking, shooting, browsing, downloading and the like of the user at the current moment in the client corresponding to the application with the picture editing function.
Referring to fig. 2, fig. 2 is a flow chart illustrating a picture editing method according to an embodiment of the present disclosure. The picture editing method shown in fig. 2 includes the steps of:
s101: and acquiring a picture to be edited, and extracting at least one picture characteristic color from the picture to be edited.
Referring to fig. 3, fig. 3 is a schematic view of a scene for editing a picture according to an embodiment of the present invention. As shown in fig. 3, the scenario takes a terminal 3000a in the embodiment corresponding to fig. 1 as an example, where applications such as application a, application B, and application C may be displayed to a user based on the terminal, and at this time, the user may access the application by clicking any application among the application a, application B, and application C, so that in a process of accessing any application among the application a, application B, and application C, a picture editing operation on any picture among the application a, application B, and application C is triggered. Optionally, in the process that the user triggers a picture editing operation on any picture in any application of the application a, the application B, and the application C, a picture that is designated as a picture to be edited in any application of the application a, the application B, and the application C that the user attempts to acquire may be used to perform the picture editing operation.
In some feasible embodiments, after the terminal obtains the picture to be edited, the terminal may analyze the picture information to be edited in a preset feature space, for example, analyze a color vector of each pixel point of the picture in an RGB color space, to obtain a feature vector of each pixel point in the picture to be edited. A preset clustering algorithm is utilized to randomly generate a plurality of clustering centers in the pixel points of the picture to be edited, and for convenience of expression, in this embodiment, a k-means algorithm and 16 clustering centers are taken as an example for description. The terminal can calculate the distance from each pixel point to each clustering center and classify the pixel points into the category represented by the clustering center closest to the pixel point. The terminal can calculate the arithmetic mean of the pixel points in each sample category in turn, and mark the pixel point closest to the arithmetic mean in each category as a new clustering center. Judging whether the clustering centers in all the categories are changed or not, if so, clustering all the pixel points by using the new clustering center again until all the clustering centers are not changed or the distance between the new clustering center and the old clustering center is less than a set threshold value, terminating the algorithm to obtain 16 final clustering centers, and extracting the color represented by the pixel points of the clustering centers at the moment as the picture characteristic color of the picture to be edited. The distance from each pixel point to each cluster center can be calculated by formulae such as Euclidean distance, Manhattan distance, Chebyshev distance, cosine distance and the like, and can be specifically determined according to an actual application scene without limitation.
In some feasible implementation modes, due to the fact that the size of the picture to be edited is too large, the clustering process is too slow, the resolution of the picture to be edited can be reduced, at the moment, the terminal can extract the characteristic color of the picture after the picture is down-sampled to the preset size, the clustering speed can be guaranteed while the characteristic color extraction result is not influenced, and the efficiency of a clustering algorithm is improved.
S102: and generating a picture paster set according to the at least one picture characteristic color, and outputting the picture paster set to a picture editing user interface for a user to select.
In some possible implementations, referring to fig. 4, fig. 4 is an interface schematic diagram of a picture editing method provided in an embodiment of the present invention. When a user accesses the application A to obtain a picture to be edited, performs picture characteristic color extraction on the picture to be edited, and generates the picture sticker set, the terminal can generate a request for requesting the application A to display the generated picture sticker set, so that the picture sticker set is generated on a picture editing user interface for the user to select. The terminal can determine the target sticker style and the target sticker style selected by the user by collecting the click operation of the user on the sticker collection area 200 c.
In some possible embodiments, the set of picture stickers includes a plurality of alternative picture stickers, and the alternative picture stickers include at least one sticker pattern color and at least one picture sticker pattern. That is, the picture sticker sets may include alternative picture stickers formed by applying picture characteristic colors in the same picture sticker style, or may include alternative picture stickers formed by applying picture characteristic colors in a plurality of picture sticker styles. The terminal may determine the target sticker style selected by the user by collecting the user's click operation on the portion of the sticker style in the sticker collection area 200 c. For example, the terminal may capture a user click operation on the portion of the decal color in the decal collection area 200c, as shown in the solid line portion in fig. 4, to determine the target decal color selected by the user. While the user switches the click operation on the sticker-like color portion, the color of the picture sticker in the sticker collection area 200c can be changed accordingly, as shown by the dotted line portion in fig. 4.
In some possible embodiments, after extracting the characteristic color of the picture to be edited based on a picture editing operation triggered by a user, since the extracted characteristic color of the picture may be too close to the color of the picture to be edited and is not suitable for being used as the sticker color of the picture sticker, the terminal may configure the sticker color parameter in the request for generating the picture sticker set sent by the application a, so that the sticker color also includes some commonly used sticker colors as the preset sticker colors. Specifically, the terminal may obtain a request for generating a picture sticker set sent by the application a, determine a request for generating a sticker pattern color carried in the request for generating a picture sticker set, determine a preset sticker pattern color called by the request for generating a picture sticker set according to the request for generating a sticker pattern color, and display the preset sticker pattern color in the sticker set area 200 c. That is, the sticker pattern color may include the picture characteristic color, or may include a preset sticker pattern color. The terminal can determine the target sticker style and the target sticker style selected by the user by collecting the click operation of the user on the sticker collection area 200 c.
S103: and determining the target sticker color and the target sticker style selected by the user based on the picture editing user interface, and applying the target sticker color selection to the target sticker style to generate a target sticker.
In some possible implementations, when the user selects the destination sticker paper pattern and the target sticker paper pattern by clicking the sticker collection area, and generates the target sticker, a display interface of the terminal is as shown in fig. 5, where fig. 5 is another interface diagram of the picture editing method provided by the embodiment of the present invention.
In some possible embodiments, after selecting the target sticker style and the target sticker style, if the target sticker style is a sticker style that can add text, the text editing region 400b may be output to a picture editing user interface displayed by the terminal for editing by the user. Wherein, the user can select the color of the sticker text and input the sticker text by clicking the text editing area 400 b. The terminal can detect the user operation command to determine the color of the sticker text selected by the user and/or the sticker text input by the user, so as to generate the corresponding sticker text, and display the corresponding target sticker in the sticker adjustment area 400 c.
In some possible embodiments, the terminal may obtain the color of the sticker text selected by the user by collecting the user's operation on the text editing area 400 b. The method comprises the steps of pasting the character color and the picture characteristic color of the paper and presetting the optional character color. The terminal may determine the user-selected text color of the sticker by collecting the user's clicking operations on the text color portion of the sticker in the text editing area 400b, as shown by the solid line portion in fig. 5. While the user switches the click operation for selecting the character color of the sticker, the terminal can change the character color of the picture sticker accordingly, as shown by the dotted line portion in fig. 5. Or automatically generating the character color according to the brightness of the label paper pattern color selected by the user. For example, when the lightness of the sticker sample color is greater than or equal to a preset value, the sticker sample color is brighter, and the text color is automatically determined to be black; when the lightness of the paste paper sample color is smaller than the preset value, the paste paper sample color is darker, and the character color is automatically determined to be white.
S104: and adding the target paster to the picture to be edited to generate a target picture.
As shown in fig. 5, in some possible embodiments, after the terminal generates the target sticker, the user may perform operations of adjusting the target sticker, such as zooming in, zooming out, mirroring and rotating, on the display interface. The user operation instruction is obtained based on the picture editing user interface, if the user needs to adjust the target sticker in the picture editing process, the target sticker to be adjusted can be selected by clicking the sticker adjustment area 400c in the picture display area 400a to be edited, and operations such as amplification, reduction, mirroring or rotation of the target sticker are performed in the corresponding area. The terminal may generate a request for requesting application a to adjust the target sticker after detecting a user-triggered operation to adjust the target sticker. After the user confirms that the target sticker is adjusted, a request for confirming the generation of the target picture can be sent to the application A by clicking confirmation, and therefore the target picture is generated.
In the embodiment of the application, the background color of the sticker can be edited by extracting the characteristic color of the picture to be edited, so that the background color of the sticker is more laminated with the picture to be edited, the color conflict between the sticker background and the picture is avoided, the disordered picture condition is realized, and the attractiveness and the degree of freedom of picture editing are increased while the practicability is enhanced.
The following describes in detail the extraction of the characteristic color of the picture in the picture editing method provided by the present application with reference to fig. 6. Referring to fig. 6, fig. 6 is a schematic flow chart of extracting a characteristic color of a picture according to an embodiment of the present application. The method for extracting the characteristic color of the picture as shown in FIG. 6 comprises the following steps:
s201: at least two pixel points with the color saturation greater than or equal to a preset saturation threshold are obtained from the picture to be edited and serve as initial clustering centers.
In some feasible embodiments, after the terminal acquires the picture to be edited, analyzing the information of the picture to be edited in a preset feature space, for example, analyzing the color vector of each pixel point of the picture in the RGB color space, to obtain the feature vector of each pixel point in the picture to be edited. A preset clustering algorithm is utilized to randomly generate a plurality of clustering centers in the pixel points of the picture to be edited, and for convenience of expression, in this embodiment, a k-means algorithm and 16 clustering centers are taken as an example for description.
Since the characteristic color of the picture extracted at last by the method of randomly generating the cluster center has a certain randomness, in some feasible embodiments, the method of initializing the cluster center can be utilized to ensure that the color extracted each time is stable.
In some feasible embodiments, the terminal can analyze the pixel point information of the picture to be edited through different color spaces, so as to optimize the selection of the initial clustering center. For example, because it is difficult to numerically obtain the vividness of the color of a pixel point in the RGB color space, HSV (Hue, Saturation, brightness) color space may be used to analyze the pixel points of the picture to be edited, sort the Saturation of the pixel points of the picture to be edited, select the Saturation of the 16 th pixel point as the Saturation threshold, extract the pixel points whose Saturation is greater than or equal to the threshold as the initial clustering center, and thereby use the pixel points with the most vivid color in the picture to be edited as the initial clustering center.
In some feasible embodiments, the terminal may also set a saturation threshold according to practical applications, and select 16 pixel points of which the saturation is greater than or equal to the threshold in the picture to be edited as initial clustering centers, so as to use the pixel points with the most vivid color in the picture to be edited as the initial clustering centers.
In some feasible implementation manners, besides analyzing color characteristics such as hue, saturation and lightness of each pixel point of the picture to be edited, the terminal can also add information such as spatial position of the pixel points to jointly construct a feature space, and increase the feature vector dimension of the pixel points, so that the clustering result is more accurate.
S202: and clustering the pixel points in the picture to be edited according to the initial clustering center to obtain the color of the pixel points in the clustering center, and taking the color of the pixel points in the clustering center as an initial characteristic color.
In some feasible embodiments, the terminal may calculate the distance from each pixel point in the picture to be edited to each cluster center, and classify the pixel point into the category represented by the cluster center closest to the pixel point. And sequentially calculating the arithmetic mean of the pixel points in each sample category, and marking the pixel point closest to the arithmetic mean in each category as a new clustering center. The terminal can judge whether the clustering centers in all the categories are changed, if the clustering centers are changed, the new clustering centers are used for clustering all the pixel points again until all the clustering centers are not changed or the distance between the new clustering centers and the old clustering centers is smaller than a set threshold value, the algorithm is terminated, 16 final clustering centers are obtained, and the color represented by the pixel points of the clustering centers at the moment is extracted to be used as the picture characteristic color of the picture to be edited. The distance from each pixel point to each clustering center calculated by the terminal can be calculated by formulae such as Euclidean distance, Manhattan distance, Chebyshev distance, cosine distance and the like, and can be specifically determined according to an actual application scene without limitation.
S203: at least one weighting parameter for each of the initial characteristic colors is determined, and a weighting coefficient for each weighting parameter for each of the initial characteristic colors is determined.
In some possible embodiments, after obtaining the initial characteristic color, the terminal may obtain a more desirable characteristic color by weighting the initial characteristic color, for example, obtain a characteristic color with a larger color block area or a characteristic color with a higher brightness in the picture to be edited. Illustratively, the hue weight coefficient may be set to 2, the saturation weight coefficient to 1, the lightness weight coefficient to 0.5, and the area weight coefficient to 5000.
S204: and calculating the color weight of each initial characteristic color according to the weighting parameter of each initial characteristic color and the weighting coefficient of each weighting parameter, and sequencing the initial characteristic colors according to the order of the color weight of each initial characteristic color from high to low.
In some possible embodiments, after obtaining the weighting parameters and the weighting coefficients, the terminal may calculate the color weight of each initial characteristic color according to a preset weighting formula. For example, the color weights of the initial characteristic colors are calculated according to formula one, and the 16 initial characteristic colors are sorted according to the order of the obtained color weights from high to low.
Wherein, the first formula is: color weight is area weight coefficient area ratio + saturation weight coefficient saturation + lightness weight coefficient lightness + hue weight coefficient hue.
S205: and calculating the color distance between the initial characteristic colors, and removing the initial characteristic color with smaller color weight from the two initial characteristic colors with the color distance smaller than a preset color distance threshold value to obtain at least one initial characteristic color as the picture characteristic color of the picture to be edited.
In some possible embodiments, after the terminal sorts the 16 initial characteristic colors from high to low according to the color weights, in order to make the picture characteristic colors representative and more comprehensive, the initial characteristic colors may be screened and similar colors may be removed, so as to obtain the picture characteristic colors.
In some possible embodiments, the terminal may calculate color distances between 16 initial feature colors, select the first-ranked initial feature color as a first picture feature color, remove the initial feature color whose color distance from the first-ranked initial feature color is smaller than a threshold value from the next 15 initial feature colors, use the second-ranked initial feature color as a second picture feature color, delete the initial feature color whose color distance from the second-ranked initial feature color is smaller than the threshold value from the next initial feature colors, and so on until a preset number of picture feature colors are obtained, or filter all initial feature colors to obtain the picture feature colors.
Further, please refer to fig. 7, and fig. 7 is another schematic flow chart of the picture editing method according to the embodiment of the present application. The picture editing method shown in fig. 7 includes the steps of:
s301: at least two pixel points with the color saturation greater than or equal to a preset saturation threshold are obtained from the picture to be edited and serve as initial clustering centers.
S302: and clustering the pixel points in the picture to be edited according to the initial clustering center to obtain the color of the pixel points in the clustering center, and taking the color of the pixel points in the clustering center as an initial characteristic color.
S303: at least one weighting parameter for each of the initial characteristic colors is determined, and a weighting coefficient for each weighting parameter for each of the initial characteristic colors is determined.
S304: and calculating the color weight of each initial characteristic color according to the weighting parameter of each initial characteristic color and the weighting coefficient of each weighting parameter, and sequencing the initial characteristic colors according to the order of the color weight of each initial characteristic color from high to low.
S305: and calculating the color distance between the initial characteristic colors, and removing the initial characteristic color with smaller color weight from the two initial characteristic colors with the color distance smaller than a preset color distance threshold value to obtain at least one initial characteristic color as the picture characteristic color of the picture to be edited.
The implementation manner executed in steps S301 to S305 can refer to steps S201 to S205, which are not described herein again.
S306: and generating a picture paster set according to the at least one picture characteristic color, and outputting the picture paster set to a picture editing user interface for a user to select.
In some possible embodiments, when a user obtains a picture to be edited by accessing the application a, performs picture characteristic color extraction on the picture to be edited, and generates the picture sticker set, the terminal may generate a request for requesting the application a to display the generated picture sticker set, so as to generate the picture sticker set on the picture editing user interface for the user to select. The terminal can determine the target sticker style and the target sticker style selected by the user by collecting the click operation of the user on the sticker collection area 200 c.
In some possible embodiments, the set of picture stickers includes a plurality of alternative picture stickers, and the alternative picture stickers include at least one sticker pattern color and at least one picture sticker pattern. That is, the picture sticker sets may include alternative picture stickers formed by applying picture characteristic colors in the same picture sticker style, or may include alternative picture stickers formed by applying picture characteristic colors in a plurality of picture sticker styles. The terminal can determine the target sticker style selected by the user by collecting the clicking operation of the part of the sticker style in the sticker collection area by the user. The clicking operation of the user on the sticker color portion in the sticker collection area is collected, and the target sticker color selected by the user is determined as shown by the solid line portion in fig. 4. While the user switches the click operation on the sticker-like color portion, the color of the picture sticker in the sticker collection area is changed accordingly, as shown by the dotted line portion in fig. 4.
In some possible embodiments, after extracting the characteristic color of the picture to be edited based on a picture editing operation triggered by a user, since the extracted characteristic color of the picture may be too close to the color of the picture to be edited and is not suitable for being used as the sticker color of the picture sticker, the terminal may configure the sticker color parameter in the request for generating the picture sticker set sent by the application a, so that the sticker color also includes some commonly used sticker colors as the preset sticker colors. Specifically, the terminal may obtain a request for generating a picture sticker set sent by the application a, determine a request for generating a sticker pattern color carried in the request for generating the picture sticker set, determine a preset sticker pattern color called by the request for generating the picture sticker set according to the request for generating the sticker pattern color, and further display the preset sticker pattern color in a sticker set area. That is, the sticker pattern color may include the picture characteristic color, or may include a preset sticker pattern color. The terminal can determine the target sticker sample color and the target sticker sample selected by the user by collecting the clicking operation of the user in the sticker collection area.
S307: and determining the target sticker color and the target sticker style selected by the user based on the picture editing user interface, and applying the target sticker color selection to the target sticker style to generate a target sticker.
In some possible embodiments, after the target sticker pattern and the target sticker pattern are selected, if the target sticker pattern is a sticker pattern to which text can be added, the text editing area may be output to a picture editing user interface displayed by the terminal for editing by a user. Wherein, the user can select the color of the character and input the character by clicking the character editing area. The terminal can detect the user operation instruction to determine the color of the sticker character selected by the user and/or the sticker character input by the user, so as to generate the corresponding sticker character, and display the corresponding target sticker in the sticker adjusting area.
In some possible embodiments, the terminal may obtain the color of the sticker text selected by the user by collecting the operation of the user on the text editing area. The method comprises the steps of pasting the character color and the picture characteristic color of the paper and presetting the optional character color. The terminal may determine the character color of the sticker selected by the user by collecting a click operation of the character color portion of the sticker in the character editing area by the user, as shown by the solid line portion in fig. 5. While the user switches the click operation for selecting the character color of the sticker, the character color of the picture sticker is changed accordingly as shown by the dotted line portion in fig. 5. Or automatically generating the character color according to the brightness of the label paper pattern color selected by the user. For example, when the lightness of the sticker sample color is greater than or equal to a preset value, the sticker sample color is brighter, and the text color is automatically determined to be black; when the lightness of the paste paper sample color is smaller than the preset value, the paste paper sample color is darker, and the character color is automatically determined to be white.
S308: and adding the target paster to the picture to be edited to generate a target picture.
In some possible embodiments, after the terminal generates the target sticker, the user may perform operations of adjusting the target sticker, such as zooming in, zooming out, mirroring, and rotating, on the display interface. If the user needs to adjust the target sticker in the picture editing process, the target sticker to be adjusted can be selected by clicking the sticker adjustment area 400c in the picture display area 400a to be edited, and operations such as amplification, reduction, mirroring or rotation of the target sticker are performed in the corresponding area, so that the terminal generates a request for requesting the application A to adjust the target sticker after detecting the operation of adjusting the target sticker triggered by the user. After the user confirms that the target sticker is adjusted, a request for confirming the generation of the target picture can be sent to the application A by clicking confirmation, and therefore the target picture is generated.
In the embodiment of the application, the picture can be compressed before the characteristic color of the picture to be edited is extracted, so that the clustering speed is increased; by initializing the clustering center, the accuracy of extracting the initial characteristic color is improved, the brightness degree of the initial characteristic color is enhanced, and the impression habit of human eyes to the picture to be edited is better met; by removing similar colors in the initial characteristic colors, the characteristic colors of the picture are representative and are more comprehensive. The practicability is enhanced, and simultaneously, the phenomenon that the interface is disordered or the operation is complicated due to too many characteristic colors displayed for a user is prevented.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a picture editing apparatus according to an embodiment of the present application. For convenience of description, the picture editing apparatus shown in fig. 8 is referred to by a terminal in this embodiment, and includes:
the image obtaining module 401 is configured to obtain an image to be edited.
The method comprises the steps that applications such as application A, application B and application C are displayed to a user on the basis of a terminal, at the moment, the user can access the applications by clicking any one of the applications A, B and C, and therefore in the process of accessing any one of the applications A, B and C, picture editing operation on any one of pictures in the applications A, B and C is triggered. Optionally, in the process that the user triggers a picture editing operation on any picture in any application of the application a, the application B, and the application C, a picture that is designated as a picture to be edited in any application of the application a, the application B, and the application C that the user attempts to acquire may be used to perform the picture editing operation.
In some feasible embodiments, the picture to be edited is too large in size, so that the clustering process is too slow, the resolution of the picture to be edited can be reduced through the picture acquisition module 401, the picture is sampled to the preset size and then the characteristic color of the picture is extracted, the clustering speed can be ensured while the characteristic color extraction result is not changed, and the efficiency of the clustering algorithm is improved.
A color processing module 402, configured to extract at least one characteristic color from the picture to be edited.
In some feasible embodiments, after the picture to be edited is acquired by the picture acquisition module 401, the color processing module 402 is utilized to analyze the picture information to be edited in a preset feature space, for example, analyze the color vector of each pixel point of the picture in the RGB color space, so as to obtain the feature vector of each pixel point in the picture to be edited. A preset clustering algorithm is utilized to randomly generate a plurality of clustering centers in the pixel points of the picture to be edited, and for convenience of expression, in this embodiment, a k-means algorithm and 16 clustering centers are taken as an example for description. And calculating the distance from each pixel point to each clustering center, and classifying the pixel points into the category represented by the clustering center closest to the pixel points. And sequentially calculating the arithmetic mean of the pixel points in each sample category, and marking the pixel point closest to the arithmetic mean in each category as a new clustering center. Judging whether the clustering centers in all the categories are changed or not, if so, clustering all the pixel points by using the new clustering center again until all the clustering centers are not changed or the distance between the new clustering center and the old clustering center is less than a set threshold value, terminating the algorithm to obtain 16 final clustering centers, and extracting the color represented by the pixel points of the clustering centers at the moment as the picture characteristic color of the picture to be edited. The distance from each pixel point to each cluster center can be calculated by formulae such as Euclidean distance, Manhattan distance, Chebyshev distance, cosine distance and the like, and can be specifically determined according to an actual application scene without limitation.
A sticker collection module 403, configured to generate a picture sticker collection according to the at least one picture characteristic color.
In some possible embodiments, the sticker gathering module 403 further includes a characteristic color filling unit 4031, configured to apply the at least one picture characteristic color to at least one preset sticker sample color to obtain at least one alternative picture sticker, and generate a picture sticker set according to the at least one alternative picture sticker;
wherein, an alternative picture sticker in the picture sticker set is obtained by combining a sticker pattern color and a picture sticker pattern.
In some possible embodiments, the sticker gathering module 403 further includes a preset color filling unit 4032 for combining at least one preset sticker color with the at least one sticker style to obtain at least one alternative picture sticker.
When a user obtains a picture to be edited by accessing the application a, after obtaining a picture characteristic color through the color processing module 402, the sticker aggregation module 403 applies the picture characteristic color as a sticker pattern color to a background color of a sticker to generate a picture sticker aggregate, so that the picture sticker aggregate is generated in a picture editing user interface for the user to select.
In some possible embodiments, after extracting the picture characteristic color of the picture to be edited based on a picture editing operation triggered by a user, since the extractable picture characteristic colors of different pictures to be edited may be too similar in color and are not suitable for being used as a sticker pattern color of a picture sticker, the terminal may configure a sticker pattern color parameter in a request for generating a picture sticker set sent by application a, so that the sticker pattern color also includes some commonly used sticker pattern colors as preset sticker pattern colors. Specifically, the terminal may obtain a request for generating a picture sticker set sent by the application a, determine a request for generating a sticker sample color carried in the request for generating the picture sticker set, determine a preset sticker sample color called by the request for generating the picture sticker set according to the request for generating the sticker sample color, and then display the preset sticker sample color indicated by the preset sticker sample color data in a sticker set area. That is, the sticker collecting module 403 may be used to generate alternative picture stickers to which picture characteristic color components are applied in one sticker style, or may be used to generate alternative picture stickers to which picture characteristic color components are applied in multiple sticker styles.
A sticker display module 404, configured to output the set of picture stickers to a picture editing user interface for selection by a user, determine a target sticker color selection and a target sticker style selected by the user, and apply the target sticker color selection to the target sticker style to generate a target sticker.
In some possible embodiments, the sticker display module 404 further includes a sticker confirmation unit 4041, configured to obtain a user operation instruction, and determine the target sticker color and the target sticker style selected by the user according to the user click operation instruction in the sticker collection area.
In some possible embodiments, the sticker display module 404 further includes a text generation unit, configured to obtain a sticker text color selected by a user, including a picture characteristic color, and a preset candidate text color, and may also automatically generate a text color according to the lightness of the selected target sticker pattern color. After the sticker display module 404 obtains the target sticker sample and the target sticker sample selected by the user through the sticker confirmation unit 4041, if the target sticker sample is a sticker sample to which characters can be added, the character editing interface may be output to the picture editing user interface for the user to edit. Wherein, the user can select the color of the sticker characters and input the sticker characters by clicking the character editing area. When the user operation instruction is detected, the color of the sticker text selected by the user and/or the sticker text input by the user are determined, so that the corresponding sticker text is generated, and the sticker display module 404 displays the corresponding target sticker in the sticker adjustment area.
In some possible embodiments, the user-selected color of the sticker text may be obtained by collecting the user's collection operation on the text editing area. The method comprises the steps of pasting the character color and the picture characteristic color of the paper and presetting the optional character color. Or automatically generating the character color according to the brightness of the label paper pattern color selected by the user. For example, when the lightness of the sticker sample color is greater than or equal to a preset value, the sticker sample color is brighter, and the text color is automatically determined to be black; when the lightness of the paste paper sample color is smaller than the preset value, the paste paper sample color is darker, and the character color is automatically determined to be white.
A picture generating module 405, configured to add the target sticker to the picture to be edited to generate a target picture.
In some possible embodiments, after the target sticker is generated, the user may perform operations of adjusting the target sticker, such as zooming in, zooming out, mirroring, and rotating, on the display interface. The method comprises the steps that a user operation instruction is obtained based on a picture editing user interface, if a user needs to adjust a target sticker in the picture editing process, the target sticker to be adjusted can be selected by clicking a sticker adjusting area in a picture display area to be edited, and operations such as amplification, reduction, mirroring or rotation of the target sticker are carried out in a corresponding area, so that a terminal generates a request for requesting to adjust the target sticker to an application A after detecting the operation of adjusting the target sticker triggered by the user. After the user confirms that the target sticker is adjusted, a request for confirming the generation of the target picture can be sent to the application A by clicking confirmation, and therefore the target picture is generated.
In the embodiment of the application, the background color of the sticker can be edited by extracting the characteristic color of the picture to be edited, so that the background color of the sticker is more laminated with the picture to be edited, the color conflict between the sticker background and the picture is avoided, the disordered picture condition is realized, and the attractiveness and the degree of freedom of picture editing are increased while the practicability is enhanced.
Referring to fig. 9, fig. 9 is another schematic structural diagram of a picture editing apparatus according to an embodiment of the present application. The picture editing apparatus shown in fig. 9 includes:
the picture obtaining module 501 is configured to obtain a picture to be edited, and the function of the picture obtaining module is similar to that of the picture obtaining module 401, which is not described herein again.
The color processing module 502 is configured to extract at least one characteristic color from the picture to be edited.
In some feasible embodiments, after the picture to be edited is obtained by the picture obtaining module 501, the color processing module 502 is utilized to analyze the information of the picture to be edited in a preset feature space, for example, the color vector of each pixel point of the picture in the RGB color space is analyzed, so as to obtain the feature vector of each pixel point in the picture to be edited. A preset clustering algorithm is utilized to randomly generate a plurality of clustering centers in the pixel points of the picture to be edited, and for convenience of expression, in this embodiment, a k-means algorithm and 16 clustering centers are taken as an example for description. And calculating the distance from each pixel point to each clustering center, and classifying the pixel points into the category represented by the clustering center closest to the pixel points. And sequentially calculating the arithmetic mean of the pixel points in each sample category, and marking the pixel point closest to the arithmetic mean in each category as a new clustering center. Judging whether the clustering centers in all the categories are changed or not, if so, clustering all the pixel points by using the new clustering center again until all the clustering centers are not changed or the distance between the new clustering center and the old clustering center is less than a set threshold value, terminating the algorithm to obtain 16 final clustering centers, and extracting the color represented by the pixel points of the clustering centers at the moment as the picture characteristic color of the picture to be edited. The distance from each pixel point to each cluster center can be calculated by formulae such as Euclidean distance, Manhattan distance, Chebyshev distance, cosine distance and the like, and can be specifically determined according to an actual application scene without limitation.
Since the characteristic color of the image extracted last by the method of randomly generating the cluster center has a certain randomness, in some possible embodiments, the color processing module 502 further includes an initial clustering unit 5021 for initializing the cluster center to ensure that the color extracted each time is stable.
In some feasible embodiments, the pixel point information of the picture to be edited can be analyzed through different color spaces, so that the selection of the initial clustering center is optimized. For example, because it is difficult to numerically obtain the vividness of the color of a pixel point in the RGB color space, HSV (Hue, Saturation, brightness) color space may be used to analyze the pixel points of the picture to be edited, sort the Saturation of the pixel points of the picture to be edited, select the Saturation of the 16 th pixel point as the Saturation threshold, extract the pixel points whose Saturation is greater than or equal to the threshold as the initial clustering center, and thereby use the pixel points with the most vivid color in the picture to be edited as the initial clustering center.
In some feasible embodiments, the saturation threshold may also be set according to practical applications, and 16 of the pixel points whose saturation is greater than or equal to the threshold in the picture to be edited are selected as initial clustering centers, so that the pixel point with the most vivid color in the picture to be edited is used as the initial clustering center.
In some feasible implementation manners, besides analyzing color characteristics such as hue, saturation and lightness of each pixel point of the picture to be edited, information such as spatial position of the pixel points can be added to jointly construct a feature space, and the feature vector dimension of the pixel points is increased, so that the clustering result is more accurate.
In some possible embodiments, the color processing module 502 further includes a color sorting unit 5022, configured to determine at least one weighting parameter of each of the initial characteristic colors, determine a weighting coefficient of each of the weighting parameters of each of the initial characteristic colors, where the weighting parameter includes a color area, a saturation, a brightness, or a hue, calculate a color weight of each of the initial characteristic colors according to at least two weighting parameters of each of the initial characteristic colors and the weighting coefficient of each of the weighting parameters, and sort the initial characteristic colors according to a sequence of the color weights of the initial characteristic colors from high to low.
In some possible embodiments, after obtaining the initial characteristic colors, the initial characteristic colors may be weighted by the color sorting unit 5022 to obtain more desirable characteristic colors, such as characteristic colors with larger color patch areas or characteristic colors with higher lightness in the picture to be edited. Illustratively, the hue weight coefficient may be set to 2, the saturation weight coefficient to 1, the lightness weight coefficient to 0.5, and the area weight coefficient to 5000.
In some possible embodiments, after obtaining the weighting parameters and the weighting coefficients, the color ranking unit 5022 may calculate the color weights of the initial characteristic colors according to a preset weighting formula. For example, the color weights of the initial characteristic colors are calculated according to the second formula, and the 16 initial characteristic colors are sorted according to the order of the obtained color weights from high to low.
Wherein, the formula two is: color weight is area weight coefficient area ratio + saturation weight coefficient saturation + lightness weight coefficient lightness + hue weight coefficient hue.
In some possible embodiments, the color processing module 502 further includes a color filtering unit 5023, configured to calculate color distances between the initial feature colors, and remove an initial feature color with a smaller color weight from two initial feature colors with a color distance smaller than a preset color distance threshold to obtain at least one initial feature color as the picture feature color of the picture to be edited.
In some possible embodiments, after the 16 initial feature colors are sorted from high to low according to the color weights by the color sorting unit 5022, in order to make the picture feature colors representative and more comprehensive, the initial feature colors may be screened and the similar colors may be removed by the color screening unit 5023, so as to obtain the picture feature colors.
In some possible embodiments, the color distance between 16 initial feature colors may be calculated, and the initial feature color with the first order is selected as the first picture feature color, the initial feature color with the color distance smaller than the threshold value in the next 15 initial feature colors is removed, the initial feature color with the second order at this time is used as the second picture feature color, the initial feature color with the color distance smaller than the threshold value in the next initial feature color is deleted, and so on until a preset number of picture feature colors are obtained, or all the initial feature colors are screened, so as to obtain the picture feature colors.
And a sticker aggregation module 503, configured to generate a picture sticker aggregation according to the at least one picture characteristic color.
In some possible embodiments, the sticker gathering module 503 further includes a characteristic color filling unit 5031, configured to apply the at least one picture characteristic color to at least one preset sticker-like color to obtain at least one alternative picture sticker, and generate a picture sticker set according to the at least one alternative picture sticker;
wherein, an alternative picture sticker in the picture sticker set is obtained by combining a sticker pattern color and a picture sticker pattern.
In some possible embodiments, the sticker aggregation module 503 further includes a preset color filling unit 5032 for combining at least one preset sticker color with the at least one sticker style to obtain at least one alternative picture sticker.
When a user obtains a picture to be edited by accessing the application A, after the picture characteristic color is obtained by the color processing module 502, the picture characteristic color is applied to the background color of the sticker as a sticker pattern color by the sticker collection module 503 to generate a picture sticker collection, and a terminal display interface is obtained. In some possible embodiments, after extracting the picture characteristic color of the picture to be edited based on a picture editing operation triggered by a user, since the extractable picture characteristic colors of different pictures to be edited may be too similar in color and are not suitable for being used as a sticker pattern color of a picture sticker, the terminal may configure a sticker pattern color parameter in a request for generating a picture sticker set sent by application a, so that the sticker pattern color also includes some commonly used sticker pattern colors as preset sticker pattern colors. Specifically, the terminal may obtain a request for generating a picture sticker collection sent by the application a, determine a sticker sample color identifier carried in the request for generating the picture sticker collection, determine preset sticker sample color data called by the request for generating the picture sticker collection according to the sticker sample color identifier, and further display a preset sticker sample color indicated by the preset sticker sample color data in a sticker collection area. That is, the sticker aggregation module 503 may be used to generate an alternative picture sticker in which picture characteristic color components are applied to one sticker style, or may be used to generate an alternative picture sticker in which picture characteristic color components are applied to a plurality of sticker styles.
A sticker display module 504, configured to output the set of picture stickers to a picture editing user interface for selection by a user, determine a target sticker color selection and a target sticker style selected by the user, and apply the target sticker color selection to the target sticker style to generate a target sticker.
In some possible embodiments, the sticker display module 504 further includes a sticker confirmation unit 5041, configured to obtain a user operation instruction, and determine the target sticker color and the target sticker style selected by the user according to the user operation instruction. The sticker confirmation unit 5041 may determine the target sticker style selected by the user by collecting a click operation of the user on a part of the sticker style in the sticker collection area. The clicking operation of the user on the sticker color portion in the sticker collection area is collected, and the target sticker color selected by the user is determined as shown by the solid line portion in fig. 4. While the user switches the click operation on the sticker-like color portion, the color of the picture sticker in the sticker collection area is changed accordingly, as shown by the dotted line portion in fig. 4.
In some possible embodiments, the sticker display module 504 further includes a text generation unit, configured to obtain a sticker text color selected by a user, including a picture characteristic color, and a preset candidate text color, and may also automatically generate a text color according to the lightness of the selected target sticker pattern color. After the sticker display module 504 obtains the target sticker style and the target sticker style selected by the user through the sticker confirmation unit 5041, if the target sticker style is a sticker style to which a character can be added, the character editing interface may be output to the picture editing user interface for the user to edit. Wherein, the user can select the color of the sticker characters and input the sticker characters by clicking the character editing area.
In some possible embodiments, the user-selected color of the sticker text may be obtained by collecting the user's collection operation on the text editing area. The method comprises the steps of pasting the character color and the picture characteristic color of the paper and presetting the optional character color. The character generating unit may determine the character color of the sticker selected by the user by collecting a click operation of the character color portion of the sticker in the character editing area by the user, as shown by a solid line portion in fig. 5. While the user switches the click operation for selecting the character color of the sticker, the character color of the picture sticker is changed accordingly as shown by the dotted line portion in fig. 5. Or automatically generating the character color according to the brightness of the label paper pattern color selected by the user. For example, when the lightness of the sticker sample color is greater than or equal to a preset value, the sticker sample color is brighter, and the text color is automatically determined to be black; when the lightness of the paste paper sample color is smaller than the preset value, the paste paper sample color is darker, and the character color is automatically determined to be white.
And a picture generating module 505, configured to add the target sticker to the picture to be edited to generate a target picture.
In some possible embodiments, after the target sticker is generated, the user may perform operations of adjusting the target sticker, such as zooming in, zooming out, mirroring, and rotating, on the display interface. The method comprises the steps that a user operation instruction is obtained based on a picture editing user interface, if a user needs to adjust a target sticker in the picture editing process, the target sticker to be adjusted can be selected by clicking a sticker adjusting area in a picture display area to be edited, and operations such as amplification, reduction, mirroring or rotation of the target sticker are carried out in a corresponding area, so that a terminal generates a request for requesting to adjust the target sticker to an application A after detecting the operation of adjusting the target sticker triggered by the user. After the user confirms that the target sticker is adjusted, a request for confirming the generation of the target picture can be sent to the application A by clicking confirmation, and therefore the target picture is generated.
In the embodiment of the application, the picture can be compressed before the characteristic color of the picture to be edited is extracted, so that the clustering speed is increased; by initializing the clustering center, the accuracy of extracting the initial characteristic color is improved, the brightness degree of the initial characteristic color is enhanced, and the impression habit of human eyes to the picture to be edited is better met; by removing similar colors in the initial characteristic colors, the characteristic colors of the picture are representative and are more comprehensive. The practicability is enhanced, and simultaneously, the phenomenon that the interface is disordered or the operation is complicated due to too many characteristic colors displayed for a user is prevented.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an apparatus provided in an embodiment of the present application. As shown in fig. 10, the apparatus 1000 in the present embodiment may include: the processor 1001, the network interface 1004, and the memory 1005, and the apparatus 1000 may further include: a user interface 1003, and at least one communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display) and a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a standard wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1004 may be a high-speed RAM memory or a non-volatile memory (e.g., at least one disk memory). The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 10, a memory 1005, which is a kind of computer-readable storage medium, may include therein an operating system, a network communication module, a user interface module, and a device control application program.
In the device 1000 shown in fig. 10, the network interface 1004 may provide network communication functions; the user interface 1003 is an interface for providing a user with input; and the processor 1001 may be used to invoke a device control application stored in the memory 1005 to implement:
and acquiring a picture to be edited, and extracting at least one picture characteristic color from the picture to be edited.
And generating a picture paster set according to the at least one picture characteristic color, and outputting the picture paster set to a picture editing user interface for selection by a user, wherein the picture paster set comprises at least one paster style color and at least one paster style.
And determining the target sticker color and the target sticker style selected by the user based on the picture editing user interface, and applying the target sticker color selection to the target sticker style to generate a target sticker.
And adding the target paster to the picture to be edited to generate a target picture.
In some possible embodiments, the processor 1001 is configured to generate a picture sticker set according to the at least one picture characteristic color, and includes:
and applying the at least one picture characteristic color to at least one preset sticker sample color to obtain at least one alternative picture sticker, and generating a picture sticker set according to the at least one alternative picture sticker.
Wherein, an alternative picture sticker in the picture sticker set is obtained by combining a sticker pattern color and a picture sticker pattern.
The picture sticker set further comprises at least one preset sticker color, and the at least one preset sticker color is used for being combined with the at least one sticker style to obtain at least one alternative picture sticker.
In some possible embodiments, the processor 1001 is further configured to determine the user-selected target sticker style and target sticker style based on the picture editing user interface, including:
and acquiring a user operation instruction based on the picture editing user interface, and determining the target sticker style and the target sticker style selected by the user according to the user operation instruction.
In some possible embodiments, the processor 1001 is further configured to extract at least one picture characteristic color from the picture to be edited, including:
and determining at least two initial characteristic colors from the picture to be edited, and determining the color weight of each initial characteristic color in the at least two initial characteristic colors.
And determining at least one initial characteristic color from the at least two initial characteristic colors as the picture characteristic color of the picture to be edited according to the color weight of each initial characteristic color and the color distance between the initial characteristic colors.
In some possible embodiments, the processor 1001 is further configured to determine at least two initial characteristic colors from the picture to be edited, including:
and acquiring at least two pixel points with the color saturation greater than or equal to a preset saturation threshold from the picture to be edited as an initial clustering center.
And clustering the pixel points in the picture to be edited according to the initial clustering centers to obtain the colors of at least two clustering center pixel points, and taking the colors of the at least two clustering center pixel points as at least two initial characteristic colors.
In some possible embodiments, the processor 1001 is further configured to determine at least one initial characteristic color from the at least two initial characteristic colors as the picture characteristic color of the picture to be edited according to the color weight of each initial characteristic color and the color distance between each initial characteristic color, and includes:
at least one weighting parameter of each initial characteristic color is determined, and a weighting coefficient of each weighting parameter of each initial characteristic color is determined, wherein the weighting parameters comprise color area, saturation, brightness or hue.
And calculating the color weight of each initial characteristic color according to at least two weighting parameters of each initial characteristic color and the weighting coefficient of each weighting parameter, and sequencing the initial characteristic colors according to the order of the color weight of each initial characteristic color from high to low.
And calculating the color distance between the initial characteristic colors, and removing the initial characteristic color with smaller color weight from the two initial characteristic colors with the color distance smaller than a preset color distance threshold value to obtain at least one initial characteristic color as the picture characteristic color of the picture to be edited.
It should be understood that in some possible embodiments, the processor 1001 may be a Central Processing Unit (CPU), and the processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), field-programmable gate arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The memory may include both read-only memory and random access memory, and provides instructions and data to the processor. The portion of memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In a specific implementation, the device 1000 may execute, through each built-in functional module thereof, the implementation manners provided in each step in fig. 2, fig. 6, and/or fig. 7, which may specifically refer to the implementation manners provided in each step, and are not described herein again.
In the embodiment of the present application,
an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and is executed by a processor to implement the method provided in each step in fig. 2, fig. 6, and/or fig. 7, which may specifically refer to implementation manners provided in each step, and are not described herein again.
The computer readable storage medium may be an internal storage unit of the task processing device provided in any of the foregoing embodiments, for example, a hard disk or a memory of an electronic device. The computer readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash card (flash card), and the like, which are provided on the electronic device. The computer readable storage medium may further include a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), and the like. Further, the computer readable storage medium may also include both an internal storage unit and an external storage device of the electronic device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the electronic device. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
The terms "first", "second", and the like in the claims and in the description and drawings of the present application are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments. The term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (10)

1. A picture editing method, characterized in that the method comprises:
acquiring a picture to be edited, and extracting at least one picture characteristic color from the picture to be edited;
generating a picture paster set according to the at least one picture characteristic color, and outputting the picture paster set to a picture editing user interface for selection by a user, wherein the picture paster set comprises at least one paster sample color and at least one paster sample;
determining a target sticker color and a target sticker style selected by the user based on the picture editing user interface, applying the target sticker color selection to the target sticker style to generate a target sticker;
and adding the target paster to the picture to be edited to generate a target picture.
2. The method of claim 1, wherein generating a set of picture stickers from the at least one picture characteristic color comprises:
applying the at least one picture characteristic color to at least one preset sticker sample color to obtain at least one alternative picture sticker, and generating a picture sticker set according to the at least one alternative picture sticker;
wherein, an alternative picture sticker in the picture sticker set is obtained by combining a sticker pattern color and a picture sticker pattern.
3. The method of claim 2, wherein the collection of picture decals further includes at least one preset decal color for combining with the at least one decal style to obtain at least one alternative picture decal.
4. The method of any of claims 1-3, wherein determining the user-selected target decal color and target decal style based on the picture editing user interface comprises:
and acquiring a user operation instruction based on the picture editing user interface, and determining the target sticker style and the target sticker style selected by the user according to the user operation instruction.
5. The method according to claim 4, wherein the extracting at least one picture characteristic color from the picture to be edited comprises:
determining at least two initial characteristic colors from the picture to be edited, and determining the color weight of each initial characteristic color in the at least two initial characteristic colors;
and determining at least one initial characteristic color from the at least two initial characteristic colors as the picture characteristic color of the picture to be edited according to the color weight of each initial characteristic color and the color distance between the initial characteristic colors.
6. The method according to claim 5, wherein the determining at least two initial characteristic colors from the picture to be edited comprises:
acquiring at least two pixel points with color saturation greater than or equal to a preset saturation threshold from the picture to be edited as an initial clustering center;
clustering the pixel points in the picture to be edited according to the initial clustering centers to obtain the colors of at least two clustering center pixel points, and taking the colors of the at least two clustering center pixel points as at least two initial characteristic colors.
7. The method according to claim 5, wherein the determining at least one initial characteristic color from the at least two initial characteristic colors as the picture characteristic color of the picture to be edited according to the color weight of each initial characteristic color and the color distance between each initial characteristic color comprises:
determining at least one weighting parameter of each initial characteristic color, and determining a weighting coefficient of each weighting parameter of each initial characteristic color, wherein the weighting parameters comprise color area, saturation, brightness or hue;
calculating the color weight of each initial characteristic color according to the at least two weighting parameters of each initial characteristic color and the weighting coefficient of each weighting parameter, and sequencing each initial characteristic color according to the order of the color weight of each initial characteristic color from high to low;
and calculating the color distance between the initial characteristic colors, and removing the initial characteristic color with smaller color weight from the two initial characteristic colors with the color distance smaller than a preset color distance threshold value to obtain at least one initial characteristic color as the picture characteristic color of the picture to be edited.
8. A picture editing apparatus, comprising:
the picture acquisition module is used for acquiring a picture to be edited;
the color processing module is used for extracting at least one characteristic color from the picture to be edited;
the sticker collection module is used for generating a picture sticker collection according to the at least one picture characteristic color;
a sticker display module to output the collection of picture stickers to a picture editing user interface for selection by a user, to determine a target sticker color selection and a target sticker style selected by the user, to apply the target sticker color selection to the target sticker style to generate a target sticker;
and the picture generation module is used for adding the target paster to the picture to be edited to generate a target picture.
9. A picture editing apparatus comprising a processor and a memory, the processor and the memory being interconnected;
the memory for storing a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 7.
CN202010676536.0A 2020-07-14 2020-07-14 Picture editing method, device, equipment and storage medium Pending CN112748829A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010676536.0A CN112748829A (en) 2020-07-14 2020-07-14 Picture editing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010676536.0A CN112748829A (en) 2020-07-14 2020-07-14 Picture editing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112748829A true CN112748829A (en) 2021-05-04

Family

ID=75645229

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010676536.0A Pending CN112748829A (en) 2020-07-14 2020-07-14 Picture editing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112748829A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010109651A (en) * 2008-10-30 2010-05-13 Furyu Kk Sticker photo creation apparatus, method, and program
CN102722880A (en) * 2011-03-29 2012-10-10 阿里巴巴集团控股有限公司 Image main color identification method and apparatus thereof, image matching method and server
CN106528187A (en) * 2015-09-09 2017-03-22 腾讯科技(深圳)有限公司 Method and device for determining occupied background color
US20180137660A1 (en) * 2016-11-11 2018-05-17 Microsoft Technology Licensing, Llc Responsive customized digital stickers
CN108924440A (en) * 2018-08-01 2018-11-30 Oppo广东移动通信有限公司 Paster display methods, device, terminal and computer readable storage medium
CN110198437A (en) * 2018-02-27 2019-09-03 腾讯科技(深圳)有限公司 Processing method, device, storage medium and the electronic device of image
CN110580729A (en) * 2018-06-11 2019-12-17 阿里巴巴集团控股有限公司 image color matching method and device and electronic equipment
CN110780961A (en) * 2019-10-15 2020-02-11 深圳创维-Rgb电子有限公司 Method for adjusting character color of application interface, storage medium and terminal equipment
CN111161377A (en) * 2019-12-26 2020-05-15 北京猎豹网络科技有限公司 Method and device for adding characters into picture, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010109651A (en) * 2008-10-30 2010-05-13 Furyu Kk Sticker photo creation apparatus, method, and program
CN102722880A (en) * 2011-03-29 2012-10-10 阿里巴巴集团控股有限公司 Image main color identification method and apparatus thereof, image matching method and server
CN106528187A (en) * 2015-09-09 2017-03-22 腾讯科技(深圳)有限公司 Method and device for determining occupied background color
US20180137660A1 (en) * 2016-11-11 2018-05-17 Microsoft Technology Licensing, Llc Responsive customized digital stickers
CN110198437A (en) * 2018-02-27 2019-09-03 腾讯科技(深圳)有限公司 Processing method, device, storage medium and the electronic device of image
CN110580729A (en) * 2018-06-11 2019-12-17 阿里巴巴集团控股有限公司 image color matching method and device and electronic equipment
CN108924440A (en) * 2018-08-01 2018-11-30 Oppo广东移动通信有限公司 Paster display methods, device, terminal and computer readable storage medium
CN110780961A (en) * 2019-10-15 2020-02-11 深圳创维-Rgb电子有限公司 Method for adjusting character color of application interface, storage medium and terminal equipment
CN111161377A (en) * 2019-12-26 2020-05-15 北京猎豹网络科技有限公司 Method and device for adding characters into picture, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US10956784B2 (en) Neural network-based image manipulation
CN107993191B (en) Image processing method and device
US9396560B2 (en) Image-based color palette generation
US9245350B1 (en) Image-based color palette generation
CN111654635A (en) Shooting parameter adjusting method and device and electronic equipment
CN107621966B (en) Graphical user interface display method and device and terminal equipment
CN109087376B (en) Image processing method, image processing device, storage medium and electronic equipment
CN107613202B (en) Shooting method and mobile terminal
CN108551519B (en) Information processing method, device, storage medium and system
CN107368550B (en) Information acquisition method, device, medium, electronic device, server and system
CN110555171B (en) Information processing method, device, storage medium and system
CN106844302B (en) Electronic book page display method and device and terminal equipment
JP2005151282A (en) Apparatus and method of image processing, and program
CN110084871B (en) Image typesetting method and device and electronic terminal
CN114529490B (en) Data processing method, device, equipment and readable storage medium
CN108494996A (en) Image processing method, device, storage medium and mobile terminal
CN110266926B (en) Image processing method, image processing device, mobile terminal and storage medium
CN112200844A (en) Method, device, electronic equipment and medium for generating image
CN111476154A (en) Expression package generation method, device, equipment and computer readable storage medium
CN110837571A (en) Photo classification method, terminal device and computer readable storage medium
CN108804652B (en) Method and device for generating cover picture, storage medium and electronic device
CN112748829A (en) Picture editing method, device, equipment and storage medium
CN111274145A (en) Relationship structure chart generation method and device, computer equipment and storage medium
CN108010038B (en) Live-broadcast dress decorating method and device based on self-adaptive threshold segmentation
CN113837181B (en) Screening method, screening device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40048360

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination