CN114782558A - Image processing method, apparatus, device, storage medium, and program product - Google Patents

Image processing method, apparatus, device, storage medium, and program product Download PDF

Info

Publication number
CN114782558A
CN114782558A CN202210359872.1A CN202210359872A CN114782558A CN 114782558 A CN114782558 A CN 114782558A CN 202210359872 A CN202210359872 A CN 202210359872A CN 114782558 A CN114782558 A CN 114782558A
Authority
CN
China
Prior art keywords
image
cluster
target
centers
target pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210359872.1A
Other languages
Chinese (zh)
Inventor
唐炎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bigo Technology Pte Ltd
Original Assignee
Bigo Technology Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bigo Technology Pte Ltd filed Critical Bigo Technology Pte Ltd
Priority to CN202210359872.1A priority Critical patent/CN114782558A/en
Publication of CN114782558A publication Critical patent/CN114782558A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image processing method, an image processing device, image processing equipment, a storage medium and a program product, and relates to the field of image processing. The method comprises the following steps: acquiring a target image, wherein the target image comprises target pixel points; acquiring a color space, wherein the color space comprises at least two cluster-like centers; classifying the target pixel point to a corresponding cluster center based on the spatial distance between the target pixel point and at least two cluster centers; acquiring a pixel mean value of target pixel points clustered by at least two cluster centers, and performing iterative updating on the at least two cluster centers; and in response to the updated cluster center meeting the updating condition, determining the subject color of the target image based on the updated at least two cluster centers. By the method, the theme color of the target image can be better determined by using the iteratively updated cluster center, the cost for extracting the theme color of the image is reduced, and the use experience of a user is improved. The method and the device can be applied to various scenes such as image processing, artificial intelligence and the like.

Description

Image processing method, apparatus, device, storage medium, and program product
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image processing method, an image processing apparatus, an image processing device, a storage medium, and a program product.
Background
When image content is displayed on a user interface, in order to provide a harmonious and consistent effect for the interface and ensure good user experience, different theme color matching needs to be designed according to different image content, or theme colors are extracted from an image and used as a background of the user interface.
In the related art, the image is usually compared with an official color palette support library of a system corresponding to the terminal, and the theme color of the image is acquired; or, the terminal transmits the image to a webpage back end (server), and the webpage back end acquires the theme color of the image by adopting an image theme color extraction algorithm.
However, when the method is used for obtaining the theme color, the requirement on the system corresponding to the terminal is high, and when the system does not have an official palette, the terminal has to transmit the image to the server, but the process of obtaining the theme color by the server not only increases the calculation cost of the server, but also increases the waiting time of the user, and affects the human-computer interaction efficiency.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, image processing equipment, a storage medium and a program product. The technical scheme is as follows:
according to an aspect of an embodiment of the present application, there is provided an image processing method including:
acquiring a target image, wherein the target image comprises target pixel points;
acquiring a color space, wherein the color space comprises at least two cluster centers, and the cluster centers are pixel point positions used for clustering pixel points in the color space;
based on the spatial distance between the target pixel point and the at least two cluster centers in the color space, clustering the target pixel point by taking the at least two cluster centers as clustering centers;
acquiring a pixel mean value of a target pixel point clustered by the at least two cluster centers, and performing iterative updating on the at least two cluster centers;
and determining a target pixel value as a theme color corresponding to the target image based on the updated at least two cluster centers in response to the updated at least two cluster centers meeting the updating condition.
According to an aspect of an embodiment of the present application, there is provided an image processing apparatus including:
the image acquisition module is used for acquiring a target image, and the target image comprises target pixel points;
the device comprises a space acquisition module, a color space acquisition module and a color matching module, wherein the color space comprises at least two cluster centers, and the cluster centers are pixel point positions used for clustering pixel points in the color space;
the clustering module is used for clustering the target pixel points by taking the at least two cluster centers as clustering centers on the basis of the spatial distance between the target pixel points and the at least two cluster centers in the color space;
the updating module is used for acquiring the pixel mean value of the target pixel point clustered by the at least two cluster centers and performing iterative updating on the at least two cluster centers;
and the color determining module is used for responding to the condition that the updated at least two cluster centers meet the updating condition, and determining a target pixel value based on the updated at least two cluster centers as the theme color corresponding to the target image.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the image processing method according to any one of the embodiments described above.
In another aspect, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the image processing method as described in any of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to execute the image processing method described in any of the above embodiments.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
classifying the target pixel points to corresponding cluster centers based on the spatial distance between the target pixel points corresponding to the target image in the color space and the cluster centers, performing iterative updating on the cluster centers according to the pixel average values of the target pixel points clustered by the cluster centers, and determining the theme colors of the target image based on the updated cluster centers after the updating conditions are met. By the method, the terminal can finish the acquisition process of the image theme color without depending on a background server, and the updated cluster center can be better clustered to the target pixel point corresponding to the target image through the iterative updating process of the cluster center, so that the determined target pixel value is used as the theme color corresponding to the target image according to the number of pixels of the target pixel point clustered by the updated cluster center, the cost for extracting the image theme color is reduced, the obtained theme color can be used for providing a harmonious and consistent viewing effect for an interface, and the use experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 is a flow chart of an image processing method provided by an exemplary embodiment of the present application;
FIG. 3 is a schematic illustration of a color space provided by an exemplary embodiment of the present application;
FIG. 4 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 5 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of an image pre-processing operation provided by an exemplary embodiment of the present application;
FIG. 7 is a diagram illustrating cluster-centered clustering provided in an exemplary embodiment of the present application;
FIG. 8 is a diagram illustrating cluster-center clustering provided in accordance with another exemplary embodiment of the present application;
FIG. 9 is a schematic interface diagram of an image processing method provided in an exemplary embodiment of the present application;
fig. 10 is a block diagram of an image processing apparatus according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings. It should be apparent that the embodiments described are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the related art, an image is usually compared with an official palette support library of a system corresponding to a terminal, and the theme color of the image is acquired; or, the terminal transmits the image to a webpage back end (server), and the webpage back end acquires the theme color of the image by adopting an image theme color extraction algorithm. However, when the method is used for obtaining the theme color, the requirement on the system corresponding to the terminal is high, and when the system does not have an official color palette, the terminal has to transmit the image to the server, but the process of obtaining the theme color by the server not only increases the calculation cost of the server, but also increases the waiting time of the user, and influences the human-computer interaction efficiency.
The embodiment of the application provides an image processing method, which can shorten the cost for extracting the theme color of an image and improve the use experience of a user. Optionally, the image processing method obtained by training in the present application is applied to an image theme color acquisition scene for example.
After the image is obtained, based on reasons such as color design, image classification, image recognition and the like, sometimes the theme color of the image needs to be extracted from the image, so that the theme color of the image is utilized to provide a harmonious and consistent viewing effect for the image viewing process. Illustratively, the image processing method is adopted to respectively determine the spatial distance between a target pixel point corresponding to the obtained target image and a cluster center in a color space, so as to classify different target pixel points to the corresponding cluster centers, cluster different cluster centers to different target pixel points, then iteratively update the cluster centers according to the pixel average values of the target pixel points clustered by the different cluster centers, and determine the theme color of the target image based on the updated cluster centers after the update of the cluster centers meets the update condition. By the method, the updated cluster center can be better clustered to the target pixel points corresponding to the target image, so that the determined target pixel value is used as the theme color corresponding to the target image according to the number of the pixels of the target pixel points to which the updated cluster center can be clustered, the cost for extracting the theme color of the image is effectively reduced, and the image processing effect and the use experience of a user are improved by means of the extracted theme color.
It should be noted that the above application scenarios are only illustrative examples, and the image processing method provided in this embodiment may also be applied to other scenarios, which are not limited in this embodiment.
It should be noted that information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals referred to in this application are authorized by the user or sufficiently authorized by various parties, and the collection, use, and processing of the relevant data is required to comply with relevant laws and regulations and standards in relevant countries and regions. For example, the image data and the like referred to in the present application are acquired under sufficient authorization.
Next, an implementation environment related to the embodiment of the present application is described, and please refer to fig. 1 schematically, in which the implementation environment is performed by the terminal 110.
In some embodiments, the terminal 110 has an image acquisition module 111 therein for acquiring the target image. In some embodiments, the theme color extraction process described above is performed independently by the terminal 110. Illustratively, the terminal 110 corresponds to the theme color extraction model 112, after the terminal 110 obtains the target image, the theme color extraction process is performed on the target image by using the theme color extraction model 112 of the terminal, and after the theme color corresponding to the target image is obtained by extraction, the theme color is fed back to the image display module 113 in the terminal 110 and is displayed by the terminal 110.
The theme color extraction model 112 is obtained by training in the following way: classifying the target pixel point to a corresponding cluster center based on the spatial distance between the target pixel point corresponding to the target image in the color space and the cluster center in the color space, performing iterative update on the cluster center according to the pixel average value of the target pixel point clustered by the cluster center, and determining the theme color of the target image based on the updated cluster center after the update of the cluster center meets the update condition. The above process is an example of a non-exclusive case of the subject color extraction model 112 training process.
It should be noted that the above terminals include but are not limited to mobile terminals such as mobile phones, tablet computers, portable laptop computers, intelligent voice interaction devices, intelligent appliances, and vehicle-mounted terminals, and can also be implemented as desktop computers; the server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, Network service, cloud communication, middleware service, domain name service, security service, Content Delivery Network (CDN), big data, an artificial intelligence platform, and the like.
The Cloud technology (Cloud technology) is a hosting technology for unifying series resources such as hardware, application programs, networks and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data.
In some embodiments, the servers described above may also be implemented as nodes in a blockchain system.
In conjunction with the above noun introduction and application scenario, the image processing method provided in the present application is described, taking the application of the method to a server as an example, as shown in fig. 2, the method includes the following steps 210 to 250.
Step 210, acquiring a target image.
And the target image comprises target pixel points.
Illustratively, the target image includes multiple types of images, such as: landscape images, architectural images, animal images, etc. Alternatively, the target image includes a photograph taken by a camera, a mobile phone, or the like, an image synthesized by software or the like, and the like.
The target image comprises target pixel points. The pixel is the smallest image unit, optionally, a target image is composed of a plurality of pixel points, and the pixel points have position information and form the target image based on the position relation of different pixel points.
Illustratively, each pixel has a corresponding color channel value. For example: taking three channel (red, green and blue) values corresponding to the pixel point as an example, the first value is used for indicating the red channel value of the pixel point; the second value is used for indicating the green channel value of the pixel point; the third value is used for indicating the blue channel value of the pixel point.
For example: when the image is a pure black image, the values of three channels corresponding to all pixel points in the image are all 0; or, when the image is a pure white image, three channels corresponding to all pixel points in the image are 255; or, when the image is an image with rich color distribution, three channels corresponding to different pixel points in the image have different values.
In an optional embodiment, the target pixel point may be used to indicate each of the plurality of pixels, indicate at least two of the plurality of pixels, and the like. Illustratively, the manner of determining the target pixel point from the plurality of pixel points of the target image includes at least one of the following.
1. And taking all pixel points of the target image as target pixel points.
Schematically, after the target image is obtained, all pixel points corresponding to the target image are used as target pixel points, that is,: and carrying out an image processing process by using all pixel points of the target image.
2. And taking pixel points corresponding to a preset area in the middle of the target image as target pixel points.
When the theme color is extracted from the target image, the color of the central area of the target image is more obvious to the overall impression of the image, and schematically, when the target pixel point is selected from the pixel points corresponding to the target image, the pixel point in the central area of the target image is preferentially selected. Optionally, a central preset area is divided in the target image, for example: and determining a central preset area of the 5 × 5 target image as a 3 × 3 central area with the central point as the center, and taking a pixel point corresponding to the 3 × 3 central area as a target pixel point.
3. And selecting target pixel points from the pixel points of the target image in a jumping mode.
Illustratively, after the position information of the pixel points in the target image is determined, one pixel point is selected at intervals in a jumping type selection mode from left to right and from top to bottom, and the selected pixel point is taken as a target pixel point to finish the process of selecting the target pixel point.
4. And selecting target pixel points from the pixel points of the target image in a pixel division mode.
Optionally, after determining the position information of the pixel points in the target image, the pixel point region is divided. Illustratively, four pixel points adjacent to each other in the vertical and horizontal directions are used as a pixel point group, and each pixel point belongs to one pixel point group, so as to obtain a plurality of pixel point groups corresponding to the target image, and select one pixel point from each pixel point group as a target pixel point, for example: taking the pixel point at the upper left corner in each pixel point group as a target pixel point, thereby completing the process of selecting the target pixel point; or, the pixel point at the lower left corner in each pixel point group is used as a target pixel point, so that the process of selecting the target pixel point is completed, and the like.
It should be noted that the above is only an illustrative example, and the present invention is not limited to this.
Step 220, a color space is obtained.
Wherein the color space is used for indicating a color relationship having a coordinate relationship in which different colors exist in the form of points. Illustratively, the RGB color space is used to indicate the color relationship of three colors, red (red), green (green), and blue (blue). Alternatively, when transparency (Alpha) is considered in the RGB color space, as shown in fig. 3, which is a schematic diagram of the RGB color space 310, the points of different colors have corresponding color coordinates, for example: white coordinates are (1,1,1, 1); blue coordinates (0,1,1, 1); red coordinates are (1,0,0,1), etc., where transparency is used to indicate the degree of transparency of the image. Schematically, in the RGB color space, the transparency is represented as 1, i.e., the image is opaque; the transparency is represented as 0, i.e. the image is in a transparent state.
Illustratively, color spaces such as HSV color space, HSL color space, etc. may also be selected to describe the color. The HSV color space is constructed according to the visual characteristics of colors, H is used for indicating Hue (Hue), S is used for indicating Saturation (Saturation), and V is used for indicating brightness (Value); the HSL color space is constructed according to basic attributes of colors, H is used to indicate Hue (Hue), S is used to indicate Saturation (Saturation), and L is used to indicate brightness (Lightness).
In an alternative embodiment, the color space is obtained based on the color channel corresponding to the target pixel point.
Illustratively, if the target pixel point corresponds to an RGB three channel, acquiring an RGB color space based on the RGB three channel; or if the target pixel point corresponds to an HSV three-channel, acquiring an HSV color space based on the HSV three-channel; or, if the target pixel point corresponds to a single channel, the gray scale space is acquired based on the single channel. Illustratively, the target pixel points are distributed in a color space and exist in the form of pixel coordinates.
Optionally, taking a color space as an RGB color space as an example for description, when describing pixel point coordinates of different colors in the color space, describing the pixel points by using a red value, a green value, a blue value, and a transparency, that is: the corresponding transparency of the image can be additionally considered on the basis of considering different channel values, so that the image can be analyzed more finely.
In an optional embodiment, the color space includes at least two cluster centers, and the cluster centers are pixel point positions used for clustering pixel points in the color space.
Illustratively, the vertex in the color space is taken as the cluster-like center. As shown in fig. 3, there are 8 vertexes of the RGB color space, and the 8 vertexes are used as 8 cluster centers; or at least two vertexes are selected from 8 vertexes as cluster centers and the like. Alternatively, a color point arbitrarily selected from the color space may be used as the cluster center or the like.
It should be noted that the above is only an illustrative example, and the present invention is not limited to this.
And step 230, clustering the target pixel points by taking the at least two cluster centers as clustering centers based on the spatial distance between the target pixel points and the at least two cluster centers in the color space.
Optionally, after at least two cluster centers are determined in the color space, the spatial distance between a target pixel point corresponding to the target image and the at least two cluster centers is calculated.
Illustratively, taking a target pixel point L, a target pixel point M, and a target pixel point N in a target image, and a cluster center O and a cluster center P of a cluster center as examples, respectively calculating spatial distances between the target pixel point L and the cluster center O and the cluster center P, such as: the spatial distance L-O between the target pixel point L and the cluster-like center O is 0.8; the spatial distance L-P between the target pixel point L and the cluster center P is 0.6; similarly, the spatial distances between the target pixel point M and the cluster-like center O and the cluster-like center P are respectively calculated, for example: the spatial distance M-O between the target pixel point M and the cluster center O is 0.7; the spatial distance L-P between the target pixel point M and the cluster center P is 0.6; similarly, the spatial distances between the target pixel point N and the cluster center O and the cluster center P are calculated respectively, such as: the space distance N-O between the target pixel point N and the cluster center O is 0.56; and the space distance N-P between the target pixel point N and the cluster center P is 0.77 and the like.
Optionally, when calculating the spatial distances between the target pixel points corresponding to the target image and the centers of the at least two clusters, an euclidean distance calculation method is adopted, that is: the euclidean distance (straight line distance) between two color points in the color space is measured. Illustratively, the spatial distance between the target pixel point and the centers of at least two cluster classes in the color space is determined.
In an optional embodiment, after the spatial distance between the target pixel point and the centers of the at least two clusters in the color space is determined, the centers of the at least two clusters are taken as the cluster centers, and the target pixel point is classified into the cluster center with the minimum euclidean distance.
Illustratively, the spatial distance is an euclidean distance, as exemplified by the distance analysis between the target pixel point L, the target pixel point M, and the target pixel point N, and the cluster center O and the cluster center P of the cluster center. Taking the cluster center O and the cluster center P as clustering centers for clustering target pixel points, and after determining the Euclidean distance between the target pixel point L and the cluster center O and the cluster center P, classifying the target pixel point L into the cluster center P based on that the Euclidean distance (0.6) between the target pixel point L and the cluster center P is smaller than the Euclidean distance (0.8) between the target pixel point L and the cluster center O; in the same way, classifying the target pixel point M to a cluster center P; and classifying the target pixel point N to a cluster center O and the like.
Optionally, when the spatial distance between one target pixel point and two cluster centers is the same, one cluster center is arbitrarily selected from the two cluster centers as the cluster center of the target pixel point, that is, the target pixel point is classified into one of the cluster centers in an equal probability manner. It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
And 240, acquiring a pixel mean value of the target pixel points clustered by the at least two cluster centers, and performing iterative updating on the at least two cluster centers.
Optionally, after the target pixel points are clustered to the corresponding cluster centers, at least two cluster centers respectively correspond to different target pixel points, and the distribution of the target pixel points includes at least one of the following situations:
(1) different target pixel points are classified into the same cluster center of at least two cluster centers;
(2) different target pixel points are respectively classified to the corresponding cluster centers.
In an optional embodiment, after the target pixel points corresponding to the centers of the different clusters are determined, the mean value operation is performed on the target pixel points corresponding to the centers of the different clusters, so as to determine the pixel mean values corresponding to the target pixel points corresponding to the centers of the different clusters.
Illustratively, the target pixel point is represented by a three-channel (RGB) representation mode, the pixel of the target pixel point M is represented by (211, 156, 78), the pixel of the target pixel point N is represented by (125, 182, 162), and if the clustering centers of the target pixel point M and the target pixel point N are both the cluster-like center P, when the pixel mean value of the target pixel point clustered by the cluster-like center P is determined, the target pixel point M corresponding to the cluster-like center P and the target pixel point N are subjected to mean value operation.
Optionally, when averaging is performed on target pixel points of three channels, averaging is performed on the values corresponding to different channels respectively. For example: a pixel mean value obtained by performing mean operation on the target pixel point M and the target pixel point N is represented as (168, 169, 120), wherein the red channel mean value 168 is (211+ 125)/2; green channel mean 169 ═ (156+ 182)/2; blue channel mean 120 ═ 78+ 162)/2.
The above description is only exemplary, and the present invention is not limited to the above description.
In an optional embodiment, the at least two cluster centers are iteratively updated according to the obtained pixel mean. Optionally, the coordinate value corresponding to the obtained pixel mean value in the color space is used as the coordinate position after the cluster center is updated.
Illustratively, for at least two cluster centers, the target pixel point clustered by each cluster center is different, the pixel mean value of the target pixel point corresponding to each cluster center is solved to obtain the pixel mean value corresponding to each cluster center, and when each cluster center is iteratively updated, the pixel mean value corresponding to the cluster center obtained by clustering at this time is used as the cluster center for clustering the target pixel point next time. That is, after the updated cluster center is obtained, the spatial distance between the target pixel point and the updated cluster center is analyzed again, the target pixel point is classified to the corresponding updated cluster center, and the iterative updating process is continued with the pixel mean value of the target pixel point clustered by the updated cluster center.
In an alternative embodiment, the cluster center in the first update process is used as the initial cluster center.
Illustratively, after obtaining the color space, first determining an initial cluster center in a first updating process from the color space, that is, first classifying the target pixel point to the initial cluster center, and then performing the first updating process of the initial cluster center.
Optionally, after obtaining the color space, determining spatial vertices in the color space, and selecting at least two spatial vertices from the spatial vertices as initial cluster centers. Schematically, as shown in fig. 3, the diagram is an RGB color space diagram, and 8 vertexes in the RGB color space are used as 8 initial cluster centers; alternatively, 3 vertices from the RGB color space are selected as 3 initial cluster-like centers, etc.
Schematically, a description is given by taking 8 vertexes in the RGB color space as 8 initial cluster centers, and performing a first update process of the cluster centers on the 8 initial cluster centers as an example. Target pixel points corresponding to the target image are distributed in the RGB color space, the spatial distance between each target pixel point and 8 initial cluster centers is calculated, and each target pixel point is classified to the corresponding initial cluster center. In this case, there may be a case where there is no target pixel point in the cluster center. After the target pixel points are classified to the corresponding initial cluster centers, performing mean value operation on the target pixel points corresponding to each initial cluster center to obtain a pixel mean value corresponding to each initial cluster center, and taking the coordinate position of each pixel mean value in the color space as the updated cluster center of each initial cluster center. For example: and taking the pixel mean value corresponding to the initial cluster center X as X, and taking the coordinate position of the pixel mean value X in the color space as the updated cluster center of the initial cluster center X.
It should be noted that the foregoing process of updating the initial cluster center is only an illustrative example, and the updating method may also be applied to a process of updating a subsequent cluster center, which is not limited in this embodiment of the present application.
And step 250, in response to that the updated at least two cluster centers meet the update condition, determining a target pixel value based on the updated at least two cluster centers as the theme color corresponding to the target image.
Optionally, after the iterative updating is performed on the at least two cluster-like centers, when the updated at least two cluster-like centers meet the updating condition, the target pixel value is determined based on the updated at least two cluster-like centers.
Wherein the updating condition is used for indicating the condition for stopping the iterative updating process of the cluster center. Illustratively, the update condition includes at least one of the following ways.
(1) The difference between the updated cluster center and the last updated cluster center is within a preset difference threshold range.
Illustratively, after the updated cluster center is obtained, the coordinate position of the updated cluster center in the color space is determined, the spatial distance calculation is performed on the coordinate position and the coordinate position of the cluster center in the color space after the last update, and the calculation result of the spatial distance calculation is used as the difference value between the updated cluster center and the cluster center after the last update.
Optionally, a difference threshold is predetermined, and when the difference value between the updated cluster center and the cluster center that is updated last time is within the preset difference threshold range, the cluster center is considered to be in accordance with the update target, and the update process of performing iterative update on the cluster center is stopped.
Illustratively, the difference value between the updated cluster center and the cluster center that is updated last time within the preset difference threshold range at least includes the following cases: after updating the at least two cluster centers, obtaining at least two updated cluster centers corresponding to the at least two cluster centers, respectively determining difference values between the at least two updated cluster centers and the corresponding at least two cluster centers to obtain at least two difference values, when the minimum difference value of the at least two difference values is within a preset difference threshold range, determining that the minimum difference value is in accordance with an update target, and stopping an update process of performing iterative update on the cluster centers; or when one difference value of the at least two difference values is within a preset difference threshold range, the update target is considered to be met, and the update process of performing iterative update on the cluster center is stopped; or, if the sum of the difference values obtained by adding the difference values after multiple iterative updates is smaller than the difference threshold, the cluster center is considered to be in accordance with the update target, and the update process of the iterative updates of the cluster center is stopped.
(2) And the updating times of the cluster center reach a preset time threshold.
Optionally, the number of times of updating the cluster center is preset, that is, a threshold of the number of times is predetermined, and when the number of times of updating the cluster center reaches the threshold of the preset number of times, the updating process of iteratively updating the cluster center is stopped.
For example: the updating times of the cluster center is not more than 5 in advance, namely the time threshold is 5, and when the updating times of the cluster center reaches 5, the updating process of the iterative updating of the cluster center is stopped.
In an optional embodiment, the difference value condition and the time threshold condition are comprehensively considered to determine the update target.
Illustratively, when the difference value between the updated cluster center and the cluster center which is updated last time is still larger than the difference threshold value, but the update times of the cluster center reach a preset time threshold value, the update process of the iterative update of the cluster center is stopped; or when the update times of the cluster center do not reach the preset time threshold yet, but the difference value between the updated cluster center and the cluster center updated last time is within the preset difference threshold range, stopping the update process of iteratively updating the cluster center.
It should be noted that the above is only an illustrative example, and the present invention is not limited to this.
In an optional embodiment, the target pixel value is determined based on the updated centers of the at least two cluster types as the corresponding theme color of the target image.
Optionally, after the updated at least two cluster centers are obtained, the number of target pixel points corresponding to the at least two cluster centers is determined, a target pixel value is determined based on the cluster centers with the larger number of target pixel points, and the target pixel value is used as the subject color of the target image.
Illustratively, the target pixel value appears light orange in the RGB color space, and the light orange is taken as the subject color of the target image.
In summary, based on the spatial distance between the target pixel point corresponding to the target image in the color space and the cluster center, the target pixel point is classified to the corresponding cluster center, the cluster center is iteratively updated according to the pixel average value of the target pixel point clustered by the cluster center, and after the update condition is met, the theme color of the target image is determined based on the updated cluster center. By the method, the terminal can finish the acquisition process of the image theme color without depending on a background server, and the updated cluster center can be better clustered to the target pixel point corresponding to the target image through the iterative updating process of the cluster center, so that the determined target pixel value is used as the theme color corresponding to the target image according to the number of pixels of the target pixel point clustered by the updated cluster center, the cost for extracting the image theme color is reduced, the obtained theme color can be used for providing a harmonious and consistent viewing effect for an interface, and the use experience of a user is improved.
In an alternative embodiment, the process of acquiring the target image is performed by pre-processing the image. Illustratively, as shown in fig. 4, the embodiment shown in fig. 2 described above can also be implemented as the following steps 410 to 490.
Step 410, a pre-processed image is acquired.
The preprocessed image is used to indicate an unprocessed image, and the preprocessed image includes various types of images such as a landscape image, a building image, and an animal image, and also includes a photograph taken by a device such as a camera and a mobile phone, or an image synthesized by software or the like.
Schematically, an image is randomly selected as a preprocessed image, and the process of acquiring the preprocessed image is realized. For example: randomly downloading an image from a network as a preprocessed image; alternatively, a mobile phone is used to arbitrarily capture an image as a preprocessed image, and the like.
And step 420, performing edge clipping operation on the preprocessed image by taking the preset edge range as an edge clipping condition to obtain a clipped image.
Wherein the preset edge range is a predetermined edge range. Illustratively, an edge cropping operation is performed on the preprocessed image with reference to a preset edge range, that is, an image edge of the preprocessed image is cropped, so as to obtain a cropped image.
Optionally, in the process of extracting the theme color from the image, the edge area of the image has a small influence on the overall color tone. For the above reasons, an edge cropping range is preset, for example: setting a cropping requirement in advance, performing 10% cropping on each of the length and the width of the preprocessed image, and obtaining an area of the cropped image which is 81% of the area of the preprocessed image (0.9 multiplied by 0.9) based on the cropping requirement; alternatively, a trimming range is set in advance, and a trimming image or the like is obtained by reserving 80% of the center area of the preprocessed image with respect to the center of the image of the preprocessed image.
And step 430, in response to the fact that the area of the pixel corresponding to the cut image is larger than a preset area threshold value, carrying out thumbnail processing on the cut image to obtain a target image.
The preset area threshold is a threshold range of pixel areas for defining the pixel area of the target image to be analyzed. Illustratively, after obtaining the cut image, determining a pixel area of the cut image, and comparing the pixel area of the cut image with a preset area threshold.
Optionally, when the pixel area of the cut image is within the preset area threshold range, the cut image is used as a target image, and a theme color determination process is performed by using target pixel points corresponding to the target image.
Alternatively, when the pixel area of the clip image is larger than a preset area threshold, the clip image is subjected to thumbnail processing. Illustratively, when the thumbnail processing is performed on the cut image, the thumbnail processing includes both the thumbnail of the cut image to the preset area threshold value and the thumbnail of the cut image to be smaller than the preset area threshold value.
In an optional embodiment, in response to that the area of the pixel corresponding to the cut image reaches a preset area threshold, performing equal-scale thumbnail processing on the cut image to obtain a thumbnail image.
Illustratively, after comparing the pixel area corresponding to the cut image with a preset area threshold, when the pixel area corresponding to the cut image is larger than the preset area threshold, in order to not change the proportional property of the image, the distribution proportion of the color pixels in the cut image is maintained, the color extraction effect obtained by the theme color is prevented from being affected, and the cut image is subjected to equal-proportion abbreviative processing. Illustratively, the pixel area of the cropped image is 200 × 200, the preset area threshold is 10000, and the cropped image is subjected to equal-proportion thumbnail processing based on the pixel area of the cropped image being larger than the preset area threshold.
Optionally, at least one target image is obtained in response to that the area of the pixel corresponding to the thumbnail image is within a preset area threshold range. Illustratively, when a cut image is subjected to equal-proportion abbreviation, the average value of 4 pixel points is taken as the pixel points of the thumbnail image subjected to the abbreviation processing, so that the pixel area of the obtained thumbnail image is 10000; or, the pixel point at the upper left corner of the 4 pixel points is used as the pixel point of the thumbnail image after the thumbnail processing, so that the pixel area of the obtained thumbnail image is 10000. Based on that the pixel area of the thumbnail image is 10000, the thumbnail image is taken as a target image according with the condition of a preset area threshold, and the theme color determination process is carried out through the target pixel point corresponding to the target image.
Step 440, a color space is obtained.
The color space comprises at least two cluster centers, and the cluster centers are pixel point positions used for clustering pixel points in the color space.
Schematically, the process of obtaining the color space in step 440 has been described in step 220, and therefore is not described herein again.
And 450, clustering the target pixel points by taking the at least two cluster centers as clustering centers based on the spatial distance between the target pixel points and the at least two cluster centers in the color space.
Schematically, the process of classifying the target pixel point to the corresponding cluster center in step 450 has been described in step 230, and therefore is not described herein again.
Step 460, obtaining the pixel mean value of the target pixel point clustered by the at least two cluster centers, and performing iterative update on the at least two cluster centers.
Schematically, the process of performing iterative update on the cluster-like centers by using the pixel average values of the target pixel points clustered by the at least two cluster-like centers in step 460 has already been described in step 240, and thus is not described herein again.
Step 470, determining the number of pixels of the target pixel point corresponding to the updated at least two cluster centers in response to the updated at least two cluster centers meeting the update condition.
Optionally, after obtaining at least two updated cluster centers meeting the update condition based on the iterative update process, spatial distances are respectively calculated between the target pixel point and the at least two updated cluster centers, so that the target pixel point is classified into the corresponding cluster centers, and the updated cluster centers have respective corresponding target pixel points.
Schematically, the number of pixels of the target pixel point corresponding to each of the updated cluster-like centers is determined. For example: the updated cluster center comprises a cluster center 1 and a cluster center 2, and the number of target pixel points corresponding to the cluster center 1 is 1383; the number of target pixel points corresponding to the cluster center 2 is 2562.
Step 480, determining a target cluster center from the at least two cluster centers based on the number of pixels.
In an optional embodiment, based on the updated number of pixels of the target pixel points corresponding to the at least two cluster centers, the cluster center whose number of pixels reaches the preset pixel number threshold is used as the candidate cluster center.
Schematically, a pixel quantity threshold is preset, and when the pixel quantity of a target pixel point corresponding to a certain cluster center is smaller than the preset pixel quantity threshold, the cluster center is discarded; or when the pixel number of the target pixel point corresponding to a certain cluster center is not less than (greater than or equal to) the preset pixel number threshold, selecting the cluster center, and taking the selected cluster center as a candidate cluster center.
Schematically, a preset threshold value of the pixel quantity is a randomly selected constant (such as 200), and a cluster center of which the pixel quantity of a target pixel point in a sequencing result is not less than 200 is taken as a candidate cluster center; or, the preset pixel number threshold is a constant determined based on a ratio of the target pixel points to the cluster-like centers (for example, 8 cluster-like centers exist, 1000 target pixel points exist, and 125(1000/8) is used as a preset pixel number threshold), that is, when the pixel number of the target pixel points corresponding to the cluster-like centers is smaller than the average value of the distribution of the target pixel points, the cluster-like centers are not considered, and when the pixel number of the target pixel points corresponding to the cluster-like centers is not smaller than (greater than or equal to) the average value of the distribution of the target pixel points, the cluster-like centers are considered and listed as candidate cluster centers.
Optionally, the candidate cluster center corresponding to the largest number of pixels is taken as the target cluster center from among the candidate cluster centers.
Illustratively, after determining the number of pixels of target pixels corresponding to different updated cluster centers, based on the number of pixels, at least two updated cluster centers (candidate cluster centers) are sorted, for example: and sequencing at least two candidate cluster centers in a descending order, arranging the candidate cluster centers with more target pixel points (with larger pixel number) at the front end of the sequence, and arranging the candidate cluster centers with less target pixel points (with smaller pixel number) at the rear end of the sequence, thereby obtaining a sequencing result determined based on the pixel number. And the number of target pixel points in the centers of different candidate clusters indicated by the sequencing result.
Illustratively, after the candidate cluster center is determined, the pixel number of the target pixel point corresponding to the candidate cluster center is compared, and the candidate cluster center corresponding to the largest pixel number is used as the target cluster center. For example: only one candidate cluster center meeting a preset pixel number threshold value is used as a target cluster center; or, if the number of the candidate cluster centers meeting the preset pixel number threshold is three, after the number of the pixels of the target pixel points corresponding to the three candidate cluster centers is determined, the candidate cluster center corresponding to the largest pixel number is used as the target cluster center, and the process of determining the target cluster center is realized.
In an optional embodiment, after determining the number of pixels of the target pixel point corresponding to the different updated cluster centers, directly taking the cluster center corresponding to the largest number of pixels as the target cluster center, that is: the selection process of the candidate cluster center described above may not be performed.
It should be noted that the above are only exemplary, and the embodiments of the present application are not limited thereto.
And step 490, taking the target pixel value corresponding to the center of the target cluster as the subject color of the target image.
Optionally, after the center of the target cluster is determined, a target pixel value corresponding to the center of the target cluster is determined, and the target pixel value is used as the theme color of the target image. Schematically, determining a coordinate position of the center of the target cluster in a color space, determining a target pixel value based on the coordinate position, and taking the target pixel value as a subject color of the target image; or, a color corresponding to the coordinate position in the color space is used as a subject color of the target image.
In summary, based on the spatial distance between the target pixel point corresponding to the target image in the color space and the cluster center, the target pixel point is classified to the corresponding cluster center, the cluster center is iteratively updated according to the pixel average value of the target pixel point clustered by the cluster center, and after the update condition is met, the theme color of the target image is determined based on the updated cluster center. By the method, the acquisition process of the image theme color can be completed by the terminal without depending on a background server, and the updated cluster center can be better clustered to the target pixel point corresponding to the target image through the iterative updating process of the cluster center, so that the determined target pixel value is used as the theme color corresponding to the target image according to the number of pixels of the target pixel point which can be clustered by the updated cluster center, and the cost for extracting the image theme color is effectively reduced.
In the embodiment of the application, a process of preprocessing a preprocessed image to obtain a target image and a process of determining the theme color of the target image according to the number of pixels of target pixel points corresponding to the centers of at least two clusters are introduced. After the preprocessed image is obtained, based on the image property of the preprocessed image, the preprocessed image is subjected to image preprocessing operations such as edge clipping and abbreviating, and then a target image which is more beneficial to the image processing process is obtained; after the target pixel points corresponding to the target image are classified to the corresponding cluster centers, iterative updating is carried out on the corresponding cluster centers according to the pixel average values of the target pixel points clustered by the different cluster centers, the target cluster centers are determined according to the pixel number of the target pixel points corresponding to the cluster centers which are updated to meet the updating conditions, the target pixel values corresponding to the target cluster centers are used as the theme colors of the target image, and the process of obtaining the theme colors is achieved. By the method, the target cluster center which is more consistent with the overall basic tone of the target image can be determined based on the target image which is more beneficial to the image processing process through the number of pixels of the target pixel points clustered by the at least two updated cluster centers, the theme color is determined according to the coordinate position of the target cluster center in the color space and is closer to the overall basic tone of the target image, the color distribution rule of the target image can be better embodied, and therefore when the target image or the preprocessed image is displayed based on the theme color, a harmonious and consistent image viewing effect can be provided for an interface, and the use experience of a user is improved.
In an alternative embodiment, the above image processing method is applied to the image theme color extraction method in the iOS device. Illustratively, as shown in fig. 5, the image processing method described above is implemented as the following steps 510 to 560.
Step 510, loading the image through the view control.
Illustratively, within the iOS device, the image is loaded by the view control UIImageView, for example: and loading the image in a network loading or local loading mode.
Optionally, after the image is successfully loaded, the UIImage attribute corresponding to the UIImageView, that is, the image itself, can be acquired.
Step 520, an image preprocessing operation is performed on the image.
Illustratively, after the loaded image is obtained, in order to reduce the time consumption of calculation when the image is analyzed, the image is subjected to a preprocessing operation. The image preprocessing is a key step in the field of image processing, images can be better converted into a data form of a machine learning sample through image preprocessing operation, and in addition, the efficiency of an image processing task can be improved through effective image preprocessing.
Optionally, the image preprocessing operation includes various methods of reducing the size of data, pre-extracting features, and the like. The image pre-processing method of image cropping and image scaling is schematically described as an example.
As shown in fig. 6, it is a schematic diagram of an image preprocessing process performed on an image. First, the image 610 is acquired, since in the image subject color extraction, the influence of the edge area of the image on the overall color tone is small. Optionally, an image cropping method is adopted to crop the image edge area with less influence, so as to obtain a cropped image 620.
Illustratively, the process of cropping the image 610 based on the preset cropping range includes: the length and width of the image are cropped by 10% respectively to obtain a cropped image 620 with the remaining 81% area. Through the image cropping process, the time consumption for calculating the cropped 19% area can be effectively reduced.
Alternatively, after the cropped image 620 is obtained, the performance of the analysis device may be considered in consideration of the extraction of the theme color on the device. Optionally, an image scaling method is used to limit the size of the cropped image 620.
After the cropped image 620 is obtained, it is first determined whether or not the cropped image 620 needs to be scaled. Illustratively, the preset scaling conditions are as follows: if the pixel area of the cropped image 620 is greater than 10000, the cropped image 620 is reduced by a scaling method to obtain a thumbnail image 630, and the pixel area of the thumbnail image 630 is less than or equal to 10000, i.e. X is less than or equal to 10000, wherein X is the length of the thumbnail image 630 and Y is the width of the thumbnail image 630. Illustratively, the scaling is performed to maintain the distribution ratio of the color pixels in the cropped image 620, thereby ensuring the final color-finding (theme color) effect.
Alternatively, the foregoing preprocessing process is only an illustrative example, and the thumbnail image 630 may be used as a target image for analysis, the cropped image 620 may be used as a target image for analysis, the acquired image 610 may be directly used as a target image for analysis, or an image processed by another preprocessing method may be used as a target image for analysis, which is not limited in this embodiment of the present application.
Optionally, after obtaining the target image (e.g. the thumbnail image 630), obtaining a target pixel point corresponding to the target image from the target image, for example: and acquiring all pixel points corresponding to the target image as target pixel points. Illustratively, taking each target pixel point in the target image as four-dimensional data, i.e. red, green, blue and transparency (RGBA), the ith target pixel point is as follows:
pi=(ri,gi,bi,ai)
wherein r isiRepresenting the red value of a red channel in the ith target pixel point; giRepresenting the green value of a green channel in the ith target pixel point; biRepresenting a blue value of a blue channel in the ith target pixel point; a is aiRepresenting the transparency corresponding to the ith target pixel point. Optionally, the set of all target pixels is P.
And step 530, classifying the target pixel points by adopting an unsupervised learning method.
Optionally, under the Swift environment in the iOS device, unsupervised learning is performed by using a k-means clustering algorithm. Illustratively, considering that the initialization operation has a great influence on the training time consumption and the final training effect, the initialization operation is first performed on the cluster center in the k-means clustering algorithm to determine the initial cluster center.
In an alternative embodiment, the unsupervised learning process includes the following process.
(1) An unsupervised learning process is performed using the RGB color space as shown in fig. 3.
Illustratively, in consideration of the characteristics of the RGB color space, the initial cluster centers are set to be 8, which are respectively 8 vertices of the RGB color space, and the 8 initial cluster centers are labeled as a set C, where the elements in the set C are as follows:
ck=(rk,gk,bk,ak)
wherein, ckRepresents the kth initial cluster center; r iskA red value representing the red channel in the kth initial cluster center; gkRepresents the green value of the green channel in the kth initial cluster center; bkBlue representing the blue color of the blue channel in the kth initial cluster-like center; a iskRepresenting the transparency corresponding to the kth initial cluster center.
Based on that 8 initial cluster centers correspond to 8 vertices of the RBG color space, the 8 initial cluster centers represent 8 colors, for example, red 320 is (1,0,0, 1). Wherein, the transparency corresponding to the centers of 8 initial clusters is 1, which represents the transparency of the image.
Optionally, the initial cluster center is set as 8 vertexes of the RBG color space, which can effectively accelerate convergence of the clustering algorithm.
(2) And traversing the target pixel point set P, and calculating the Euclidean distance from the target pixel point to the center of each initial cluster.
Illustratively, the following calculation formula is adopted to calculate the euclidean distance from the target pixel point to the center of each initial cluster.
Figure BDA0003583411780000191
Wherein d represents the euclidean distance. After determining the euclidean distance from the target pixel point to the center of each initial cluster, classifying the target pixel point to the center of the initial cluster with the minimum euclidean distance, for example: and (4) if the Euclidean distance from the target pixel point A to the initial cluster center 1 is minimum, classifying the target pixel point A to the initial cluster center 1.
(3) And updating the initial cluster center to obtain an updated cluster center.
Illustratively, after the target pixel points are classified into the corresponding initial cluster centers, each target pixel point in the set P has one corresponding initial cluster center.
Optionally, the cluster center C of the class is traversed. Taking the average value of the target pixel points corresponding to the center of each cluster as a new center value, wherein the formula is as follows:
Figure BDA0003583411780000201
wherein n iskAnd the number of pixels of the target pixel point corresponding to the kth initial cluster center is t, and the cluster center is updated for the t time, namely the iteration times.
That is, after the initial cluster center is updated, the target pixel point is categorized into the updated cluster center again by using the steps (2) to (3), so that the update process of updating the cluster center for many times is performed, that is: the updating process is iterated.
(4) And calculating the difference value between the current cluster center and the latest updated cluster center.
Illustratively, after each cluster center is updated, a difference value operation is performed on the updated cluster center and the cluster center which is updated last time, and a difference value between the current cluster center and the cluster center which is updated last time is determined, where the difference value operation formula is as follows:
Figure BDA0003583411780000202
wherein, i represents the ith target pixel point; k represents the kth cluster center;
Figure BDA0003583411780000203
representing the current cluster center;
Figure BDA0003583411780000204
representing the cluster center of the most recent update.
Optionally, the distance calculation of the cluster-like center also adopts a euclidean distance calculation method. Illustratively, it is preset that if loss is less than or equal to 50, the unsupervised learning classification is finished. Otherwise, performing step (5).
(5) To take into account computational efficiency, the maximum number of iterations is limited.
Optionally, the maximum number of iterations is set to 5. If the current iteration times t is equal to 5, the unsupervised learning classification is finished, otherwise, the step 2 is repeated, and the cluster center is continuously updated.
Schematically, as shown in fig. 7, an initial distribution situation of target pixel points corresponding to a target image is shown. The 8 initial cluster centers (e.g., initial cluster center A710) are at 8 vertices of the RBG color space, and the small data point 720 represents the target pixel point corresponding to the target image.
As shown in fig. 8, after the unsupervised learning method is adopted for training, after the class cluster center a810 corresponding to the white vertex is updated, coordinate movement is generated in the RBG color space, a class cluster center B820 is obtained, and the target pixel point 830 is gathered near the class cluster center B820, and the white hue is large, so that the returned theme color is white.
And 540, classifying and screening the cluster centers.
8 trained cluster centers are obtained through unsupervised learning classification, and each cluster center has a corresponding target pixel point. Optionally, the cluster center of which the total number of the target pixel points in the set P elements is greater than or equal to 1/8 is screened out to be used as an effective cluster center. And discarding the cluster center of which the total number of the target pixel points in the set P elements is less than 1/8, and considering the cluster center as a color class with a low target pixel point ratio. Up to this point, no more than 8 cluster centers were obtained as color classifications after unsupervised learning.
Step 550, color sorting.
Illustratively, the effective cluster centers are sorted in a descending order according to the pixel number of the target pixel point corresponding to each cluster center, so that at most 8 cluster centers can be obtained, as shown in a formula:
C=[C1,C2,…,C8]
step 560, theme color application.
Illustratively, in the obtained set C, C1=(r1,g1,b1,a1) And feeding back the cluster center with the largest number of pixels of the target pixel point to the iOS equipment for displaying as a result of the image theme color.
Illustratively, as shown in fig. 9, for an image display interface on a terminal, a target image 910 is displayed on the image display interface, target pixel points 920 at different positions in the target image 910 are represented in the form of RGBA colors, when the target image 910 is displayed, the terminal performs the above-mentioned image processing on the target pixel points 920 to obtain a theme color corresponding to the target image 910, and displays the theme color in a theme color area 930 other than the display area of the target image 910 in the image display interface, where the theme color is extracted based on the target image 910 and is commensurate with the target image 910, for example: appear as a gradient color effect, etc.
The above description is only exemplary, and the present invention is not limited to the above description.
In summary, based on the spatial distance between the target pixel point corresponding to the target image and the cluster-like center in the color space, the target pixel point is classified to the corresponding cluster-like center, the cluster-like center is iteratively updated according to the pixel average value of the target pixel point clustered by the cluster-like center, and after the update condition is met, the theme color of the target image is determined based on the updated cluster-like center. By the method, the terminal can finish the acquisition process of the image theme color without depending on a background server, and the updated cluster center can be better clustered to the target pixel point corresponding to the target image through the iterative updating process of the cluster center, so that the determined target pixel value is used as the theme color corresponding to the target image according to the number of pixels of the target pixel point clustered by the updated cluster center, the cost for extracting the image theme color is reduced, the obtained theme color can be used for providing a harmonious and consistent viewing effect for an interface, and the use experience of a user is improved.
Fig. 10 is a block diagram of an image processing apparatus according to an exemplary embodiment of the present application, and as shown in fig. 10, the apparatus includes the following components:
an image obtaining module 1010, configured to obtain a target image, where the target image includes target pixel points;
a space obtaining module 1020, configured to obtain a color space, where the color space includes at least two cluster centers, and the cluster centers are pixel point positions used for clustering pixel points in the color space;
a clustering module 1030, configured to cluster the target pixel points by using the at least two cluster centers as clustering centers based on spatial distances between the target pixel points and the at least two cluster centers in the color space;
the updating module 1040 is configured to obtain a pixel mean of the target pixel points clustered by the at least two cluster centers, and perform iterative updating on the at least two cluster centers;
the color determining module 1050, in response to that the updated at least two cluster centers meet the update condition, determines a target pixel value based on the updated at least two cluster centers as a theme color corresponding to the target image.
In an optional embodiment, the clustering module 1030 is further configured to determine a spatial distance between the target pixel point and the centers of the at least two clusters in the color space; and classifying the target pixel point to the cluster center with the minimum spatial distance by taking the at least two cluster centers as clustering centers.
In an optional embodiment, the color determining module 1050 is further configured to determine, in response to a difference between a current updated cluster center and a last updated cluster center being within a preset difference threshold range, a target pixel value based on at least two current updated cluster centers as a theme color corresponding to the target image; or, in response to that the number of times of updating the cluster centers reaches a preset number threshold, determining a target pixel value based on at least two updated cluster centers as a theme color corresponding to the target image.
In an optional embodiment, the color determining module 1050 is further configured to determine the number of pixels of the target pixel point corresponding to the updated centers of the at least two clusters; determining a target cluster center from the at least two cluster centers based on the number of pixels; and taking the target pixel value corresponding to the center of the target class cluster as the theme color of the target image.
In an optional embodiment, the color determining module 1050 is further configured to, based on the updated number of pixels of the target pixel points corresponding to the at least two cluster centers, take the cluster center whose number of pixels reaches a preset pixel number threshold as a candidate cluster center; and taking the candidate cluster center corresponding to the maximum pixel number as the target cluster center from the candidate cluster centers.
In an optional embodiment, the space obtaining module 1020 is further configured to construct a color space based on a color channel corresponding to the target pixel point, where the target pixel point is distributed in the color space.
In an alternative embodiment, the apparatus is further configured to determine a spatial vertex in the color space; and selecting at least two spatial vertexes from the spatial vertexes as initial cluster centers.
In an alternative embodiment, the image acquisition module 1010 is further configured to acquire a pre-processed image; performing edge clipping operation on the preprocessed image by taking a preset edge range as an edge clipping condition to obtain a clipped image; and in response to the fact that the pixel area corresponding to the cut image is larger than a preset area threshold value, carrying out thumbnail processing on the cut image, and taking at least one image with the pixel area within the range of the preset area threshold value after the thumbnail as the target image.
In an optional embodiment, the image obtaining module 1010 is further configured to perform, in response to that the area of the pixel corresponding to the cropped image reaches the preset area threshold, an equal-scale thumbnail processing on the cropped image to obtain a thumbnail image; and obtaining at least one target image in response to the pixel area corresponding to the thumbnail image being within the preset threshold range.
In summary, based on the spatial distance between the target pixel point corresponding to the target image in the color space and the cluster center, the target pixel point is classified to the corresponding cluster center, the cluster center is iteratively updated according to the pixel average value of the target pixel point clustered by the cluster center, and after the update condition is met, the theme color of the target image is determined based on the updated cluster center. By the aid of the device, the background server is not depended on, the terminal completes the image theme color obtaining process, the updated cluster center can be clustered to the target pixel points corresponding to the target image better through the iterative updating process of the cluster center, the determined target pixel values are used as the theme colors corresponding to the target image according to the number of the target pixel points clustered by the updated cluster center, the cost of extracting the image theme colors is reduced, the obtained theme colors can be extracted, harmonious and consistent viewing effects are provided for the interface, and the use experience of users is improved.
It should be noted that: the image processing apparatus provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical applications, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. In addition, the image processing apparatus and the image processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
In an exemplary embodiment, there is also provided a computer device comprising a processor and a memory, the memory having stored therein a computer program, the computer program being loaded and executed by the processor to implement the above-mentioned image processing method.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein a computer program, which is loaded and executed by a processor to implement the above-described image processing method. Alternatively, the computer-readable storage medium may be a ROM (Read-Only Memory), a RAM (Random Access Memory), a CD-ROM (Compact Disc Read-Only Memory), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided a computer program product including computer instructions stored in a computer-readable storage medium, from which a processor reads and executes the computer instructions to implement the above-described image processing method.
It should be understood that reference herein to "a plurality" means two or more. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. In addition, the step numbers described herein only show an exemplary possible execution sequence among the steps, and in some other embodiments, the steps may also be executed out of the numbering sequence, for example, two steps with different numbers are executed simultaneously, or two steps with different numbers are executed in a reverse order to the illustrated sequence, which is not limited in this application.
The above description is only exemplary of the present application and is not intended to limit the present application, and any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (13)

1. An image processing method, characterized in that the method comprises:
acquiring a target image, wherein the target image comprises target pixel points;
acquiring a color space, wherein the color space comprises at least two cluster centers, and the cluster centers are pixel point positions used for clustering pixel points in the color space;
based on the spatial distance between the target pixel point and the at least two cluster centers in the color space, clustering the target pixel point by taking the at least two cluster centers as clustering centers;
acquiring a pixel mean value of a target pixel point clustered by the at least two cluster centers, and performing iterative updating on the at least two cluster centers;
and determining a target pixel value as a theme color corresponding to the target image based on the updated at least two cluster centers in response to the updated at least two cluster centers meeting the updating condition.
2. The method of claim 1, wherein the clustering the target pixel based on a spatial distance between the target pixel and the at least two cluster centers in the color space with the at least two cluster centers as cluster centers comprises:
determining the space distance between the target pixel point and the centers of the at least two clusters in the color space;
and classifying the target pixel point to the cluster center with the minimum spatial distance by taking the at least two cluster centers as cluster centers.
3. The method according to claim 1, wherein the determining a target pixel value based on the updated at least two cluster-like centers as the subject color corresponding to the target image in response to the updated at least two cluster-like centers meeting the update condition comprises:
determining a target pixel value based on at least two cluster centers after current updating as a theme color corresponding to the target image in response to that a difference value between the cluster center after current updating and the cluster center after last updating is within a preset difference threshold range;
or,
and in response to the fact that the updating times of the cluster-like centers reach a preset time threshold, determining a target pixel value based on at least two updated cluster-like centers as a theme color corresponding to the target image.
4. The method according to claim 3, wherein the determining a target pixel value based on the updated at least two cluster-like centers as the subject color corresponding to the target image comprises:
determining the pixel quantity of target pixel points corresponding to the updated at least two cluster centers;
determining a target cluster center from the at least two cluster centers based on the number of pixels;
and taking the target pixel value corresponding to the center of the target class cluster as the theme color of the target image.
5. The method of claim 4, wherein determining a target cluster center from the at least two cluster centers based on the number of pixels comprises:
based on the updated pixel quantity of the target pixel points corresponding to the at least two cluster centers, taking the cluster center of which the pixel quantity reaches a preset pixel quantity threshold value as a candidate cluster center;
and taking the candidate cluster center corresponding to the maximum pixel number as the target cluster center from the candidate cluster centers.
6. The method of any of claims 1 to 5, wherein said obtaining a color space comprises:
and constructing a color space based on the color channel corresponding to the target pixel point, wherein the target pixel point is distributed in the color space.
7. The method of any of claims 1 to 5, further comprising:
determining a spatial vertex in the color space;
at least two spatial vertices are selected from the spatial vertices as initial cluster centers.
8. The method of any of claims 1 to 5, wherein the acquiring the target image comprises:
acquiring a pre-processing image;
performing edge clipping operation on the preprocessed image by taking a preset edge range as an edge clipping condition to obtain a clipped image;
and in response to the fact that the pixel area corresponding to the cut image is larger than a preset area threshold value, carrying out thumbnail processing on the cut image to obtain the target image.
9. The method according to claim 8, wherein the performing a thumbnail process on the cropped image in response to the pixel area corresponding to the cropped image being larger than a preset area threshold value comprises:
in response to the fact that the pixel area corresponding to the cut image reaches the preset area threshold value, carrying out equal-proportion thumbnail processing on the cut image to obtain a thumbnail image;
and obtaining at least one target image in response to the pixel area corresponding to the thumbnail image being within the preset area threshold range.
10. An image processing apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring a target image, and the target image comprises target pixel points;
the device comprises a space acquisition module, a color space acquisition module and a color matching module, wherein the color space comprises at least two cluster centers, and the cluster centers are pixel point positions used for clustering pixel points in the color space;
the clustering module is used for clustering the target pixel points by taking the at least two cluster centers as clustering centers on the basis of the spatial distance between the target pixel points and the at least two cluster centers in the color space;
the updating module is used for acquiring the pixel mean value of the target pixel point clustered by the at least two cluster centers and performing iterative updating on the at least two cluster centers;
and the color determining module is used for responding to the condition that the updated at least two cluster centers meet the updating condition, and determining a target pixel value based on the updated at least two cluster centers as the theme color corresponding to the target image.
11. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the image processing method according to any one of claims 1 to 9.
12. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the image processing method according to any one of claims 1 to 9.
13. A computer program product comprising computer instructions which, when executed by a processor, implement the image processing method of any one of claims 1 to 9.
CN202210359872.1A 2022-04-06 2022-04-06 Image processing method, apparatus, device, storage medium, and program product Pending CN114782558A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210359872.1A CN114782558A (en) 2022-04-06 2022-04-06 Image processing method, apparatus, device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210359872.1A CN114782558A (en) 2022-04-06 2022-04-06 Image processing method, apparatus, device, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN114782558A true CN114782558A (en) 2022-07-22

Family

ID=82427160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210359872.1A Pending CN114782558A (en) 2022-04-06 2022-04-06 Image processing method, apparatus, device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN114782558A (en)

Similar Documents

Publication Publication Date Title
CN110163810B (en) Image processing method, device and terminal
JPH1153525A (en) Facial organ detector and medium
US20100067863A1 (en) Video editing methods and systems
JP2012530319A (en) Method and system for quasi-duplicate image retrieval
CN111985281B (en) Image generation model generation method and device and image generation method and device
CN110390327B (en) Foreground extraction method and device, computer equipment and storage medium
US11347792B2 (en) Video abstract generating method, apparatus, and storage medium
CN108280190A (en) Image classification method, server and storage medium
CN113411550B (en) Video coloring method, device, equipment and storage medium
CN111935479A (en) Target image determination method and device, computer equipment and storage medium
CN112883827B (en) Method and device for identifying specified target in image, electronic equipment and storage medium
JP4967045B2 (en) Background discriminating apparatus, method and program
CN113723410A (en) Digital tube digital identification method and device
CN114782558A (en) Image processing method, apparatus, device, storage medium, and program product
CN114511862B (en) Form identification method and device and electronic equipment
KR102402643B1 (en) 3D color modeling optimization processing system
CN111080748A (en) Automatic picture synthesis system based on Internet
CN113947568B (en) Image processing method and device, electronic equipment and storage medium
CN115147633A (en) Image clustering method, device, equipment and storage medium
CN112884074B (en) Image design method, equipment, storage medium and device based on decision tree
CN115457581A (en) Table extraction method and device and computer equipment
CN116630139A (en) Method, device, equipment and storage medium for generating data
WO2023047162A1 (en) Object sequence recognition method, network training method, apparatuses, device, and medium
CN109242750B (en) Picture signature method, picture matching method, device, equipment and storage medium
CN113762058A (en) Video synthesis method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination