CN113469270A - Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel - Google Patents

Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel Download PDF

Info

Publication number
CN113469270A
CN113469270A CN202110806823.3A CN202110806823A CN113469270A CN 113469270 A CN113469270 A CN 113469270A CN 202110806823 A CN202110806823 A CN 202110806823A CN 113469270 A CN113469270 A CN 113469270A
Authority
CN
China
Prior art keywords
super
pixel
image
superpixel
pixel region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110806823.3A
Other languages
Chinese (zh)
Other versions
CN113469270B (en
Inventor
赵凤
张莉阳
刘汉强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Posts and Telecommunications
Original Assignee
Xian University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Posts and Telecommunications filed Critical Xian University of Posts and Telecommunications
Priority to CN202110806823.3A priority Critical patent/CN113469270B/en
Publication of CN113469270A publication Critical patent/CN113469270A/en
Application granted granted Critical
Publication of CN113469270B publication Critical patent/CN113469270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel, which mainly solves the problems of poor image segmentation effect and poor algorithm timeliness in the prior art. The scheme comprises the following steps: inputting an image to be segmented and setting initial parameters; performing super-pixel segmentation based on decomposition multi-target differential evolution on the image, and defining the edge of the super-pixel region as a weak edge of the image; extracting a strong edge of the image by using a canny operator, and merging super pixel regions of the image based on the strong edge and the weak edge; extracting the representative characteristics of each combined super-pixel region, and performing decomposition multi-target evolution fuzzy clustering on the image; and carrying out class label correction on the clustering result to obtain a final image segmentation result. According to the method, the regional information and partial supervision information of the image are fused, and the fitness function is optimized by adopting a decomposition and evolution strategy, so that the image segmentation performance is effectively improved, and the problem of poor timeliness of a multi-target evolution fuzzy clustering algorithm is solved.

Description

Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel
Technical Field
The invention belongs to the technical field of image processing, and further relates to a semi-supervised intuitive clustering method, in particular to a semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel, which can be used for identifying natural images.
Background
Image segmentation is a process of assigning the same label to similar pixels in an image, and the quality of the result is very important for subsequent image analysis. Image segmentation is mainly classified into five major categories, which are a threshold-based method, a cluster-based method, an edge detection-based method, a region-based method, and a method combining a specific theory. The clustering-based image segmentation method is a common method in image segmentation due to the characteristics of simple principle, good segmentation effect and the like. Common clustering methods include a K-means clustering algorithm, a fuzzy clustering algorithm, a spectral clustering algorithm, a hierarchical clustering algorithm and the like. Objects in the real world often have ambiguity and uncertainty, so that the fuzzy clustering algorithm can more objectively analyze the objects in the real world and attract the attention of numerous scholars at home and abroad. However, the conventional fuzzy clustering algorithm has some defects when applied to image segmentation: 1. the initial value of the clustering center is sensitive and is easy to fall into local optimum; 2. the method is sensitive to noise, and if the image contains a large amount of noise, an ideal image segmentation result cannot be obtained; 3. only a single objective function is considered during clustering, and different requirements of users cannot be met. Therefore, in recent years, many scholars have made intensive studies on this.
In 2016, Liu Han Qiang et al, in "local search adaptive kernel fuzzy clustering method [ J ], computer engineering and science, 38 (8): 1735-; then designing a local search method based on a kernel, and searching an initial clustering center by locally searching partial sample data; although the method can solve the problem that the traditional clustering algorithm is sensitive to the initial value of the clustering center to a certain extent, the time complexity of the algorithm is higher due to the addition of the local search method based on the kernel.
In 2019, in a' suppressed non-local space intuitionistic fuzzy C-means image segmentation algorithm [ J ], a report of electronics and informatics, 41 (6): 1472-. The algorithm improves the robustness to noise by calculating the non-local spatial information of the pixels, overcomes the defect that the traditional fuzzy clustering algorithm only considers the gray characteristic information of a single pixel of an image, and improves the image segmentation precision, but the algorithm does not consider the region information of the image when carrying out image clustering, ignores the similarity between adjacent pixels, and finally obtains poor image segmentation effect.
In 2011, Mukhopadhyay et al, in "A multi objective approach to MR flaw image segmentation [ J ], Applied Soft Computing, 11 (1): 872-; however, when the algorithm is applied to image segmentation, the region information of the image is ignored, the image segmentation effect is poor, in addition, much time is needed in the multi-target evolution process, and the algorithm timeliness is poor.
Disclosure of Invention
The invention aims to provide a semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel aiming at the defects of the prior art. The image segmentation performance is improved by fusing the region information and part of the supervision information of the image, and the speed of the multi-objective evolutionary fuzzy clustering algorithm is improved by optimizing two fitness functions by using a Kriging auxiliary reference vector guided decomposition evolution strategy. Therefore, the image segmentation performance is effectively improved, and the problem of poor timeliness of the multi-target evolution fuzzy clustering algorithm is solved.
The invention realizes the aim as follows:
(1) inputting a color image to be segmented;
(2) setting parameters: the number of superpixels is 500, the superpixel fuzzy index is 25, the positive integer is 5, and the maximum iteration number t of the superpixels ismax10, the number of neighborhoods is 10, the differential evolution variation factor is 0.5, and the differential evolution cross factor is 0.9; the cluster population scale is 50, and the maximum iteration times w of clusteringmax100, the number of individuals used for updating the Kriging model is 5, the fixed iteration number before updating the Kriging model is 20, the binary cross probability is 0.9 and the polynomial variation probability is 0.1;
(3) performing super-pixel segmentation on the color image based on decomposition multi-target differential evolution, and defining the edge of a super-pixel region after segmentation as a weak edge of the image; the super-pixel segmentation method based on decomposition multi-target differential evolution is used for carrying out super-pixel segmentation on a color image, and comprises the following specific steps:
(3.1) coding the core point offset component by adopting a decomposition multi-objective differential evolution method to obtain an initial population P:
if an image has N pixel points and is divided into K super-pixel regions with uniform size, the side length of each region is about
Figure BDA0003166914090000021
Initial population P ═ Pi,1,pi,2,…,pi,D]The following random strategy was used for generation:
pi,j=-S/2+rand×S,
wherein ,pi,jRepresenting individuals in the initial population; rand function produces [0,1]]Random number of (i) ═ 1,2, …, pop, j ═ 1,2, …, D ═ 2K; population-scale pop adoption
Figure BDA0003166914090000022
Calculating and determining, wherein M is 3 and H is a self-defined positive integer, and the number of the super-pixel criterion functions is the number of the super-pixel criterion functions;
(3.2) randomly selecting points in each uniform super-pixel region on the image as core points, and then obtaining seed points s of super-pixels of the image by using the core points and the offset of individual decodingi,k
Figure BDA0003166914090000023
Wherein q is 0.1, ci,kA core point representing a K-th super pixel of the i-th individual corresponding image, K being 1,2, …, K; Λ represents a set of K × K diagonal matrices;
(3.3) taking a seed point si,kThe 3S multiplied by 3S neighborhood, and a super-pixel label matrix L is obtained by judging the distance between the pixels in the neighborhood and the seed pointsi
(3.4) tag matrix L based on superpixelsiThree superpixel criterion functions are designed: mean square error f in superpixel1(si,Li) Superpixel edge gradient criterion function f2(Li) And regional regularization term superpixel criterion function f3(Li);
Mean square error f in superpixel1(si,Li) The calculation formula is as follows:
Figure BDA0003166914090000031
wherein ,InA 5-dimensional feature vector of an nth pixel point, wherein N is 1, 2. L isi(n) is the nth pixel point in the super pixel label matrix LiThe label of (1); d represents a pixel distance;
superpixel edge gradient criterion function f2(Li) The specific calculation formula is as follows:
Figure BDA0003166914090000032
Figure BDA0003166914090000033
wherein, Δ i (n) is a gradient feature corresponding to the nth pixel of the image; l isi(n) is the nth pixel point at LiThe label of (1); delta (. beta.)) The function is a condition judgment function, when the condition in brackets is true, 1 is returned, otherwise, 0 is returned; if the nth pixel is adjacent to the neighborhood WnThe labels of the pixels in the inner portion are different, then
Figure BDA0003166914090000034
Indicating that the pixel is at the intersection of two super-pixel regions.
Regional regularization term superpixel criterion function f3(Li) The calculation formula is as follows:
Figure BDA0003166914090000035
wherein ,
Figure BDA0003166914090000036
a super-pixel label matrix L representing the ith individualiThe number of pixels in the kth super-pixel region;
(3.5) decomposing the three super-pixel criterion functions by adopting a Chebyshev method MOEA/D, wherein the method specifically comprises the following steps:
(3.5.1) initializing weight vector matrix λ ═ λ12,…,λi,…,λpop]By calculating λiEuclidean distance between the vector and other weight vectors to obtain lambdaiT neighborhood weight vectors λi1i2,…,λiT
(3.5.2) decomposing the three superpixel criterion functions by adopting the Chebyshev method according to the calculation formula:
Figure BDA0003166914090000037
wherein λ' ═ λ12,…,λM]Is a set of weight vectors in lambda, and M is 3, which is the number of superpixel criterion functions; for each e ═ 1,2, …, M, there is λe≥0,
Figure BDA0003166914090000041
Representing a reference point, which is calculated by the formula
Figure BDA0003166914090000042
(3.6) crossing, mutating and selecting individuals, and obtaining a final population and an optimal solution through iterative updating to finally obtain a super-pixel region segmentation result of the image;
(4) extracting strong edges of the images by using an image edge detection canny operator, and merging super pixel regions of the images based on the strong and weak edges;
(5) extracting the representative feature r of the merged k-th super-pixel regionk
Figure BDA0003166914090000043
wherein ,YαRepresenting the RGB eigenvalues, Y, of the pixel point alpha in the superpixel regionβRGB characteristic value, w (Y) of median pixel point beta in the super pixel regionα,Yβ) Representing the weight between pixels α and β:
w(Yα,Yβ)=Qαβ×Uαβ
wherein ,QαβExpressing the position weight, wherein the closer the distance between the pixel point alpha and the beta is, the higher the weight is; u shapeαβExpressing color weight, wherein the closer the color information of the pixel point alpha and the color information of the pixel point beta are, the higher the weight of the pixel point alpha and the weight of the pixel point beta are; qαβ and UαβThe calculation formulas of (A) and (B) are respectively as follows:
Figure BDA0003166914090000044
wherein, (x, y) represents the coordinates of the pixels in the super-pixel region, and num represents the number of the pixels in the super-pixel region; σ represents a color feature variance of the super-pixel region;
taking k as 1, 2.. times.g, and calculating to obtain a representative feature set r as { r ═ r of each super pixel region1,r2,…,rk,…,rG};
(6) Obtaining partial supervision information by using marking information of user on image
Figure BDA0003166914090000045
(7) Initializing a reference vector, randomly initializing a population and coding chromosomes in the population;
(8) constructing a semi-supervised intuitive fuzzy compactness function J fusing superpixel region information:
Figure BDA0003166914090000046
where C denotes the number of image clusters, G denotes the number of super pixel regions, m denotes a cluster blur index, κ denotes a weighting index, and κ is set to 2, rkRepresentative feature, v, representing the kth super-pixel regionρThe cluster center of the rho-th class is represented,
Figure BDA0003166914090000051
expressed in an intuitive fuzzy set rkTo vρEuclidean distance of (c):
Figure BDA0003166914090000052
wherein ,
Figure BDA0003166914090000053
Figure BDA00031669140900000515
and π (·) represents the degree of membership, the degree of non-membership, and the degree of hesitation, respectively, in the intuitive fuzzy set:
Figure BDA0003166914090000054
Figure BDA0003166914090000055
Figure BDA0003166914090000056
wherein τ is a fixed parameter that generates the non-membership function;
Figure BDA0003166914090000057
is represented by rkMembership to vρSupervision membership degree of (2):
Figure BDA0003166914090000058
wherein ,
Figure BDA0003166914090000059
is represented by rkMembership to vρIf the labeled super-pixel region belongs to class 1, then
Figure BDA00031669140900000510
Of unmarked super-pixel areas
Figure BDA00031669140900000511
uρkRepresenting the degree of membership of the kth super-pixel region to the ρ -th class center:
Figure BDA00031669140900000512
(9) constructing an intuitive fuzzy separation function CS fusing superpixel region information:
Figure BDA00031669140900000513
Figure BDA00031669140900000514
wherein ,μγρDenotes vρWith respect to vγDegree of membership.
(10) Respectively calculating fitness function values, namely J and 1/CS, of each individual in the initial population according to the function expressions constructed in the steps (8) and (9), training a Kriging model by using the individuals in the initial population and the fitness function values thereof, and setting t to be 0 and w to be 0, wherein t represents the current iteration number of the superpixel, and w represents the current iteration number of the cluster;
(11) generating an offspring population by using binary intersection and polynomial variation, predicting the objective function value of an offspring individual by using a Kriging model, and merging a parent individual and the offspring individual;
(12) selecting a new population by adopting a selection strategy APD based on an angle punishment distance, updating a reference vector, and setting w to be w + 1;
(13) judging whether the Kriging model needs to be updated or not, if w is more than wmaxIf yes, updating the Kriging model, making w equal to 0, and executing the step (14); otherwise, returning to the step (11);
(14) judging whether the maximum iteration times of the super pixels are reached, if t is more than tmaxIf so, the iteration is terminated to obtain the final generation of non-dominated solution set, and the step (15) is executed; otherwise, let t be t +1, return to step (11);
(15) constructing a semi-supervised intuitive fuzzy clustering optimal solution selection index SI fused with super-pixel region information:
Figure BDA0003166914090000061
wherein ,ECRepresenting an intra-class compactness metric, E1Denotes the compactness of all samples grouped together as one class, FCRepresenting an inter-class maximum separability metric;
(16) selecting an optimal individual from the last generation of non-dominated solution set by using an optimal solution selection index SI to obtain an optimal clustering center;
(17) according to the optimal clustering center, performing label distribution on each super-pixel region to obtain labels of all pixel points in the image, and obtaining an image clustering result;
(18) and carrying out class label correction on the image clustering result to obtain a final image segmentation result.
Compared with the prior art, the invention has the following beneficial technical effects:
firstly, the image is subjected to superpixel segmentation under a plurality of criteria, and then representative features of superpixel regions are extracted for multi-target evolutionary fuzzy clustering, so that region information of the image can be fully considered during image clustering, and the finally obtained image segmentation effect is obviously improved;
secondly, before image clustering, the images are preprocessed by utilizing a super-pixel technology, so that the speed of a multi-target evolution fuzzy clustering algorithm is increased;
thirdly, the method respectively constructs a semi-supervised intuitionistic fuzzy compactness function fusing the super-pixel region information and an intuitionistic fuzzy separation function fusing the super-pixel region information as fitness functions to be optimized, and constructs a semi-supervised intuitionistic fuzzy clustering optimal solution selection index fusing the super-pixel region information to select an optimal clustering center, so that the image segmentation performance is improved.
Drawings
FIG. 1 is a flow chart of an implementation of the method of the present invention;
FIG. 2 is a comparison of results of a simulation segmentation of an image numbered 253036 in a Berkeley image database using the present invention and a prior art method;
FIG. 3 is a comparison of results of a simulated segmentation of an image numbered 113334665744 in a Weizmann image database using the present invention and a prior art method;
fig. 4 is a schematic diagram of super-pixel region merging based on image strong and weak edges in the present invention, wherein (a) is a schematic diagram of image strong and weak edges, and (b) is a schematic diagram of region merging results.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Embodiment one, referring to fig. 1, the implementation steps of the present invention are as follows:
step A: inputting a color image to be segmented and setting initial parameter values.
Inputting a color image to be segmented;
setting parameters: setting the number of superpixels as 500, the superpixel fuzzy index as 25, the positive integer as 5, the maximum iteration number of the superpixels as 10, the number of neighborhoods as 10, the differential evolution variation factor as 0.5 and the differential evolution cross factor as 0.9; the cluster population scale is 50, the maximum clustering iteration frequency is 100, the number of individuals for updating the Kriging model is 5, the fixed iteration frequency before updating the Kriging model is 20, the binary cross probability is 0.9 and the polynomial variation probability is 0.1;
and B: and performing super-pixel segmentation based on decomposition multi-target differential evolution on the color image, and defining the edge of the super-pixel region as a weak edge of the image.
2.1) initializing;
2.1.1) assuming that an image with N pixel points is divided into K superpixel regions of uniform size, the side length of each region is about
Figure BDA0003166914090000071
Selecting 5-dimensional features of the center point of each super-pixel region to obtain an initialization core point c ═ ci,k}, wherein ,ci,kCore points of k-th super-pixels of the image corresponding to the i-th individual, ci,k=[li,k,ai,k,bi,k,xi,k,yi,k],li,k,ai,k,bi,kRepresents the Lab color characteristic, xi,k,yi,kRepresenting its spatial characteristics;
2.1.2) according to the formula
Figure BDA0003166914090000072
Calculating the super-pixel population size pop, and initializing a weight vector matrix lambda ═ lambda12,…,λpop]By calculating λiEuclidean distance between the vector and other weight vectors to obtain lambdaiT neighborhood weight vectors λi1i2,L,λiTLet B (i) equal [ i1, i2, …, iT],i=1,2,…,pop;
2.1.3) initializing population P ═ Pi,1,pi,2,…,pi,D]:
pi,j=-S/2+rand×S,
Wherein the rand function generates a random number between [0,1], i ═ 1,2, …, pop, j ═ 1,2, …, D ═ 2K;
2.1.4) Using individuals piOptimizing the core point c to generate a group of super pixel points siAnd a super-pixel label matrix Li
Figure BDA0003166914090000081
Figure BDA0003166914090000082
Figure BDA0003166914090000083
Wherein q is 0.1, K is 1,2, …, K, si,kRepresenting p by the ith individualiAnd optimizing the 5-dimensional characteristics of the kth super-pixel seed point generated by the core point c. And based on the obtained super-pixel seed point, taking a 3S multiplied by 3S neighborhood of the seed point, and judging the distance between the pixels in the neighborhood and the seed point to obtain a super-pixel label matrix. Given any two pixel points α and β, the distance between them is given by the formula:
Figure BDA0003166914090000084
Figure BDA0003166914090000085
Figure BDA0003166914090000086
wherein ,dc(α, β) represents a color distance, ds(α, β) represents a spatial distance, and m' represents a super-pixel blur index.
2.1.5) calculating the error f in super pixel of the objective function valuei,1Superpixel edge gradient criterion function fi,2And a regional regularization term fi,3。f1The mean square error of each pixel of the representation image distributed to the nearest super-pixel seed point is smaller, the smaller the value of the mean square error is, the more accurate the pixel is represented, and the specific calculation formula is as follows:
Figure BDA0003166914090000087
wherein ,InIs the 5-dimensional feature vector of the nth pixel point, L ═ Li(n) is the nth pixel point in the super pixel label matrix LiTag of (1), si,lIndicates the seed point s corresponding to the ith individualiThe 5-dimensional feature vector at the ith superpixel point. f. of2The superpixel edge gradient criterion is a basis for judging the boundary strength, the larger the value is, the better the value is, and the specific calculation formula is as follows:
Figure BDA0003166914090000088
Figure BDA0003166914090000089
wherein, Δ i (n) is a gradient feature corresponding to the nth pixel of the image; l isi(n) is the nth pixel point at LiThe label of (1); δ (-) is a condition judgment function, when the condition in the bracket is true, 1 is returned, otherwise, 0 is returned. It can be analyzed from the formula definition that if the nth pixel is adjacent to the neighborhood WnThe labels of the pixels in the inner portion are different, then
Figure BDA00031669140900000810
Indicating that the pixel is at the intersection of two super-pixel regions. f. of3Indicating how much the size of each super-pixel region deviates from the desired size, the smaller the value the better, specifically:
Figure BDA0003166914090000091
wherein ,
Figure BDA0003166914090000092
a super-pixel label matrix L representing the ith individualiThe number of pixels in the kth super-pixel region.
2.1.6) initializing reference points
Figure BDA0003166914090000093
The calculation formula is
Figure BDA0003166914090000094
2.2) updating the neighborhood solution, wherein the specific method comprises the following steps: for each individual piH epsilon, B (i), respectively calculating polymerization function values g(s)i,Lih,z*),g(sh,Lhh,z*) If g(s)i,Lih,z*)≤g(sh,Lhh,z*) Then let ph=pi,fh,1=fi,1,fh,2=fi,2,fh,3=fi,3,O(h)=g(si,Lih,z*) Otherwise, let O (h) g(s)h,Lhh,z*) Wherein O (h) represents the aggregation function value corresponding to the h-th group of superpixel points;
2.3) selecting an individual corresponding to the minimum aggregation function value to combine with the core point to generate an initial optimal label matrix;
2.4) iterative update
2.4.1) setting gen 1;
2.4.2) updating the core point c;
2.4.3) updating individuals piAnd corresponding objective function value fi,1,fi,2,fi,3While updating the reference point z*Then step 2.2) is performed to update the neighborhood solution, i ═ 1,2, …, pop;
2.4.4) selecting an individual corresponding to the minimum aggregation function value to combine with the core point c to generate an optimal label matrix L, and setting gen to gen + 1;
2.4.5) judging whether the maximum iteration number is reached, if gen is larger than genmaxOutputting L to obtain a super-pixel segmentation result of the image; otherwise, step 2.4.2) is executed;
and C: and extracting strong edges of the images by using a canny operator, and merging super pixel regions of the images based on the strong edges and the weak edges.
Step D: and extracting the representative characteristics of each combined super-pixel region, and performing decomposition multi-target evolution fuzzy clustering on the image.
Step E: and carrying out class label correction on the clustering result to obtain a final image segmentation result.
In the second embodiment, the implementation steps of the present invention are further described in detail in this embodiment:
step 1: inputting a color image to be segmented;
step 2: setting parameters: the number of superpixels is 500, the superpixel fuzzy index is 25, the positive integer is 5, and the maximum iteration number t of the superpixels ismax10, the number of neighborhoods is 10, the differential evolution variation factor is 0.5, and the differential evolution cross factor is 0.9; the cluster population scale is 50, and the maximum iteration times w of clusteringmax100, the number of individuals used for updating the Kriging model is 5, the fixed iteration number before updating the Kriging model is 20, the binary cross probability is 0.9 and the polynomial variation probability is 0.1;
and step 3: performing super-pixel segmentation on the color image based on decomposition multi-target differential evolution, and defining the edge of a super-pixel region after segmentation as a weak edge of the image; the super-pixel region segmentation of an image is generally a process of grouping pixels with adjacent positions and similar characteristics into small regions. The method considers the super-pixel segmentation of the image from three aspects of super-pixel inner mean square error, super-pixel edge gradient criterion and area regular term. In order to optimize the three targets simultaneously, a decomposition multi-target differential evolution method is adopted, firstly, a core point offset component is coded to obtain an initialized population, different seed points are obtained by utilizing individuals and core points in the population, then a Chebyshev method decomposition method is utilized to calculate a superpixel criterion function based on a superpixel inner mean square error, a superpixel edge gradient criterion and a region regular term, then the individuals are updated by utilizing a crossing, variation and selection strategy, a final population and an optimal solution are obtained by continuous iteration, and a superpixel region segmentation result of an image is finally obtained.
The method comprises the following steps of carrying out super-pixel segmentation on a color image based on decomposition multi-target differential evolution:
(3.1) in order to obtain a super-pixel region of an image, the difference between the image pixel and a region seed point needs to be measured from the aspects of spatial position and characteristics; in order to obtain a proper super-pixel seed point, the core point offset component of each super-pixel needs to be encoded, so that the core point offset component is encoded by adopting a decomposition multi-objective differential evolution method to obtain an initial population P:
if an image has N pixel points and is divided into K super-pixel regions with uniform size, the side length of each region is about
Figure BDA0003166914090000101
Initial population P ═ Pi,1,pi,2,…,pi,D]The following random strategy was used for generation:
pi,j=-S/2+rand×S,
wherein ,pi,jRepresenting individuals in the initial population; rand function produces [0,1]]Random number of (i) ═ 1,2, …, pop, j ═ 1,2, …, D ═ 2K; population-scale pop adoption
Figure BDA0003166914090000102
Calculating and determining, wherein M is 3 and H is a self-defined positive integer, and the number of the super-pixel criterion functions is the number of the super-pixel criterion functions;
(3.2) as the individual codes the core point offset component, in the initial stage of using the decomposition-based multi-target differential evolution method, the core point is randomly and uniformly initialized on the image, namely the core point is selected to be positioned in each uniform super-pixel region; seed points of the image superpixels are then obtained using the core points and the individual decoded offsets.
Randomly selecting points in each uniform super-pixel area on the image as core points, and then obtaining seed points s of the super-pixels of the image by using the core points and the offset of individual decodingi,k
Figure BDA0003166914090000103
Wherein q is 0.1, ci,kA core point representing a K-th super pixel of the i-th individual corresponding image, K being 1,2, …, K; Λ represents a set of K × K diagonal matrices;
(3.3) taking a seed point si,kThe 3S multiplied by 3S neighborhood, and a super-pixel label matrix L is obtained by judging the distance between the pixels in the neighborhood and the seed pointsi(ii) a The distance between the pixel and the seed point is calculated as follows:
given any two pixel points α and β, the distance d (α, β) between the two is:
Figure BDA0003166914090000104
wherein m' represents a superpixel blur index; dc(α, β) represents a color distance, ds(α, β) represents a spatial distance;
(3.4) tag matrix L based on superpixelsiThree superpixel criterion functions are designed: mean square error f in superpixel1(si,Li) Superpixel edge gradient criterion function f2(Li) And regional regularization term superpixel criterion function f3(Li);
Mean square error f in superpixel1(si,Li) The calculation formula is as follows:
Figure BDA0003166914090000111
wherein ,InA 5-dimensional feature vector of an nth pixel point, wherein N is 1, 2. L isi(n) is the nth pixel point in the super pixel label matrix LiThe label of (1); d represents a pixel distance;
superpixel edge gradient criterion function f2(Li) The specific calculation formula is as follows:
Figure BDA0003166914090000112
Figure BDA0003166914090000113
wherein, Δ i (n) is a gradient feature corresponding to the nth pixel of the image; l isi(n) is the nth pixel point at LiThe label of (1); delta (-) is a condition judgment function, when the condition in the bracket is true, 1 is returned, otherwise, 0 is returned; if the nth pixel is adjacent to the neighborhood WnThe labels of the pixels in the inner portion are different, then
Figure BDA0003166914090000114
Indicating that the pixel is at the intersection of two super-pixel regions.
Regional regularization term superpixel criterion function f3(Li) The calculation formula is as follows:
Figure BDA0003166914090000115
wherein ,
Figure BDA0003166914090000116
a super-pixel label matrix L representing the ith individualiThe k-th one ofThe number of pixels in the super pixel region;
(3.5) in order to simultaneously optimize the three superpixel criterion functions, the three superpixel criterion functions are decomposed into a plurality of scalar subproblems by adopting a Chebyshev method of MOEA/D so as to evaluate the advantages and disadvantages of individuals in a population. Decomposing three super-pixel criterion functions by adopting a Chebyshev method MOEA/D, which comprises the following steps:
(3.5.1) initializing weight vector matrix λ ═ λ12,…,λi,…,λpop]By calculating λiEuclidean distance between the vector and other weight vectors to obtain lambdaiT neighborhood weight vectors λi1i2,…,λiT
(3.5.2) decomposing the three superpixel criterion functions by adopting the Chebyshev method according to the calculation formula:
Figure BDA0003166914090000117
wherein λ' ═ λ12,…,λM]Is a set of weight vectors in lambda, and M is 3, which is the number of superpixel criterion functions; for each e ═ 1,2, …, M, there is λe≥0,
Figure BDA0003166914090000121
Figure BDA0003166914090000122
Representing a reference point, which is calculated by the formula
Figure BDA0003166914090000123
(3.6) crossing, mutating and selecting individuals, and obtaining a final population and an optimal solution through iterative updating to finally obtain a super-pixel region segmentation result of the image; wherein, the individual is crossed, mutated and selected as follows:
(3.6.1) obtaining the decomposition results of all individuals under three criterion functions by utilizing the Chebyshev method, and selecting the optimal individual piAnd the seed points s of the super pixels of the image are calculatedi,kAs core points;
(3.6.2) generating new individuals by crossover and mutation operations:
Figure BDA0003166914090000124
wherein χ, τ ∈ b (i), b (i) ═ i1, i2, …, iT ]; χ ≠ τ ≠ i, FR 'is the variation factor, CR' is the crossover factor, rand is a random number between [0,1 ];
(3.6.3) defining as being illegal values elements that appear in the individual that are greater than the maximum value or less than the minimum value and repairing them as being adjacent border values;
(3.6.4) generating new individuals using a gaussian mutation operator:
Figure BDA0003166914090000125
wherein ,
Figure BDA0003166914090000126
means following a normal distribution of
Figure BDA0003166914090000127
The standard deviation is random number of S/20, pm is mutation probability, and pm is defined as 1/D.
And 4, step 4: extracting strong edges of the images by using an image edge detection canny operator, and merging super pixel regions of the images based on the strong and weak edges; the super-pixel region combination based on the strong and weak edges is carried out on the image, and the method specifically comprises the following steps: obtaining strong edge information E of image by using image edge detection canny operatoredgeThen selecting the spatial position characteristic of the center point of the super pixel region, and constructing a set cen ═ cen1,cen2,…,cenK]And calculating the space distance between any two central points, and finally judging whether a strong edge exists on the connecting line of the central points of the 8 adjacent domain superpixels of each superpixel region, if so, notMerging; if not, combining the two super pixel areas; finally, super-pixel region combination based on image strong and weak edges is achieved, and G combined super-pixel regions R ═ R are obtained1,R2,…,RG]. Referring to fig. 4, wherein u, v, and w in (a) are 3 central points of 3 super-pixel regions R1, R2, and R3 in an image, respectively, a dotted line in the figure represents a boundary between the super-pixel regions, i.e., a weak edge of the image, and a line L represents a strong edge of the image. There is no strong edge of the image on the connecting line between u and v, so combining the two super-pixel regions R1 and R2 results in (b) there is a strong edge of the image on the connecting line between regions R4, u and w and v and w of the map, so none of the super-pixel regions R1 and R3 and R2 and R3 are combined.
And 5: extracting the representative feature r of the merged k-th super-pixel regionk
Figure BDA0003166914090000131
wherein ,YαRepresenting the RGB eigenvalues, Y, of the pixel point alpha in the superpixel regionβRGB characteristic value, w (Y) of median pixel point beta in the super pixel regionα,Yβ) Representing the weight between pixels α and β:
w(Yα,Yβ)=Qαβ×Uαβ
wherein ,QαβExpressing the position weight, wherein the closer the distance between the pixel point alpha and the beta is, the higher the weight is; u shapeαβExpressing color weight, wherein the closer the color information of the pixel point alpha and the color information of the pixel point beta are, the higher the weight of the pixel point alpha and the weight of the pixel point beta are; qαβ and UαβThe calculation formulas of (A) and (B) are respectively as follows:
Figure BDA0003166914090000132
Figure BDA0003166914090000133
wherein, (x, y) represents the coordinates of the pixels in the super-pixel region, and num represents the number of the pixels in the super-pixel region; σ represents a color feature variance of the super-pixel region;
taking k as 1, 2.. times.g, and calculating to obtain a representative feature set r as { r ═ r of each super pixel region1,r2,…,rk,…,rG};
Step 6: obtaining partial supervision information by using marking information of user on image
Figure BDA0003166914090000134
The semi-supervised strategy is adopted, namely, each type of manual drawn line in the image is marked, namely marking information, so that the characteristic information RGB characteristic value of the on-line pixel point is obtained, and the characteristic information RGB characteristic value is called as partial supervision information.
And 7: initializing a reference vector, wherein the vector is a professional term involved in a Kriging model; randomly initializing a population and coding chromosomes in the population, namely coding RGB characteristic values of a clustering center; if an image is classified as C, each individual is a vector of C multiplied by 3.
And 8: constructing a semi-supervised intuitive fuzzy compactness function J fusing superpixel region information:
Figure BDA0003166914090000135
where C denotes the number of image clusters, G denotes the number of super pixel regions, m denotes a cluster blur index, κ denotes a weighting index, and κ is set to 2, rkRepresentative feature, v, representing the kth super-pixel regionρThe cluster center of the rho-th class is represented,
Figure BDA0003166914090000136
expressed in an intuitive fuzzy set rkTo vρEuclidean distance of (c):
Figure BDA0003166914090000141
wherein ,
Figure BDA0003166914090000142
Figure BDA00031669140900001414
and π (·) represents the degree of membership, the degree of non-membership, and the degree of hesitation, respectively, in the intuitive fuzzy set:
Figure BDA0003166914090000143
Figure BDA0003166914090000144
Figure BDA0003166914090000145
wherein τ is a fixed parameter that generates the non-membership function;
Figure BDA0003166914090000146
is represented by rkMembership to vρSupervision membership degree of (2):
Figure BDA0003166914090000147
wherein ,
Figure BDA0003166914090000148
is represented by rkMembership to vρIf the labeled super-pixel region belongs to class 1, then
Figure BDA0003166914090000149
Of unmarked super-pixel areas
Figure BDA00031669140900001410
uρkRepresenting the degree of membership of the kth super-pixel region to the ρ -th class center:
Figure BDA00031669140900001411
and step 9: constructing an intuitive fuzzy separation function CS fusing superpixel region information:
Figure BDA00031669140900001412
Figure BDA00031669140900001413
wherein ,μγρDenotes vρWith respect to vγDegree of membership.
Step 10: and respectively calculating the fitness function value of each individual in the initial population, namely J and 1/CS according to the function expressions constructed in the steps 8 and 9, wherein the fitness function is a general name in the multi-objective evolutionary clustering algorithm and is composed of a plurality of objective functions. The fitness function of the present invention refers to the function J constructed in step 8 and the function CS constructed in step 9. Training a Kriging model by using individuals in the initial population and fitness function values thereof, and setting t as 0 and w as 0, wherein t represents the current iteration times of the superpixels, and w represents the current iteration times of clustering;
step 11: generating an offspring population by using binary intersection and polynomial variation, predicting the objective function value of an offspring individual by using a Kriging model, and merging a parent individual and the offspring individual;
step 12: after dividing an original population into a plurality of sub-populations, selecting an elite individual from each sub-population to a next generation, the invention adopts a selection strategy based on Angle Penalized Distance (APD), and can better balance diversity and convergence by selecting the individual with the smallest APD.
Selecting a new population by adopting a selection strategy APD based on an angle punishment distance, updating a reference vector, and setting w to be w + 1; the selection strategy based on the angle penalty distance specifically comprises the following steps: firstly, calculating APD values of population individuals, and then selecting the individual with the minimum APD value for balancing diversity and convergence; the APD value is calculated as follows:
Figure BDA0003166914090000151
wherein ,
Figure BDA0003166914090000152
representing target vectors
Figure BDA0003166914090000153
Euclidean distance to origin, θt,i,eTo represent
Figure BDA0003166914090000154
With its own reference vector vt,eAngle between P (theta)t,i,e) And (3) representing a penalty function, and calculating the formula as follows:
Figure BDA0003166914090000155
wherein beta represents a parameter penalizing the rate of change,
Figure BDA0003166914090000156
representing a reference vector vt,eMinimum angle values with other reference vectors in the current generation;
step 13: judging whether the Kriging model needs to be updated or not, if w is more than wmaxIf yes, updating the Kriging model, making w equal to 0, and executing step 14; otherwise, returning to step 44;
step 14: judging whether the maximum iteration times of the super pixels are reached, if t is more than tmaxIf so, the iteration is terminated to obtain the final generation of non-dominated solution set, and the step 15 is executed; otherwise, let t be t +1, return to step 11;
step 15: constructing a semi-supervised intuitive fuzzy clustering optimal solution selection index SI fused with super-pixel region information:
Figure BDA0003166914090000157
wherein ,ECRepresenting an intra-class compactness metric, E1Denotes the compactness of all samples grouped together as one class, FCRepresenting an inter-class maximum separability metric;
step 16: selecting an optimal individual from the last generation of non-dominated solution set by using an optimal solution selection index SI to obtain an optimal clustering center;
and step 17: according to the optimal clustering center, performing label distribution on each super-pixel region to obtain labels of all pixel points in the image, and obtaining an image clustering result;
step 18: and carrying out class label correction on the image clustering result to obtain a final image segmentation result.
The technical effects of the invention are further explained by combining simulation experiments as follows:
1. simulation conditions are as follows:
the simulation experiment is carried out in the software environment of computer Inter (R) core (TM) i5-6500M 3.20GHz CPU, 8G memory and MATLAB R2018 a.
2. Simulation content:
simulation 1, selecting an image with the number of 253036 in a Berkeley image database, and segmenting the image by using the method of the invention and the existing FCM method, KFCM method, IFCM method, MOVGA method, K-MOVGA method and RE-MSSFC method respectively, wherein the result is shown in figure 2, wherein:
(a) 253036 is the original of the image;
(b) is a standard segmentation map of the 253036 image;
(c) 253036 is a decomposition-based multi-target differential evolution superpixel segmentation result graph;
(d) is a region merge map of the 253036 image;
(e) is a supervised information tag map of the 253036 image;
(f) is the result of the segmentation of 253036 images by the existing FCM method;
(g) is the result of the segmentation of 253036 images by the existing KFCM method;
(h) is the result of the segmentation of 253036 images by the existing IFCM method;
(i) is the result of the segmentation of 253036 images by the existing MOVGA method;
(j) is the result of the segmentation of 253036 images by the existing K-MOVGA method;
(k) is the segmentation result of 253036 image by the existing RE-MSSFC method;
(l) Is the result of the segmentation of the 253036 image using the present invention;
as can be seen from FIG. 2, the present invention can clearly separate the object from the background, so the present invention has better segmentation effect on Berkeley atlas than the existing FCM method, KFCM method, IFCM method, MOVGA method, K-MOVGA method and RE-MSSFC method.
Simulation 2, selecting an image with the number of 113334665744 in a Weizmann image database, and segmenting the image by using the method of the invention and the existing FCM method, KFCM method, IFCM method, MOVGA method, K-MOVGA method and RE-MSSFC method respectively, wherein the result is shown in FIG. 3, wherein:
(a) 113334665744 is the original of the image;
(b) is a standard segmentation map of the 113334665744 image;
(c) 113334665744 is a decomposition-based multi-target differential evolution superpixel segmentation result graph;
(d) is a region merge map of the 113334665744 image;
(e) is a supervised information tag map of the 113334665744 image;
(f) is the result of the segmentation of 113334665744 images by the existing FCM method;
(g) is the result of the segmentation of 113334665744 images by the existing KFCM method;
(h) is the result of the segmentation of 113334665744 images by the existing IFCM method;
(i) is the result of the segmentation of 113334665744 images by the existing MOVGA method;
(j) is the result of the segmentation of 113334665744 images by the existing K-MOVGA method;
(k) is the segmentation result of 113334665744 image by the existing RE-MSSFC method;
(l) Is the result of the segmentation of the 113334665744 image using the present invention;
as can be seen from FIG. 3, the present invention can completely segment the target and clearly segment the target from the background, so the segmentation effect of the present invention on the Weizmann gallery is superior to the existing FCM method, KFCM method, IFCM method, MOVGA method, K-MOVGA method and RE-MSSFC method.
The simulation analysis proves the correctness and the effectiveness of the method provided by the invention.
The invention has not been described in detail in part of the common general knowledge of those skilled in the art.
While the invention has been particularly shown and described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.

Claims (5)

1. A semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel is characterized by comprising the following steps:
(1) inputting a color image to be segmented;
(2) setting parameters: the number of superpixels is 500, the superpixel fuzzy index is 25, the positive integer is 5, and the maximum iteration number t of the superpixels ismax10, the number of neighborhoods is 10, the differential evolution variation factor is 0.5, and the differential evolution cross factor is 0.9; the cluster population scale is 50, and the maximum iteration times w of clusteringmax100, the number of individuals used for updating the Kriging model is 5, the fixed iteration number before updating the Kriging model is 20, the binary cross probability is 0.9 and the polynomial variation probability is 0.1;
(3) performing super-pixel segmentation on the color image based on decomposition multi-target differential evolution, and defining the edge of a super-pixel region after segmentation as a weak edge of the image; the super-pixel segmentation method based on decomposition multi-target differential evolution is used for carrying out super-pixel segmentation on a color image, and comprises the following specific steps:
(3.1) coding the core point offset component by adopting a decomposition multi-objective differential evolution method to obtain an initial population P:
if an image has N pixel points and is divided into K super-pixel regions with uniform size, the side length of each region is about
Figure FDA0003166914080000011
Initial population P ═ Pi,1,pi,2,…,pi,D]The following random strategy was used for generation:
pi,j=-S/2+rand×S,
wherein ,pi,jRepresenting individuals in the initial population; rand function produces [0,1]]Random number of (i) ═ 1,2, …, pop, j ═ 1,2, …, D ═ 2K; population-scale pop adoption
Figure FDA0003166914080000012
Calculating and determining, wherein M is 3 and H is a self-defined positive integer, and the number of the super-pixel criterion functions is the number of the super-pixel criterion functions;
(3.2) randomly selecting points in each uniform super-pixel region on the image as core points, and then obtaining seed points s of super-pixels of the image by using the core points and the offset of individual decodingi,k
Figure FDA0003166914080000013
Wherein q is 0.1, ci,kA core point representing a K-th super pixel of the i-th individual corresponding image, K being 1,2, …, K; Λ represents a set of K × K diagonal matrices;
(3.3) taking a seed point si,kThe 3S multiplied by 3S neighborhood, and a super-pixel label matrix L is obtained by judging the distance between the pixels in the neighborhood and the seed pointsi
(3.4) tag matrix L based on superpixelsiThree superpixel criterion functions are designed: mean square error f in superpixel1(si,Li) Superpixel edge gradient criterion function f2(Li) And regional regularization term superpixel criterion function f3(Li);
Mean square error f in superpixel1(si,Li) The calculation formula is as follows:
Figure FDA0003166914080000021
wherein ,InA 5-dimensional feature vector of an nth pixel point, wherein N is 1, 2. L isi(n) is the nth pixel point in the super pixel label matrix LiThe label of (1); d represents a pixel distance;
superpixel edge gradient criterion function f2(Li) The specific calculation formula is as follows:
Figure FDA0003166914080000022
Figure FDA0003166914080000023
wherein, Δ i (n) is a gradient feature corresponding to the nth pixel of the image; l isi(n) is the nth pixel point at LiThe label of (1); delta (-) is a condition judgment function, when the condition in the bracket is true, 1 is returned, otherwise, 0 is returned; if the nth pixel is adjacent to the neighborhood WnThe labels of the pixels in the inner portion are different, then
Figure FDA0003166914080000024
Indicating that the pixel is at the intersection of two super-pixel regions.
Regional regularization term superpixel criterion function f3(Li) Calculating the formulaThe following were used:
Figure FDA0003166914080000025
wherein ,
Figure FDA0003166914080000026
a super-pixel label matrix L representing the ith individualiThe number of pixels in the kth super-pixel region;
(3.5) decomposing the three super-pixel criterion functions by adopting a Chebyshev method MOEA/D, wherein the method specifically comprises the following steps:
(3.5.1) initializing weight vector matrix λ ═ λ12,…,λi,…,λpop]By calculating λiEuclidean distance between the vector and other weight vectors to obtain lambdaiT neighborhood weight vectors λi1i2,…,λiT
(3.5.2) decomposing the three superpixel criterion functions by adopting the Chebyshev method according to the calculation formula:
Figure FDA0003166914080000027
wherein λ' ═ λ12,…,λM]Is a set of weight vectors in lambda, and M is 3, which is the number of superpixel criterion functions; for each e ═ 1,2, …, M, there is λe≥0,
Figure FDA0003166914080000031
Figure FDA0003166914080000032
Representing a reference point, which is calculated by the formula
Figure FDA0003166914080000033
(3.6) crossing, mutating and selecting individuals, and obtaining a final population and an optimal solution through iterative updating to finally obtain a super-pixel region segmentation result of the image;
(4) extracting strong edges of the images by using an image edge detection canny operator, and merging super pixel regions of the images based on the strong and weak edges;
(5) extracting the representative feature r of the merged k-th super-pixel regionk
Figure FDA0003166914080000034
wherein ,YαRepresenting the RGB eigenvalues, Y, of the pixel point alpha in the superpixel regionβRGB characteristic value, w (Y) of median pixel point beta in the super pixel regionα,Yβ) Representing the weight between pixels α and β:
w(Yα,Yβ)=Qαβ×Uαβ
wherein ,QαβExpressing the position weight, wherein the closer the distance between the pixel point alpha and the beta is, the higher the weight is; u shapeαβExpressing color weight, wherein the closer the color information of the pixel point alpha and the color information of the pixel point beta are, the higher the weight of the pixel point alpha and the weight of the pixel point beta are; qαβ and UαβThe calculation formulas of (A) and (B) are respectively as follows:
Figure FDA0003166914080000035
Figure FDA0003166914080000036
wherein, (x, y) represents the coordinates of the pixels in the super-pixel region, and num represents the number of the pixels in the super-pixel region; σ represents a color feature variance of the super-pixel region;
taking k as 1, 2.. times.g, and calculating to obtain a representative feature set r as { r ═ r of each super pixel region1,r2,…,rk,…,rG};
(6) Obtaining partial supervision information by using marking information of user on image
Figure FDA0003166914080000037
(7) Initializing a reference vector, randomly initializing a population and coding chromosomes in the population;
(8) constructing a semi-supervised intuitive fuzzy compactness function J fusing superpixel region information:
Figure FDA0003166914080000041
where C denotes the number of image clusters, G denotes the number of super pixel regions, m denotes a cluster blur index, κ denotes a weighting index, and κ is set to 2, rkRepresentative feature, v, representing the kth super-pixel regionρThe cluster center of the rho-th class is represented,
Figure FDA0003166914080000042
expressed in an intuitive fuzzy set rkTo vρEuclidean distance of (c):
Figure FDA0003166914080000043
wherein ,
Figure FDA0003166914080000044
and π (·) represents the degree of membership, the degree of non-membership, and the degree of hesitation, respectively, in the intuitive fuzzy set:
Figure FDA0003166914080000045
Figure FDA0003166914080000046
Figure FDA0003166914080000047
wherein τ is a fixed parameter that generates the non-membership function;
Figure FDA0003166914080000048
is represented by rkMembership to vρSupervision membership degree of (2):
Figure FDA0003166914080000049
wherein ,
Figure FDA00031669140800000410
is represented by rkMembership to vρIf the labeled super-pixel region belongs to class 1, then
Figure FDA00031669140800000411
Of unmarked super-pixel areas
Figure FDA00031669140800000412
uρkRepresenting the degree of membership of the kth super-pixel region to the ρ -th class center:
Figure FDA00031669140800000413
(9) constructing an intuitive fuzzy separation function CS fusing superpixel region information:
Figure FDA00031669140800000414
Figure FDA0003166914080000051
wherein ,μγρDenotes vρWith respect to vγDegree of membership.
(10) Respectively calculating fitness function values, namely J and 1/CS, of each individual in the initial population according to the function expressions constructed in the steps (8) and (9), training a Kriging model by using the individuals in the initial population and the fitness function values thereof, and setting t to be 0 and w to be 0, wherein t represents the current iteration number of the superpixel, and w represents the current iteration number of the cluster;
(11) generating an offspring population by using binary intersection and polynomial variation, predicting the objective function value of an offspring individual by using a Kriging model, and merging a parent individual and the offspring individual;
(12) selecting a new population by adopting a selection strategy APD based on an angle punishment distance, updating a reference vector, and setting w to be w + 1;
(13) judging whether the Kriging model needs to be updated or not, if w is more than wmaxIf yes, updating the Kriging model, making w equal to 0, and executing the step (14); otherwise, returning to the step (11);
(14) judging whether the maximum iteration times of the super pixels are reached, if t is more than tmaxIf so, the iteration is terminated to obtain the final generation of non-dominated solution set, and the step (15) is executed; otherwise, let t be t +1, return to step (11);
(15) constructing a semi-supervised intuitive fuzzy clustering optimal solution selection index SI fused with super-pixel region information:
Figure FDA0003166914080000052
wherein ,ECRepresenting an intra-class compactness metric, E1Denotes the compactness of all samples grouped together as one class, FCRepresenting an inter-class maximum separability metric;
(16) selecting an optimal individual from the last generation of non-dominated solution set by using an optimal solution selection index SI to obtain an optimal clustering center;
(17) according to the optimal clustering center, performing label distribution on each super-pixel region to obtain labels of all pixel points in the image, and obtaining an image clustering result;
(18) and carrying out class label correction on the image clustering result to obtain a final image segmentation result.
2. The method of claim 1, wherein: and (3.3) calculating the distance between the pixel and the seed point as follows:
given any two pixel points α and β, the distance d (α, β) between the two is:
Figure FDA0003166914080000061
wherein m' represents a superpixel blur index; dc(α, β) represents a color distance, dsAnd (α, β) represents a spatial distance.
3. The method of claim 1, wherein: and (3.6) performing crossing, variation and selection on the individuals, wherein the steps are as follows:
(3.6.1) obtaining the decomposition results of all individuals under three criterion functions by utilizing the Chebyshev method, and selecting the optimal individual piAnd the seed points s of the super pixels of the image are calculatedi,kAs core points;
(3.6.2) generating new individuals by crossover and mutation operations:
Figure FDA0003166914080000062
wherein χ, τ ∈ b (i), b (i) ═ i1, i2, …, iT ]; χ ≠ τ ≠ i, FR 'is the variation factor, CR' is the crossover factor, rand is a random number between [0,1 ];
(3.6.3) defining as being illegal values elements that appear in the individual that are greater than the maximum value or less than the minimum value and repairing them as being adjacent border values;
(3.6.4) generating new individuals using a gaussian mutation operator:
Figure FDA0003166914080000063
wherein ,
Figure FDA0003166914080000064
means following a normal distribution of
Figure FDA0003166914080000065
The standard deviation is random number of S/20, pm is mutation probability, and pm is defined as 1/D.
4. The method of claim 1, wherein: in the step (4), super-pixel region merging based on strong and weak edges is carried out on the image, and the method specifically comprises the following steps: obtaining strong edge information E of image by using image edge detection canny operatoredgeThen selecting the spatial position characteristic of the center point of the super pixel region, and constructing a set cen ═ cen1,cen2,…,cenK]Calculating the spatial distance between any two central points, and finally judging whether a strong edge exists on a connecting line of the central points of the 8 adjacent domain superpixels of each superpixel region, if so, not combining; if not, combining the two super pixel areas; finally, super-pixel region combination based on image strong and weak edges is achieved, and G combined super-pixel regions R ═ R are obtained1,R2,…,RG]。
5. The method of claim 1, wherein: the selection strategy based on the angle penalty distance in the step (12) specifically comprises the following steps: firstly, calculating APD values of population individuals, and then selecting the individual with the minimum APD value for balancing diversity and convergence; the APD value is calculated as follows:
Figure FDA0003166914080000071
wherein ,
Figure FDA0003166914080000072
representing target vectors
Figure FDA0003166914080000073
Euclidean distance to origin, θt,i,eTo represent
Figure FDA0003166914080000074
With its own reference vector vt,eAngle between P (theta)t,i,e) And (3) representing a penalty function, and calculating the formula as follows:
Figure FDA0003166914080000075
wherein beta represents a parameter penalizing the rate of change,
Figure FDA0003166914080000076
representing a reference vector vt,eThe minimum angle value with other reference vectors in the current generation.
CN202110806823.3A 2021-07-16 2021-07-16 Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel Active CN113469270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110806823.3A CN113469270B (en) 2021-07-16 2021-07-16 Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110806823.3A CN113469270B (en) 2021-07-16 2021-07-16 Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel

Publications (2)

Publication Number Publication Date
CN113469270A true CN113469270A (en) 2021-10-01
CN113469270B CN113469270B (en) 2023-08-11

Family

ID=77880811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110806823.3A Active CN113469270B (en) 2021-07-16 2021-07-16 Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel

Country Status (1)

Country Link
CN (1) CN113469270B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114862216A (en) * 2022-05-16 2022-08-05 中国银行股份有限公司 Method and device for determining agile project scheduling scheme

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103839261A (en) * 2014-02-18 2014-06-04 西安电子科技大学 SAR image segmentation method based on decomposition evolution multi-objective optimization and FCM
US20160223506A1 (en) * 2015-01-30 2016-08-04 AgriSight, Inc. System and method for crop health monitoring
CN108596244A (en) * 2018-04-20 2018-09-28 湖南理工学院 A kind of high spectrum image label noise detecting method based on spectrum angle density peaks

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103839261A (en) * 2014-02-18 2014-06-04 西安电子科技大学 SAR image segmentation method based on decomposition evolution multi-objective optimization and FCM
US20160223506A1 (en) * 2015-01-30 2016-08-04 AgriSight, Inc. System and method for crop health monitoring
CN108596244A (en) * 2018-04-20 2018-09-28 湖南理工学院 A kind of high spectrum image label noise detecting method based on spectrum angle density peaks

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
赵凤;吝晓娟;刘汉强;: "融合对称特性的混合标签传递半监督直觉模糊聚类图像分割", 信号处理, no. 09 *
赵凤;张咪咪;刘汉强;: "区域信息驱动的多目标进化半监督模糊聚类图像分割算法", 电子与信息学报, no. 05 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114862216A (en) * 2022-05-16 2022-08-05 中国银行股份有限公司 Method and device for determining agile project scheduling scheme

Also Published As

Publication number Publication date
CN113469270B (en) 2023-08-11

Similar Documents

Publication Publication Date Title
Lai et al. A hierarchical evolutionary algorithm for automatic medical image segmentation
CN111126488B (en) Dual-attention-based image recognition method
Deng et al. Saliency detection via a multiple self-weighted graph-based manifold ranking
CN107392919B (en) Adaptive genetic algorithm-based gray threshold acquisition method and image segmentation method
Zhang et al. Multi-objective evolutionary fuzzy clustering for image segmentation with MOEA/D
Nakane et al. Application of evolutionary and swarm optimization in computer vision: a literature survey
CN110443257B (en) Significance detection method based on active learning
Yu et al. A re-balancing strategy for class-imbalanced classification based on instance difficulty
CN110263804B (en) Medical image segmentation method based on safe semi-supervised clustering
Akgundogdu et al. 3D image analysis and artificial intelligence for bone disease classification
CN112308115A (en) Multi-label image deep learning classification method and equipment
CN111815582B (en) Two-dimensional code region detection method for improving background priori and foreground priori
Qiu et al. Inferring skin lesion segmentation with fully connected CRFs based on multiple deep convolutional neural networks
Yang et al. Color texture segmentation based on image pixel classification
CN107657276B (en) Weak supervision semantic segmentation method based on searching semantic class clusters
Liu et al. Multiobjective fuzzy clustering with multiple spatial information for Noisy color image segmentation
CN108921853B (en) Image segmentation method based on super-pixel and immune sparse spectral clustering
CN113469270B (en) Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel
CN113723449A (en) Preference information-based agent-driven multi-objective evolutionary fuzzy clustering method
Peng et al. Multi-threshold image segmentation of 2D OTSU inland ships based on improved genetic algorithm
CN105678798A (en) Multi-target fuzzy clustering image segmentation method combining local spatial information
CN111259938B (en) Manifold learning and gradient lifting model-based image multi-label classification method
Azarbad et al. Brain tissue segmentation using an unsupervised clustering technique based on PSO algorithm
CN111539966A (en) Colorimetric sensor array image segmentation method based on fuzzy c-means clustering
Bai et al. A unified deep learning model for protein structure prediction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant