CN107274418A - A kind of crop image partition method based on AP clustering algorithms - Google Patents

A kind of crop image partition method based on AP clustering algorithms Download PDF

Info

Publication number
CN107274418A
CN107274418A CN201710554933.9A CN201710554933A CN107274418A CN 107274418 A CN107274418 A CN 107274418A CN 201710554933 A CN201710554933 A CN 201710554933A CN 107274418 A CN107274418 A CN 107274418A
Authority
CN
China
Prior art keywords
mrow
msub
crop
image
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710554933.9A
Other languages
Chinese (zh)
Inventor
许立兵
周振
韩冰
周望
徐爱国
金红伟
孙涵
陈鹏斐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JIANGSU PROVINCIAL RADIO INST CO Ltd
Original Assignee
JIANGSU PROVINCIAL RADIO INST CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JIANGSU PROVINCIAL RADIO INST CO Ltd filed Critical JIANGSU PROVINCIAL RADIO INST CO Ltd
Priority to CN201710554933.9A priority Critical patent/CN107274418A/en
Publication of CN107274418A publication Critical patent/CN107274418A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of crop image partition method based on AP clustering algorithms, wherein, including:Image after clustering processing is clustered is carried out to original crop map picture based on AP clustering algorithms, wherein described carry out clustering processing based on AP clustering algorithms to original crop map picture;The original crop map picture is placed under different illumination to the crop map picture obtained under different illumination, and counts brightness and the colourity relation of crop map picture under different illumination, wherein brightness and the colourity relation of the crop map picture under the different illumination of the statistics;Coarse segmentation image is obtained image segmentation is carried out to the original crop map picture according to the chroma luminance relationship graph, and the image after the coarse segmentation image and the cluster is subjected to area coincidence obtaining crop segmentation figure picture.The crop image partition method based on AP clustering algorithms that the present invention is provided, quickly can accurately extract crop from the complex background image under outdoor different illumination.

Description

A kind of crop image partition method based on AP clustering algorithms
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of crop map picture segmentation side based on AP clustering algorithms Method.
Background technology
With developing rapidly for computer software and hardware, Digital Image Processing, machine learning scheduling theory and technology, computer is regarded Feel technology has been applied to the every field of agricultural production, such as crop condition monitoring, pest and disease monitoring, disaster prison more and more The field such as survey and early warning, agriculture unmanned plane.
For the growing way of monitoring crop it may first have to which crop is automatically extracted, up to the present, it is existing many on The method and strategy of crop map picture segmentation are suggested.For example, appliance computer image processing techniques identification Maize at Seedling Stage field is miscellaneous Research (EI, 2001 (17) of grass:154-156) by the difference for the green component for analyzing soil and plant, carry Go out and image is split using Two-peak method searching optimal threshold according to the grey level histogram of G components.T.Kataoka in 2003, T.Kaneko, H.Okamoto, and S.Hata, " Crop growth estimation system using machine vision”in Proc.IEEE/ASME International Conference on Advanced Intelligent Mechatronics proposes a kind of crop color extraction operator CIVE to split crop.It is poly- based on improved K- averages Segmentation of crop images (agricultural research, 2008 (6) of class algorithm:57~60) in propose RGB color image is changed first To HIS color space, with Mean-Variance and the appropriate initial cluster center of rough set theory selection and cluster number, then enter Row cluster calculation, realizes the rapid automatized segmentation of color component.
The above method is typically applicable background simply, the inviolent crop map picture of illumination variation, but the crop of actual photographed There is complicated background in image, uneven illumination is even and the features such as changing over time, the above method is for actual outdoor image With certain limitation, tend to more serious mistake point situation occur when handling actual outdoor image.
Therefore, how the crop map picture of complex background is carried out rapid automatized being partitioned into those skilled in the art urgently The technical problem of solution.
The content of the invention
It is contemplated that at least solve one of technical problem present in prior art is based on AP clustering algorithms there is provided one kind Crop image partition method, to solve the problems of the prior art.
As one aspect of the present invention there is provided a kind of crop image partition method based on AP clustering algorithms, wherein, institute Stating the crop image partition method based on AP clustering algorithms includes:
Image after clustering processing is clustered is carried out to original crop map picture based on AP clustering algorithms, wherein described be based on AP clustering algorithms carry out clustering processing to original crop map picture to be included:
The original crop map picture is transformed into LAB spaces by RGB color, and obtains A and B points of each pixel Composition characteristic vector is measured,
Similarity matrix S between pixel is calculated according to the characteristic vector of each pixel, and with similarity matrix S's Average initializes the point of reference matrix P of each pixel,
The Attraction Degree and degree of membership matrix of each pixel are initialized, and maximum iteration T and attenuation coefficient λ be set,
Will be described according to the similarity matrix S of each pixel, point of reference matrix P, Attraction Degree and degree of membership matrix Each pixel of original image does interative computation based on AP clustering algorithms and counts real-time iterative number of times T0,
As the real-time iterative number of times T0During more than the maximum iteration T, the interative computation is terminated,
Obtaining the image after the image after the cluster, the cluster includes the image after multiple classifications, and the cluster In each pixel correspond to a class label;
The original crop map picture is placed under different illumination to the crop map picture obtained under different illumination, and statistics is not shared the same light The brightness of crop map picture under taking and colourity relation, wherein the brightness for counting the crop map picture under different illumination and colourity are closed System includes:
Original crop map picture under different illumination is retained into the region containing crop respectively, and by the picture in the region of non-crop Vegetarian refreshments is set to white, obtains several training sample images,
Every width training sample image is transformed into HSI spaces and yuv space by RGB color respectively, and by every width The chromatic value H of all crop pixels points in training sample image with brightness value Y shape into training data matrix, and by the colourity Value H and the brightness value Y are saved in the first row and secondary series of the training data matrix respectively,
The brightness value Y that secondary series in the training data matrix is preserved is traveled through, and by all crop pixels Point is formed as chrominance luminance matrix according to the whether identical multirows that are divided into of the brightness value Y,
Calculate mean μ and variance δ per row crop pixel in the chrominance luminance matrix2,
According to the mean μ and variance δ of every row crop pixel2Result of calculation generation chrominance luminance relationship graph;
Image segmentation is carried out to the original crop map picture according to the chrominance luminance relationship graph and obtains coarse segmentation figure Picture, and the image progress area coincidence after the coarse segmentation image and the cluster is obtained into crop segmentation figure picture.
Preferably, it is described that the original crop map picture progress image segmentation is obtained according to the chrominance luminance graph of a relation Coarse segmentation image, and the image progress area coincidence after the coarse segmentation image and the cluster is obtained into crop segmentation figure picture bag Include:
The original crop map picture is converted into HSI spaces and yuv space by RGB color, and obtains described original The chromatic value H of all crop pixels points of crop map picture0With brightness value Y0
By each crop pixels point in the original crop map picture according to line number and the chrominance luminance relationship graph pair Should, and find out in the chrominance luminance relationship graph with the current crop pixels point line number phase in the original crop map picture The mean μ of crop pixels point corresponding to same row0With variance δ0 2
By the chromatic value H0 of current crop pixels point and mean μ corresponding with current crop pixels point0Subtract each other and obtain colourity Difference DELTA H;
If the chroma difference Δ H≤k* δ0 2, then the colourity of current crop pixels point is adjusted to white, is otherwise adjusted to Black, wherein k ∈ [1.5,2.1];
All crop pixels points in the original crop map picture are carried out into the image after colourity adjustment to carry out going to dry place Reason obtains the coarse segmentation image;
Calculate the area coincidence rate of the image after the coarse segmentation image and the cluster and according to the area coincidence rate Obtain the crop segmentation figure picture.
Preferably, the area coincidence rate for calculating the coarse segmentation image and the image after the cluster and according to described Area coincidence rate, which obtains the crop segmentation figure picture, to be included:
Count the pixel sum of each classification image in the image after the cluster;
Calculate the coincidence factor of each classification image in the figure after the cluster;
Judge whether the coincidence factor of each class image is more than predetermined threshold value;
If the coincidence factor of each class image is more than the predetermined threshold value, retaining should in the image after the cluster The all pixels point of the category, is otherwise set to black by the pixel of classification;
The differentiation of the colourity of the pixel of each classification in the image after the cluster obtains the crop segmentation Image;
Wherein, the span of the predetermined threshold value is 0.6~0.8.
Preferably, the AP clustering algorithms include:
Calculate two data point p (xp, yp) and q (xp, yp) between similarity S (p, q):
Data point q (x are represented from matrix r (p, q) is representedp, yp) it is suitable as data point p (xp, yp) representative point institute The evidence of accumulation;
Matrix a (p, q) is selected to represent data point p (x from suitablep, yp) selection q (xp, yp) it is used as the suitable journey of its representative point The accumulated evidence of degree;
Using the data on the diagonal of the matrix of the similarity of N number of data point formation as each data point the reference P is spent, and sets the point of reference P all sames of all data points;
Calculated according to the similarity S (p, q) and the point of reference P and described represent matrix r (p, q) and described fit selects matrix A (p, q):
Wherein, p1Represent to support data point q to turn into the data point of cluster centre, q1Represent competition candidate cluster centre data Point;
Introduce attenuation coefficient λ, it is described to represent matrix r (p, q) and described fit selects matrix a (p, q) T0The result of+1 time changes In generation, is all set to T0The 1- λ times of secondary iteration, wherein, the real number of the attenuation coefficient λ ∈ [0,1], T0The representative square of+1 time Battle arrayMatrix is selected with suitableRespectively:
Preferably, the similarity matrix S, point of reference matrix, Attraction Degree and the degree of membership of each pixel of the basis Each pixel of the original image is done interative computation based on AP clustering algorithms and calculates real-time iterative number of times T by matrix0Bag Include:
Each pixel is set to be expressed as pixel [row, column], wherein row represents the row of the pixel Number, column represents the columns of the pixel;
Calculate the representative matrix r (p, q) of the pixel and fit and select matrix a (p, q):
Wherein, p=1,2 ..., row, q=1,2 ..., column;
Represent matrix r (p, q) by described and described suitable select matrix a (p, q) iteration T respectively0It is secondary, obtain T0+ 1 representative MatrixMatrix is selected with suitableRespectively:
Wherein, p=1,2 ..., row, q=1,2 ..., column.
Preferably, the mean μ and variance δ calculated in the chrominance luminance matrix per row crop pixel2Including:
The chrominance luminance matrix is expressed as HYData (m, n), wherein m is represented in the chrominance luminance matrix OK, n represents the row in the chrominance luminance matrix, then in the chrominance luminance matrix m row crop pixels mean μm For:
Wherein, HYData (m, n) represents the chrominance luminance matrix, and col represents m rows in the chrominance luminance matrix All nonzero terms number;
According to the mean μ of the m row crop pixelsmCalculate m row crop pixels in the chrominance luminance matrix Variance δm 2
Wherein, HYData (m, n) represents the chrominance luminance matrix, and col represents m rows in the chrominance luminance matrix All nonzero terms number.
The crop image partition method based on AP clustering algorithms that the present invention is provided, by using AP clustering algorithms first to original Beginning crop map picture carries out the image after clustering processing is clustered, while at the training meticulously of original crop map picture and carrying out rough segmentation Cut processing and obtain the image after coarse segmentation, the image after the image and coarse segmentation after cluster is subjected to area coincidence rate calculating, obtained To final image segmentation result, this crop image partition method based on AP clustering algorithms can not only quickly by crop from Accurately extracted in complex background image under outdoor difference illumination, and the image extracted is without obvious wrong point feelings Condition.
Brief description of the drawings
Accompanying drawing is, for providing a further understanding of the present invention, and to constitute a part for specification, with following tool Body embodiment is used to explain the present invention together, but is not construed as limiting the invention.In the accompanying drawings:
The flow chart for the crop image partition method based on AP clustering algorithms that Fig. 1 provides for the present invention.
Fig. 2 be Fig. 1 in step S130 embodiment flow chart.
The schematic diagram of data point in the AP clustering algorithms that Fig. 3 provides for the present invention.
The original crop map picture that Fig. 4 provides for the present invention.
The result figure after the cluster that clustering phase is obtained that Fig. 5 provides for the present invention.
The result figure for the coarse segmentation obtained after coarse segmentation that Fig. 6 provides for the present invention.
The final segmentation figure picture that Fig. 7 provides for the present invention.
The specific implementation process flow chart for the training stage that Fig. 8 provides for the present invention.
The specific implementation process flow chart in the segmentation stage that Fig. 9 provides for the present invention.
Embodiment
The embodiment of the present invention is described in detail below in conjunction with accompanying drawing.It should be appreciated that this place is retouched The embodiment stated is merely to illustrate and explain the present invention, and is not intended to limit the invention.
As one aspect of the present invention there is provided a kind of crop image partition method based on AP clustering algorithms, wherein, such as Shown in Fig. 1, the crop image partition method based on AP clustering algorithms includes:
S110, the image after clustering processing is clustered, wherein institute carried out to original crop map picture based on AP clustering algorithms State is included based on AP clustering algorithms to original crop map picture progress clustering processing:
S111, the original crop map picture by RGB color is transformed into LAB spaces, and obtains the A of each pixel It is vectorial with B component composition characteristic,
S112, the similarity matrix S between pixel calculated according to the characteristic vector of each pixel, and with similarity moment Battle array S average initializes the point of reference matrix P of each pixel,
S113, the Attraction Degree of each pixel of initialization and degree of membership matrix, and maximum iteration T and decay system are set Number λ,
S114, according to the similarity matrix S of each pixel, point of reference matrix P, Attraction Degree and degree of membership matrix will Each pixel of the original image does interative computation based on AP clustering algorithms and counts real-time iterative number of times T0,
S115, as the real-time iterative number of times T0During more than the maximum iteration T, the interative computation is terminated,
S116, the image after the image after the cluster, the cluster is obtained including after multiple classifications, and the cluster Each pixel in image corresponds to a class label;
S120, the original crop map picture is placed under different illumination to the crop map picture obtained under different illumination, and counted The brightness of crop map picture under different illumination and colourity relation, wherein the brightness of crop map picture under the different illumination of the statistics and Colourity relation includes:
S121, the original crop map picture under different illumination retains to the region containing crop respectively, and by the area of non-crop The pixel in domain is set to white, obtains several training sample images,
S122, every width training sample image is transformed into HSI spaces and yuv space respectively by RGB color, and will The chromatic value H of all crop pixels points in every width training sample image and brightness value Y shape incite somebody to action described into training data matrix Chromatic value H and the brightness value Y are saved in the first row and secondary series of the training data matrix respectively,
S123, the brightness value Y that the secondary series in the training data matrix is preserved is traveled through, and by all crops Pixel is formed as chrominance luminance matrix according to the whether identical multirows that are divided into of the brightness value Y,
S124, the mean μ and variance δ for calculating every row crop pixel in the chrominance luminance matrix2,
S125, mean μ and variance δ according to every row crop pixel2Result of calculation generation chrominance luminance graph of a relation Table;
S130, image segmentation is carried out to the original crop map picture according to the chrominance luminance relationship graph obtain rough segmentation Image is cut, and the image after the coarse segmentation image and the cluster is subjected to area coincidence and obtains crop segmentation figure picture.
The crop image partition method based on AP clustering algorithms that the present invention is provided, by using AP clustering algorithms first to original Beginning crop map picture carries out the image after clustering processing is clustered, while at the training meticulously of original crop map picture and carrying out rough segmentation Cut processing and obtain the image after coarse segmentation, the image after the image and coarse segmentation after cluster is subjected to area coincidence rate calculating, obtained To final image segmentation result, this crop image partition method based on AP clustering algorithms can not only quickly by crop from Accurately extracted in complex background image under outdoor difference illumination, and the image extracted is without obvious wrong point feelings Condition.
As the embodiment of dividing processing, specifically, as shown in Fig. 2 described in step S130 according to the colourity- Brightness relationship figure carries out image segmentation to the original crop map picture and obtains coarse segmentation image, and by the coarse segmentation image and institute State the image after cluster and carry out area coincidence and obtain crop segmentation figure picture and include:
S131, the original crop map picture by RGB color is converted into HSI spaces and yuv space, and obtains described The chromatic value H of all crop pixels points of original crop map picture0With brightness value Y0
S132, by each crop pixels point in the original crop map picture according to line number and the chrominance luminance relation Chart correspondence, and find out in the chrominance luminance relationship graph with the current crop pixels point in the original crop map picture The mean μ of crop pixels point corresponding to line number identical row0With variance δ0 2
S133, the chromatic value H by current crop pixels point0Mean μ corresponding with current crop pixels point0Subtract each other and obtain Chroma difference Δ H;
If S134, the chroma difference Δ H≤k* δ0 2, then the colourity of current crop pixels point is adjusted to white, otherwise It is adjusted to black, wherein k ∈ [1.5,2.1];
S135, all crop pixels points in the original crop map picture are subjected to the image after colourity adjustment Dry processing obtains the coarse segmentation image;
The area coincidence rate of image after S136, the calculating coarse segmentation image and the cluster is simultaneously heavy according to the region Conjunction rate obtains the crop segmentation figure picture.
Further, the area coincidence rate of the coarse segmentation image and the image after the cluster is calculated described in step S136 And the crop segmentation figure picture is obtained according to the area coincidence rate can specifically include:
Count the pixel sum of each classification image in the image after the cluster;
Calculate the coincidence factor of each classification image in the figure after the cluster;
Judge whether the coincidence factor of each class image is more than predetermined threshold value;
If the coincidence factor of each class image is more than the predetermined threshold value, retaining should in the image after the cluster The all pixels point of the category, is otherwise set to black by the pixel of classification;
The differentiation of the colourity of the pixel of each classification in the image after the cluster obtains the crop segmentation Image;
Wherein, the span of the predetermined threshold value is 0.6~0.8.
The pixel is set to black or white refers to image under RGB color pattern it should be appreciated that above-mentioned, When brightness under tri- passages of R, G, B is disposed as into 0, pixel is set to black, and the brightness under tri- passages of R, G, B is equal When being set to 255, pixel is set to white.
It should be noted that with reference to shown in Fig. 3, the AP clustering algorithms include:
Calculate two data point p (xp, yp) and q (xp, yp) between similarity S (p, q):
Data point q (x are represented from matrix r (p, q) is representedp, yp) it is suitable as data point p (xp, yp) representative point institute The evidence of accumulation;
Matrix a (p, q) is selected to represent data point p (x from suitablep, yp) selection q (xp, yp) it is used as the suitable journey of its representative point The accumulated evidence of degree;
Using the data on the diagonal of the matrix of the similarity of N number of data point formation as each data point the reference P is spent, and sets the point of reference P all sames of all data points;
Calculated according to the similarity S (p, q) and the point of reference P and described represent matrix r (p, q) and described fit selects matrix A (p, q):
Wherein, p1Represent to support data point q to turn into the data point of cluster centre, q1Represent competition candidate cluster centre data Point;
Introduce attenuation coefficient λ, it is described to represent matrix r (p, q) and described fit selects matrix a (p, q) T0The result of+1 time changes In generation, is all set to T0The 1- λ times of secondary iteration, wherein, the real number of the attenuation coefficient λ ∈ [0,1], T0The representative square of+1 time Battle arrayMatrix is selected with suitableRespectively:
Specifically, AP clustering algorithms are a kind of new Unsupervised clustering algorithms, and the target of cluster is to make data point and its class Represent the distance between point and reach minimum, therefore from measurement index of the Euclidean distance as similarity, i.e. any two number Strong point p (xp, yp) and q (xp, yp) similarity be:
As shown in figure 3, AP clustering algorithms are with representing matrix r (p, q) and suitable select matrix a (p, q) and represent between data point Two category informations, wherein r (p, q) is from p (xp, yp) point to candidate representative point q (xp, yp), it reflects q (xp, yp) be suitable as For p (xp, yp) the evidence that is accumulated of representative point, a (p, q) is from candidate representative point q (xp, yp) point to p (xp, yp), it is anti- P (x are reflectedp, yp) selection q (xp, yp) evidence that is accumulated as the appropriate level of its representative point.
The input of AP algorithms is the similarity matrix S (p, q) between N number of data point, with matrix S (p, q) diagonal Data as the point turn into cluster centre judgment criteria, referred to as point of reference P, when initial, the point of reference of point used is set to Identical value (average for generally taking S (p, q)).Wherein r (p, q) and a (p, q) are calculated according to formula (1) and formula (2) respectively:
Wherein, p1Represent to support data point q to turn into the data point of cluster centre, q1Represent competition candidate cluster centre data Point.
In order to avoid vibration, attenuation coefficient λ is introduced during AP algorithm fresh informations, it is previous repeatedly that every information is arranged to it The 1- λ times, wherein attenuation coefficient λ ∈ [0,1] that add this information updating again that generation updates real number, i.e. T0The representative of+1 time MatrixMatrix is selected with suitableCalculated respectively according to (3) formula and (4) formula:
Further specifically, the similarity matrix S of each pixel of the basis, point of reference matrix, Attraction Degree and Each pixel of the original image is done interative computation based on AP clustering algorithms and calculates real-time iterative by degree of membership matrix Number T0Including:
Each pixel is set to be expressed as pixel [row, column], wherein row represents the row of the pixel Number, column represents the columns of the pixel;
Calculate the representative matrix r (p, q) of the pixel and fit and select matrix a (p, q):
Wherein, p=1,2 ..., row, q=1,2 ..., column;
Represent matrix r (p, q) by described and described suitable select matrix a (p, q) iteration T respectively0It is secondary, obtain T0+ 1 representative MatrixMatrix is selected with suitableRespectively:
Wherein, p=1,2 ..., row, q=1,2 ..., column.
Specifically, the mean μ and variance δ calculated in the chrominance luminance matrix per row crop pixel2Including:
The chrominance luminance matrix is expressed as HYData (m, n), wherein m is represented in the chrominance luminance matrix OK, n represents the row in the chrominance luminance matrix, then in the chrominance luminance matrix m row crop pixels mean μm For:
Wherein, HYData (m, n) represents the chrominance luminance matrix, and col represents m rows in the chrominance luminance matrix All nonzero terms number;
According to the mean μ of the m row crop pixelsmCalculate m row crop pixels in the chrominance luminance matrix Variance δm 2
Wherein, HYData (m, n) represents the chrominance luminance matrix, and col represents m rows in the chrominance luminance matrix All nonzero terms number.
It is understood that the crop image partition method based on AP clustering algorithms that the present invention is provided mainly passes through three Stage, first stage is clustering phase, and second stage is the training stage, and three phases are the segmentation stage.
Split with reference to actual crop map picture, the detailed process to three phases is described in detail, wherein, Fig. 4 is Original crop map picture, Fig. 5 is the result figure of coarse segmentation for the result figure ApResultImage, Fig. 6 after cluster RoughResultImage, Fig. 7 are finally to split obtained image.
Wherein clustering phase is after being clustered based on AP clustering algorithms to original crop map as progress clustering processing Image, specifically, the image clustering stage based on AP algorithms are described as follows:
Original to be transformed into LAB spaces by RGB color as image, the A and B component composition for obtaining each pixel are special Levy vector;
Similarity matrix S between pixel is calculated according to characteristic vector, the ginseng of each pixel is initialized with S average Degree of examining matrix P;
Attraction Degree and degree of membership matrix are initialized, maximum iteration T, attenuation coefficient λ are set;
For the pixel [row, column] of each image, wherein row is the original line number as image, and column is The original columns as image, iterates to calculate and counts real-time iterative number of times T0
Calculate the representative matrix r (p, q) of original crop map picture according to formula (1) and formula (2) and fit and select matrix a (p, q), according to Formula (3) and formula (1) calculate the representative matrix updatedMatrix is selected with suitable
Judge real-time iterative number of times T0Whether default maximum iteration T is exceeded, if it does, then terminating iteration meter Calculate, otherwise return to iterative calculation step;
Result figure ApResultImage after finally being clustered, as shown in figure 5, each pixel in the figure is corresponded to One class label i, i=1,2 ..., n.
It is in order to realize the step of being carried out to the coarse segmentation of original crop map picture, specifically for the second stage training stage Ground, as shown in figure 8, the training stage can specifically be described as follows.
Crop picture under different illumination is gathered, the crop pixels in history image are counted with the relation of its brightness and colourity, Comprise the following steps that:
The original crop map picture under different illumination is chosen, retains the region containing crop, non-crop area pixel is put In vain, to obtain training sample image entirely;
Above-mentioned training sample image is transformed into HSI spaces and yuv space by RGB color, and every width is trained into sample The crop pixels point H values of this image and Y value are saved in training data matrix TrainingData first row and second respectively Row;
The secondary series of the training data matrix TrainingData is searched, when training data matrix TrainingData(:, 2) and it is brightness Y0When, by the corresponding training data matrix TrainingData of current line (:, 1) and value preservation To chrominance luminance matrix H YData (Y0,:) in;
The step is identical by brightness it is understood that when the secondary series to the training data matrix is searched Row extract, store into same a line of chrominance luminance matrix, due to brightness value Y scope be 0~255, so color The maximum memory space of degree-luminance matrix is 256*Numtotal, and Numtotal is the total picture of crop of selected training sample image Element, the i.e. memory space of chrominance luminance matrix is sufficiently large.
Average and variance often capable in chrominance luminance matrix H YData is calculated, is specially:
Wherein, the chrominance luminance matrix is expressed as HYData (m, n), wherein m represents the chrominance luminance matrix In in row, n represents the row in the chrominance luminance matrix, then m row crop pixels in the chrominance luminance matrix Mean μm, the variance δ of m row crop pixelsm 2, col represents all nonzero terms of m rows in the chrominance luminance matrix Number;
It is final to obtain chrominance luminance relationship graph HY-Table.
On splitting shown in the specific steps combination Fig. 9 in stage:
Using the chrominance luminance relationship graph HY-Table of training generation, the crop in original crop map picture is carried out certainly Dynamic to extract, concrete operation step is as follows:
The original crop map picture is converted into HSI spaces and yuv space by RGB color, i.e., described original work It is respectively R (m, n), G (m, n), B (m, n) that the m rows n of object image, which is listed in rgb space, is H (m, n) in HSI spaces, S (m, n), I (m, n), is Y (m, n), U (m, n), V (m, n) in yuv space;
Line number a line equal with pixel Y (m, n) is searched in chrominance luminance relationship graph HY-Table, and is extracted It is correspondingWith
By the colourity H (m, n) of current pixel point withSubtract each other, to obtainWhenWhen, then current pixel point is changed into white, otherwise the pixel is set to black, and wherein k span is [1.5,2.1];
Dry processing is gone to the pre-segmentation result that segmentation is obtained, removes less connected domain to reduce the dry of small impurities point Disturb.Mark is connected using 8 neighborhoods in this algorithm, the area threshold scope of connected domain is [5,12].
After being operated as above to original crop map picture, it is possible to obtain the result figure of coarse segmentation RoughResultImage, as shown in Figure 6.
Then the area coincidence rate of the result figure after the result figure and cluster of coarse segmentation is calculated:
Met if the original m rows n as image is arranged:ApResultImage (m, n)==i, and RoughResultImage (m, n)==0, then by coincidenceAreai=coincidenceAreai+1;
Count the sum of all pixels of each classes of ApResultImage;
By coincidenceAreaiDivided by ApAreai, obtain the coincidence factor coincideRate of each classi, calculate coincideRateiWhether it is more than the predetermined threshold value RateThreshold, the predetermined threshold value is determined through test of many times RateThreshold span is [0.6,0.8], if the picture for retaining the category in ApResultImage if meeting condition The all pixels of the category, are otherwise set to black by element, obtain final segmentation result, as shown in Figure 7.
It is understood that the principle that embodiment of above is intended to be merely illustrative of the present and the exemplary implementation that uses Mode, but the invention is not limited in this.For those skilled in the art, the essence of the present invention is not being departed from In the case of refreshing and essence, various changes and modifications can be made therein, and these variations and modifications are also considered as protection scope of the present invention.

Claims (6)

1. a kind of crop image partition method based on AP clustering algorithms, it is characterised in that the work based on AP clustering algorithms Object image dividing method includes:
Image after clustering processing is clustered is carried out to original crop map picture based on AP clustering algorithms, wherein described poly- based on AP Class algorithm carries out clustering processing to original crop map picture to be included:
The original crop map picture is transformed into LAB spaces by RGB color, and obtains the A and B component group of each pixel Into characteristic vector,
Similarity matrix S between pixel is calculated according to the characteristic vector of each pixel, and with similarity matrix S average The point of reference matrix P of each pixel is initialized,
The Attraction Degree and degree of membership matrix of each pixel are initialized, and maximum iteration T and attenuation coefficient λ be set,
Will be described original according to the similarity matrix S of each pixel, point of reference matrix P, Attraction Degree and degree of membership matrix Each pixel of image does interative computation based on AP clustering algorithms and counts real-time iterative number of times T0,
As the real-time iterative number of times T0During more than the maximum iteration T, the interative computation is terminated,
The image after the image after the cluster, the cluster is obtained including in the image after multiple classifications, and the cluster Each pixel corresponds to a class label;
The original crop map picture is placed under different illumination to the crop map picture obtained under different illumination, and counted under different illumination Crop map picture brightness and colourity relation, wherein brightness and the colourity relation bag of crop map picture under the different illumination of the statistics Include:
Original crop map picture under different illumination is retained into the region containing crop respectively, and by the pixel in the region of non-crop White is set to, several training sample images are obtained,
Every width training sample image is transformed into HSI spaces and yuv space by RGB color respectively, and every width is trained The chromatic value H of all crop pixels points in sample image with brightness value Y shape into training data matrix, and by the chromatic value H It is saved in the first row and secondary series of the training data matrix respectively with the brightness value Y,
The brightness value Y that secondary series in the training data matrix is preserved is traveled through, and all crop pixels are pressed Be formed as chrominance luminance matrix according to the whether identical multirows that are divided into of the brightness value Y,
Calculate mean μ and variance δ per row crop pixel in the chrominance luminance matrix2,
According to the mean μ and variance δ of every row crop pixel2Result of calculation generation chrominance luminance relationship graph;
Image segmentation is carried out to the original crop map picture according to the chrominance luminance relationship graph and obtains coarse segmentation image, and Image after the coarse segmentation image and the cluster is subjected to area coincidence and obtains crop segmentation figure picture.
2. the crop image partition method according to claim 1 based on AP clustering algorithms, it is characterised in that the basis The chrominance luminance graph of a relation carries out image segmentation to the original crop map picture and obtains coarse segmentation image, and by the rough segmentation Cut image and carry out area coincidence with the image after the cluster and obtain crop segmentation figure picture and include:
The original crop map picture is converted into HSI spaces and yuv space by RGB color, and obtains the original crop The chromatic value H of all crop pixels points of image0With brightness value Y0
Each crop pixels point in the original crop map picture is corresponding with the chrominance luminance relationship graph according to line number, And find out in the chrominance luminance relationship graph with the current crop pixels point line number identical in the original crop map picture The mean μ of the corresponding crop pixels point of row0With variance δ0 2
By the chromatic value H of current crop pixels point0Mean μ corresponding with current crop pixels point0Subtract each other and obtain chroma difference Δ H;
If the chroma difference Δ H≤k* δ0 2, then the colourity of current crop pixels point is adjusted to white, is otherwise adjusted to black Color, wherein k ∈ [1.5,2.1];
All crop pixels points in the original crop map picture are carried out into the image after colourity adjustment to carry out going dry handle To the coarse segmentation image;
Calculate the area coincidence rate of the coarse segmentation image and the image after the cluster and obtained according to the area coincidence rate The crop segmentation figure picture.
3. the crop image partition method according to claim 2 based on AP clustering algorithms, it is characterised in that the calculating The area coincidence rate of image after the coarse segmentation image and the cluster simultaneously obtains the crop according to the area coincidence rate Segmentation figure picture includes:
Count the pixel sum of each classification image in the image after the cluster;
Calculate the coincidence factor of each classification image in the figure after the cluster;
Judge whether the coincidence factor of each class image is more than predetermined threshold value;
If the coincidence factor of each class image is more than the predetermined threshold value, retain the category in the image after the cluster Pixel, all pixels point of the category is otherwise set to black;
The differentiation of the colourity of the pixel of each classification in the image after the cluster obtains the crop segmentation figure picture;
Wherein, the span of the predetermined threshold value is 0.6~0.8.
4. the crop image partition method as claimed in any of claims 1 to 3 based on AP clustering algorithms, its feature It is, the AP clustering algorithms include:
Calculate two data point p (xp, yp) and q (xp, yp) between similarity S (p, q):
<mrow> <mi>S</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>-</mo> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>q</mi> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>y</mi> <mi>q</mi> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>;</mo> </mrow>
Data point q (x are represented from matrix r (p, q) is representedp, yp) it is suitable as data point p (xp, yp) representative point accumulated Evidence;
Matrix a (p, q) is selected to represent data point p (x from suitablep, yp) selection q (xp, yp) it is used as the appropriate level institute of its representative point The evidence of accumulation;
Using the data on the diagonal of the matrix of the similarity of N number of data point formation as each data point the point of reference P, And set the point of reference P all sames of all data points;
According to the similarity S (p, q) and the point of reference P calculate it is described represent matrix r (p, q) and it is described fit select matrix a (p, q):
<mrow> <mi>r</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>s</mi> <mrow> <mo>(</mo> <munder> <mrow> <mi>p</mi> <mo>,</mo> <mi>q</mi> </mrow> <mrow> <mi>q</mi> <mo>&amp;NotEqual;</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> </mrow> </munder> <mo>)</mo> </mrow> <mo>-</mo> <mi>m</mi> <mi>a</mi> <mi>x</mi> <mo>{</mo> <mi>a</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mi>s</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>}</mo> <mo>,</mo> </mrow>
<mrow> <mi>a</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>min</mi> <mo>{</mo> <mn>0</mn> <mo>,</mo> <mi>r</mi> <mrow> <mo>(</mo> <mi>q</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>+</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>&amp;NotEqual;</mo> <mi>p</mi> </mrow> </munder> <mi>max</mi> <mo>{</mo> <mn>0</mn> <mo>,</mo> <mi>r</mi> <mrow> <mo>(</mo> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>}</mo> <mo>}</mo> <mo>,</mo> <mi>p</mi> <mo>&amp;NotEqual;</mo> <mi>p</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <munder> <mo>&amp;Sigma;</mo> <mrow> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>&amp;NotEqual;</mo> <mi>p</mi> </mrow> </munder> <mi>max</mi> <mo>{</mo> <mn>0</mn> <mo>,</mo> <mi>r</mi> <mrow> <mo>(</mo> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>}</mo> <mo>,</mo> <mi>p</mi> <mo>=</mo> <mi>q</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow>
Wherein, p1Represent to support data point q to turn into the data point of cluster centre, q1Represent competition candidate cluster centre data point;
Introduce attenuation coefficient λ, it is described to represent matrix r (p, q) and described fit selects matrix a (p, q) T0The result iteration quilt of+1 time It is set to T0The 1- λ times of secondary iteration, wherein, the real number of the attenuation coefficient λ ∈ [0,1], T0The representative matrix of+1 timeMatrix is selected with suitableRespectively:
<mrow> <msub> <mi>r</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msub> <mi>r</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>&amp;lambda;r</mi> <msub> <mi>T</mi> <mn>0</mn> </msub> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow>
<mrow> <msub> <mi>a</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msub> <mi>a</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>&amp;lambda;a</mi> <msub> <mi>T</mi> <mn>0</mn> </msub> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>.</mo> </mrow>
5. the crop image partition method according to claim 4 based on AP clustering algorithms, it is characterised in that the basis The similarity matrix S, point of reference matrix, Attraction Degree and the degree of membership matrix of each pixel are by each of the original image Pixel does interative computation based on AP clustering algorithms and calculates real-time iterative number of times T0Including:
Each pixel is set to be expressed as pixel [row, column], wherein row represents the line number of the pixel, Column represents the columns of the pixel;
Calculate the representative matrix r (p, q) of the pixel and fit and select matrix a (p, q):
<mrow> <mi>r</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>s</mi> <mrow> <mo>(</mo> <munder> <mrow> <mi>p</mi> <mo>,</mo> <mi>q</mi> </mrow> <mrow> <mi>q</mi> <mo>&amp;NotEqual;</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> </mrow> </munder> <mo>)</mo> </mrow> <mo>-</mo> <mi>m</mi> <mi>a</mi> <mi>x</mi> <mo>{</mo> <mi>a</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mi>s</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>}</mo> <mo>,</mo> </mrow>
<mrow> <mi>a</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>min</mi> <mo>{</mo> <mn>0</mn> <mo>,</mo> <mi>r</mi> <mrow> <mo>(</mo> <mi>q</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>+</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>&amp;NotEqual;</mo> <mi>p</mi> </mrow> </munder> <mi>max</mi> <mo>{</mo> <mn>0</mn> <mo>,</mo> <mi>r</mi> <mrow> <mo>(</mo> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>}</mo> <mo>}</mo> <mo>,</mo> <mi>p</mi> <mo>&amp;NotEqual;</mo> <mi>p</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <munder> <mo>&amp;Sigma;</mo> <mrow> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>&amp;NotEqual;</mo> <mi>p</mi> </mrow> </munder> <mi>max</mi> <mo>{</mo> <mn>0</mn> <mo>,</mo> <mi>r</mi> <mrow> <mo>(</mo> <msub> <mi>p</mi> <mn>1</mn> </msub> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>}</mo> <mo>,</mo> <mi>p</mi> <mo>=</mo> <mi>q</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow>
Wherein, p=1,2 ..., row, q=1,2 ..., column;
Represent matrix r (p, q) by described and described suitable select matrix a (p, q) iteration T respectively0It is secondary, obtain T0+ 1 representative matrixMatrix is selected with suitableRespectively:
<mrow> <msub> <mi>r</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msub> <mi>r</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>&amp;lambda;r</mi> <msub> <mi>T</mi> <mn>0</mn> </msub> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <msub> <mi>a</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msub> <mi>a</mi> <mrow> <msub> <mi>T</mi> <mn>0</mn> </msub> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>&amp;lambda;a</mi> <msub> <mi>T</mi> <mn>0</mn> </msub> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow>
Wherein, p=1,2 ..., row, q=1,2 ..., column.
6. the crop image partition method as claimed in any of claims 1 to 3 based on AP clustering algorithms, its feature It is, the mean μ and variance δ calculated in the chrominance luminance matrix per row crop pixel2Including:
The chrominance luminance matrix is expressed as HYData (m, n), wherein m represents the row in the chrominance luminance matrix, n Represent the row in the chrominance luminance matrix, then in the chrominance luminance matrix m row crop pixels mean μmFor:
<mrow> <msub> <mi>&amp;mu;</mi> <mi>m</mi> </msub> <mo>=</mo> <mfrac> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>n</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>l</mi> </mrow> </munderover> <mi>H</mi> <mi>Y</mi> <mi>D</mi> <mi>a</mi> <mi>t</mi> <mi>a</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>l</mi> </mrow> </mfrac> <mo>,</mo> </mrow>
Wherein, HYData (m, n) represents the chrominance luminance matrix, and col represents the institute of m rows in the chrominance luminance matrix There is the number of nonzero term;
According to the mean μ of the m row crop pixelsmCalculate the side of m row crop pixels in the chrominance luminance matrix Poor δm 2
<mrow> <msup> <msub> <mi>&amp;delta;</mi> <mi>m</mi> </msub> <mn>2</mn> </msup> <mo>=</mo> <mfrac> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>n</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>l</mi> </mrow> </munderover> <msup> <mrow> <mo>(</mo> <mi>H</mi> <mi>Y</mi> <mi>D</mi> <mi>a</mi> <mi>t</mi> <mi>a</mi> <mo>(</mo> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> <mo>)</mo> <mo>-</mo> <msub> <mi>&amp;mu;</mi> <mi>m</mi> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>l</mi> </mrow> </mfrac> <mo>,</mo> </mrow>
Wherein, HYData (m, n) represents the chrominance luminance matrix, and col represents the institute of m rows in the chrominance luminance matrix There is the number of nonzero term.
CN201710554933.9A 2017-07-07 2017-07-07 A kind of crop image partition method based on AP clustering algorithms Pending CN107274418A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710554933.9A CN107274418A (en) 2017-07-07 2017-07-07 A kind of crop image partition method based on AP clustering algorithms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710554933.9A CN107274418A (en) 2017-07-07 2017-07-07 A kind of crop image partition method based on AP clustering algorithms

Publications (1)

Publication Number Publication Date
CN107274418A true CN107274418A (en) 2017-10-20

Family

ID=60072654

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710554933.9A Pending CN107274418A (en) 2017-07-07 2017-07-07 A kind of crop image partition method based on AP clustering algorithms

Country Status (1)

Country Link
CN (1) CN107274418A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108548846A (en) * 2018-06-21 2018-09-18 电子科技大学 Bearing device subsurface defect extracting method based on thermal image AP clusters
CN108898603A (en) * 2018-05-29 2018-11-27 北京佳格天地科技有限公司 Plot segmenting system and method on satellite image
CN109783178A (en) * 2019-01-24 2019-05-21 北京字节跳动网络技术有限公司 A kind of color adjustment method of interface assembly, device, equipment and medium
CN109859212A (en) * 2019-01-16 2019-06-07 中国计量大学 A kind of unmanned plane image soybean crops row dividing method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102208099A (en) * 2011-05-30 2011-10-05 华中科技大学 Illumination-change-resistant crop color image segmentation method
CN106780502A (en) * 2016-12-27 2017-05-31 江苏省无线电科学研究所有限公司 Sugarcane seeding stage automatic testing method based on image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102208099A (en) * 2011-05-30 2011-10-05 华中科技大学 Illumination-change-resistant crop color image segmentation method
CN106780502A (en) * 2016-12-27 2017-05-31 江苏省无线电科学研究所有限公司 Sugarcane seeding stage automatic testing method based on image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BRENDAN J. FREY ET AL.: "Clustering by Passing Messages Between Data Points", 《SCIENCE》 *
ZHENGHONG YU ET AL.: "Automatic image-based detection technology for two critical growth stages of maize: Emergence and three-leaf stage", 《AGRICULTURAL AND FOREST METEOROLOGY》 *
码迷: "AP聚类算法", 《HTTP://MAMICODE.COM/INFO-DETAIL-1380807.HTML》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108898603A (en) * 2018-05-29 2018-11-27 北京佳格天地科技有限公司 Plot segmenting system and method on satellite image
CN108548846A (en) * 2018-06-21 2018-09-18 电子科技大学 Bearing device subsurface defect extracting method based on thermal image AP clusters
CN109859212A (en) * 2019-01-16 2019-06-07 中国计量大学 A kind of unmanned plane image soybean crops row dividing method
CN109783178A (en) * 2019-01-24 2019-05-21 北京字节跳动网络技术有限公司 A kind of color adjustment method of interface assembly, device, equipment and medium
CN109783178B (en) * 2019-01-24 2022-08-23 北京字节跳动网络技术有限公司 Color adjusting method, device, equipment and medium for interface component

Similar Documents

Publication Publication Date Title
CN107274418A (en) A kind of crop image partition method based on AP clustering algorithms
JP4864857B2 (en) Measurement of mitotic activity
CN110060233B (en) Corn ear damage detection method
US20100232705A1 (en) Device and method for detecting shadow in image
CN102208099A (en) Illumination-change-resistant crop color image segmentation method
CN106611421B (en) The SAR image segmentation method constrained based on feature learning and sketch line segment
EP1700269A2 (en) Detection of sky in digital color images
CN108537239A (en) A kind of method of saliency target detection
Schaefer et al. Fuzzy clustering for colour reduction in images
CN106127735B (en) A kind of facilities vegetable edge clear class blade face scab dividing method and device
CN111259925A (en) Method for counting field wheat ears based on K-means clustering and width mutation algorithm
Kumar et al. Pest detection using adaptive thresholding
CN114494739B (en) Toner mixing effect detection method based on artificial intelligence
CN114821268A (en) Weed and crop identification method based on machine learning
CN108171683B (en) Cell counting method adopting software for automatic identification
CN109816629A (en) A kind of coating nature separation method and device based on k-means cluster
CN112489049A (en) Mature tomato fruit segmentation method and system based on superpixels and SVM
CN112329818A (en) Hyperspectral image unsupervised classification method based on graph convolution network embedded representation
CN116416523A (en) Machine learning-based rice growth stage identification system and method
CN106203447A (en) A kind of foreground target extracting method based on pixel heredity
CN116258968B (en) Method and system for managing fruit diseases and insects
CN110807357A (en) Non-supervised field crop classification method and system based on histogram estimation
Pereira et al. Fruit recognition and classification based on SVM method for production prediction of peaches-preliminary study
Dar et al. An adaptive path classification algorithm for a pepper greenhouse sprayer
CN112258522B (en) Martial arts competition area segmentation method based on secondary area growth

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171020