CN104700115B - The detection method of crater during Mars probes soft landing based on sparse lifting integrated classifier - Google Patents

The detection method of crater during Mars probes soft landing based on sparse lifting integrated classifier Download PDF

Info

Publication number
CN104700115B
CN104700115B CN201510089099.1A CN201510089099A CN104700115B CN 104700115 B CN104700115 B CN 104700115B CN 201510089099 A CN201510089099 A CN 201510089099A CN 104700115 B CN104700115 B CN 104700115B
Authority
CN
China
Prior art keywords
mrow
msub
crater
msubsup
mtd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510089099.1A
Other languages
Chinese (zh)
Other versions
CN104700115A (en
Inventor
王岩
杨刚
郭雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201510089099.1A priority Critical patent/CN104700115B/en
Publication of CN104700115A publication Critical patent/CN104700115A/en
Application granted granted Critical
Publication of CN104700115B publication Critical patent/CN104700115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The detection method of crater during a kind of Mars probes soft landing based on sparse lifting integrated classifier of the present invention, it has four big steps:Step 1. determines candidate's crater;Step 2. candidate's crater Texture Feature Extraction;Step 3. carries out feature selecting to the textural characteristics of extraction;Step 4. is combined Boost algorithms with sparse Density Estimator algorithm RSDE WL1, devises sparse lifting integrated classifier, to realize the quick detection to the crater based on image.The advantages of present invention mainly utilizes designed sparse lifting integrated classifier to have sparse solution and reduce computation complexity, after crater textural characteristics based on image are carried out with feature extraction and selection, the classification to crater and non-crater is realized, to reach the quick detection of crater.Its degree of accuracy of classifying can reach approximately 85% and more than, during the Mars probes soft landing to reality the detection of crater there is certain reference value.

Description

Crater during Mars probes soft landing based on sparse lifting integrated classifier Detection method
Technical field
The present invention relates to a kind of Mars probes soft landing based on integrated (SparseBoost) grader of sparse lifting The detection method of crater in journey, and in particular to the pretreatment of crater, Texture Feature Extraction and selection in martian surface image And the design of SparseBoost graders, belong to image procossing and area of pattern recognition.
Background technology
Detection of obstacles research main purpose is to detect barrier and true accordingly during Mars probes soft landing Determine safe landing locations.Wherein crater is the most common geological form in the celestial body such as Mars surface, have distribution is wide, area is big, The features such as characteristics of image is obvious, it is that one of main barrier for detecting is needed during detector soft landing.
Crater is considered as a pair of crescent moons with bright and dark area in the image of Mars probes collection.True In fixed candidate's crater, after generally being extracted to the textural characteristics of candidate's crater, the characteristic of extraction has higher Dimension, it is necessary to by suitable image feature selection algorithm, supervised learning sorting algorithm to the crater in candidate's crater Classified with non-crater, to determine the particular location of crater in image.
SparseBoost sorting algorithms are a kind of sparse supervised learning sorting algorithms, with existing AdaBoost, The advantage that Boost algorithms are compared is, in each iterative process, AdaBoost algorithms utilize whole feature sets, and SparseBoost algorithms only select that an optimal feature;Simultaneously during Weak Classifier build, AdaBoost with The equal trade-off decision tree of Boost algorithms builds grader, and whole sample sets are utilized in calculating process, and SparseBoost algorithms are adopted Weak Classifier is built with sparse Density Estimator RSDE-WL1, is realized merely with a small amount of sample.Therefore SparseBoost algorithms tool Have openness and the advantages of reduce computation complexity, SparseBoost sorting algorithms are applied to mars exploration in real process The detection of crater during device soft landing, can effectively reduce computation complexity, realize the quick detection of crater.
The content of the invention
1st, goal of the invention
During a kind of Mars probes soft landing based on sparse lifting integrated classifier The detection method of crater, this method are pre-processed to determine candidate's crater to the image of collection first, secondly from candidate Abstract image textural characteristics in crater, and carry out feature selecting;Then by improved sparse Density Estimator algorithm (RSDE- WL1) it is combined with lifting Ensemble Learning Algorithms (Boost), building sparse lifting integrated classifier, (SparseBoost classifies Device), SparseBoost graders are finally applied to the detection of crater, realize quick detection, while obtain higher classification The degree of accuracy.
2nd, technical scheme
To reach above-mentioned purpose, the present invention is specific to be situated between according to the step in the main frame figure (Fig. 1) of crater automatic detection The technical scheme of this method that continues.
The present invention devises crater during a kind of Mars probes soft landing based on sparse lifting integrated classifier Detection method, this method comprises the following steps:
Step 1. determines candidate's crater
Determine that the crucial of candidate's crater is considered as having bright and dark area one in the crater in image To crescent moon, as shown in Figure 2.The shape of each pair crescent moon can by based on the shape detecting method of mathematical morphology from image really It is fixed, crater of the crescent moon that can be matched as candidate.The building process of candidate's crater is as shown in figure 3, input is one complete Color image, it comprises many bright and dark features regions.Bright and dark shape parallel processing, by using original image The bright shape of processing, and dark shape is handled using inverted image.The target of this method, which is that elimination is all, to be designated as The noise characteristic of crater, and only retain bright and dark features.Remaining bright and dark features region matches each other, by this A little area markings are good, the candidate region as crater.
Step 2. candidate's crater Texture Feature Extraction
In order to represent single candidate's crater in terms of rectangular characteristic, we extract the side around candidate's crater first Shape image block.In an experiment, in order to include the edge of crater around, we use twice of candidate's aerolite pit-size as screening Cover area.The unknown textural characteristics of each candidate's crater are encoded using 9 kinds of different size of square coverings, as shown in Figure 4. Therefore, the attribute for candidate's crater that an image is included can be described by thousands of textural characteristics.These are special Sign is not separate, and these complete features compensate for by the single square limitation for covering the texture information obtained.
Step 3. carries out feature selecting to the textural characteristics of extraction
According to the candidate's crater textural characteristics tentatively extracted, due to the higher-dimension of characteristic, therefore by training sample Before grader being inputted with test sample, it is necessary to carry out feature selecting.In the present invention, we are designed using step 4 SparseBoost algorithms carry out feature selecting, and the maximum difference of itself and AdaBoost algorithms is that the former is in iteration each time During only choose an optimal feature, and the latter generally utilizes whole feature set.So greatly reduce training sample Intrinsic dimensionality, effectively reduce the computation complexity of classifier training.
Step 4. is combined Boost algorithms with sparse Density Estimator algorithm RSDE-WL1, and it is integrated to devise sparse lifting Grader, to realize the quick detection to the crater based on image.
According to selected candidate's crater textural characteristics, in order to distinguish wherein crater and non-crater, the present invention is set Count a kind of supervised learning sorting algorithm --- SparseBoost algorithms.This method combination Boost algorithms and one kind are improved dilute Density Estimator algorithm (RSDE-WL1) is dredged, while character subset is selected, some sparse Density Estimator devices is constructed and is used for The design of corresponding base grader, by the weighted array of base grader, finally realizes integrated classifier.
Give n candidate's crater (x1,y1),(x2,y2),...,(xn,yn), wherein yi=0,1, i=1,2 ..., n points Non- crater (c is not correspond to0) and crater (c1) example, n0And n1The number of non-crater and crater example is correspond to respectively Mesh, n0+n1=n.Each candidate's crater can be expressed as a characteristic vector x=(f1,f2,...,fm)T, each of which Feature fi, i=1 ..., m are produced by the square covering of a certain ad-hoc location on candidate's crater, and m is that the feature extracted is total Number.A series of Weak Classifier h are built using SparseBoost algorithms (detailed process such as algorithm 1)t(x) it is, and integrated by weighting Weak Classifier is combined by method establishes final strong classifier H (x):
Wherein T is iterations (T < n), αtIt is the Weak Classifier h of studyt(x) weight.
, it is necessary to the step of realizing following three cores in each iterative process:Weak Classifier study, optimal characteristics choosing Select and updated with next iteration process sample weights.Wherein, in Weak Classifier learning process, density estimation algorithm is collected to compression (RSDE) penalty term is added, obtains improved sparse Density Estimator algorithm RSDE-WL1.Using RSDE-WL1 algorithms to each Its probability density function of kind classification attributes estimation, classifies to the sample of input according to Bayes decision rule, obtains weak point Class device.
(1) Weak Classifier learns
In the t times iterative process, for the single optimal characteristics f ∈ { f of selection1,f2,...,fmStructure weak typing Device ht(x), can be realized by building Bayes classifier.Two classification on crater and the classification of non-crater is being discussed Before problem, Bayes classifier is introduced first.Generally for Bayes's classification problem, expectation estimation gives input sample x lower classes Other posterior probability density.In order to obtain a probability classification on density estimation, instructed first for each category attribute c Practice a Multilayer networks deviceWherein x is the characteristic vector for representing single candidate's crater, β For core weight vector, c is the category attribute of candidate's crater, c ∈ { c0,c1, c0Represent non-crater classification, c1Represent aerolite Cheat classification.Then posterior probability is calculated with Bayes rule (2), final test sample is assigned to maximum a posteriori probability Category attribute.
For two classification problem in the present invention, two conditional probability densities under given classification are estimated firstWithThe two density can be obtained by follow-up sparse Density Estimator RSDE-WL1 (being calculated according to formula (5) and (6)).Then according to formula (2), corresponding posterior probability is calculated respectively With(being directly calculated according to formula (2)).According to the sample size of each category attribute, two kinds are calculated Prior probability p (the c of classification0) and p (c1):p(c0)=n0/ n, p (c1)=n1/ n, p (c0)+p(c1)=1.Finally utilize Bayes Decision rule (3) is classified to the sample of input
Therefore, Weak Classifier ht(x) expression formula is
Two of which category attribute lower probability densityWithSparse estimation expression difference For:
m0And m1Non-zero core weights in sparse Density Estimator RSDE-WL1 expression formulas are correspond under two kinds of category attributes respectively Number, n0And n1The number of non-crater and crater example, usual m are correspond to respectively0< n0And m1< n1WithFor Core weight vector, βkFor core weight coefficient (0≤βk≤ 1), h0And h1For the wide (h of nucleus band0> 0, h1> 0),WithFor kernel function.
Wherein, improved sparse Density Estimator RSDE-WL1 simple realization process is as follows:
Compression collection density estimation (RSDE) algorithm is introduced first.RSDE is based on experience points square error (ISE) criterion, with Total regression matrixBased on, core weights as much as possible is tended to 0, so as to obtain the dilute of density p (x) Dredge expression formula, wherein Ki,k=Kh(xi,xk) it is ΦNThe i-th row k column elements.Specifically, the RSDE estimations with Gaussian kernel, Its core weight vector β can be obtained by minimizing integrated square error, as follows:Wherein,Represent N × N-dimensional matrix Space;
Parameter beta is consistent with the meaning of parameters in formula (2) in formula (7), and dx represents differential term, Ep(x)Represent desired value;
WhereinItem can not consider, E because it is unrelated with parameter betap(x){ } is represented on density p (x) Desired value.By Density Estimator expression formulaSubstitution formula (7), by series of transformations, obtains band The non-negative double optimization problem of constraint
Constraints βk>=0,1≤k≤N andWherein, matrixMember Element is defined asGh() is gaussian kernel function, h It is wide for nucleus band,It is that the Parzen windows of each sample point are estimated Evaluation vector, βN=[β12,...,βN]T
In order to reduce aggregation extent and the degree of rarefication that improves density estimation of the weight coefficient in some regions, we introduce power The weighting l of value coefficient1NormAs penalty term, improved sparse Density Estimator algorithm RSDE-WL1 is obtained. Also referred to as regularization term, whereinFor diagonal matrix.DefinitionW= [w1,w2,...,wN]T, βN=[β12,...,βN]T, add penalty term after new double optimization problem be
It is non-convex to notice problem (9), and weight coefficient can be obtained by solving above mentioned problem using corresponding iterative algorithm Sparse solution.
(2) optimal feature selection
Calculate Weak Classifier ht(x) weighted error summation, selection meet the single optimal characteristics f of minimal errortFor structure Build the optimal Weak Classifier of current iteration
ht(x)=h (x, ft) (11)
(3) next iteration process sample weights update
SparseBoost algorithms combine the classification knot of current sample weights and feature selected by the past in AdaBoost algorithms Fruit is about this information, and this information helps to select current optimal characteristics.In implementation process, increase by the sample of mistake classification This weight, and reduce the sample weights correctly classified.When calculating weighted error summation, by the sample of mistake classification more likely It is selected in next iteration process.Weight more new-standard cement is as follows
3rd, advantage and effect
The detection of crater during a kind of Mars probes soft landing based on sparse lifting integrated classifier of the present invention Method, it compared with the conventional method, its major advantage is:(1) time complexity of classifier algorithm is O (Tm (m0+m1)), with Boost algorithms are compared, and the time complexity of Boost algorithms is O (Tmn), due to m0+m1< n0+n1=n, thus it is complicated in the time There is significant advantage on degree.(2) every time structure Weak Classifier when, only select that an optimal feature as sample to Amount, without utilizing whole features, computation complexity is reduced, accelerates classification speed.(3) to the crater based on image and non- Crater is classified, classification accuracy can reach approximately 85% and more than, to reality Mars probes soft landing during The detection of crater has certain reference value.
Brief description of the drawings
Fig. 1 crater automatic detection main frame figures.
Explain that a crater is made up of the bright and dark area as crescent moon in Fig. 2 (A) crater crescent moon region Physical principle.
The bright and dark area of the real 1km size craters of Fig. 2 (B) one.
Fig. 3 builds candidate's crater flow chart.
2 kinds of 2- rectangles of Fig. 4 (A) cover.
2 kinds of 3- rectangles of Fig. 4 (B) cover.
5 kinds of 4- rectangles of Fig. 4 (C) cover.
Example covered to crater 2- rectangles of Fig. 4 (D).
Fig. 5 (A) West domain crater real image.
Fig. 5 (B) intermediate region crater real image.
Fig. 5 (C) East domain crater real image.
Embodiment
The detection of crater during a kind of Mars probes soft landing based on sparse lifting integrated classifier of the present invention Method, the step of this method includes, see Fig. 1.Its main thought is to make full use of designed sparse lifting integrated classifier to have The advantages of sparse solution and reduction computation complexity, after the crater textural characteristics based on image are carried out with feature extraction and selection, The classification to crater and non-crater is realized, to reach the quick detection of crater.Fig. 2 (A)-(B) is crater crescent moon area Become clear in domain and the physical principle of dark area and real example.Fig. 3 is structure candidate's crater flow chart.
Present invention selection High Resolution Stereo Camera (HRSC) full-colour image minimum point h0905_0000 part work For test set, the image is shot by the quick airship of Mars, and as shown in Fig. 5 (A)-(C), selected image resolution ratio is 12.5 meters/pixel, size is 3000 × 4500 pixel (37500 × 56250m2).Domain expert marks by hand on this pictures About 3500 craters as earth's surface truth compared with automatic detection result.This pictures is for automatic inspection It is a great challenge to survey crater algorithm, because it contains the form with spatial variations, and the contrast of image Mutually it is on duty.This pictures is divided into three parts, is designated as West domain, intermediate region and East domain, West domain has similar with East domain Landforms, but there are more craters in West domain than East domain, and intermediate region has visibly different compared with other two regions Surface geographical feature.
Steps 1 and 2:Determine candidate's crater and candidate's crater Texture Feature Extraction
According to the determination method of candidate's crater, we have primarily determined that 13075 candidate's aerolites from full-colour image 5 Hole.The method covered by the square rows of Fig. 4 (A) -9 kinds of (D), has extracted 1089 image texture spies from candidate's crater image Sign.Training set randomly chooses 204 true craters from the half of candidate's crater in East domain north in experiment and 292 non-are fallen from the sky or outer space Stone pit example forms, and the corresponding candidate's crater number of test set from West domain, middle region and East domain in experiment is respectively 2935th, 1181 and 1223.
Step 3:Feature selecting is carried out to the textural characteristics of extraction
In order to realize that selected characteristic number is as few as possible, and the degree of accuracy of classifying is as big as possible, and the present invention is calculated SpraseBoost The iterations T of method is respectively set to 2,5,10,15,20,25,30,50,100,150,200, then in West domain, middle region And East domain tests the classification results for choosing individual features subset respectively.Simultaneously because in candidate's crater data crater and The disequilibrium of non-crater distribution, successfully detects that true crater is more important than non-crater.Therefore, the present invention uses Accuracy rate (Accuracy=(TP+TN)/(TP+TN+FP+FN)), recall ratio (Recall=TP/ (TP+FN)), precision ratio (Precision=TP/ (TP+FP)) and F measured values (F-measure=2/ (1/ (Recall)+1/ (Precision))) conduct Evaluation index.Wherein TP represents the true crater number correctly classified, and TN represents the non-crater number correctly classified, FP represents that by the wrong non-crater number for being categorized as crater FN represents to be categorized as the true crater of non-crater by mistake Number.
In iterations T setting, it is by becoming clear and dark area to select 2 features to be primarily due to candidate's crater Composition, thus represent bright and the two dark be characterized in it is most important, maximum choose 200 features be in order to other The experimental result of document compares, and middle characteristic randomly selects.In feature selection process, Boost algorithms are utilized Thinking, in T iterative process, choose meet the single optimal characteristics f of minimal error each timet, obtain T feature and be used for The structure of sample set, wherein T < n.
Step 4:Boost algorithms are combined with sparse Density Estimator algorithm RSDE-WL1, it is integrated to devise sparse lifting Grader, to realize the quick detection to the crater based on image.
In the design of grader, main the step of including three cores:Weak Classifier study, optimal feature selection and under An iteration process sample weights update.Initialized first, for the initial weight w of input sampleiIf yi=0, then wi=1/2n0;If yi=1, then wi=1/2n1, wherein n0And n1The number of non-crater and crater example is correspond to respectively, n0+n1=n.In Weak Classifier study, arrange parameter λ=0.001, lmax=8, ε=1/ (0.3*N), obtain improved sparse Density Estimator device RSDE-WL1 density estimation expression formula, further according to Bayesian decision criterion, obtain in each iterative process One Weak Classifier ht(x).In optimal feature selection, the error in classification of Weak Classifier is calculated Choose and meet that the minimum feature of error in classification as most there is feature, sets γtt/1-εt.Finally process for next iteration more The weight of new samplesSo as to obtain T Weak Classifier.This T Weak Classifier is weighted combination Into final strong classifier, the weight of corresponding Weak Classifier is arranged to αt=ln (1/ γt)。
Table 1-3 show respectively on West domain, intermediate region and East domain, characteristic selected by different iterationses and Classification accuracy, recall ratio, precision ratio and F measured values.It is can be seen that from table 1-3 in West domain, intermediate region and East domain On three regions, when selected characteristic is respectively 10,20,20, corresponding classification accuracy reaches highest, is respectively 0.790,0.854 and 0.874, while F measured values also reach maximum, respectively 0.790,0.796 and 0.818.Therefore, training In the case that collection determines, the optimal characteristics number chosen when classifying to trizonal test data is respectively 10,20 and 20.
The West domain of table 1
Iterations T Selected characteristic Accuracy Recall Precision F-measure
2 2 0.783 0.840 0.737 0.785
5 5 0.788 0.814 0.765 0.789
10 10 0.790 0.796 0.767 0.790
15 15 0.788 0.779 0.774 0.776
20 20 0.775 0.724 0.783 0.752
25 25 0.771 0.714 0.781 0.746
30 30 0.763 0.691 0.782 0.734
50 50 0.740 0.625 0.781 0.694
100 100 0.688 0.502 0.755 0.603
150 150 0.658 0.423 0.737 0.541
200 200 0.638 0.378 0.721 0.496
The intermediate region of table 2
Iterations T Selected characteristic Accuracy Recall Precision F-measure
2 2 0.850 0.761 0.827 0.751
5 5 0.848 0.743 0.836 0.760
10 10 0.843 0.700 0.854 0.769
15 15 0.844 0.689 0.869 0.768
20 20 0.854 0.761 0.834 0.796
25 25 0.841 0.709 0.842 0.770
30 30 0.837 0.707 0.832 0.764
50 50 0.831 0.661 0.854 0.746
100 100 0.813 0.587 0.873 0.702
150 150 0.782 0.497 0.866 0.631
200 200 0.789 0.512 0.873 0.646
The East domain of table 3
Iterations T Selected characteristic Accuracy Recall Precision F-measure
2 2 0.861 0.755 0.838 0.801
5 5 0.867 0.746 0.858 0.811
10 10 0.865 0.738 0.883 0.804
15 15 0.868 0.725 0.905 0.805
20 20 0.874 0.753 0.894 0.818
25 25 0.868 0.716 0.911 0.802
30 30 0.867 0.716 0.909 0.801
50 50 0.863 0.699 0.914 0.792
100 100 0.860 0.666 0.941 0.780
150 150 0.841 0.611 0.946 0.743
200 200 0.842 0.611 0.950 0.744
Four kinds of supervised classification algorithms for being used for crater detection are chosen to carry out with SparseBoost algorithms proposed by the present invention Compare, such as Boost, AdaBoost, SVM and J48 algorithm.Boost algorithms, as base grader, lifting are integrated using decision tree Algorithm is merged with feature selecting algorithm is classified.And other three kinds of algorithms can not be to initial data in experimentation Collection carries out feature selecting.Table 4 lists this classification accuracy of five kinds of algorithms on West domain, intermediate region and East domain (Accuracy), recall ratio (Recall), precision ratio (Precision) and F measured values (F-measure).
The crater detection algorithm of the present invention of table 4 and other four kinds of crater detection algorithm performance comparisions
From table 4, it can be seen that on West domain and East domain, have feature selecting sorting algorithm (SparseBoost and Boost classification accuracy and F measured values) all apparently higher than the sorting algorithm (AdaBoost, SVM and J48) without feature selecting, Wherein SparseBoost algorithms are better than Boost algorithms.And in intermediate region, have the algorithm of feature selecting with without feature selecting Algorithm is compared, and difference is little (such as SparseBoost, AdaBoost and J48) on classification accuracy and F measured values, even more Poor (such as Boost), this be probably because the training sample taken come from East domain, and middle region different from East domain distinguishingly Looks cause to lose some important characteristic informations during feature selecting, it can be seen that the geographical form in middle region is obvious from Fig. 5 (B) There is larger difference with west, East domain.Therefore, totally apparently, SparseBoost sorting algorithms proposed by the invention are in crater Context of detection has preferable classifying quality, and computation complexity is minimum.
It should be noted last that:Above example is only to illustrative and not limiting technical scheme, although ginseng The present invention is described in detail according to above-described embodiment, it will be understood by those within the art that:Still can be to this Invention is modified or equivalent substitution, and any modification or partial replacement without departing from the spirit and scope of the present invention, its is equal It should cover among scope of the presently claimed invention.

Claims (1)

1. the detection method of crater, its feature exist during the Mars probes soft landing based on sparse lifting integrated classifier In:This method comprises the following steps:
Step 1. determines candidate's crater
Determine that the key of candidate's crater is to regard the crater on image as a pair of crescent moons with bright and dark area, The shape of each pair crescent moon based on the shape detecting method of mathematical morphology from image by being determined, the crescent moon that can be matched is as time The crater of choosing;First have to input a full-colour image in the building process of candidate's crater, it comprises bright and dark Characteristic area, become clear and dark area parallel processing, bright areas is handled by using original image, and use inverted image To handle dark area, the target of this method be in order to eliminate it is all can not be designated as the noise characteristic of crater, and only protect Bright and dark features are stayed, remaining bright and dark features region matches each other, and these area markings are good, as crater Candidate region;
Step 2. candidate's crater Texture Feature Extraction
In order to express candidate's crater in terms of rectangular characteristic, the square chart around each candidate's crater is extracted first As block, in an experiment, in order to include the edge of crater around, twice of covering of candidate's aerolite pit-size, Mei Gehou are used The unknown textural characteristics of crater are selected using 9 kinds of different size of square coverings to encode, therefore, the time that an image is included The attribute of crater is selected to be described by textural characteristics;These features are not separate;
Step 3. carries out feature selecting to the textural characteristics of extraction
According to the candidate's crater textural characteristics tentatively extracted, due to the higher-dimension of characteristic, therefore by training sample and survey Before sample this input grader, it is necessary to carry out feature selecting;The SparseBoost algorithms designed using step 4 carry out feature choosing Select, the maximum difference of itself and AdaBoost algorithms is that the former is that an optimal spy is only chosen in iterative process each time Sign, and the latter utilizes whole feature set;
Step 4. is combined Boost algorithms with sparse Density Estimator algorithm RSDE-WL1, devises sparse lifting Ensemble classifier Device, to realize the quick detection to the crater based on image;
According to selected candidate's crater textural characteristics, in order to distinguish wherein crater and non-crater, a kind of prison is devised Educational inspector practises sorting algorithm --- SparseBoost algorithms;This method combination Boost algorithms and a kind of improved sparse cuclear density are estimated Calculating method is RSDE-WL1, while character subset is selected, constructs a plurality of sparse Density Estimator devices for corresponding base point The design of class device, by the weighted array of base grader, finally realize integrated classifier;
Give n candidate's crater (x1,y1),(x2,y2),...,(xi,yi),...,(xn,yn), wherein yi=0,1, i=1, 2 ..., n correspond to non-crater and crater example, n respectively0And n1The number of non-crater and crater example is correspond to respectively Mesh, n0+n1=n;Each candidate's crater is expressed as a characteristic vector x=(f1,f2,...,fi,...,fm)T, wherein often One feature fi, i=1 ..., m are produced by the square covering of a certain ad-hoc location on candidate's crater, are utilized SparseBoost algorithms produce a series of Weak Classifier ht(x) Weak Classifier, is combined foundation by weighting method for improving One strong integrated classifier;Before iteration starts, the weight of n candidate's crater is first initialized, for i-th of candidate's aerolite Hole, if yi=0, then wi=1/2n0;If yi=1, then wi=1/2n1
, it is necessary to the step of realizing following three cores in each iterative process:Weak Classifier study, optimal feature selection and under An iteration process sample weights update;Wherein, it is RSDE to compression collection density estimation algorithm in Weak Classifier learning process Penalty term is added, improved sparse Density Estimator algorithm RSDE-WL1 is obtained, using RSDE-WL1 algorithms to each classification Its density function of attributes estimation, the sample of input is classified according to Bayes decision rule, so as to obtain Weak Classifier;
(1) Weak Classifier learns
In the t times iterative process, for the single optimal characteristics f ∈ { f of selection1,f2,...,fmStructure Weak Classifier ht (x), realized by building Bayes classifier;Generally for Bayes's classification problem, expectation estimation is given under input sample x The posterior probability density of classification;It is first each category attribute c to obtain a probability classification on density estimation Train a density estimatorThen posterior probability, final test sample are calculated with Bayes rule (2) Originally it is assigned to the category attribute with maximum a posteriori probability;
<mrow> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>c</mi> <mo>|</mo> <mi>x</mi> <mo>;</mo> <mi>&amp;beta;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>;</mo> <mi>&amp;beta;</mi> <mo>|</mo> <mi>c</mi> <mo>)</mo> </mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>&amp;Sigma;</mi> <msup> <mi>c</mi> <mo>&amp;prime;</mo> </msup> </msub> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>;</mo> <mi>&amp;beta;</mi> <mo>|</mo> <msup> <mi>c</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mi>p</mi> <mrow> <mo>(</mo> <msup> <mi>c</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
For two classification problem, two conditional probability densities under given classification are estimated firstWithThe two density are obtained by sparse Density Estimator RSDE-WL1;According to formula (1), calculate it is corresponding after Test probabilityWithAccording to the sample size of each category attribute, prior probability p (c are calculated0) With p (c1), wherein p (c0)+p(c1)=1;Finally the sample of input is classified using Bayes decision rule (2)
<mrow> <mfenced open = "" close = "}"> <mtable> <mtr> <mtd> <mrow> <mi>i</mi> <mi>f</mi> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>;</mo> <msub> <mi>&amp;beta;</mi> <msub> <mi>n</mi> <mn>1</mn> </msub> </msub> <mo>,</mo> <msub> <mi>h</mi> <mn>1</mn> </msub> <mo>|</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>&amp;GreaterEqual;</mo> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>;</mo> <msub> <mi>&amp;beta;</mi> <msub> <mi>n</mi> <mn>0</mn> </msub> </msub> <mo>,</mo> <msub> <mi>h</mi> <mn>0</mn> </msub> <mo>|</mo> <msub> <mi>c</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>c</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>x</mi> <mo>&amp;Element;</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>e</mi> <mi>l</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> <mtd> <mrow> <mi>x</mi> <mo>&amp;Element;</mo> <msub> <mi>c</mi> <mn>0</mn> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
Therefore, Weak Classifier ht(x) expression formula is
<mrow> <msub> <mi>h</mi> <mi>t</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>;</mo> <msub> <mi>&amp;beta;</mi> <msub> <mi>n</mi> <mn>1</mn> </msub> </msub> <mo>,</mo> <msub> <mi>h</mi> <mn>1</mn> </msub> <mo>|</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> </mrow> <mo>)</mo> </mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>&amp;GreaterEqual;</mo> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>;</mo> <msub> <mi>&amp;beta;</mi> <msub> <mi>n</mi> <mn>0</mn> </msub> </msub> <mo>,</mo> <msub> <mi>h</mi> <mn>0</mn> </msub> <mo>|</mo> <msub> <mi>c</mi> <mn>0</mn> </msub> </mrow> <mo>)</mo> </mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>c</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>o</mi> <mi>t</mi> <mi>h</mi> <mi>e</mi> <mi>r</mi> <mi>w</mi> <mi>i</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
WhereinWithSparse expression be respectively
<mrow> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>;</mo> <msub> <mi>&amp;beta;</mi> <msub> <mi>n</mi> <mn>0</mn> </msub> </msub> <mo>,</mo> <msub> <mi>h</mi> <mn>0</mn> </msub> <mo>|</mo> <msub> <mi>c</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>=</mo> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>m</mi> <mn>0</mn> </msub> </msubsup> <msub> <mi>&amp;beta;</mi> <mi>k</mi> </msub> <msub> <mi>K</mi> <msub> <mi>h</mi> <mn>0</mn> </msub> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <msub> <mi>x</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
<mrow> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>;</mo> <msub> <mi>&amp;beta;</mi> <msub> <mi>n</mi> <mn>1</mn> </msub> </msub> <mo>,</mo> <msub> <mi>h</mi> <mn>1</mn> </msub> <mo>|</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>=</mo> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>m</mi> <mn>1</mn> </msub> </msubsup> <msub> <mi>&amp;beta;</mi> <mi>k</mi> </msub> <msub> <mi>K</mi> <msub> <mi>h</mi> <mn>1</mn> </msub> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <msub> <mi>x</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
m0And m1The number of non-zero core weights under respectively two kinds of classifications obtained by sparse Density Estimator RSDE-WL1, usual m0 < < n0And m1< < n1
Wherein, sparse Density Estimator RSDE-WL1 simple realization process is as follows:
It is RSDE algorithms to introduce compression collection density estimation first, and RSDE is based on empirical mean integrated square error ISE criterions, with complete Regression matrixBased on, core weights as much as possible is tended to 0, so as to obtain the sparse of density p (x) Expression formula, it is manifestly that, the RSDE estimations with Gaussian kernel, its core weight vector can be obtained by minimizing integrated square error Arrive, it is as follows
WhereinItem is not considered, E because it is unrelated with parameter betap(x){ } represents the expectation on density p (x); By Density Estimator expression formulaSubstitution formula (6), by series of transformations, obtain equivalent belt restraining Non-negative double optimization problem
<mrow> <mi>&amp;beta;</mi> <mo>=</mo> <munder> <mi>argmin</mi> <mi>&amp;beta;</mi> </munder> <mo>{</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <msubsup> <mi>&amp;beta;</mi> <mi>N</mi> <mi>T</mi> </msubsup> <msub> <mi>C</mi> <mi>N</mi> </msub> <msub> <mi>&amp;beta;</mi> <mi>N</mi> </msub> <mo>-</mo> <msubsup> <mover> <mi>P</mi> <mo>^</mo> </mover> <mi>N</mi> <mi>T</mi> </msubsup> <msub> <mi>&amp;beta;</mi> <mi>N</mi> </msub> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
Constraints βk>=0,1≤k≤N andWherein matrix WithIt is in each sample point Parzen windows estimate vector;
In order to reduce aggregation extent and the degree of rarefication that improves density estimation of the weight coefficient in some regions, weight coefficient is introduced Weight l1NormAs penalty term, improved sparse Density Estimator algorithm RSDE-WL1 is obtained;Also referred to as just Then change item, whereinFor diagonal matrix, definitionW=[w1,w2,..., wN]T, βN=[β12,...,βN]T, add penalty term after new double optimization problem be
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>&amp;beta;</mi> <mo>=</mo> <munder> <mrow> <mi>arg</mi> <mi>min</mi> </mrow> <mi>&amp;beta;</mi> </munder> <mrow> <mo>{</mo> <mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <msubsup> <mi>&amp;beta;</mi> <mi>N</mi> <mi>T</mi> </msubsup> <msub> <mi>C</mi> <mi>N</mi> </msub> <msub> <mi>&amp;beta;</mi> <mi>N</mi> </msub> <mo>-</mo> <msubsup> <mover> <mi>P</mi> <mo>^</mo> </mover> <mi>N</mi> <mi>T</mi> </msubsup> <msub> <mi>&amp;beta;</mi> <mi>N</mi> </msub> <mo>+</mo> <msup> <mi>&amp;lambda;w</mi> <mi>T</mi> </msup> <msub> <mi>&amp;beta;</mi> <mi>N</mi> </msub> </mrow> <mo>}</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>=</mo> <munder> <mrow> <mi>arg</mi> <mi>min</mi> </mrow> <mi>&amp;beta;</mi> </munder> <mrow> <mo>{</mo> <mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <msubsup> <mi>&amp;beta;</mi> <mi>N</mi> <mi>T</mi> </msubsup> <msub> <mi>C</mi> <mi>N</mi> </msub> <msub> <mi>&amp;beta;</mi> <mi>N</mi> </msub> <mo>-</mo> <msup> <mrow> <mo>(</mo> <mrow> <msub> <mover> <mi>P</mi> <mo>^</mo> </mover> <mi>N</mi> </msub> <mo>-</mo> <mi>&amp;lambda;</mi> <mi>w</mi> </mrow> <mo>)</mo> </mrow> <mi>T</mi> </msup> <msub> <mi>&amp;beta;</mi> <mi>N</mi> </msub> </mrow> <mo>}</mo> </mrow> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
It is non-convex to notice problem (8), and solving above mentioned problem using a kind of iterative algorithm obtains the sparse solution of weight coefficient;
(2) optimal feature selection
Calculate Weak Classifier ht(x) weighted error summation, selection meet the single optimal characteristics f of minimal errortWork as building The optimal Weak Classifier of preceding iteration
<mrow> <msup> <mi>f</mi> <mi>t</mi> </msup> <mo>=</mo> <munder> <mi>argmin</mi> <mi>f</mi> </munder> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </msubsup> <msubsup> <mi>w</mi> <mi>i</mi> <mi>t</mi> </msubsup> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <mi>f</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow>
ht(x)=h (x, ft) (10)
(3) next iteration process sample weights update
The classification results that SparseBoost algorithms combine current sample weights and feature selected by the past in AdaBoost algorithms have Close this information;In implementation process, increase by the sample weights of mistake classification, and reduce the sample weights correctly classified; When calculating weighted error summation, more likely it is selected by the sample of mistake classification in next iteration process, weight renewal expression Formula is as follows
<mrow> <msubsup> <mi>w</mi> <mi>i</mi> <mrow> <mi>t</mi> <mo>+</mo> <mn>1</mn> </mrow> </msubsup> <mo>=</mo> <msubsup> <mi>w</mi> <mi>i</mi> <mi>t</mi> </msubsup> <msubsup> <mi>&amp;gamma;</mi> <mi>t</mi> <mrow> <mn>1</mn> <mo>-</mo> <mo>|</mo> <msub> <mi>h</mi> <mi>t</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>|</mo> </mrow> </msubsup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
Set
γtt/1-εt (12)
Wherein εtIt is Weak Classifier ht(x) error in classification:
<mrow> <msub> <mi>&amp;epsiv;</mi> <mi>t</mi> </msub> <mo>=</mo> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </msubsup> <msubsup> <mi>w</mi> <mi>i</mi> <mi>t</mi> </msubsup> <mo>|</mo> <msub> <mi>h</mi> <mi>t</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>13</mn> <mo>)</mo> </mrow> </mrow>
Meanwhile to make weight wt+1Meet probability distribution, it is necessary to be standardized to the weight after renewal, standardization formula is as follows:
After T iteration, the final output of SparseBoost algorithms is being obtained:The strong classification being made up of Weak Classifier weighting Device h (x):
<mrow> <mi>h</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <msubsup> <mi>if&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>T</mi> </msubsup> <msub> <mi>&amp;alpha;</mi> <mi>t</mi> </msub> <msub> <mi>h</mi> <mi>t</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>&amp;GreaterEqual;</mo> <msubsup> <mi>&amp;mu;&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>T</mi> </msubsup> <msub> <mi>&amp;alpha;</mi> <mi>t</mi> </msub> <mo>,</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>o</mi> <mi>t</mi> <mi>h</mi> <mi>e</mi> <mi>r</mi> <mi>w</mi> <mi>i</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>15</mn> <mo>)</mo> </mrow> </mrow>
Wherein αt=ln (1/ γt), μ is given threshold value, and μ takes 0.5.
CN201510089099.1A 2014-07-30 2015-02-27 The detection method of crater during Mars probes soft landing based on sparse lifting integrated classifier Active CN104700115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510089099.1A CN104700115B (en) 2014-07-30 2015-02-27 The detection method of crater during Mars probes soft landing based on sparse lifting integrated classifier

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201410371237 2014-07-30
CN2014103712370 2014-07-30
CN201510089099.1A CN104700115B (en) 2014-07-30 2015-02-27 The detection method of crater during Mars probes soft landing based on sparse lifting integrated classifier

Publications (2)

Publication Number Publication Date
CN104700115A CN104700115A (en) 2015-06-10
CN104700115B true CN104700115B (en) 2017-12-05

Family

ID=53347212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510089099.1A Active CN104700115B (en) 2014-07-30 2015-02-27 The detection method of crater during Mars probes soft landing based on sparse lifting integrated classifier

Country Status (1)

Country Link
CN (1) CN104700115B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778861A (en) * 2016-12-12 2017-05-31 齐鲁工业大学 A kind of screening technique of key feature
CN107945274B (en) * 2017-12-26 2021-04-20 苏州蜗牛数字科技股份有限公司 Volly noise-based annular mountain terrain generation method and device
CN108734219B (en) * 2018-05-23 2022-02-01 北京航空航天大学 End-to-end collision pit detection and identification method based on full convolution neural network structure
CN110443176B (en) * 2019-07-29 2022-03-04 中国科学院国家空间科学中心 Dark and weak celestial body correlation detection method and system based on statistical feature space

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129565A (en) * 2011-03-01 2011-07-20 北京航空航天大学 Object detection method based on feature redundancy elimination AdaBoost classifier
CN102855486A (en) * 2012-08-20 2013-01-02 北京理工大学 Generalized image target detection method
CN102944226A (en) * 2012-12-03 2013-02-27 哈尔滨工业大学 Meteor crater detecting method based on bright and dark area pairing
CN103093463A (en) * 2013-01-15 2013-05-08 南京航空航天大学 Meteor crater detecting method based on gray level image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129565A (en) * 2011-03-01 2011-07-20 北京航空航天大学 Object detection method based on feature redundancy elimination AdaBoost classifier
CN102855486A (en) * 2012-08-20 2013-01-02 北京理工大学 Generalized image target detection method
CN102944226A (en) * 2012-12-03 2013-02-27 哈尔滨工业大学 Meteor crater detecting method based on bright and dark area pairing
CN103093463A (en) * 2013-01-15 2013-05-08 南京航空航天大学 Meteor crater detecting method based on gray level image

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Crater Detection by a Boosting Approach;R Martins等;《IEEE Geoscience and Remote Sensing Letters》;20090131;第6卷(第1期);第127-131页 *
QUANTITATIVE ASSESSMENT OF AUTOMATED CRATER DETECTION ON MARS;Jung Rack Kim等;《(Proceedings) ISPRS Congress. ISPRS: Istanbul, Turkey》;20040723;第1-6页 *
利用数字高程模型自动检测火星表面陨石坑;张腾宇等;《深空探测学报》;20140630;第1卷(第2期);第123-127页 *
基于Census变换和Boosting方法的陨石坑区域检测;丁萌等;《南京航空航天大学学报》;20091031;第41卷(第5期);第682-687页 *

Also Published As

Publication number Publication date
CN104700115A (en) 2015-06-10

Similar Documents

Publication Publication Date Title
CN111368896B (en) Hyperspectral remote sensing image classification method based on dense residual three-dimensional convolutional neural network
CN109614985B (en) Target detection method based on densely connected feature pyramid network
CN110210486B (en) Sketch annotation information-based generation countermeasure transfer learning method
CN108664971B (en) Pulmonary nodule detection method based on 2D convolutional neural network
CN109583425A (en) A kind of integrated recognition methods of the remote sensing images ship based on deep learning
CN106203523A (en) The classification hyperspectral imagery of the semi-supervised algorithm fusion of decision tree is promoted based on gradient
CN104484681B (en) Hyperspectral Remote Sensing Imagery Classification method based on spatial information and integrated study
CN108764281A (en) A kind of image classification method learning across task depth network based on semi-supervised step certainly
CN107424159A (en) Image, semantic dividing method based on super-pixel edge and full convolutional network
CN104700115B (en) The detection method of crater during Mars probes soft landing based on sparse lifting integrated classifier
CN107451616A (en) Multi-spectral remote sensing image terrain classification method based on the semi-supervised transfer learning of depth
CN106611423B (en) SAR image segmentation method based on ridge ripple filter and deconvolution structural model
Espínola et al. Contextual and hierarchical classification of satellite images based on cellular automata
CN108229550A (en) A kind of cloud atlas sorting technique that network of forests network is cascaded based on more granularities
CN105955708A (en) Sports video lens classification method based on deep convolutional neural networks
CN111489370B (en) Remote sensing image segmentation method based on deep learning
CN104537647A (en) Target detection method and device
CN107292336A (en) A kind of Classification of Polarimetric SAR Image method based on DCGAN
CN104123561A (en) Spatial gravity model based fuzzy c-means remote sensing image automatic classification method
CN103886342A (en) Hyperspectral image classification method based on spectrums and neighbourhood information dictionary learning
CN103839073A (en) Polarization SAR image classification method based on polarization features and affinity propagation clustering
CN105046268B (en) Classification of Polarimetric SAR Image method based on Wishart depth networks
CN107403434A (en) SAR image semantic segmentation method based on two-phase analyzing method
CN104408731B (en) Region graph and statistic similarity coding-based SAR (synthetic aperture radar) image segmentation method
CN106611422A (en) Stochastic gradient Bayesian SAR image segmentation method based on sketch structure

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant