CN104463169B - A kind of Ship Target area-of-interest rapid extracting method of overlap processing - Google Patents

A kind of Ship Target area-of-interest rapid extracting method of overlap processing Download PDF

Info

Publication number
CN104463169B
CN104463169B CN201410705774.4A CN201410705774A CN104463169B CN 104463169 B CN104463169 B CN 104463169B CN 201410705774 A CN201410705774 A CN 201410705774A CN 104463169 B CN104463169 B CN 104463169B
Authority
CN
China
Prior art keywords
mrow
msub
omega
image
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410705774.4A
Other languages
Chinese (zh)
Other versions
CN104463169A (en
Inventor
陈瑞
孙文方
张守娟
李晓博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Institute of Space Radio Technology
Original Assignee
Xian Institute of Space Radio Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Institute of Space Radio Technology filed Critical Xian Institute of Space Radio Technology
Priority to CN201410705774.4A priority Critical patent/CN104463169B/en
Publication of CN104463169A publication Critical patent/CN104463169A/en
Application granted granted Critical
Publication of CN104463169B publication Critical patent/CN104463169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present invention proposes a kind of Ship Target area-of-interest (ROI) rapid extracting method of overlap processing.The present invention carries out overlap partition to remote sensing images first, and the pixel average of each sub-block, Estimation of Mean value, pixel criterion deviation and standard deviation estimate are calculated;Then preliminary screening is carried out to Ship Target ROI using pixel Estimation of Mean value;It is otherwise marine site if standard deviation estimate outside 3 σ confidential intervals, concludes that the region is Ship Target ROI finally according to the σ rules of standard deviation estimate 3 in preliminary screening region to judging.The inventive method is simple, differentiates that the degree of accuracy is high, and has certain adaptability, can be applied to the fields such as sea traffic monitoring, ship search and rescue, fisheries management and marine site Situation Awareness.

Description

A kind of Ship Target area-of-interest rapid extracting method of overlap processing
Technical field
The invention belongs to space remote sensing field, it is related to a kind of Ship Target ROI rapid extracting methods of overlap processing.
Background technology
The method of Ship Target ROI (area-of-interest) rapid extraction is the serviceability from remotely-sensed data, directly will Ship Target region interested is detected, and rejects invalid data, improving data transmission efficiency.With remotely sensed image technology Development, how from background complexity, the visible remote sensing image that target signature is unstable, data volume is huge, fast and accurately sieve Ship Target ROI is selected, is an extremely challenging job.
It is relatively more to the Ship Target ROI extracting methods of remote sensing images at present, it can be divided mainly into four major classes:(1) based on ash Spend the method for statistical nature;(2) method based on marginal information;(3) method based on fractal model and fuzzy theory;(4) base In the method for visually-perceptible mechanism.And for the method for visible images Ship Target ROI extractions, usually carried out on ground , compared with in-orbit application on star, cloud detection equipment in ground must not constrained by space power consumption, weight, volume etc., therefore detection side Method is to complexity without strict demand.In addition, satellite remote sensing images Ship Target Detection method is specifically defended for certain mostly at present Star, detection method is with strong points, without versatility.
The content of the invention
Present invention solves the technical problem that it is:A kind of overcome the deficiencies in the prior art, there is provided Ship Target of overlap processing The method of ROI rapid extractions, solve the problems, such as the in-orbit Ship Target ROI quick detections of satellite remote sensing images.
The technical scheme is that:A kind of Ship Target area-of-interest rapid extracting method of overlap processing, step It is as follows:
1) weight for being divided into image size by four kinds of modes according to different image resolution ratios for the original image of (M, N) Folded subimage block, each subimage block size is (BM,BN);Wherein M and N can divide exactly B respectivelyMAnd BN, then segmentation figure is as number of blocks ForThe starting point of the upper left corner sub-block of four kinds of partitioning schemes be respectively (0,0), (BM/ 2,0), (0, BN/ 2) and (BM/2,BN/2);
2) calculate respectively and obtain each (B in four segmentationsM/2,BN/ 2) the pixel average of subimage block, Estimation of Mean value, Pixel criterion deviation and standard deviation estimate;Wherein, the Estimation of Mean value of each subimage block appears in four times for it The arithmetic mean of instantaneous value of subimage block pixel average in partitioning scheme;The standard deviation estimate of each subimage block is it Appear in the arithmetic mean of instantaneous value of the pixel criterion deviation in four partitioning schemes;
3) judgement is compared with the threshold value T1 and T2 of precondition to the Estimation of Mean value obtained in step 2), if full Sufficient decision condition:T1≤Estimation of Mean value≤T2, then the preliminary judgement region is Ship Target ROI, is otherwise marine site;
4) threshold value is carried out using standard deviation estimate to the region that preliminary judgement in step 3) is Ship Target ROI to sentence It is disconnected, if meeting decision condition:The threshold value T3 of Estimation of Mean value >=precondition, then the region is confirmed for Ship Target ROI, it is no It is then marine site.
The present invention compared with prior art the advantages of be:A kind of Ship Target sense of overlap processing proposed by the present invention is emerging Interesting region rapid extracting method, method is simple, meets the needs of in-orbit application, and inventive method uses simple average and standard deviation Feature, and computationally intensive threshold parameter is chosen and then completed by ground.Therefore, whole method is simply easy to hardware realization;Hair Bright method accuracy in detection is high, by showing the classification results of satellite remote sensing images, detection of this method to Ship Target ROI Rate is more than 90%;Simultaneously adaptable, by choosing different satellite imaging datas, experiment test multi-satellite image should Method can preferably detect Ship Target ROI, have and well adapt to performance.
Brief description of the drawings
Fig. 1 is the flow chart of the present invention;
Fig. 2 (a) -2 (d) is respectively four seed block partitioning scheme figures;
Fig. 3 is subimage block neighborhood relationships figure.
Embodiment
The present invention proposes a kind of low complex degree Ship Target ROI rapid extracting methods, and its specific implementation process is as follows:
First, overlap partition is carried out to original image first, image size is divided into below figure for the original image of (M, N) Size shown in 2 (a) is (BM,BN) subimage block, it is desirable to M and N can divide exactly B respectivelyMAnd BN, segmentation figure is as number of blocksThe starting point of upper left corner sub-block is (0,0);Press the method shown in Fig. 2 (b) and carry out image block, upper left corner sub-block Starting point be (BM/2,0);Press the method shown in Fig. 2 (c) and carry out image block, the starting point of upper left corner sub-block is (0, BN/2); Press the method shown in Fig. 2 (d) and carry out image block, the starting point of upper left corner sub-block is (BM/2,BN/2).Above-mentioned four kinds of images point Cut and constitute a kind of overlapping image segmentation result, wherein four sizes at four angles are (BM/2,BN/ 2) sub-block does not have segmentation It is overlapping, four edgesIndividual size is (BM/2,BN/ 2) sub-block is appeared in segmentation twice, and middle 'sIndividual size is (BM/2,BN/ 2) sub-block is appeared in four segmentations.Engineering experience shows, 2m The image of~5m resolution ratio is relatively adapted to do Ship Target ROI extractions, and in actual process, true resolution is transformed into 2m, and the hull size detected as required carries out piecemeal.
2nd, each (B in four segmentations is calculated respectivelyM/2,BN/ 2) characteristic value of sub-image, including average, variance.Note Θk(i, j) is the individual segmentation subimage block region of kth kind dividing method (i, j), and its size is (BM/2,BN/ 2), its subgraph Block neighborhood relationships are as shown in Figure 3.
Remember Ωk(i, j) is the individual sub- image block areas of kth kind dividing method (i, j), and its pixel average is:
Wherein:P (m, n) is that (m, n) puts pixel value;
Pixel criterion deviation is
Remember that Θ (i, j) is that (i, j) individual size is (BM/2,BN/ 2) subimage block region, for appearing in 4 segmentations Subgraph it is fast, its pixel Estimation of Mean value is:
Fast for appearing in the subgraph in 2 segmentations, its pixel Estimation of Mean value is:
(up-and-down boundary)
(right boundary)
Fast for appearing in the subgraph in 1 segmentation, its pixel Estimation of Mean value is:
(corner)
Fast for appearing in the subgraph in 4 segmentations, its pixel criterion estimation of deviation value is:
Fast for appearing in the subgraph in 2 segmentations, its pixel criterion estimation of deviation value is:
(up-and-down boundary)
(right boundary)
Fast for appearing in the subgraph in 1 segmentation, its pixel criterion estimation of deviation value is:
(corner)
3rd, for obtaining in step 2Carry out threshold value judgement, it is desirable toThen tentatively it is judged as The region is Ship Target ROI, is otherwise marine site;According to engineering experience, the reflectivity on naval vessel is between 10%~50%, then T1 =10%R, T2=50%R (specific percentage can adjust), wherein R are maximum corresponding to image pixel bit wide.
4th, threshold value judgement is carried out for obtaining the preliminary standard deviation estimate for judging region in step 3, it is desirable toThen the region decision is Ship Target ROI, is otherwise marine site;Wherein T3 determines according to 3 σ rules, i.e.,Θ (i, j) herein is the region just sentenced in step 3.
The content not being described in detail in description of the invention belongs to professional and technical personnel in the field's known technology.

Claims (1)

  1. A kind of 1. Ship Target area-of-interest exacting method of overlap processing, it is characterised in that:Step is as follows:
    1) image size is divided into according to different image resolution ratios for the original image of (M, N) by four kinds of partitioning schemes overlapping Subimage block, each subimage block size are (BM,BN);Wherein M and N can divide exactly B respectivelyMAnd BN, then segmentation figure be as number of blocksThe starting point of the upper left corner sub-image of four kinds of partitioning schemes be respectively (0,0), (BM/ 2,0), (0, BN/ 2) and (BM/2,BN/2);Four sizes at four angles are (BM/2,BN/ 2) sub-image do not have segmentation it is overlapping, four edgesIndividual size is (BM/2,BN/ 2) sub-image is appeared in segmentation twice, and middleIndividual size is (BM/2,BN/ 2) sub-image is appeared in four segmentations;
    2) calculate respectively and obtain each (B in four segmentationsM/2,BN/ 2) the pixel average of sub-image, Estimation of Mean value, pixel Standard deviation and standard deviation estimate;Wherein, each (BM/2,BN/ 2) the Estimation of Mean value of sub-image occurs for it The arithmetic mean of instantaneous value of sub-image pixel average in four partitioning schemes;Each (BM/2,BN/ 2) mark of sub-image Quasi- estimation of deviation value appears in the arithmetic mean of instantaneous value of the sub-image pixel criterion deviation in four partitioning schemes for it;Specific mistake Cheng Wei:
    Remember Ωk(i, j) is the individual sub-block image-region of kth kind dividing method (i, j), and its pixel average is:
    <mrow> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <msub> <mi>B</mi> <mi>M</mi> </msub> <msub> <mi>B</mi> <mi>N</mi> </msub> </mrow> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> <mo>&amp;Element;</mo> <msub> <mi>&amp;Omega;</mi> <mi>k</mi> </msub> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </munder> <mi>P</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
    Wherein:P (m, n) is that (m, n) puts pixel value;
    Pixel criterion deviation is
    <mrow> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mrow> <msub> <mi>B</mi> <mi>M</mi> </msub> <msub> <mi>B</mi> <mi>N</mi> </msub> </mrow> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> <mo>&amp;Element;</mo> <msub> <mi>&amp;Omega;</mi> <mi>k</mi> </msub> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </munder> <msup> <mi>P</mi> <mn>2</mn> </msup> <mo>(</mo> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> <mo>)</mo> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mn>0.5</mn> </msup> <mo>;</mo> </mrow>
    Remember that Θ (i, j) is that (i, j) individual size is (BM/2,BN/ 2) sub-image region, for appearing in the son in 4 segmentations Block image, its pixel Estimation of Mean value are:
    <mrow> <msub> <mover> <mi>E</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>4</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>4</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
    For appearing in the sub-image in 2 segmentations, its pixel Estimation of Mean value is:
    <mrow> <msub> <mover> <mi>E</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
    <mrow> <msub> <mover> <mi>E</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
    For appearing in the sub-image in 1 segmentation, its pixel Estimation of Mean value is:
    <mrow> <msub> <mover> <mi>E</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <msub> <mi>E</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> </mrow>
    For appearing in the sub-image in 4 segmentations, its pixel criterion estimation of deviation value is:
    <mrow> <msub> <mover> <mi>&amp;delta;</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>4</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>4</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
    For appearing in the sub-image in 2 segmentations, its pixel criterion estimation of deviation value is:
    <mrow> <msub> <mover> <mi>&amp;delta;</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
    <mrow> <msub> <mover> <mi>&amp;delta;</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>+</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
    For appearing in the sub-image in 1 segmentation, its pixel criterion estimation of deviation value is:
    <mrow> <msub> <mover> <mi>&amp;delta;</mi> <mo>^</mo> </mover> <mrow> <mi>&amp;Theta;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <msub> <mi>&amp;delta;</mi> <mrow> <msub> <mi>&amp;Omega;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>;</mo> </mrow>
    3) judgement is compared with the threshold value T1 and T2 of precondition to the Estimation of Mean value obtained in step 2), if meeting to sentence Fixed condition:T1≤Estimation of Mean value≤T2, then the preliminary judgement region is Ship Target ROI, is otherwise marine site;
    4) threshold decision is carried out using standard deviation estimate to the region that preliminary judgement in step 3) is Ship Target ROI, such as Fruit meets decision condition:The threshold value T3 of standard deviation estimate >=precondition, then the region is confirmed for Ship Target ROI, it is no It is then marine site;Wherein, T1=10%R, T2=50%R, R are maximum corresponding to image pixel bit wide;T3 is true according to 3 σ rules It is fixed, i.e.,Θ (i, j) is the region just sentenced in step 3).
CN201410705774.4A 2014-11-27 2014-11-27 A kind of Ship Target area-of-interest rapid extracting method of overlap processing Active CN104463169B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410705774.4A CN104463169B (en) 2014-11-27 2014-11-27 A kind of Ship Target area-of-interest rapid extracting method of overlap processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410705774.4A CN104463169B (en) 2014-11-27 2014-11-27 A kind of Ship Target area-of-interest rapid extracting method of overlap processing

Publications (2)

Publication Number Publication Date
CN104463169A CN104463169A (en) 2015-03-25
CN104463169B true CN104463169B (en) 2017-12-15

Family

ID=52909183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410705774.4A Active CN104463169B (en) 2014-11-27 2014-11-27 A kind of Ship Target area-of-interest rapid extracting method of overlap processing

Country Status (1)

Country Link
CN (1) CN104463169B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110084181B (en) * 2019-04-24 2021-04-20 哈尔滨工业大学 Remote sensing image ship target detection method based on sparse MobileNet V2 network
CN114299094B (en) * 2022-01-05 2022-10-11 哈尔滨工业大学 Infusion bottle image region-of-interest extraction method based on block selection and expansion

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101520896A (en) * 2009-03-30 2009-09-02 中国电子科技集团公司第十研究所 Method for automatically detecting cloud interfering naval vessel target by optical remote sensing image
US8116522B1 (en) * 2008-08-25 2012-02-14 The United States Of America As Represented By The Secretary Of The Navy Ship detection system and method from overhead images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8116522B1 (en) * 2008-08-25 2012-02-14 The United States Of America As Represented By The Secretary Of The Navy Ship detection system and method from overhead images
CN101520896A (en) * 2009-03-30 2009-09-02 中国电子科技集团公司第十研究所 Method for automatically detecting cloud interfering naval vessel target by optical remote sensing image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Ship Detection in SAR Image Based on the Alpha-stable Distribution;Changcheng Wang 等;《Sensors》;20080822;第4948页-4960页 *
一种光学遥感图像海面舰船检测算法;高立宁 等;《清华大学学报(自然科学版)》;20111231;第51卷(第1期);第105页-110页 *

Also Published As

Publication number Publication date
CN104463169A (en) 2015-03-25

Similar Documents

Publication Publication Date Title
CN105374033B (en) SAR image segmentation method based on ridge ripple deconvolution network and sparse classification
CN104484667B (en) A kind of contour extraction method based on brightness and integrality of outline
CN108960135B (en) Dense ship target accurate detection method based on high-resolution remote sensing image
CN105354541A (en) SAR (Synthetic Aperture Radar) image target detection method based on visual attention model and constant false alarm rate
CN107680090A (en) Based on the electric transmission line isolator state identification method for improving full convolutional neural networks
CN103745468B (en) Significant object detecting method based on graph structure and boundary apriority
CN106384344A (en) Sea-surface ship object detecting and extracting method of optical remote sensing image
CN111862143B (en) Automatic monitoring method for river dike collapse
CN106651880B (en) Offshore moving target detection method based on multi-feature fusion thermal infrared remote sensing image
CN102830404B (en) Method for identifying laser imaging radar ground target based on range profile
CN103247059A (en) Remote sensing image region of interest detection method based on integer wavelets and visual features
CN105427313B (en) SAR image segmentation method based on deconvolution network and adaptive inference network
CN107967474A (en) A kind of sea-surface target conspicuousness detection method based on convolutional neural networks
WO2018000252A1 (en) Oceanic background modelling and restraining method and system for high-resolution remote sensing oceanic image
CN103208097A (en) Principal component analysis collaborative filtering method for image multi-direction morphological structure grouping
CN103927758A (en) Saliency detection method based on contrast ratio and minimum convex hull of angular point
CN104463169B (en) A kind of Ship Target area-of-interest rapid extracting method of overlap processing
CN101964108A (en) Real-time on-line system-based field leaf image edge extraction method and system
CN110472628A (en) A kind of improvement Faster R-CNN network detection floating material method based on video features
CN103839234A (en) Double-geometry nonlocal average image denoising method based on controlled nuclear
CN110084302A (en) A kind of crack detection method based on remote sensing images
CN105893960A (en) Road traffic sign detecting method based on phase symmetry
CN110097524A (en) SAR image object detection method based on fusion convolutional neural networks
CN101533507B (en) Self-adaptive method for watermarking intensive image texture
CN106296649A (en) A kind of texture image segmenting method based on Level Set Models

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant