CN102073996B - Image-correlation-evaluation-based method for determining image segmentation threshold - Google Patents

Image-correlation-evaluation-based method for determining image segmentation threshold Download PDF

Info

Publication number
CN102073996B
CN102073996B CN201010623834XA CN201010623834A CN102073996B CN 102073996 B CN102073996 B CN 102073996B CN 201010623834X A CN201010623834X A CN 201010623834XA CN 201010623834 A CN201010623834 A CN 201010623834A CN 102073996 B CN102073996 B CN 102073996B
Authority
CN
China
Prior art keywords
image
gray
value
threshold value
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201010623834XA
Other languages
Chinese (zh)
Other versions
CN102073996A (en
Inventor
郭绍刚
赵春晖
刘鲁
龚德铸
高进
高文文
王艳宝
王京海
张丽华
魏高乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control Engineering
Original Assignee
Beijing Institute of Control Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control Engineering filed Critical Beijing Institute of Control Engineering
Priority to CN201010623834XA priority Critical patent/CN102073996B/en
Publication of CN102073996A publication Critical patent/CN102073996A/en
Application granted granted Critical
Publication of CN102073996B publication Critical patent/CN102073996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides an image-correlation-evaluation-based method for determining an image segmentation threshold. On the basis of an imaging theory, a signal processing theory and other traditional theories, correlations between image effective spots and background light and stray light are considered, namely the covariance (between-class variance) between an effective spot part image and a background stray light image is analyzed, wherein the maximum coefficient of the covariance refers to the minimum correlation between the effective spot part image and the background stray light image, and the threshold segmentation is best in the state, so the background light, the stray light and the effective light source image can be separated. The method can effectively inhibit the interference of the stray light to the effective spots, and the spots are accurately extracted.

Description

Image segmentation threshold value based on the image correlation assessment is confirmed method
Technical field
The present invention relates to a kind of Technique in Rendezvous and Docking with the image processing method of optical imagery sensor under complex background.
Background technology
Increasing along with the space mission value volume and range of product, the especially development of space station makes the Spacecraft Rendezvous technology obtain significant progress.The closely optical measuring apparatus that CCD optical imagery sensor docks with the target aircraft intersection as tracker, its image quality to the successful intersection of spacecraft to being connected to very big influence.
According to the requirement of intersection docking mission, the blip device is fixedly mounted on predetermined measured zone on the target aircraft according to certain mode usually.CCD optical imagery sensor forms images to this measured zone, and the image that obtains is carried out luminous point extract and discern, and calculates and also exports relative position and the relative attitude angle of optical sensor coordinate system with respect to blip device coordinate system.Effective luminous point that therefore how from CCD optical imagery sensor image, will constitute the blip device extracts from bias light the normal measurement that directly influences CCD optical imagery sensor, and the cut-point between effective luminous point and the background image is image threshold.
Existing method mainly uses fixed threshold (as add three times of variances etc. with image background; Specifically can be referring to M.Sezgin and B.Sankur; Survey over image thresholding techniques and quantitative performance evaluation; Journal of Electronic Imaging, pp.146-156,2003) image is carried out Threshold Segmentation; These methods can not split effective luminous point from background image in complex background image, thereby influence the normal measurement of CCD optical imagery sensor.
Summary of the invention
Technology of the present invention is dealt with problems and is: overcome the deficiency of prior art, provide a kind of image segmentation threshold value based on the image correlation assessment to confirm method, can effectively suppress the interference of parasitic light to effective luminous point, accurately luminous point is extracted.
Technical solution of the present invention is: the image segmentation threshold value based on the image correlation assessment is confirmed method, and step is following:
(1) set up the picture noise distribution histogram, find out the noise maximal value and with it as noise gate;
(2) noise gate of confirming with step (1) is a benchmark, uniformly discrete pixel extraction greater than noise gate in the image is come out according to the SI, adds up the number of pixels S of each gray scale i, i ∈ [hmin, hmax], hmin and hmax are respectively minimum gradation value and maximum gradation value, the background mean value background that calculates image and the number of pixels of each gray scale shared ratio in image;
(3) travel through each gray-scale value from minimal gray hmin successively to maximum gray scale, add up respectively from minimal gray hmin to current gray level n number of pixels sum miu1 and greater than the number of pixels sum miu2 of current gray level n smaller or equal to maximum gray scale hmax,
miu 1 = Σ i = h min n ( i × S i ) omega 1 , omega 1 = Σ i = h min n S i , miu 2 = Σ i = n h min ( i × S i ) omega 2 , omega 2 = Σ i = n + 1 h max S i ;
(4) travel through each gray-scale value successively; The current gray level value as the separatrix; With image be divided into be higher than marginal part be lower than marginal part, calculate two-part inter-class variance xigma, if the inter-class variance that calculates is greater than the inter-class variance and the predetermined offset sum of previous stage gray-scale value; Then with current gray scale as current theoretical threshold value, the inter-class variance computing formula is:
Xigma=omega1 * (miu1-background) 2+ omega2 * (miu2-background) 2+ predetermined offset;
(5) utilize filter function that the threshold value of current theoretical threshold value and previous frame image is carried out weighted mean, obtain the present image threshold value, computing formula is:
Present image threshold value=current theoretical the threshold value of (1-filter function) *+filter function * previous frame image threshold.
Described predetermined offset is obtained by the statistical average background of the multiple image statistical average threshold value divided by correspondence.
The present invention's advantage compared with prior art is: the inventive method is the basis with the traditional theory of imaging theory and signal Processing etc.; Through considering the correlativity between the effective luminous point of image and bias light and the parasitic light; Promptly analyze the covariance size (inter-class variance) between effective luminous point parts of images and the background miscellaneous light image; Correlativity between maximum effective luminous point parts of images of expression of covariance coefficient and the background miscellaneous light image is minimum; Explain that promptly the Threshold Segmentation under this state is best; Thereby with bias light, parasitic light and efficient light sources image area separately, reduced the influence of sunshine, earth light and target aircraft surface emitting light simultaneously, strengthened the ability to work of CCD optical imagery sensor under the complex background environment CCD optical imagery sensor normal attitude position calculation.Simultaneously; The inventive method has been improved traditional threshold value and has been confirmed the dependence of method to the image background degree of purity; Can guarantee CCD optical imagery sensor operate as normal under complicated optical environment; When satisfying the requirement of intersection butt joint different operating period measuring task, practiced thrift the Flame Image Process time, the development of high robust, high refresh rate intersection being docked sensor has important engineering use value.
Description of drawings
Fig. 1 is the FB(flow block) of the inventive method.
Embodiment
As shown in Figure 1, be the theory diagram of the inventive method, after the input of CCD head integral image, the key step that image threshold is confirmed is following:
1, the horizontal statistical appraisal of gradation of image
The gray scale of all pixels of light spot signal should be greater than certain threshold value.The noise maximal value that employing is confirmed by the histogram of noise profile is as thresholding.At first, add up the number of pixels of each gray level, the number of pixels of background mean value and each gray scale that calculates image shared ratio in image according to the plurality of pixels in the uniformly discrete image of choosing the specific region of SI.
If the number of pixels of each gray level is respectively: S 0, S 1, S 2..., S 255
2, image correlation analysis
Successively from the minimal gray level to maximum gray scale the traversal each grey level, the statistics from the minimal gray level to this gray level the number of pixels sum and greater than the number of pixels sum of this gray scale smaller or equal to maximum gray scale.
Number of pixels sum from minimal gray level (hmin) to this gray level (being made as n):
miu 1 = Σ i = h min n ( i × S i ) omega 1 , omega 1 = Σ i = h min n S i
More than or equal to the number of pixels sum of this gray scale (being made as n) smaller or equal to maximum gray scale (hmax):
miu 2 = Σ i = n h min ( i × S i ) omega 2 , omega 2 = Σ i = n + 1 h max S i
3, calculating of image covariance and optimum variance chooses with definite
Travel through each gray-scale value successively, the current gray level value as the separatrix, with image be divided into be higher than marginal part be lower than marginal part, calculate two-part inter-class variance xigma,
xigma=omega1×(miu1-background) 2+omega2×(miu2-background) 2
xigma=xigma+0.1
Travel through each gray level inter-class variance successively; If greater than the previous stage gray-scale value inter-class variance add side-play amount; Then with current gray level as threshold value, too violent in order to guarantee between image upper level class with current inter-stage covariance variable gradient, increased the inter-stage side-play amount and come correction threshold.After carrying out inter-class variance gray scale traversal calculating completion; Make inter-class variance and predetermined offset sum obtain peaked gray-scale value as current theoretical threshold value; The selection principle that predetermined offset is general is to be predetermined offset to multiple image statistical average background divided by likening to of statistical average threshold value, and this side-play amount gets 0.1 usually.
4, adaptive threshold is confirmed
Current theoretical threshold value and previous frame image threshold carry out weighted mean according to filter function, obtain and find best threshold value, can accurately aiming pip and background miscellaneous light be distinguished, and computing formula is:
Current threshold value=current theoretical the threshold value of (1-filter function) *+filter function * previous frame image threshold.
Filter function is to obtain according to 1000 captured width of cloth image statistics backgrounds and changes of threshold gradient, and directly value is 0.1 at present.
The content of not doing to describe in detail in the instructions of the present invention belongs to those skilled in the art's known technology.

Claims (1)

1. confirm method based on the image segmentation threshold value of image correlation assessment, it is characterized in that step is following:
(1) set up the picture noise distribution histogram, find out the noise maximal value and with it as noise gate;
(2) noise gate of confirming with step (1) is a benchmark, uniformly discrete pixel extraction greater than noise gate in the image is come out according to the SI, adds up the number of pixels S of each gray scale i, i ∈ [hmin, hmax], hmin and hmax are respectively minimum gradation value and maximum gradation value, the background mean value background that calculates image and the number of pixels of each gray scale shared ratio in image;
(3) travel through each gray-scale value from minimal gray hmin successively to maximum gray scale, add up respectively from minimal gray hmin to current gray level n number of pixels sum miu1 and greater than the number of pixels sum miu2 of current gray level n smaller or equal to maximum gray scale hmax,
Figure FSB00000756890800011
Figure FSB00000756890800012
Figure FSB00000756890800013
Figure FSB00000756890800014
(4) travel through each gray-scale value successively; The current gray level value as the separatrix; With image be divided into be higher than marginal part be lower than marginal part, calculate two-part inter-class variance xigma, if the inter-class variance that calculates is greater than the inter-class variance and the predetermined offset sum of previous stage gray-scale value; Then with current gray scale as current theoretical threshold value, the inter-class variance computing formula is:
Xigma=omega1 * (miu1-background) 2+ omega2 * (miu2-background) 2+ predetermined offset, wherein predetermined offset is obtained by the statistical average background of the multiple image statistical average threshold value divided by correspondence;
(5) utilize filter function that the threshold value of current theoretical threshold value and previous frame image is carried out weighted mean, obtain the present image threshold value, computing formula is:
Present image threshold value=current theoretical the threshold value of (1-filter function) *+filter function * previous frame image threshold.
CN201010623834XA 2010-12-31 2010-12-31 Image-correlation-evaluation-based method for determining image segmentation threshold Active CN102073996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010623834XA CN102073996B (en) 2010-12-31 2010-12-31 Image-correlation-evaluation-based method for determining image segmentation threshold

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010623834XA CN102073996B (en) 2010-12-31 2010-12-31 Image-correlation-evaluation-based method for determining image segmentation threshold

Publications (2)

Publication Number Publication Date
CN102073996A CN102073996A (en) 2011-05-25
CN102073996B true CN102073996B (en) 2012-07-18

Family

ID=44032526

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010623834XA Active CN102073996B (en) 2010-12-31 2010-12-31 Image-correlation-evaluation-based method for determining image segmentation threshold

Country Status (1)

Country Link
CN (1) CN102073996B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102761360B (en) 2012-06-29 2015-07-22 飞天诚信科技股份有限公司 Optical signal processing method and device
CN104700096B (en) * 2015-03-30 2018-07-13 北京奇艺世纪科技有限公司 A kind of user action identified areas based on image determines method and device
CN112488980B (en) * 2019-08-20 2024-06-25 隆基绿能科技股份有限公司 Melt state detection method, device and equipment
CN111787310B (en) * 2020-07-14 2023-01-17 广州得尔塔影像技术有限公司 Anti-shake performance testing method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100576246C (en) * 2007-05-23 2009-12-30 华中科技大学 A kind of image partition method based on attribute histogram

Also Published As

Publication number Publication date
CN102073996A (en) 2011-05-25

Similar Documents

Publication Publication Date Title
US10290219B2 (en) Machine vision-based method and system for aircraft docking guidance and aircraft type identification
CN102279973B (en) Sea-sky-line detection method based on high gradient key points
CN103077521B (en) A kind of area-of-interest exacting method for video monitoring
CN103324936B (en) A kind of vehicle lower boundary detection method based on Multi-sensor Fusion
CN103886325B (en) Cyclic matrix video tracking method with partition
CN102789578B (en) Infrared remote sensing image change detection method based on multi-source target characteristic support
CN101408985A (en) Method and apparatus for extracting circular luminous spot second-pixel center
CN110647836B (en) Robust single-target tracking method based on deep learning
CN102855622A (en) Infrared remote sensing image sea ship detecting method based on significance analysis
CN105457908B (en) The sorting method for rapidly positioning and system of small size glass panel based on monocular CCD
CN102073996B (en) Image-correlation-evaluation-based method for determining image segmentation threshold
CN104268872A (en) Consistency-based edge detection method
CN104766334A (en) Infrared weak and small target detection and tracking method and device
CN103700113A (en) Method for detecting dim small moving target under downward-looking complicated background
CN102360503B (en) SAR (Specific Absorption Rate) image change detection method based on space approach degree and pixel similarity
CN104680538A (en) SAR image CFAR target detection method on basis of super pixels
CN103500453B (en) Based on the SAR image salient region detection method of Gamma distribution and neighborhood information
CN103871039A (en) Generation method for difference chart in SAR (Synthetic Aperture Radar) image change detection
CN102663778B (en) A kind of method for tracking target based on multi-view point video and system
CN107341781A (en) Based on the SAR image correcting methods for improving the matching of phase equalization characteristic vector base map
CN103914829B (en) Method for detecting edge of noisy image
CN104391294A (en) Connection domain characteristic and template matching based radar plot correlation method
CN103065320A (en) Synthetic aperture radar (SAR) image change detection method based on constant false alarm threshold value
CN110849354A (en) Star point extraction and compensation method under condition of last life stage of star sensor
CN105243661A (en) Corner detection method based on SUSAN operator

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant