CN102521593A - Affine invariant feature extraction method based on profile split - Google Patents

Affine invariant feature extraction method based on profile split Download PDF

Info

Publication number
CN102521593A
CN102521593A CN2011103925227A CN201110392522A CN102521593A CN 102521593 A CN102521593 A CN 102521593A CN 2011103925227 A CN2011103925227 A CN 2011103925227A CN 201110392522 A CN201110392522 A CN 201110392522A CN 102521593 A CN102521593 A CN 102521593A
Authority
CN
China
Prior art keywords
point
profile
shape
cut
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011103925227A
Other languages
Chinese (zh)
Inventor
杨明强
张征
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN2011103925227A priority Critical patent/CN102521593A/en
Publication of CN102521593A publication Critical patent/CN102521593A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses an affine invariant feature extraction method based on profile split, belonging to the technical field of computer vision. The affine invariant feature extraction method comprises the following steps of: extracting a shape profile and carrying out equal-area sampling on the shape profile; taking a profile starting point as a segmentation starting point, carrying out equal-point-number segmentation on the profile to form K parts and obtaining K segmentation points; connecting the segmentation starting point and all the segmentation points, leading the shape to be divided into K regions, calculating the area of each region to form an ordered series till all the profile points are traversed in sequence, and obtaining an area ordered series corresponding to each profile point to form a matrix, wherein the number of lines is K and the number of rows is the number of the profile points; using shape-area normalization for the matrix to obtain a shape signing matrix in which each line is a discrete function; and applying discrete Fourier transformation, using amplitudes of transformation coefficients as a result, selecting former 20 transformation coefficients in each line, and finally forming a matrix with the number of lines being K and the number of rows being 20, i.e., a shape feature matrix.

Description

The affine invariant features method for distilling of cutting apart based on profile
Technical field
The present invention relates to a kind of affine invariant features method for distilling of cutting apart based on profile of being used for, belong to technical field of computer vision.
Background technology
Modern society's network has become the instrument that people use always, and still along with the continuous increase of quantity of information, how in vast as the open sea network information ocean, to search out required information is the huge problem that the modern network retrieval technique is faced.Comparatively ripe at present gopher all is search key with the text, but when carrying out picture search, uses text to have bigger subjectivity and be difficult to as term image is described comprehensively, directly retrieves so hope to hold through image energy.In fact, the CBIR instrument is also very immature now, and accuracy rate is lower, and main cause is that computer vision and people's intuition impression has very big gap.And in the feature extracting method of picture shape,, obtained now using widely the most near people's intuition impression based on the described target signature of the feature extracting method of shape profile.Fourier descriptor has good performance in the feature extracting method based on the shape profile; These class methods of experiment proof have unchangeability to shape translation, yardstick, rotational transform; And computation complexity is lower, so the feature extraction algorithm based on Fourier transform has obtained using widely in reality.Affined transformation is a kind of common conversion of shape, but for general fourier descriptor affined transformation is not had robustness, has reduced the accuracy of method in image retrieval is used like this.
Summary of the invention
To the deficiency of prior art, pin of the present invention provides a kind of affine invariant features method for distilling of cutting apart based on profile.
A kind of affine invariant features method for distilling of cutting apart based on profile may further comprise the steps:
The first step is cut apart the target object shape, extracts the profile of shape, and the shape profile is carried out the homalographic sampling, representes the shape profile with N point.
In second step, arbitrarily a bit as starting point, with the definite order of counter clockwise direction numbering, wherein this starting point is a point 1 to have a few on the shape profile on the selection profile, and following point is followed successively by 2,3 ... N.
In the 3rd step, profile etc. counted is divided into the K part.On profile, select point 1 for cutting apart starting point; And remember that this point is a cut-point 1; Counterclockwise profile etc. is counted from the beginning of this point and to cut apart, wherein be designated as cut-point 2 apart from cut-point 1 nearest cut-point, and the like to the last a cut-point order be cut-point K.Straight line connection source and each cut-point obtain the line between starting point and each cut-point; Computed segmentation starting point and cut-point 2 lines and silhouette edge are along the area in the zone that shape profile between the area that surrounds the zone, two adjacent connection lines and its corresponding cut-point is surrounded and cut apart starting point and the line of K cut-point and point-to-point transmission shape profile surround regional area successively; Area by above order computation becomes an ordered sequence.
The 4th step, on profile, select point 2 as cutting apart starting point, repeat the computation process in the 3rd step, obtain the area value ordered sequence of point 2 correspondences; Point all on profile are all calculated one time, obtain the corresponding area value ordered sequence of all point.
In the 5th step, the area value ordered sequence that all point are corresponding is formed a matrix, each row of matrix corresponding the area value ordered sequence of a point, this matrix is carried out normalization, the matrix after normalization is called shape signature matrix.
In the 6th step, each line number value of shape signature matrix all is a discrete function; Each discrete function is done discrete Fourier transformation, and the Fourier coefficient that obtains is formed a matrix, i.e. shape facility matrix.
Among the present invention, parameter K, promptly the number of shape cut zone is a comparatively crucial value.Contrasted the performance situation in the different K values image retrieval among Fig. 7, following table is feature extraction required time and the retrieval accuracy table of K when getting different value.Balance movement speed and retrieval accuracy that can be best in the time of can finding out K=12.
Figure BDA0000115010520000021
The present invention is in the universal property that not only has fourier descriptor; Promptly have translation, yardstick, rotational invariance; And the shape facility that is extracted has stronger robustness for the affined transformation of shape, and recall ratio and the precision ratio aspect retrieval also is higher than general fourier descriptor simultaneously.
Description of drawings
Fig. 1 is the aircraft image
Fig. 2 is that the shape profile extracts synoptic diagram.
Fig. 3 is a shape point homalographic sampling synoptic diagram.
Fig. 4 a is that starting point is cut apart and cut apart to profile and the cut-point line calculates synoptic diagram at the shape internal area.
Fig. 4 b is that starting point is cut apart and cut apart to profile and the cut-point line does not calculate synoptic diagram at the shape internal area.
Fig. 5 a is the apple original image.
Fig. 5 b is an original image shape signature matrix table diagrammatic sketch.
Fig. 5 c is an original image shape facility matrix table diagrammatic sketch.
Fig. 6 a is an apple affine graph picture.
Fig. 6 b is affine picture shape signature matrix table diagrammatic sketch.
Fig. 6 c is that affine graph is as shape facility matrix table diagrammatic sketch.
Fig. 7 is the performance that feature extracting method of the present invention is retrieved in MPEG-7Shape CE-1 image library under the different parameters K.
Fig. 8 is different fourier descriptors and the inventive method are retrieved performance in MPEG-7Shape CE-1 image library contrast.
Fig. 9 is the framework synoptic diagram that the present invention is based on the affine invariant features extraction algorithm that profile cuts apart.
Figure 10 is the overall flow figure that the present invention is based on the affine invariant features extraction algorithm that profile cuts apart.
Embodiment
Embodiment:
A kind of affine invariant features method for distilling of cutting apart based on profile may further comprise the steps:
The first step is cut apart the target object shape, extracts the profile of shape, and the shape profile is carried out the homalographic sampling, representes the shape profile with N point.
In second step, arbitrarily a bit as starting point, with the definite order of counter clockwise direction numbering, wherein this starting point is a point 1 to have a few on the shape profile on the selection profile, and following point is followed successively by 2,3 ... N.
In the 3rd step, profile etc. counted is divided into the K part.On profile, select point 1 for cutting apart starting point; And remember that this point is a cut-point 1; Counterclockwise profile etc. is counted from the beginning of this point and to cut apart, wherein be designated as cut-point 2 apart from cut-point 1 nearest cut-point, and the like to the last a cut-point order be cut-point K.Straight line connection source and each cut-point obtain the line between starting point and each cut-point; Computed segmentation starting point and cut-point 2 lines and silhouette edge are along the area in the zone that shape profile between the area that surrounds the zone, two adjacent connection lines and its corresponding cut-point is surrounded and cut apart starting point and the line of K cut-point and point-to-point transmission shape profile surround regional area successively; Area by above order computation becomes an ordered sequence.
The 4th step, on profile, select point 2 as cutting apart starting point, repeat the computation process in the 3rd step, obtain the area value ordered sequence of point 2 correspondences; Point all on profile are all calculated one time, obtain the corresponding area value ordered sequence of all point.
In the 5th step, the area value ordered sequence that all point are corresponding is formed a matrix, each row of matrix corresponding the area value ordered sequence of a point, this matrix is carried out normalization, the matrix after normalization is called shape signature matrix.
In the 6th step, each line number value of shape signature matrix all is a discrete function; Each discrete function is done discrete Fourier transformation, and the Fourier coefficient that obtains is formed a matrix, i.e. shape facility matrix.
Use the present invention below piece image is carried out Shape Feature Extraction, performing step is following:
1) image binaryzation is formed bianry image, make that target object partly is 1, all the other background parts are 0.Use the Canny operator to binary Images Processing, extract the closed contour of target object.Like Fig. 2 is the closed contour of shape.
2) select on the profile arbitrfary point as starting point expression with
Figure BDA0000115010520000041
, from this point beginning counterclockwise successively the mark point for wherein M be that point is total.The arbitrfary point
Figure BDA0000115010520000043
(i=1,2 ..., coordinate figure M) is (x i, y i), x wherein iRepresent the columns of this pixel in image array, y iThe expression line number.Profile can use orderly some set representations: C o = { P 1 o , P 2 o , . . . , P M o } .
3) the center G of calculating shape carries out homalographic to the shape profile and resamples, and supposes that sampling number is N.Point ordered set after the sampling that obtains so is C={P 1, P 2..., P N.Calculate the total area A of shape.Homalographic sampling promptly is to make consecutive point P on the profile i, P I+1Triangle P with center G composition iGP I+1Area be A/N, i=1 wherein, 2 ..., N.Because profile is closed then P i=P I+NFig. 3 is the shape profile diagram through the homalographic sampling.
4) select profile starting point P 1As cutting apart starting point, profile waited to count cut apart, suppose and need profile be divided into the K part; Profile between the then adjacent cut-point is counted and is N/K; And on the profile except that cutting apart starting point, also have K-1 cut-point, comprise and cut apart starting point; Is that starting point makes up an orderly point set counterclockwise with the whole cut-point on the profile to cut apart starting point, and establishing this point set is C s={ X 1, X 2..., X K, X wherein 1For cutting apart starting point P 1, all the other points are the cut-point on the profile.Respectively with X 1With all the other cut-point (X 2To X K) connect.Shape is divided into K zone by these lines so, and wherein first zone is by line segment X 1X 2With an X 1With X 2Between the shape profile surrounded, its area is designated as S 1(P 1), be by line segment X then for i zone (7<i<K, i ∈ N) 1X iWith line segment X 1X I+1And some X iWith an X I+1Between the shape profile surrounded, its area is designated as S i(P 1), and K zone is by line segment X 1X KAnd some X 1With an X KBetween the shape profile surrounded, its area is designated as S K(P 1).For each zone all is closed with X 1Point is for starting point, regional form an orderly point set a little by counterclockwise forming, and the terminal point of this point set is X 1The orderly point set of supposing the regional i of expression is SP={D 1, D 2..., D L, wherein L representes the number of a centrostigma.The form that is expressed as coordinate is SP={ (x 1, y 1), (x 2, y 2) ..., (x L, y L), x wherein L=x 1, y L=y 1Can use the area S of formula
Figure BDA0000115010520000045
zoning i.In Fig. 3 a, represented as line segment X 1X iComputed segmentation region area synoptic diagram when shape is inner has been represented among Fig. 3 b as line segment X fully 1X iIncomplete computed segmentation region area synoptic diagram when shape is inner.So for a P 1As cutting apart starting point, can obtain the area in K zone.By this K of counterclockwise series arrangement zone, the area of Zone Full is formed ordered sequence a: A 1={ S 1(P 1), S 2(P 1) ..., S K(P 1).
5) next one point P on the selected shape profile 2As cutting apart starting point, the process of repeating step 4 obtains a P 2Corresponding area ordered sequence: A 2={ S 1(P 2), S 2(P 2) ..., S K(P 2).Continue the point on the selected shape profile successively afterwards, as a Pi (i=3,4 ..., N) corresponding area ordered sequence is: A i={ S 1(P i), S 2(P i) ..., S K(P i).On the shape profile have a few and all calculated its corresponding area ordered sequence.
6) all area ordered sequences are formed a matrix according to the order of point, the corresponding profile of this matrix column number count the number of the corresponding cut zone of line number.Represent this matrix with M:
M = S 1 ( P 1 ) S 1 ( P 2 ) L S 1 ( P N ) S 2 ( P 1 ) S 2 ( P 2 ) L S 2 ( P N ) M M O M S K ( P 1 ) S K ( P 2 ) L S K ( P N )
The characteristic of matrix M reaction shape, but this character representation does not have the yardstick unchangeability, and responsive to the shape rotational transform, and data scale is bigger.
7) in order to make the described shape facility of matrix M have the yardstick unchangeability; Use the shape total area that calculates in the step 3 to carry out normalization to matrix M; And because the point of shape disperses, put us for each and use its order as parametric representation, the matrix that obtains after the normalization is M n:
M n = 1 A S 1 ( 1 ) S 1 ( 2 ) L S 1 ( N ) S 2 ( 1 ) S 2 ( 2 ) L S 2 ( N ) M M O M S K ( 1 ) S K ( 2 ) L S K ( N ) = S 1 n ( 1 ) S 1 n ( 2 ) L S 1 n ( N ) S 2 n ( 1 ) S 2 n ( 2 ) L S 2 n ( N ) M M O M S K n ( 1 ) S K n ( 2 ) L S K n ( N )
Through normalized matrix M nThe shape signature matrix that is called shape.
8) for the characteristic that makes extraction has rotational invariance, and the data scale of reduction eigenmatrix, each row of matrix is used discrete Fourier transformation.For matrix M nThe capable vector of t can use discrete function
Figure BDA0000115010520000053
T=1 wherein, 2 ..., K.f tDiscrete Fourier transformation be defined as
Figure BDA0000115010520000054
T=1 wherein, 2 ..., K, n=0,1,2 ..., N-1.Fourier transform coefficient f then t(n) then as the characteristic of shape,, ignore the phase information of Fourier transform coefficient, only choose its amplitude information in order to realize rotational invariance | F t(n) |.And because f tIt is real-number function; So its Fourier transform result is the Fourier coefficient that can only choose half of symmetry; In order to guarantee that carrying out the later shape facility of Fourier transform still has the yardstick unchangeability, use DC component promptly simultaneously to all Fourier transform coefficients | F t(0) | normalization, the sequence that finally obtains the Fourier coefficient composition does Φ t = ( | F t ( 1 ) | | F t ( 0 ) | , | F t ( 2 ) | | F t ( 0 ) | , | F t ( 3 ) | | F t ( 0 ) | , · · · , | F t ( N / 2 ) | | F t ( 0 ) | ) , So it is last that eigenmatrix is M f:
M f = | F 1 ( 1 ) | / | F 1 ( 0 ) | | F 1 ( 2 ) | / | F 1 ( 0 ) | L | F 1 ( N 2 ) | / | F 1 ( 0 ) | | F 2 ( 1 ) | / | F 2 ( 0 ) | | F 2 ( 2 ) | / | F 2 ( 0 ) | L | F 2 ( N 2 ) | / | F 2 ( 0 ) | M M O M | F K ( 1 ) | / | F K ( 0 ) | | F K ( 2 ) | / F K ( 0 ) | L | F K ( N 2 ) | / | F K ( 0 ) | = Ψ 1 ( 1 ) Ψ 1 ( 2 ) L Ψ 1 ( N / 2 ) Ψ 2 ( 1 ) Ψ 2 ( 2 ) L Ψ 2 ( N / 2 ) M M O M Ψ K ( 1 ) Ψ K ( 2 ) L Ψ K ( N / 2 )
In practical application, do not need whole Fourier transform coefficients just can well describe the characteristic of shape, through experimental verification, only need to use preceding 20 Fourier transform coefficients just can realize complete description to shape facility.So common eigenmatrix M fBe expressed as:
M f = Ψ 1 ( 1 ) Ψ 1 ( 2 ) L Ψ 1 ( 20 ) Ψ 2 ( 1 ) Ψ 2 ( 2 ) L Ψ 2 ( 20 ) M M O M Ψ K ( 1 ) Ψ K ( 2 ) L Ψ K ( 20 )

Claims (1)

1. affine invariant features method for distilling of cutting apart based on profile is characterized in that method for distilling may further comprise the steps:
The first step is cut apart the target object shape, extracts the profile of shape, and the shape profile is carried out the homalographic sampling, representes the shape profile with N point;
In second step, arbitrarily a bit as starting point, with the definite order of counter clockwise direction numbering, wherein this starting point is a point 1 to have a few on the shape profile on the selection profile, and following point is followed successively by 2,3 ... N;
The 3rd step; Profile etc. counted be divided into the K part; On profile, select point 1 for cutting apart starting point, and note this be cut-point 1, begin counterclockwise profile etc. counted from this point and cut apart; Wherein be designated as cut-point 2 apart from cut-point 1 nearest cut-point, and the like to the last a cut-point order be cut-point K; Straight line connection source and each cut-point obtain the line between starting point and each cut-point; Computed segmentation starting point and cut-point 2 lines and silhouette edge are along the area in the zone that shape profile between the area that surrounds the zone, two adjacent connection lines and its corresponding cut-point is surrounded and cut apart starting point and the line of K cut-point and point-to-point transmission shape profile surround regional area successively; Area by above order computation becomes an ordered sequence;
The 4th step, on profile, select point 2 as cutting apart starting point, repeat the computation process in the 3rd step, obtain the area value ordered sequence of point 2 correspondences; Point all on profile are all calculated one time, obtain the corresponding area value ordered sequence of all point;
In the 5th step, the area value ordered sequence that all point are corresponding is formed a matrix, each row of matrix corresponding the area value ordered sequence of a point, this matrix is carried out normalization, the matrix after normalization is called shape signature matrix;
In the 6th step, each line number value of shape signature matrix all is a discrete function; Each discrete function is done discrete Fourier transformation, and the Fourier coefficient that obtains is formed a matrix, i.e. shape facility matrix.
CN2011103925227A 2011-12-01 2011-12-01 Affine invariant feature extraction method based on profile split Pending CN102521593A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011103925227A CN102521593A (en) 2011-12-01 2011-12-01 Affine invariant feature extraction method based on profile split

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011103925227A CN102521593A (en) 2011-12-01 2011-12-01 Affine invariant feature extraction method based on profile split

Publications (1)

Publication Number Publication Date
CN102521593A true CN102521593A (en) 2012-06-27

Family

ID=46292504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011103925227A Pending CN102521593A (en) 2011-12-01 2011-12-01 Affine invariant feature extraction method based on profile split

Country Status (1)

Country Link
CN (1) CN102521593A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914840A (en) * 2014-04-01 2014-07-09 山东大学 Automatic human body contour extracting method for non-simple background

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101464948A (en) * 2009-01-14 2009-06-24 北京航空航天大学 Object identification method for affine constant moment based on key point

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101464948A (en) * 2009-01-14 2009-06-24 北京航空航天大学 Object identification method for affine constant moment based on key point

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《,ZHENG ZHANG,MINGQIANG YANG》: "《Area Function Fourier Descriptors Based on Contour Split》", 《COMMUNICATION TECHNOLOGY(ICCT) IEEE 13TH INTERNALTIONAL CONFERENCE ON》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914840A (en) * 2014-04-01 2014-07-09 山东大学 Automatic human body contour extracting method for non-simple background
CN103914840B (en) * 2014-04-01 2016-08-17 山东大学 A kind of human body contour outline extraction method for non-simple background

Similar Documents

Publication Publication Date Title
CN103838864B (en) Visual saliency and visual phrase combined image retrieval method
CN103383700B (en) Based on the edge direction histogrammic image search method of difference
CN103456013B (en) A kind of method representing similarity between super-pixel and tolerance super-pixel
Chen et al. Using binarization and hashing for efficient SIFT matching
CN104199931A (en) Trademark image consistent semantic extraction method and trademark retrieval method
Dalitz et al. Fourier descriptors for broken shapes
CN107679539B (en) Single convolution neural network local information and global information integration method based on local perception field
CN105335469A (en) Method and device for image matching and retrieving
CN103970883A (en) Motion sequence search method based on alignment clustering analysis
CN104484425A (en) Color image searching method based on multiple features
CN104978582A (en) Contour chord angle feature based identification method for blocked target
CN103399863B (en) Image search method based on the poor characteristic bag of edge direction
Mishchenko et al. Chart image understanding and numerical data extraction
CN104484432A (en) Color image searching method based on quaternion exponential moment
CN104077742A (en) GABOR characteristic based face sketch synthetic method and system
CN101000651B (en) Method for recognizing multiple texture image
Tran et al. Handwritten mathematical expression recognition using convolutional neural network
CN102521593A (en) Affine invariant feature extraction method based on profile split
CN106951501B (en) Three-dimensional model retrieval method based on multi-graph matching
Amoda et al. Efficient image retrieval using region based image retrieval
CN102737254B (en) Identification method of mark image
CN111104922B (en) Feature matching algorithm based on ordered sampling
Kekre et al. Sectorization of walsh and walsh wavelet in CBIR
CN108549739B (en) Sparse matrix-based protection device constant value information modeling method and system
Kekre et al. DCT-DST Plane sectorization of row wise transformed color images in CBIR

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20120627