CN103258202A - Method for extracting textural features of robust - Google Patents

Method for extracting textural features of robust Download PDF

Info

Publication number
CN103258202A
CN103258202A CN2013101587600A CN201310158760A CN103258202A CN 103258202 A CN103258202 A CN 103258202A CN 2013101587600 A CN2013101587600 A CN 2013101587600A CN 201310158760 A CN201310158760 A CN 201310158760A CN 103258202 A CN103258202 A CN 103258202A
Authority
CN
China
Prior art keywords
input image
lbp
label
pixel
feature set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013101587600A
Other languages
Chinese (zh)
Other versions
CN103258202B (en
Inventor
李宏亮
宋铁成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201310158760.0A priority Critical patent/CN103258202B/en
Publication of CN103258202A publication Critical patent/CN103258202A/en
Application granted granted Critical
Publication of CN103258202B publication Critical patent/CN103258202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a method for extracting textural features of a robust, and belongs to the technical field of image processing. The method comprises the implementation steps: pre-processing an input image, generating a feature set F, carrying out binaryzation on the feature set F based on a threshold value of each feature, carrying out binary encoding and generating a particular pixel label, carrying out rotation invariant even local binary pattern (LBP) encoding on the input image, generating an LBP label of each pixel point, constructing a 2-D coexistence histogram by the particular pixel label and the LBP label of each pixel point, vectoring the coexistence histogram and then applying the coexistence histogram into textural expression. The method is applied so as to reduce binary quantitative loss in an existing LBP mode, at the same time maintain robustness of changes of illumination, rotation, scale and visual angles by the extraction features.

Description

Robust texture feature extraction method
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a robust texture feature extraction method.
Background
Texture features play an important role in visual recognition, and have been widely researched and applied in the fields of texture classification, retrieval, synthesis, segmentation, and the like. In general, texture images not only exhibit a wide variety of geometric and lighting changes, but are often accompanied by drastic intra-class and inter-class variations. Texture classification is a difficult task when a priori knowledge is not available. Therefore, extracting robust texture features is a core problem to solve these tasks.
Over the past several decades, many methods have been proposed to extract textural features. Early research focused on different statistical-based methods, model-based and signal-processing features such as co-occurrence matrices, markov random fields and filter-based methods, etc. Later, methods based on primitives (textons) and Local Binary Patterns (LBP) were proposed. The former requires a learning process to build a primitive dictionary by clustering local features of the training images, and then to build a histogram expression by counting the primitive frequencies for a given texture. The local gray difference binary quantization coding is directly expressed by local gray difference binary quantization coding without a training process, so that local microstructure information of the texture is extracted. Examples of primitive-based methods are found in M.Varma and A.Zisserman, "A statistical adaptation to material class-using image patch templates," IEEE trans.Pattern animal. inner. vol.31, No.11, pp.2032-2047, Nov.2009; LBP is specifically referenced: t.ojala, m.pietikaine, and t.maenpaa, "Multiresolution grid-scale and rotation in variable temporal mapping with local binary patterns," IEEE trans.pattern anal.mach.intell., vol.24, No.7, pp.971-987, jul.2002.
The LBP uses a binary sequence to describe the characteristics of local texture, for each pixel point of the image, the pixel value of the pixel point is subtracted from the pixel value of the neighboring sampling pixel point around the pixel point to obtain a corresponding sequence, the symbol value of the corresponding sequence is taken to obtain a binary sequence coded as 0/1, and then the binary sequence is converted into decimal numbers to be used as the texture identification of the pixel point, namely the LBP code of the pixel point.
LBP is well known for its simplicity and efficiency, and has been widely used in the fields of texture classification, face recognition, and object detection. In recent years, a number of improved algorithms have been proposed based on LBP, most of which are attributable to the following: selection of local patterns (e.g., ring or disk geometry), sampling features (e.g., high order differential features, Garbor features, differential amplitude, block-based gray-scale mean), quantization approaches (e.g., three-level quantization and adaptive quantization), coding rules (e.g., split ternary coding, statistical sign number), and extensions to high dimensions (e.g., stereo LBP, uniform spherical region description, Garbor stereo LBP, and color LBP).
LBP trades binary quantization for high efficiency of extracting local structure information and poor robustness to noise, and in order to effectively reduce quantization loss and maintain robustness of extracted features, it is necessary to improve the current LBP method.
Disclosure of Invention
The invention aims to: a robust texture feature extraction method is provided to reduce binary quantization loss in the existing LBP mode, and meanwhile, robustness of extracted features to illumination, rotation, scale and view angle changes is maintained.
The invention relates to a robust texture feature extraction method, which comprises the following steps:
step 1: generating a specific pixel label L (x) of the input image I, wherein x represents a pixel point of the input image I;
101: generating an n-dimensional feature set F ═ F for an input image Ii(x)|i=1,2,...,n};
102: and carrying out binarization processing on the feature set F to obtain a binary feature set B ═ Bi(x)|i=1,2,...,n;x∈I}:
If fi(x) Greater than or equal to the threshold value thr of the featureiThen b isi(x) The value is 1; otherwise, the value is 0;
103: encoding the binary feature set B to generate a specific pixel label L (x) of each pixel:
L ( x ) = Σ i = 1 n b i ( x ) 2 i - 1 ;
step 2, carrying out rotation invariant uniform LBP coding on the input image I to generate an LBP label Z (x) of each pixel point;
and 3, constructing a 2-dimensional symbiotic histogram based on the specific pixel label L (x) and the LBP label Z (x) of each pixel point, and vectorizing and outputting the 2-dimensional symbiotic histogram.
The original LBP method extracts texture features from only the difference information of a small-range neighborhood of the original image, and is sensitive to noise. The invention provides a robust texture extraction method, which realizes extraction of texture features based on high-order gradient domain information on a larger support region, and the extracted texture features are more robust and have expression force and discrimination force compared with the conventional LBP (local binary pattern) mode. Compared with the texture feature extraction method based on the primitive, the method omits the training and clustering processes which occupy a large amount of system resources, and directly performs binary quantization and coding on the feature set of the generated input image, so that the method is simpler, more convenient and more efficient to realize compared with the method based on the primitive.
In order to obtain a relatively stable quantization threshold and maintain the high efficiency of the present invention in extracting texture features, in step 102, a threshold value thriComprises the following steps: characteristic f corresponding to each pixel point on input image IiMean over the whole image I.
In conclusion, the method has the advantages of being simple and convenient to implement, reducing binary quantization loss in the conventional LBP mode, maintaining robustness of extracted features on illumination, rotation, scale and visual angle change and enabling the extracted texture features to have high discriminability.
Drawings
The invention will now be described, by way of example, with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of an embodiment of the present invention.
Detailed Description
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
Any feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features.
In the texture feature extraction process of the present invention, in order to reduce the binary quantization loss and obtain the feature with more judgment power, the present invention extracts useful information from a larger local neighbor support region, and simultaneously maintains the robustness of the extracted feature to illumination, rotation, scale and view angle changes, see fig. 1, and the specific implementation steps thereof are:
step S100: the feature set F of the input image I is generated, which is a preprocessing step of the present invention, and the present invention can be implemented in any existing mature implementation manner, for example, in the present embodiment, the following steps (1) to (4) are adopted to generate the feature set F of the present invention:
(1): normalizing the input image I to remove the illumination influence; the normalization process can be any existing mature mode, such as histogram equalization and the like, and is preferably performed by using the mean value and standard deviation of the image I;
(2): respectively obtaining rotation invariant filter responses of all scales based on multi-scale and multi-direction edge filtering (first order Gaussian partial derivative) and strip filtering (second order Gaussian partial derivative), namely, convolutively normalizing the image I by using edge filtering and strip filtering in m directions on each scale, and recording the direction filter response with the maximum amplitude response on the scale, so that each pixel point x of the image I obtains stable n-1 (n represents the dimension of a feature set F) filter responses;
the number of different scales can be set according to actual conditions, preferably, 3 scales are considered, each scale adopts 8-direction filtering, and the sizes of the 3 scales in the horizontal and vertical directions of the image I can be sequentially selected as follows: (1, 3) (2, 6), (3, 12), and other values can be taken according to actual requirements and applications.
When based on 3 scales, each pixel point x will obtain 6 stable filter responses.
(3) The normalization processing is carried out on the obtained n-1 filter responses of each pixel point, which can be any one of the existing normalization modes, and the invention preferably adopts Web's law (Weber's law) normalization, namely
R(x)←R(x)[log(1+M(x)/0.03)]/M(x)
Wherein r (x) represents a filter response, m (x) | | | r (x) | luminance2Representing the magnitude of the filter response, the symbol | · | | non-woven calculation2Representing a 2-norm.
(4): an n-dimensional (n-D) feature set F is constructed from the normalized image I, n-1 filter responses, and is expressed as:
F={fi(x)|i=1,2,...,n;x∈I}
step S200: carrying out binarization operation on the feature set F to obtain a binary feature set B ═ Bi(x)|i=1,2,...,n;x∈I}:
Figure BDA00003136617700031
Wherein, thriRepresenting a feature fiThreshold value of, any feature fi(x) Threshold value thr ofiThe threshold value thr may be preset, preferably, based on an empirical valueiComprises the following steps: characteristic f corresponding to each pixel point on input image IiMean over the entire image I, i.e.
Figure BDA00003136617700041
Wherein X represents the number of pixel points X of the input image I.
Step S300: for the binary feature set B ═ Bi(x) I1, 2, n, x belongs to I, and generates a specific pixel label L (x), namely
L ( x ) = Σ i = 1 n b i ( x ) 2 i - 1
Step S400: carrying out rotation invariant uniform LBP coding on an input image I to generate a neighbor information coding label Z (x) of each pixel, namely
Figure BDA00003136617700043
Wherein,
Figure BDA00003136617700044
indicating a rotationally invariant uniform pattern; central pixel value gcNamely the pixel value of the current pixel point x; gpThe pixel value of the p-th sampling pixel point with the sampling radius of R around the central pixel is obtained; u (LBP)P,R) Representing the number of 0-1 transitions of a uniform measure, i.e. a circumferential bit string consisting of the sign of the difference between the pixel values of the sampling neighbours and the central pixel; s (t) is a sign function, i.e., t is a negative number, and the function value is 0, otherwise it is 1.
Step S500: histogram expression. Constructing a 2-D symbiotic histogram based on specific pixel labels L (x) and LBP labels Z (x) of all pixel points, and then vectorizing the symbiotic histogram to obtain a 2-D symbiotic histogram with dimension of 2nX (P +2), which is the final texture expression. The 2-D symbiotic histogram calculation mode is as follows:
H ( l , p ) = Σ x δ ( ( L ( x ) , Z ( x ) ) = = ( l , p )
where H (l, p) represents a 2-D histogram with an index of (l, p), l ∈ [0,2 ]n-1],p∈[0,P+1],
Figure BDA00003136617700046
δ (y) is used to accumulate the number of times a pixel in the image is encoded as (l, p). For example, for pixel x0If L (x)0)=1,Z(x0) When the value is 2, 1 is added to the value corresponding to the index (l, p) = (1,2) in H (l, p).
In the present invention, the processing for generating the specific pixel label l (x) and the LBP label z (x) may be executed in parallel or in series, and is selected according to the actual application requirements.
The texture feature extraction method is used for classifying three texture databases of Outex, CURET and UIUC with illumination, rotation, visual angle and scale change by adopting edge filtering and strip filtering in 8 directions on 3 different scales (1, 3) (2, 6) and (3, 12) with R =1 and P =8, and compared with classification algorithms based on LBP and learning (such as primitives), the classification performance is obviously improved; and the resulting feature representation has a dimension of 1280, which has a lower feature dimension compared to primitive-based methods (typical dimension is 2440).
The invention is not limited to the foregoing embodiments. The invention extends to any novel feature or any novel combination of features disclosed in this specification and any novel method or process steps or any novel combination of features disclosed.

Claims (9)

1. A robust texture feature extraction method is characterized by comprising the following steps:
step 1: generating a specific pixel label L (x) of the input image I, wherein x represents a pixel point of the input image I;
101: generating an n-dimensional feature set F ═ F for an input image Ii(x)|i=1,2,...,n};
102: and carrying out binarization processing on the feature set F to obtain a binary feature set B ═ Bi(x)|i=1,2,...,n;x∈I}:
If fi(x) Is greater than or equal toThreshold value thr of characteristiciThen b isi(x) The value is 1; otherwise, the value is 0;
103: encoding the binary feature set B to generate a specific pixel label L (x) of each pixel:
L ( x ) = Σ i = 1 n b i ( x ) 2 i - 1 ;
step 2, carrying out rotation invariant uniform LBP coding on the input image I to generate an LBP label Z (x) of each pixel point;
and 3, constructing a 2-dimensional symbiotic histogram based on the specific pixel label L (x) and the LBP label Z (x) of each pixel point, and vectorizing and outputting the 2-dimensional symbiotic histogram.
2. The method of claim 1, wherein the threshold value thr in step 102iComprises the following steps: characteristic f corresponding to each pixel point on input image IiMean over the whole image I.
3. The method according to claim 1 or 2, wherein in step 101, generating the n-dimensional feature set F is specifically:
carrying out normalization processing on the input image I;
performing multi-scale multi-directional filtering on the normalized image I: for each scale, m directional filtering is adopted, the directional filtering with the maximum amplitude response on each scale is respectively obtained based on edge filtering and strip filtering to serve as the rotation invariant filtering response on the scale, and normalization processing is carried out on the filtering response;
an n-dimensional feature set F is formed by the normalized image I and the filter response: f ═ Fi(x)|i=1,2,...,n;x∈I}。
4. The method according to claim 3, wherein the normalizing process on the input image I is a normalization process that removes an illumination transform.
5. The method of claim 4, wherein the normalization process to remove the illumination transform is: and carrying out normalization processing by using the pixel value mean value and the standard deviation of each pixel point of the input image I.
6. The method of claim 3, wherein normalizing the filter response is: r (x) ← R (x) [ log (1+ M (x)/0.03)](x), where r (x) represents a filter response, m (x) r (x) y2Representing the magnitude of the filter response.
7. The method of claim 3, wherein m of the m directional filters is 8.
8. The method of claim 3, wherein the multi-scale is specifically 3-scales.
9. The method of claim 8, wherein the 3 dimensions have magnitudes in the horizontal and vertical directions of the image I in the order: (1,3)(2,6),(3,12).
CN201310158760.0A 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust Active CN103258202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310158760.0A CN103258202B (en) 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310158760.0A CN103258202B (en) 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust

Publications (2)

Publication Number Publication Date
CN103258202A true CN103258202A (en) 2013-08-21
CN103258202B CN103258202B (en) 2016-06-29

Family

ID=48962106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310158760.0A Active CN103258202B (en) 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust

Country Status (1)

Country Link
CN (1) CN103258202B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761507A (en) * 2014-01-03 2014-04-30 东南大学 Local multi-value pattern face recognition method based on Weber law
CN105046262A (en) * 2015-06-29 2015-11-11 中国人民解放军国防科学技术大学 Robust extended local binary pattern textural feature extraction method
CN106788722A (en) * 2016-11-30 2017-05-31 东南大学 A kind of pixel modulates the inter-symbol interference cancellation method of visible light communication system
CN107403451A (en) * 2017-06-16 2017-11-28 西安电子科技大学 Adaptive binary feature monocular vision odometer method and computer, robot
CN108629262A (en) * 2017-03-18 2018-10-09 上海荆虹电子科技有限公司 Iris identification method and related device
CN108876832A (en) * 2018-05-30 2018-11-23 重庆邮电大学 Based on grouping-order modes robust texture features extracting method
CN109271997A (en) * 2018-08-28 2019-01-25 河南科技大学 A kind of image texture classification method based on jump subdivision local mode
CN109410258A (en) * 2018-09-26 2019-03-01 重庆邮电大学 Texture image feature extracting method based on non local binary pattern

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6647132B1 (en) * 1999-08-06 2003-11-11 Cognex Technology And Investment Corporation Methods and apparatuses for identifying regions of similar texture in an image
CN102542571A (en) * 2010-12-17 2012-07-04 ***通信集团广东有限公司 Moving target detecting method and device
US20120188418A1 (en) * 2011-01-26 2012-07-26 Stmicroelectronics S.R.L. Texture detection in image processing
CN102663436A (en) * 2012-05-03 2012-09-12 武汉大学 Self-adapting characteristic extracting method for optical texture images and synthetic aperture radar (SAR) images
CN103035013A (en) * 2013-01-08 2013-04-10 东北师范大学 Accurate moving shadow detection method based on multi-feature fusion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6647132B1 (en) * 1999-08-06 2003-11-11 Cognex Technology And Investment Corporation Methods and apparatuses for identifying regions of similar texture in an image
CN102542571A (en) * 2010-12-17 2012-07-04 ***通信集团广东有限公司 Moving target detecting method and device
US20120188418A1 (en) * 2011-01-26 2012-07-26 Stmicroelectronics S.R.L. Texture detection in image processing
CN102663436A (en) * 2012-05-03 2012-09-12 武汉大学 Self-adapting characteristic extracting method for optical texture images and synthetic aperture radar (SAR) images
CN103035013A (en) * 2013-01-08 2013-04-10 东北师范大学 Accurate moving shadow detection method based on multi-feature fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MANIK VARMA等: "A statistical approach to texture classification from singal images", 《INT.J.COMPUT. VISION》 *
ZHENHUA GUO等: "A Completed Modeling of Local Binary Pattern Operator for Texture Classification", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761507A (en) * 2014-01-03 2014-04-30 东南大学 Local multi-value pattern face recognition method based on Weber law
CN103761507B (en) * 2014-01-03 2017-02-08 东南大学 Local multi-value pattern face recognition method based on Weber law
CN105046262A (en) * 2015-06-29 2015-11-11 中国人民解放军国防科学技术大学 Robust extended local binary pattern textural feature extraction method
CN105046262B (en) * 2015-06-29 2018-08-17 中国人民解放军国防科学技术大学 A kind of robust extension local binary patterns texture characteristic extracting method
CN106788722A (en) * 2016-11-30 2017-05-31 东南大学 A kind of pixel modulates the inter-symbol interference cancellation method of visible light communication system
CN106788722B (en) * 2016-11-30 2019-10-15 东南大学 A kind of inter-symbol interference cancellation method of pixel modulation visible light communication system
CN108629262A (en) * 2017-03-18 2018-10-09 上海荆虹电子科技有限公司 Iris identification method and related device
CN108629262B (en) * 2017-03-18 2021-08-20 上海荆虹电子科技有限公司 Iris identification method and corresponding device
CN107403451A (en) * 2017-06-16 2017-11-28 西安电子科技大学 Adaptive binary feature monocular vision odometer method and computer, robot
CN107403451B (en) * 2017-06-16 2020-11-10 西安电子科技大学 Self-adaptive binary characteristic monocular vision odometer method, computer and robot
CN108876832A (en) * 2018-05-30 2018-11-23 重庆邮电大学 Based on grouping-order modes robust texture features extracting method
CN108876832B (en) * 2018-05-30 2022-04-26 重庆邮电大学 Robust texture feature extraction method based on grouping-order mode
CN109271997A (en) * 2018-08-28 2019-01-25 河南科技大学 A kind of image texture classification method based on jump subdivision local mode
CN109410258A (en) * 2018-09-26 2019-03-01 重庆邮电大学 Texture image feature extracting method based on non local binary pattern
CN109410258B (en) * 2018-09-26 2021-12-10 重庆邮电大学 Texture image feature extraction method based on non-local binary pattern

Also Published As

Publication number Publication date
CN103258202B (en) 2016-06-29

Similar Documents

Publication Publication Date Title
CN103258202B (en) A kind of texture characteristic extracting method of robust
Lukic et al. Leaf recognition algorithm using support vector machine with Hu moments and local binary patterns
Li et al. Scale-and rotation-invariant local binary pattern using scale-adaptive texton and subuniform-based circular shift
Singh et al. Svm-bdt pnn and fourier moment technique for classification of leaf shape
Liu et al. Improved deep belief networks and multi-feature fusion for leaf identification
Bhardwaj et al. Recognition of plants by leaf image using moment invariant and texture analysis
Bai et al. Saliency-SVM: An automatic approach for image segmentation
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
CN105528595A (en) Method for identifying and positioning power transmission line insulators in unmanned aerial vehicle aerial images
CN105321176A (en) Image segmentation method based on hierarchical higher order conditional random field
CN105095880B (en) A kind of multi-modal Feature fusion of finger based on LGBP coding
CN102722699A (en) Face identification method based on multiscale weber local descriptor and kernel group sparse representation
Wen et al. Virus image classification using multi-scale completed local binary pattern features extracted from filtered images by multi-scale principal component analysis
Khmag et al. Recognition system for leaf images based on its leaf contour and centroid
CN103778434A (en) Face recognition method based on multi-resolution multi-threshold local binary pattern
CN103390170A (en) Surface feature type texture classification method based on multispectral remote sensing image texture elements
Chitaliya et al. An efficient method for face feature extraction and recognition based on contourlet transform and principal component analysis using neural network
Muzaffar et al. Gabor contrast patterns: A novel framework to extract features from texture images
Hewitt et al. Shape-only features for plant leaf identification
CN110490210B (en) Color texture classification method based on t sampling difference between compact channels
Iamsiri et al. A new shape descriptor and segmentation algorithm for automated classifying of multiple-morphological filamentous algae
CN111401485A (en) Practical texture classification method
CN110489587B (en) Tire trace image feature extraction method in local gradient direction three-value mode
Park et al. Image retrieval technique using rearranged freeman chain code
Xiong et al. A generic object detection using a single query image without training

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant