CN107563393A - A kind of extraction of inscriptions on bones or tortoise shells picture Local textural feature and matching process and system - Google Patents
A kind of extraction of inscriptions on bones or tortoise shells picture Local textural feature and matching process and system Download PDFInfo
- Publication number
- CN107563393A CN107563393A CN201710826536.2A CN201710826536A CN107563393A CN 107563393 A CN107563393 A CN 107563393A CN 201710826536 A CN201710826536 A CN 201710826536A CN 107563393 A CN107563393 A CN 107563393A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- msup
- msubsup
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000000988 bone and bone Anatomy 0.000 title claims abstract description 55
- 238000000034 method Methods 0.000 title claims abstract description 44
- 241000270708 Testudinidae Species 0.000 title claims abstract description 37
- 238000000605 extraction Methods 0.000 title claims abstract description 29
- 230000008569 process Effects 0.000 title claims abstract description 19
- 239000013598 vector Substances 0.000 claims description 73
- 239000011159 matrix material Substances 0.000 claims description 18
- 238000004422 calculation algorithm Methods 0.000 claims description 10
- 230000017105 transposition Effects 0.000 claims description 8
- 230000015572 biosynthetic process Effects 0.000 claims description 6
- 230000004069 differentiation Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 7
- 239000012634 fragment Substances 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 6
- 238000011160 research Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 241001074085 Scophthalmus aquosus Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000205 computational method Methods 0.000 description 1
- 230000021615 conjugation Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 239000000686 essence Substances 0.000 description 1
- 238000006062 fragmentation reaction Methods 0.000 description 1
- 210000004209 hair Anatomy 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 239000002932 luster Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 108010048734 sclerotin Proteins 0.000 description 1
- 230000031068 symbiosis, encompassing mutualism through parasitism Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Landscapes
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
The invention belongs to inscriptions on bones or tortoise shells digitizing technique field, discloses extraction and matching process and the system of a kind of inscriptions on bones or tortoise shells picture Local textural feature, extraction and the gray proces of gray level co-occurrence matrixes progress textural characteristics are used to the inscriptions on bones or tortoise shells picture to be known of loading;Then texture eigenvalue is calculated by the algorithmic formula of contrast, energy, entropy, correlation these characteristic quantities;Treat knowledge picture and carry out matching classification using the least euclidean distance criteria with other pictures;It is bigger apart from smaller similarity.The problem of it is more that the present invention solves current first osteocomma fragment, need to manually be conjugated, and the not high and conjugated difficulty of accuracy is big, can quickly filter out the similar fragment of texture, without artificial matching, carry out that first osteocomma is conjugated to extend efficient help for inscriptions on bones or tortoise shells expert;And it is poor to solve conventional texture characteristic extracting method effect on similar sample characteristics gap is reduced, the unobvious in the gap of feature between inhomogeneity, the problem of to the classifying quality difference of image.
Description
Technical field
The invention belongs to inscriptions on bones or tortoise shells digitizing technique field, more particularly to a kind of inscriptions on bones or tortoise shells picture Local textural feature carries
Take and matching process and system.
Background technology
The inscriptions on bones or tortoise shells is earliest a kind of ripe writing system that China is found so far, and weight is occupied in China's word development history
Want status.And these writing records on business's (Yin) towards politics, economy, culture, custom etc. permitted many contents, be research
Ancient times history, the especially irreplaceable firsthand material of Shang dynasty history, their appearance, solve many historical mysteries,
Increasing scholar is attract to study the inscriptions on bones or tortoise shells.
However, first sclerotin it is more crisp, be easily broken off, it is most of when unearthed to have by after more than 3,000 years bury
It is damaged.Find larger and complete first osteocomma once in a while, the processes such as Chang Yinwei is resell, ink is opened up and be fractured into more fragments.Such as
The first bone of some fragmentations can be joined together recovery by fruit, can more comprehensively understand the content of the oracle inscriptions of the Shang Dynasty, preferably grasp the literary example and language of the oracle inscriptions of the Shang Dynasty
Method rule, this will have very big contribution to inscriptions on bones or tortoise shells research.This is also exactly the fundamental significance of the present invention.
Artificial conjugated method is the experience material object or the book of rubbings with the person of joining together, according to the mark of break of first bone, radian, color and luster, font
Style, oracle inscriptions of the Shang Dynasty content, the character stroke at mark of break, brill, chisel, the inscriptions on bones or tortoise shells word such as trace and Bu Zhao direction and non-word feature are burnt to sentence
Break and be conjugated first bone.Therefore, first bone is manually conjugated, it is necessary to which first osteology expert takes action on one's own.If no the abundant of first osteology is known
Know, be unfamiliar with the classification and regulation method of first bone, it is then not possible in differentiating and record by " slot " and " shield line " of tortoise plastron
The book of rubbings, copy or photo are that first is bone actually, are certain parts of plastron, carapace or residual bone version, also can not just be engaged in and sew at all
Close work.And if not understanding the content of the fore-telling method of the inscriptions on bones or tortoise shells, literary example and the oracle inscriptions of the Shang Dynasty, only rely on shape almost or position is substantially harmonious with
It is conjugated, it will necessarily just cause substantial amounts of false conjugated.Therefore, it is necessary to careful investigation is done to each side can just start.But this is
One very cumbersome work, this will expend the substantial amounts of energy and time of many researchers.
The development of digital image processing techniques, mode identification technology and database technology, one is opened for first bone is conjugated
Brand-new research meanses.Foreign countries in 1973 carry out first bone with computer with regard to someone and work are conjugated, and they carry out numeral to first osteocomma
Whole osteocomma is divided into several parts by change first when building storehouse, and osteocomma to be put in storage will obtain osteocomma position attribution according to its shape,
Plus multiple attributes such as style of writing thickness of carving characters, according to each property value of some osteocomma when being conjugated, carried out using computer automatic
Matching.This method still needs information, the previous works such as artificial judgment osteocomma position, font extremely complex early stage.1974 I
State some scholars are benefited our pursuits with computer in terms of conjugated first bone.Wang Jun hairs in 1992 and Zhang Jianan utilize computer
Image processing techniques designs a hierarchy type categorizing system, so as to which all osteocomma rubbing collection in the whole world are classified, repetition
Rubbing find out and rejected, to complete the most complete oracle bone rubbing intersection in the whole world.Digitized first osteocomma image warp
An one-dimensional time series is produced as principal character value after crossing the algorithm calculation of the document, plus some auxiliary secondary features
Value, to represent a first osteocomma profile.
According to nearest statistics, the unearthed quantity up to ten tens of thousands of of first bone, new discovery from now on cannot not also be it is anticipated that as entirely
To arrange will be very difficult by manpower.But foreign countries' electronic computer can only accomplish complete or substantially complete bone version
It is conjugated, and its method also has the improved space of continuation.The result that Chinese scholar is explored, in addition to adjacent bone version, may be used also
So that more than 1/4 fragment of each bone version to be conjugated, its accuracy also has much room for improvement.These inscriptions on bones or tortoise shells area of computer aided are sewed
The method of conjunction is not met by the needs of people.The simply direct auxiliary of shallow hierarchy, lacks system research, particularly manually records
Specimen information workload processed is big, and is inaccurate.
Computer technology passes through the development of decades, and texture analysis has been made significant headway, and it is special to generate many textures
Research method is levied, Local textural feature substantially has:Covariance matrix, co-occurrence matrix, wavelet energy, entropy etc..These methods can be with
It is divided into statistical analysis method, modelling, structured analysis method etc., wherein statistical analysis method is most widely used.But studied specifically
Cheng Zhong, also difficult there is feature extraction, characteristic vector represents the problems such as inaccurate.
In summary, the problem of prior art is present be:
Existing first osteocomma conjugation methods are mostly slow by artificial progress matching speed, and accuracy is not high, and it is special to expend the inscriptions on bones or tortoise shells
Family's great effort.The offer auxiliary of the conjugated method of partial computer auxiliary, simply shallow hierarchy, the result of effect is not provided with, together
Sample needs to expend great effort.
And current image characteristic extracting method existing characteristics extraction is difficult, and vector representation is inaccurate, similar reducing
Sample characteristics gap on effect it is poor, the unobvious in the gap of feature between inhomogeneity, the problems such as to the classifying quality difference of image.
The content of the invention
The problem of existing for prior art, the invention provides a kind of extraction of inscriptions on bones or tortoise shells picture Local textural feature and
Matching process and system.
The present invention is achieved in that extraction and the matching process of a kind of inscriptions on bones or tortoise shells picture Local textural feature, the first
The extraction of bone texts and pictures piece Local textural feature and matching process, including:
Extraction and the gray proces of gray level co-occurrence matrixes progress textural characteristics are used to the inscriptions on bones or tortoise shells picture of loading;
Then texture eigenvalue is calculated by the algorithmic formula of contrast, energy, entropy, correlation these characteristic quantities;Make
The picture of selection carries out the least euclidean distance criteria with other pictures and matches classification;
In matching primitives in the case of two classifications, formula is utilizedIt is identified
The characteristic vector ownership of image, d (x, y) represent distance, and x, y represent the characteristic vector of two width pictures, and i is represented the in characteristic vector
I element, d represent the number of element in characteristic vector;
In matching primitives in the case of multi-class, discriminant function is used:
Calculated, select minimum numerical value be placed on before, smaller value similarity it is bigger (G represent discriminant function, i tables
Show classification, x represents identified picture feature vector, and d represents distance, μiThe i-th category feature vector is represented, T represents transposition).
Further, the gray level co-occurrence matrixes normalization formula is:
P (x, y)=# (X)=# { (a1,b1),(a2,b2)∈M×Nf(a1,b1)=x, f (a2,b2)=y } (1);
What # (X) was represented is the element number in set X, and (a, b) is a point in digital picture, and M × N is digitized map
The size of picture, it is x that the value of element (x, y), which is expressed as a gray scale, and another gray scale is y.F (a, b) is gray count function.
Further, the gray level co-occurrence matrixes include:
Co-occurrence matrix:P(x,y,d,θ);Wherein, x, y represent gray value, and d represents mobile distance, and θ represents mobile angle
Degree.
A point (a, b) in digital picture M × N and any one point (a+i, b+j) around this point are first extracted, it is false
If the gray value of this point pair is (x, y);This point (a, b) is constantly moved on this picture, will obtain it is different (x,
Y) value, the series of gray value are assumed to be K, then the number of combinations of (x, y) have K square, recorded out on picture it is every kind of (x,
Y) number that gray value occurs in picture, these numbers are then arranged in a square formation, then occurred in image with (x, y) total
It is gray level co-occurrence matrixes that they are normalized as probability of occurrence P (x, y), resulting square formation number;
If (a, b) and (a+i, b+j) distance be d, the angle of both and abscissa line is θ, obtain various distances and
The gray level co-occurrence matrixes P (x, y, d, θ) of angle;
If i=1, j=0, θ=0, pixel is to for level;
If i=0, j=1, θ=90, pixel is to be vertical;
If i=1, j=1, θ=45, pixel is to for right diagonal;
If i=-1, j=1, θ=135, pixel is to for left diagonal.
Further, it is described by contrast, energy, entropy, correlation these characteristic quantities algorithmic formula, specifically include:
Contrast algorithm formula:
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
Energy arithmetic formula:
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
The algorithmic formula of entropy:
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
Relevance algorithms formula:
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
Wherein
The algorithmic formula by contrast, energy, entropy, correlation these characteristic quantities calculates texture eigenvalue, so
Afterwards with vectorial h=[Asm1,Con1,Ent1,Corr1,…,Asm4,Con4,Ent4,Corr4] features above combines.Knot
Vector after conjunction is just to do image texture characteristic value.
Further, the least euclidean distance criteria includes:
(1) two point a (x are set1,y1) and b (x2,y2) Euclidean distance formula is on two-dimensional surface:
Wherein, d represents distance, and x, y represent position coordinates of the point on two-dimensional surface;
(2) two point a (x are set1,y1,z1) and b (x2,y2,z2) Euclidean distance formula is on three-dimensional surface:
Wherein, d represents distance, and x, y, z represents position coordinates of the point on three-dimensional surface;
(3) two point a (x are set11,x12,…,x1n) and b (x21,x22,…,x2n) in the Euclidean distance formula of n-dimensional space be:
Wherein, d represents distance, x1k、x2kRepresent position coordinateses of point a, the b on n dimensions face, which dimension k represents, k from 1 to
n;
Or with the form for being expressed as vector operation:
Wherein, d represents distance, and a, b represent point a, b characteristic vector, and T represents transposition.
Further, the picture is matched in classification with other pictures progress the least euclidean distance criteria,
In matching primitives in the case of two classifications:
Provided with two standard forms A and B, their characteristic vector is:
Characteristic vector
Characteristic vector
The characteristic vector of any one image to be identified is
During the characteristic vector ownership for the image being identified, calculated using following formula:
Wherein, d (x, y) represents distance, and x, y represent the characteristic vector of two width pictures, and i represents in characteristic vector i-th yuan
Element, d represent the number of element in characteristic vector;
When:d(X,μA) < d (X, μB) when, X belongs to A;When:D (X, μA) > d (X, μB) when, X belongs to B.
Further, the picture is matched in classification with other pictures progress the least euclidean distance criteria,
In matching primitives in the case of multi-class:
Provided with m classes, Ω=[ω1 ω2 … ωm], there is a pile vectorial per class, from every heap vector, choose one and most mark
The accurate prototype for representative, referred to as image;
For ωiClass, the characteristic vector of its prototype are:
Knowledge figure characteristic vector is treated to any one:
Calculate d (X, μi), calculate minimum range.Assuming that d (X, μi) be minimum range, then X belongs to ωiClass;It is specific to differentiate
When, use | x-y |2Calculated instead of distance, i.e. formula
Wherein, d (x, y) represents distance, and x represents to treat knowledge figure characteristic vector, μiRepresent ωiThe prototype feature vector of class, T
Represent transposition;
In formula, featureFor discriminant function:
If Gi(x) then X belongs to ω to=miniClass.
Another object of the present invention is to provide extraction and the matching system of a kind of inscriptions on bones or tortoise shells picture Local textural feature.
Advantages of the present invention and good effect are:
The present invention is matched automatically by Computer Automatic Extraction first bone picture textural characteristics.Used feature
Extracting method is gray level co-occurrence matrixes, and matching classification is using the least euclidean distance criteria.
The present invention solves conventional first osteocomma and is conjugated and can only manually be carried out by inscriptions on bones or tortoise shells expert, is conjugated several during a piece of first osteotabes
Even several years individual month the problem of, quickly similar first bone can be screened automatically, researcher is sewed from cumbersome first bone
Freed in conjunction, and improve the precision of manual identified.And improve and classify present in conventional image characteristic extracting method
Effect is poor, and characteristic vector represents the problems such as inaccurate.
Brief description of the drawings
Fig. 1 is extraction and the matching process flow chart of inscriptions on bones or tortoise shells picture Local textural feature provided in an embodiment of the present invention;
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, with reference to embodiments, to the present invention
It is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not used to
Limit the present invention.
Gray level co-occurrence matrixes are the co-occurrence matrixs obtained by calculating image gray processing, then calculate symbiosis by algorithm
The part matrix characteristic value of matrix, these characteristic values are used for representing some textural characteristics.The local mode and queueing discipline of image
Basis be gray level co-occurrence matrixes reflected gradation of image direction, adjacent spaces, amplitude of variation, on gray level co-occurrence matrixes,
Need the exponent number of specific direction, offset and gray level co-occurrence matrixes.
Current feature extraction effect on similar sample characteristics gap is reduced is poor, the gap of feature between inhomogeneity
Upper unobvious, it is poor to the classifying quality of image.
Below in conjunction with the accompanying drawings and specific embodiment is further described to the application principle of the present invention.
As shown in figure 1, extraction and the matching process of inscriptions on bones or tortoise shells picture Local textural feature provided in an embodiment of the present invention, bag
Include:
S101:Inscriptions on bones or tortoise shells picture library is opened, an inscriptions on bones or tortoise shells picture is loaded and carries out picture similarity-rough set;
S102:Data analysis is carried out on backstage, first each pictures are carried out with the extraction of textural characteristics;
S103:Texture feature extraction uses gray level co-occurrence matrixes, carries out gray proces to inscriptions on bones or tortoise shells image, is divided into 16
Individual feature;
S104:Then texture eigenvalue is calculated by algorithmic formula.First of selection is set to scheme to carry out with other pictures
Matching classification, matching classification are used the least euclidean distance criteria, calculated using the algorithm in the case of multi-class, selected
Before minimum numerical value is placed on, smaller value similarity is bigger;First pictures are exactly with selecting image similar in figure.
With reference to specific embodiment, the invention will be further described.
First, texture characteristic extracting method provided in an embodiment of the present invention includes gray level co-occurrence matrixes:
Co-occurrence matrix:P(x,y,d,θ);(x, y represent gray value, and d represents mobile distance, and θ represents mobile angle)
A point (a, b) in digital picture M × N and any one point (a+i, b+j) around this point are first extracted, it is false
If the gray value of this point pair is (x, y);This point (a, b) is constantly moved on this picture, will obtain it is different (x,
Y) value, the series of gray value are assumed to be K, then the number of combinations of (x, y) have K square, recorded out on picture it is every kind of (x,
Y) number that gray value occurs in picture, these numbers are then arranged in a square formation, then occurred in image with (x, y) total
It is gray level co-occurrence matrixes that they are normalized as probability of occurrence P (x, y), resulting square formation number;
If (a, b) and (a+i, b+j) distance be d, the angle of both and abscissa line is θ, obtain various distances and
The gray level co-occurrence matrixes P (x, y, d, θ) of angle;If i=1, j=0, θ=0, pixel is horizontal to what is showed;If i=0, j
=1, θ=90, pixel are vertical to what is showed;If i=1, j=1, θ=45, pixel is right diagonal to what is showed
Line;If i=-1, j=1, θ=135, pixel is left diagonal to what is showed.The probability that two pixel grayscales occur.This
Sample is formed gray level co-occurrence matrixes.
Gray level co-occurrence matrixes normalize formula:
P (x, y)=# (X)=# { (a1,b1),(a2,b2)∈M×N|f(a1,b1)=x, f (a2,b2)=y } (1);
What # (X) was represented is the element number in set X, and (a, b) is a point in digital picture, and M × N is digitized map
The size of picture, it is x that the value of element (x, y), which is expressed as a gray scale, and another gray scale is y.f(a1,b1) it is gray count function.
It is the gray level co-occurrence matrixes that can not apply to computer by the matrix being calculated.But will be by being calculated
Texture characteristic amount.Contrast, energy, entropy, correlation these characteristic quantities represent textural characteristics.
(1) contrast is that pixel value and its neighborhood territory pixel value are contrasted, and the contrast between pixel reflects the clear of image
The degree of clear degree and the texture rill depth.Texture rill degree is more obvious, and pixel contrast is big, and the effect seen is more clear;
Texture rill degree unobvious, pixel contrast is small, and the visual effect seen obscures.Contrast algorithm formula:
(2) energy refers to the quadratic sum of each matrix element, has given expression to gradation of image distribution average degree and texture
Fineness degree.Asm values illustrate more greatly the more stable texture of texture variations rule.Energy arithmetic formula:
(3) entropy is the measurement for the information content that response diagram picture is contained, and texture information is the measurement of a randomness, when meter exists
It can be replaced to a certain extent with entropy, the value that entropy shows is bigger, illustrates that the information that image includes is more complicated.Calculate
The co-occurrence matrix value come is equal, and entropy is maximum, and when maximum randomness occurs in the pixel of image, entropy is maximum,
(4) correlation has reacted the uniformity of image texture.We are also referred to as homogeney, can be used for calculating gray level
Similarity degree on being expert at, the similarity degree on row can also be calculated.The value of correlation is bigger, illustrates local gray level correlation
Property is bigger.Relevance algorithms:
Wherein
Final step can exactly select a vector that features above is combined.With reference to vectorial can afterwardsDescribed as image texture characteristic.
2nd, the matching of textural characteristics:
Image texture matching system is made up of two parts, one be effective textural characteristics extraction, one be high accuracy
Feature classifiers.Grader generally has a lot, such as minimum euclidean distance, Bayes's classification, K value-nearest neighbour classification etc..
3rd, minimum distance classification:
Minimum distance classification, refer to not knowing the vector of classification to the distance of identification class vector center point, being exactly will not
The categorization vector known belongs to most a kind of image classification method of distance.
A kind of statistical recognition method of pattern classification is carried out according to pattern and the distance of all kinds of representative samples.In this method
In, it is identified pattern and the distance of affiliated pattern class sample is minimum.It is assumed that the characteristic vector that c classification represents pattern is used
R1,…,RcRepresenting, x is the characteristic vector of identified pattern, | x-Ri| it is x and RiThe distance between (i=1,2 ..., c), if
|x-Ri| it is minimum, then it is the i-th class x points.All kinds of representative sample set can be used in more complicated cases, rather than just
The basis of minimum distance classification is used as by the use of a sample.Minimum distance classification is carried out to first have to determine its representative for each classification
The characteristic vector of pattern, this is the key for carrying out classifying quality quality in this way.All kinds of representative feature vectors can basis
The mechanism of the physics of analyzed object, chemistry, biology etc. determines that conventional method is to collect Different categories of samples, use is all kinds of
The average vector of sampling feature vectors is as all kinds of characteristic vectors for representing pattern.A kind of its secondary distance metric of determination of selection
To calculate identified pattern the distance between pattern character vector is represented with all kinds of.Conventional distance have Euclidean distance, absolutely
To value distance etc..
4th, the matching of minimum distance criterion:
Matching sorting algorithm of the present invention is minimum distance criterion.
Minimum distance criterion is also known as minimum euclidean distance, and Euclidean distance is simplest computational methods.
(1) two point a (x are set1,y1) and b (x2,y2) Euclidean distance formula is on two-dimensional surface:
Wherein, d represents distance, and x, y represent position coordinates of the point on two-dimensional surface;
(2) two point a (x are set1,y1,z1) and b (x2,y2,z2) Euclidean distance formula is on three-dimensional surface:
Wherein, d represents distance, and x, y, z represents position coordinates of the point on three-dimensional surface;
(3) two point a (x are set11,x12,…,x1n) and b (x21,x22,…,x2n) in the Euclidean distance formula of n-dimensional space be:
Wherein, d represents distance, x1k、x2kRepresent position coordinateses of point a, the b on n dimensions face, which dimension k represents, k from 1 to
n;
The form for being expressed as vector operation can also be used:
Wherein, d represents distance, and a, b represent point a, b characteristic vector, and T represents transposition.
Calculated in the case of two classifications:
Provided with two standard forms A and B, their characteristic vector is:
Characteristic vector
Characteristic vector
The characteristic vector of any one image to be identified is
Wherein, X belongs to μAOr μBIf X and μAIt is worth equal, then the image is A, if X and μBIt is worth equal, then
The image is B.How to know that X is equal with some characteristic vector value, to be calculated using following formula:
Wherein, d (x, y) represents distance, and x, y represent the characteristic vector of two width pictures, and i represents in characteristic vector i-th yuan
Element, d represent the number of element in characteristic vector;
When:d(X,μA) < d (X, μB) when, X belongs to A;When:D (X, μA) > d (X, μB) when, X belongs to B.
Calculated in the case of multi-class:
Provided with m classes, Ω=[ω1 ω2 … ωm], there is a pile vectorial per class, from every heap vector, choose one and most mark
The accurate prototype for representative, referred to as image.
Such as ωiClass, the characteristic vector of its prototype are:
Knowledge figure characteristic vector is treated to any one:
Calculate d (X, μi), calculate minimum range.Assuming that d (X, μi) be minimum range, then X belongs to ωiClass;It is specific to differentiate
When, use | x-y |2Calculated instead of distance, i.e. formula
Wherein, d (x, y) represents distance, and x represents to treat knowledge figure characteristic vector, μiRepresent ωiThe prototype feature vector of class, T
Represent transposition;
In formula, featureFor discriminant function:
If Gi(x) then X belongs to ω to=miniClass.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention
All any modification, equivalent and improvement made within refreshing and principle etc., should be included in the scope of the protection.
Claims (8)
1. extraction and the matching process of a kind of inscriptions on bones or tortoise shells picture Local textural feature, it is characterised in that the inscriptions on bones or tortoise shells picture office
The extraction of portion's textural characteristics and matching process, including:
Extraction and the gray proces of gray level co-occurrence matrixes progress textural characteristics are used to the inscriptions on bones or tortoise shells picture of loading;
Then texture eigenvalue is calculated by the algorithmic formula of contrast, energy, entropy, correlation these characteristic quantities;To selection
Picture and other pictures carry out the least euclidean distance criteria and match classification;
In matching primitives in the case of two classifications, formula is utilizedThe image being identified
Characteristic vector ownership, d (x, y) represent distance, x, y represent two width pictures characteristic vector, i represent characteristic vector in i-th
Element, d represent the number of element in characteristic vector;
In matching primitives in the case of multi-class, discriminant function is used:
Gi(x)=d (x, μi)=| x- μi|2=(x- μi)T(x-μi)=xTx-(xTμi+μi Tx-μi Tμi)
Calculated, select minimum numerical value be placed on before, smaller value similarity is bigger, G represent discriminant function, i represent class
Not, x represents identified picture feature vector, and d represents distance, μiThe i-th category feature vector is represented, T represents transposition.
2. extraction and the matching process of inscriptions on bones or tortoise shells picture Local textural feature as claimed in claim 1, it is characterised in that described
Gray level co-occurrence matrixes normalize formula:
P (x, y)=# (X)=# { (a1,b1),(a2,b2)∈M×N|f(a1,b1)=x, f (a2,b2)=y } (1);
What # (X) was represented is the element number in set X, and (a, b) is a point in digital picture, and M × N is digital picture
Size, it is x that the value of element (x, y), which is expressed as a gray scale, and another gray scale is y.F (a, b) is gray count function.
3. extraction and the matching process of inscriptions on bones or tortoise shells picture Local textural feature as claimed in claim 2, it is characterised in that described
Gray level co-occurrence matrixes include:
Co-occurrence matrix:P(x,y,d,θ);X, y represents gray value, and d represents mobile distance, and θ represents mobile angle;
First extract a point (a, b) in digital picture M × N and any one point (a+i, b+j) around this point, it is assumed that this
The gray value of individual point pair is (x, y);This point (a, b) is constantly moved on this picture, will obtain different (x, y)
Value, the series of gray value is assumed to be K, then the number of combinations of (x, y) have K square, recorded out on picture every kind of (x, y)
The number that gray value occurs in picture, these numbers are then arranged in a square formation, then total time occurred with (x, y) in image
It is gray level co-occurrence matrixes that they are normalized as probability of occurrence P (x, y), resulting square formation number;
If (a, b) and (a+i, b+j) distance be d, the angle of both and abscissa line is θ, obtains various distances and angle
Gray level co-occurrence matrixes P (x, y, d, θ);
If i=1, j=0, θ=0, pixel is to for level;
If i=0, j=1, θ=90, pixel is to be vertical;
If i=1, j=1, θ=45, pixel is to for right diagonal;
If i=-1, j=1, θ=135, pixel is to for left diagonal.
4. extraction and the matching process of inscriptions on bones or tortoise shells picture Local textural feature as claimed in claim 1, it is characterised in that described
By contrast, energy, entropy, correlation these characteristic quantities algorithmic formula, specifically include:
Contrast algorithm formula:
<mrow>
<mi>C</mi>
<mi>o</mi>
<mi>n</mi>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mi>P</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
Energy arithmetic formula:
<mrow>
<mi>A</mi>
<mi>s</mi>
<mi>m</mi>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mi>P</mi>
<msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>3</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
The algorithmic formula of entropy:
<mrow>
<mi>E</mi>
<mi>n</mi>
<mi>t</mi>
<mo>=</mo>
<mo>-</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mi>P</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mi>log</mi>
<mi> </mi>
<mi>P</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>4</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
Relevance algorithms formula:
<mrow>
<mi>C</mi>
<mi>o</mi>
<mi>r</mi>
<mi>r</mi>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mfrac>
<mrow>
<mo>(</mo>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
<mi>P</mi>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
<mo>)</mo>
<mo>-</mo>
<msub>
<mi>&mu;</mi>
<mi>x</mi>
</msub>
<msub>
<mi>&mu;</mi>
<mi>y</mi>
</msub>
</mrow>
<mrow>
<msub>
<mi>&sigma;</mi>
<mi>x</mi>
</msub>
<msub>
<mi>&sigma;</mi>
<mi>y</mi>
</msub>
</mrow>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>5</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, x, y represent the gray level in gray level co-occurrence matrixes, the element in P (x, y) representing matrix;
<mrow>
<msub>
<mi>&mu;</mi>
<mi>x</mi>
</msub>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mi>x</mi>
<mo>*</mo>
<mi>P</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
</mrow>
<mrow>
<msub>
<mi>&mu;</mi>
<mi>y</mi>
</msub>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mi>y</mi>
<mo>*</mo>
<mi>P</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
</mrow>
<mrow>
<msub>
<mi>&sigma;</mi>
<mi>x</mi>
</msub>
<mo>=</mo>
<msqrt>
<mrow>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mo>-</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mi>P</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>-</mo>
<msub>
<mi>&mu;</mi>
<mi>x</mi>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
<mo>,</mo>
</mrow>
<mrow>
<msub>
<mi>&sigma;</mi>
<mi>y</mi>
</msub>
<mo>=</mo>
<msqrt>
<mrow>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mo>-</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>k</mi>
</msubsup>
<mi>P</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<msup>
<mrow>
<mo>(</mo>
<mi>y</mi>
<mo>-</mo>
<msub>
<mi>&mu;</mi>
<mi>y</mi>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
<mo>;</mo>
</mrow>
The algorithmic formula by contrast, energy, entropy, correlation these characteristic quantities calculates texture eigenvalue, Ran Houyong
Vectorial h=[Asm1,Con1,Ent1,Corr1,…,Asm4,Con4,Ent4,Corr4] features above combines, with reference to it
Vector afterwards is image texture characteristic value.
5. extraction and the matching process of inscriptions on bones or tortoise shells picture Local textural feature as claimed in claim 1, it is characterised in that described
The least euclidean distance criteria includes:
(1) two point a (x are set1,y1) and b (x2,y2) Euclidean distance formula is on two-dimensional surface:
<mrow>
<mi>d</mi>
<mo>=</mo>
<msqrt>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mn>1</mn>
</msub>
<mo>-</mo>
<msub>
<mi>x</mi>
<mn>2</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>y</mi>
<mn>1</mn>
</msub>
<mo>-</mo>
<msub>
<mi>y</mi>
<mn>2</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>6</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, d represents distance, and x, y represent position coordinates of the point on two-dimensional surface;
(2) two point a (x are set1,y1,z1) and b (x2,y2,z2) Euclidean distance formula is on three-dimensional surface:
<mrow>
<mi>d</mi>
<mo>=</mo>
<msqrt>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mn>1</mn>
</msub>
<mo>-</mo>
<msub>
<mi>x</mi>
<mn>2</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>y</mi>
<mn>1</mn>
</msub>
<mo>-</mo>
<msub>
<mi>y</mi>
<mn>2</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>z</mi>
<mn>1</mn>
</msub>
<mo>-</mo>
<msub>
<mi>z</mi>
<mn>2</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>7</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, d represents distance, and x, y, z represents position coordinates of the point on three-dimensional surface;
(3) two point a (x are set11,x12,…,x1n) and b (x21,x22,…,x2n) in the Euclidean distance formula of n-dimensional space be:
<mrow>
<mi>d</mi>
<mo>=</mo>
<msqrt>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mrow>
<mn>1</mn>
<mi>k</mi>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>x</mi>
<mrow>
<mn>2</mn>
<mi>k</mi>
</mrow>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>8</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, d represents distance, x1k、x2kPosition coordinateses of point a, the b on n dimensions face is represented, k represents which is tieed up, and k is from 1 to n;
Or with the form for being expressed as vector operation:
<mrow>
<mi>d</mi>
<mo>=</mo>
<msqrt>
<mrow>
<mo>(</mo>
<mi>a</mi>
<mo>-</mo>
<mi>b</mi>
<mo>)</mo>
<msup>
<mrow>
<mo>(</mo>
<mi>a</mi>
<mo>-</mo>
<mi>b</mi>
<mo>)</mo>
</mrow>
<mi>T</mi>
</msup>
</mrow>
</msqrt>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>9</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, d represents distance, and a, b represent point a, b characteristic vector, and T represents transposition.
6. extraction and the matching process of inscriptions on bones or tortoise shells picture Local textural feature as claimed in claim 1, it is characterised in that described
Picture carries out the least euclidean distance criteria with other pictures and matched in classification, in matching primitives in the case of two classifications:
Provided with two standard forms A and B, their characteristic vector is:
Characteristic vector
Characteristic vector
The characteristic vector of any one image to be identified is
During the characteristic vector ownership for the image being identified, calculated using following formula:
<mrow>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msup>
<mrow>
<mo>&lsqb;</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>d</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mi>i</mi>
</msub>
<mo>-</mo>
<msub>
<mi>y</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&rsqb;</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>10</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, d (x, y) represents distance, and x, y represent the characteristic vector of two width pictures, and i represents i-th of element in characteristic vector, d
Represent the number of element in characteristic vector;
When:d(X,μA) < d (X, μB) when, X belongs to A;When:D (X, μA) > d (X, μB) when, X belongs to B.
7. extraction and the matching process of inscriptions on bones or tortoise shells picture Local textural feature as claimed in claim 1, it is characterised in that described
Picture carries out the least euclidean distance criteria with other pictures and matched in classification, in matching primitives in the case of multi-class:
Provided with m classes, Ω=[ω1 ω2 … ωm], there is a pile vectorial per class, from every heap vector, choose most standard
For representative, the referred to as prototype of image;
For ωiClass, the characteristic vector of its prototype are:
Knowledge figure characteristic vector is treated to any one:
Calculate d (X, μi), calculate minimum range;Assuming that d (X, μi) be minimum range, then X belongs to ωiClass;During specific differentiation, use |
x-y|2Calculated instead of distance, i.e. formula
<mrow>
<mtable>
<mtr>
<mtd>
<mrow>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mo>|</mo>
<mi>x</mi>
<mo>-</mo>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>-</mo>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mi>T</mi>
</msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>-</mo>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>=</mo>
<msup>
<mi>x</mi>
<mi>T</mi>
</msup>
<mi>x</mi>
<mo>-</mo>
<msup>
<mi>x</mi>
<mi>T</mi>
</msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mo>-</mo>
<msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mi>T</mi>
</msup>
<mi>x</mi>
<mo>+</mo>
<msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mi>T</mi>
</msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>=</mo>
<msup>
<mi>x</mi>
<mi>T</mi>
</msup>
<mi>x</mi>
<mo>-</mo>
<mrow>
<mo>(</mo>
<msup>
<mi>x</mi>
<mi>T</mi>
</msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mo>+</mo>
<msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mi>T</mi>
</msup>
<mi>x</mi>
<mo>-</mo>
<msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mi>T</mi>
</msup>
<msub>
<mi>&mu;</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>11</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein, d (x, y) represents distance, and x represents to treat knowledge figure characteristic vector, μiRepresent ωiThe prototype feature vector of class, T are represented
Transposition;In formula, feature xTx-(xTμi+μi Tx-μi Tμi) it is discriminant function:
G (x)=(xTμi+μi Tx-μi Tμi);
If Gi(x) then X belongs to ω to=miniClass.
A kind of 8. inscriptions on bones or tortoise shells picture office of the extraction of inscriptions on bones or tortoise shells picture Local textural feature and matching process as claimed in claim 1
The extraction of portion's textural characteristics and matching system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710826536.2A CN107563393A (en) | 2017-09-14 | 2017-09-14 | A kind of extraction of inscriptions on bones or tortoise shells picture Local textural feature and matching process and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710826536.2A CN107563393A (en) | 2017-09-14 | 2017-09-14 | A kind of extraction of inscriptions on bones or tortoise shells picture Local textural feature and matching process and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107563393A true CN107563393A (en) | 2018-01-09 |
Family
ID=60980882
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710826536.2A Pending CN107563393A (en) | 2017-09-14 | 2017-09-14 | A kind of extraction of inscriptions on bones or tortoise shells picture Local textural feature and matching process and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107563393A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110598030A (en) * | 2019-09-26 | 2019-12-20 | 西南大学 | Oracle bone rubbing classification method based on local CNN framework |
CN111563506A (en) * | 2020-03-18 | 2020-08-21 | 西南大学 | Oracle bone rubbing conjugation method based on curve contour matching |
CN112232348A (en) * | 2020-09-07 | 2021-01-15 | 华南师范大学 | Oracle identification method and system based on machine vision |
CN112837334A (en) * | 2021-04-02 | 2021-05-25 | 河南大学 | Automatic conjugation method of Chinese character and sketch image |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160004930A1 (en) * | 2009-12-28 | 2016-01-07 | Picscout (Israel) Ltd. | Robust and efficient image identification |
CN105701512A (en) * | 2016-01-14 | 2016-06-22 | 西安电子科技大学 | Image classification method based on BBO-MLP and texture characteristic |
-
2017
- 2017-09-14 CN CN201710826536.2A patent/CN107563393A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160004930A1 (en) * | 2009-12-28 | 2016-01-07 | Picscout (Israel) Ltd. | Robust and efficient image identification |
CN105701512A (en) * | 2016-01-14 | 2016-06-22 | 西安电子科技大学 | Image classification method based on BBO-MLP and texture characteristic |
Non-Patent Citations (3)
Title |
---|
周兴中等: "《计算机与焊接工程》", 31 December 1989, 华中理工大学出版社 * |
杨帆: "《数字图像处理与分析》", 31 October 2007, 北京航空航天大学出版社 * |
王爱民等: "甲骨碎片智能缀合关键技术研究", 《武汉理工大学学报》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110598030A (en) * | 2019-09-26 | 2019-12-20 | 西南大学 | Oracle bone rubbing classification method based on local CNN framework |
CN110598030B (en) * | 2019-09-26 | 2022-05-17 | 西南大学 | Oracle bone rubbing classification method based on local CNN framework |
CN111563506A (en) * | 2020-03-18 | 2020-08-21 | 西南大学 | Oracle bone rubbing conjugation method based on curve contour matching |
CN111563506B (en) * | 2020-03-18 | 2022-07-22 | 西南大学 | Oracle bone rubbing conjugation method based on curve contour matching |
CN112232348A (en) * | 2020-09-07 | 2021-01-15 | 华南师范大学 | Oracle identification method and system based on machine vision |
CN112837334A (en) * | 2021-04-02 | 2021-05-25 | 河南大学 | Automatic conjugation method of Chinese character and sketch image |
CN112837334B (en) * | 2021-04-02 | 2022-07-05 | 河南大学 | Automatic conjugation method of Chinese character and sketch image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Joshi et al. | Script identification from Indian documents | |
CN108122008A (en) | SAR image recognition methods based on rarefaction representation and multiple features decision level fusion | |
Mahmoud | Recognition of writer-independent off-line handwritten Arabic (Indian) numerals using hidden Markov models | |
CN107563393A (en) | A kind of extraction of inscriptions on bones or tortoise shells picture Local textural feature and matching process and system | |
CN101777125B (en) | Method for supervising and classifying complex category of high-resolution remote sensing image | |
CN104850822B (en) | Leaf identification method under simple background based on multi-feature fusion | |
Mehri et al. | A texture-based pixel labeling approach for historical books | |
Joshi et al. | A generalised framework for script identification | |
De Stefano et al. | Layout measures for writer identification in mediaeval documents | |
Sah et al. | Text and non-text recognition using modified HOG descriptor | |
Cilia et al. | What is the minimum training data size to reliably identify writers in medieval manuscripts? | |
CN104899551B (en) | A kind of form image sorting technique | |
Marquis et al. | Quantitative characterization of morphological polymorphism of handwritten characters loops | |
CN105844299A (en) | Image classification method based on bag of words | |
Joshi et al. | Combination of multiple image features along with KNN classifier for classification of Marathi Barakhadi | |
CN115272689A (en) | View-based spatial shape recognition method, device, equipment and storage medium | |
Han et al. | HEp-2 staining pattern recognition using stacked fisher network for encoding weber local descriptor | |
Rani et al. | Performance analysis of feature extractors and classifiers for script recognition of English and Gurmukhi words | |
Mulyana et al. | Gender Classification for Anime Character Face Image Using Random Forest Classifier Method and GLCM Feature Extraction | |
Mehri et al. | Performance evaluation and benchmarking of six texture-based feature sets for segmenting historical documents | |
Le et al. | Logo spotting for document categorization | |
Froech et al. | Reconstructing facade details using MLS point clouds and Bag-of-Words approach | |
Mao et al. | Detection of artificial pornographic pictures based on multiple features and tree mode | |
Bockholt et al. | Document image retrieval with morphology-based segmentation and features combination | |
Chao et al. | The stochastic replica approach to machine learning: Stability and parameter optimization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180109 |