CN101944358A - Ant colony algorithm-based codebook classification method and codebook classification device thereof - Google Patents

Ant colony algorithm-based codebook classification method and codebook classification device thereof Download PDF

Info

Publication number
CN101944358A
CN101944358A CN2010102671568A CN201010267156A CN101944358A CN 101944358 A CN101944358 A CN 101944358A CN 2010102671568 A CN2010102671568 A CN 2010102671568A CN 201010267156 A CN201010267156 A CN 201010267156A CN 101944358 A CN101944358 A CN 101944358A
Authority
CN
China
Prior art keywords
code book
book
vector
code
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010102671568A
Other languages
Chinese (zh)
Other versions
CN101944358B (en
Inventor
李凤莲
张雪英
马朝阳
王峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiyuan University of Technology
Original Assignee
Taiyuan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taiyuan University of Technology filed Critical Taiyuan University of Technology
Priority to CN201010267156.8A priority Critical patent/CN101944358B/en
Publication of CN101944358A publication Critical patent/CN101944358A/en
Application granted granted Critical
Publication of CN101944358B publication Critical patent/CN101944358B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Compression, Expansion, Code Conversion, And Decoders (AREA)

Abstract

The invention discloses an ant colony algorithm-based codebook classification method and a codebook classification device thereof. By the method, a designed codebook is divided into a plurality of sub codebooks by a codebook classification method in the classification process of the codebook; the classified sub codebooks are represented by characteristic values of the sub codebooks; an ant colony algorithm is adopted in the codebook classification method, a pick-up probability function and a put-down probability function are introduced in the ant colony algorithm, a random probability range is combined with a probability function value and an algorithm convergence speed is improved; and in a codebook rearranging process, the sub codebooks are arranged according to characteristic value orders of the sub codebooks to form classified codebooks. The device consists of a sub codebook characteristic value unit, a sub codebook code word number unit and a classified codebook unit. The code word searching range of a codebook classifying vector quantizer is limited as the sub codebooks of the classified codebook through the characteristic values and the code word number information of the sub codebooks when the codebook classifying vector quantizer is quantized, so that the searching range of code words and the time complexity of the vector quantizer are reduced.

Description

Code book sorting technique and code book sorter thereof based on the ant colony clustering algorithm
Technical field
The present invention relates to a kind of voice signal and handle and colony intelligence algorithmic technique, specifically is that a kind of the employing optimized code book sorting technique and the code book sorter thereof that the ant colony clustering algorithm carries out.
Background technology
Vector quantization technology application in practice is very extensive, has related to fields such as digital picture and voice compression coding, speech recognition, emotion recognition, literature search and database retrieval.
Vector quantizer mainly is made up of encoder, wherein includes identical or different code book.Input vector need carry out distortion computation according to all code words in distortion measure and the vector quantization systematic encoder code book usually during quantification, and to find the coupling code word of distortion minimum, what promptly adopt is the method for exhaustive search vector quantization.The major advantage of exhaustive search vector quantizer is to find the code word of coupling according to a certain distortion measure, but its computation complexity maximum when quantizing.Therefore proposed the strategy of many reduction quantization complexity, these strategies are considered from the formation of vector quantizer on the one hand, consider from the code word searching algorithm on the other hand.Constraint vector quantizer additional various constraint conditions on exhaustive search vector quantizer basis are that purpose proposes to reduce quantization complexity, produce more corresponding encryption algorithms and code book designing technique thus.A kind of method that can reduce quantization complexity is that the code word in the quantizer code book is carried out some constraints, makes its code word no longer have any distribution, but distributes in a kind of affined mode, thereby makes nearest neighbor search become easier.
As a kind of constraint vector quantizer, classification vector quantizer principle is according to the characteristic of quantization parameter input vector to be classified, search for nearest code word then in the code book of respective class, the size of all kinds of code books can be different, just constitutes total code book of quantizer altogether.Because the size of each subcode book of this quantizer is all smaller, so time complexity has obtained reduction.The classification vector quantizer can produce two index when quantizing, one is codebook index, one is codewords indexes, wherein codebook index is used for determining which code book is input vector code word search procedure need carry out at, and codewords indexes is input vector searches the arest neighbors code word in determining good code book a index.The design process of classification vector quantizer code book normally will be imported the trained vector set with sorter earlier and be divided into several subclass, adopt the code book algorithm for design to produce corresponding code book then, and these code books just constitute final total code book altogether.The key that influences classification vector quantizer performance is also how to determine that the size of all kinds of code books makes overall performance the best of quantizer under the certain condition of total code book size.Usually adopt two kinds of methods to determine all kinds of code book sizes, a kind of is bit distribution algorithm, and another kind is under the certain situation of total code book size, thinks that the size of all kinds of code books is directly proportional with the size of training subclass.Because the classification vector quantizer needs codebook index and codewords indexes, so its quantizing bit number is all relevant with sorter number and each code book size.
Existing ant group algorithm is a kind of probabilistic search algorithm of optimizing the path that is used for seeking in the drawings, propose in phase early 1990s by people such as Italian scholar MarcoDorigo, its inspiration comes from ant is found the path in seeking the food process behavior, be a kind of heuristic bionical optimizing algorithm, be mainly used in and find the solution complicated combinatorial optimization problem.Up to now, ant group algorithm has successfully solved many practical problemss, as traveling salesman problem, quadratic assignment problem, Job-Shop scheduling problem and discrete optimization problem etc.Ant group algorithm is used for cluster analysis, and inspiration comes from ant and piles up their corpse and their young behavior of classification.Because the ant group motion process of reality approaches actual clustering problem, so emerge a large amount of ant colony clustering algorithms in recent years.
The clustering algorithm that forms principle based on the ant heap is propositions such as Deneubourg the earliest, they are according to the similarity of data object and its surroundings, allow ant move, pick up or put down data object randomly, to reach the purpose of cluster data, this basic model successfully is applied to fields such as robot.Lumer etc. at first improve this algorithm, have proposed the LF algorithm, are carrying out having obtained certain effect aspect the cluster analysis with ant group algorithm.
The fundamental mechanism that the ant group carries object is: a non-loaded ant of moving at random when running into an object, if this object with it around the position similarity of object more little, the probability of then " picking up " this object is big more; Otherwise, one move at random have load object that ant carries on the back and its position object similarity big more, the probability that then " puts down " this object is big more.This mechanism can guarantee not destroy the object of raft, and can assemble the object of rickle.
Based on this, the researcher has proposed the basic ideas of ant colony clustering algorithm.Its main thought is that data to be clustered are dispersed in the two dimensional surface initial randomly, produces some virtual ants then on this plane it is carried out cluster analysis.At first data object is projected to randomly a two dimensional surface, every ant is selected a data object at random then, according to the probability that this object obtains in the similarity of regional area, whether the decision ant " is picked up ", " moving " or " putting down " this object.Through the limited number of time iteration, the data object on the plane is assembled by its similarity, obtains cluster result and clusters number at last.
Speed and non-loaded ant that the principal element of above-mentioned ant colony clustering algorithm affects algorithm the convergence speed has the load ant to put down object are picked up the speed of object, and the key that influences these two speed be in the algorithm put down probability function and pick up probability function and random chance that system produces between relation.Because when existing ant colony clustering algorithm similarity function value changes, putting down probability function and picking up probability function is not very remarkable with the variation that its value change is taken place, make and put down probability function and pick up the random chance that the probability function value can not produce greater than system for a long time, can not put down object immediately when causing having the load ant that object should be put down, when should picking up object, non-loaded ant can not pick up object immediately, make the object in the two dimensional surface not form cluster result by its similarity, thereby directly influenced the speed of convergence and the cluster result of clustering algorithm with fast speeds.
Summary of the invention
The purpose of this invention is to provide a kind of code book sorting technique and code book sorter thereof based on the ant colony clustering algorithm, unordered with the arrangement mode that solves code word in the exhaustive search vector quantizer code book, the code word hunting zone is big, the problem that time complexity is high,
The technical scheme that the present invention takes in order to address the above problem is as follows:
A kind of code book sorting technique based on the ant colony clustering algorithm, based on adpedance spectral frequency parameter, this method comprises:
The code book assorting process:
With the code book of code book algorithm for design design, use code book sorting technique to be categorized as the subcode book based on the ant colony clustering algorithm, each subcode book is represented with a sub-code book eigenwert;
The code book rearrangement process:
With each subcode book of code book sorting technique classification, by combining the composition and classification code book with identical the putting in order of subcode book eigenwert.
Ant colony clustering algorithm described in the technique scheme of the present invention is to adopt following similarity function:
Figure BSA00000249041300031
(1) in the formula: d (o i, o j) be data o iWith data o jBetween Euclidean distance; d MAX(o i, o j) be data o iInterior and the o of place cluster radius r iBetween maximum Euclidean distance, d MAX(o i, o j) in o jBe data o iInterior and the o of place cluster radius r iBetween have the data of maximum Euclidean distance; α is for regulating the parameter of similarity between data object, α=4 in (1) formula;
Picking up probability function and putting down probability function in the described ant colony clustering algorithm, it is picked up probability function and is expressed as follows:
Figure BSA00000249041300032
Similarity f determines according to formula (1) in the formula (2-4); B=0.3, k 1=11.11;
Putting down probability function is expressed as follows:
Figure BSA00000249041300033
Similarity f determines according to formula (1) in the formula (3-6); k 2=11.11;
The disposal route in isolated point in the described ant colony clustering algorithm and atypia classification district is as follows
Its isolated point disposal route is to reclassify with the arest neighbors criterion;
Its atypia classification district disposal route is to merge with arest neighbors criterion and other classification district;
The random chance scope of described ant colony clustering algorithm be according to formula (2-4) and (3-6) statistics of calculated value determine.
The code book sorter of a kind of code book sorting technique based on the ant colony clustering algorithm described in the technique scheme of the present invention, this device comprises subcode book eigenwert unit, subcode book code word number unit and classification code book list unit;
Described subcode book eigenwert unit is the subcode book eigenwert that storage obtains with the code book sorting technique, be used for the vector to be quantified of input is carried out the position that code book divides time-like to determine the subcode book, subcode book eigenwert unit is arranged in code book classification vector quantizer means cell encoder; Described subcode book code word number unit is the code word number that each subcode school bag that storage obtains with the code book sorting technique is drawn together, be used for vector to be quantified to input and carry out code book and divide the position that time-like determines the subcode book and the scope of subcode book, subcode book code word number unit is arranged in code book classification vector quantizer means cell encoder; Described classification code book list unit be storage by the subcode book by the code book that the series arrangement identical with subcode book eigenwert location contents obtains, be positioned at code book classification vector quantizer means cell encoder and decoder element.
Code book classification vector quantizer described in the technique scheme of the present invention is the vector quantizer that comprises the code book sorter; Code book classification vector quantizer means is to be made of cell encoder and decoder element; The cell encoder of described code book classification vector quantizer means comprises code book sorter and code book classification quantitative module, code book classification quantitative module is used for the vector to be quantified of input is determined the corresponding quantitative vector at the code book sorter, and the quantization index of quantization vector is write code stream; The decoder element of described code book classification vector quantizer means comprises classification code book list unit and decoder module, described decoder module is used to receive the quantization index that is sent to decoder element by code stream, and searches for the corresponding reconstructed vector that quantizes the input vector to be quantified of index value in classification code book list unit.
The present invention a kind of code book sorting technique and code book sorter thereof based on the ant colony clustering algorithm, the code book sorting technique adopts the ant colony clustering algorithm, compare with existing ant colony clustering algorithm, the present invention proposes new picking up probability function and put down probability function, and make the random chance span and pick up probability function and put down the probability function value and combine, thereby accelerated convergence of algorithm speed, improved the cluster performance.
Compare with the existing code book of LBG algorithm design that adopts, code book through the classification of code book sorting technique, the code book size is identical, code word is identical, but marked change has taken place in putting in order of code word, code word with identical subcode book eigenwert is divided into a sub-code book, and each subcode book leaves in the code book sorter by combining the composition and classification code book with identical the putting in order of subcode book eigenwert.When quantizing, time complexity when the complexity that increases during for search code book classified information based on the code book classification vector quantizer quantization complexity of code book sorting technique and code book sorter adds in subcode book search quantization vector, because subcode book hunting zone adds code book classified information hunting zone and will be lower than the hunting zone of classification code book, time complexity when therefore, this code book classification vector quantizer quantizes has obtained significantly reducing.
For example, when AMR-WB voice AMR-WB encryption algorithm ISF parameter one-level is quantized, if adopt code book classification vector quantizer of the present invention to quantize, the example that is quantified as with 7 dimension sub-vectors, the code book size of 7 dimension sub-vectors is 256, if the classified information number with code book behind the optimization ant colony clustering algorithm automatic cluster is 14 classes, then the present invention is 28 floating-point storage unit in the storage demand that encoder-side increases, reduced much but in the classification code book, obtain the required code book hunting zone of quantization vector, thereby made quantization complexity obtain significantly reducing.The code word number that each the subcode school bag that forms with one of them cluster result is drawn together is an example, the code word number that its each subcode school bag is drawn together be respectively 39,35,13,16,15,11,7,8,52,17,15,13,7,8}.As seen, the code word number that comprised of each subcode book is unequal.Time complexity when the code word number that each subcode book is comprised has determined vector quantization, code word number is few more, and the code word hunting zone is more little, and the time complexity during quantification is low more.When code book classification vector quantizer quantizes, at first need will input 7 dimension sub-vectors estimate with Euclidean distance with each subcode book eigenwert and judge, with definite subcode book and position thereof, in the subcode book, further determine its quantized value then with the exhaustive search algorithm.If the time complexity of quantizer is measured with the number of times that adds (subtracting) method, multiplication and comparison operation, and represent the code book size with N, k represents the quantization vector dimension, the time complexity that then adopts the exhaustive search vector quantizer to need is 3Nk-1, and these 3Nk-1 computing method are for well known to a person skilled in the art content.When adopting exhaustive search vector quantizer and code book classification vector quantizer to quantize respectively, the time complexity contrast is as follows:
Time complexity when adopting the exhaustive search vector quantizer:
3Nk-1=3 * 256 * 7-1=5375 (inferior/input vector)
When adopting code book classification vector quantizer maximum time complexity:
3 * 14 * 7-1+3 * 52 * 7-1=3 * 66 * 7-2=1384 (inferior/input vector)
Minimum time complexity when adopting code book classification vector quantizer:
3 * 14 * 7-1+3 * 7 * 7-1=3 * 21 * 7-2=439 (inferior/input vector)
It is as follows that time complexity when adopting code book classification vector quantizer is reduced to the number percent of exhaustive search at least:
1384/5375×100%=25.75%
It is as follows that time complexity when adopting code book classification vector quantizer is reduced to the number percent of exhaustive search at most:
439/5375×100%=8.17%。
Compare with existing classification vector quantizer, significantly difference is that the sorter that the present invention designs only need leave encoder-side in, and decoder end does not need to deposit.Subcode book characteristic value information relevant with the code book sorter and subcode book code word number information do not need to be sent to decoder end yet, therefore, do not take quantization bit, like this under the identical situation of quantization bit, can save a part of bit number, make it be used for the other parts of encryption algorithm.In speech coding algorithm,, the quantization bit of quantizer is reduced, thereby the whole algorithm code rate is reduced if the used bit number of remainder is identical.
In sum, a kind of code book sorting technique and code book sorter thereof of the present invention based on the ant colony clustering algorithm, the code book sorting technique adopts the ant colony clustering algorithm, compared with prior art, the present invention proposes new picking up probability function and put down probability function, and make the random chance span and pick up probability function and put down the probability function value and combine, thereby accelerated convergence of algorithm speed, improved the cluster performance.Be made of the subcode book through code book sorted code book, subcode book putting in order in the classification code book puts in order identically with subcode book eigenwert, and the subcode book is made of the code word with identical subcode book eigenwert.When vector quantizer quantizes, eigenwert and subcode book code word number information by the subcode book, vector quantizer code word hunting zone has been limited to a sub-code book of classification code book, thereby reduced vector quantizer code word hunting zone, reduced the time complexity when vector quantizer quantizes, and classified information need not transmit at channel, so not increase of quantization bit, and vector quantizer quantizes the effect that performance has reached transparent quantification.
Description of drawings
Fig. 1 is the code book sorting technique theory diagram based on the ant colony clustering algorithm that the embodiment of the invention provides;
Fig. 2 be the embodiment of the invention provide pick up probability function curvilinear motion rule comparison diagram;
Fig. 3 be the embodiment of the invention provide put down probability function curvilinear motion rule comparison diagram;
Fig. 4 is the ant colony clustering algorithm principle block diagram that the embodiment of the invention provides;
Fig. 5 is the code book sorter structural representation that the embodiment of the invention provides;
Fig. 6 is the code book classification vector quantizer means structural representation that the embodiment of the invention provides;
Fig. 7 is the multistage division vector quantizer apparatus structure synoptic diagram that comprises the code book sorter that the embodiment of the invention provides.
Embodiment
Below a kind of code book sorting technique and code book sorter thereof based on the ant colony clustering algorithm of the present invention is further detailed.
Embodiment 1
The specific implementation process that the code book sorting technique that the present invention is based on the ant colony clustering algorithm is used for the code book classification comprises that code book classification and code book reset two processes.
Fig. 1 shows the code book sorting technique theory diagram based on optimization ant colony clustering algorithm that the embodiment of the invention provides, dividing time-like with optimizing the ant colony clustering algorithm to code book, the data to be clustered that the ant colony clustering algorithm is optimized in input are the code book that designs in advance, the method of design code book can be used LBG code book algorithm for design, and LBG code book algorithm for design is a content as well known to those skilled in the art.
Comprise the steps: based on the code book assorting process of optimizing the ant colony clustering algorithm
Step 1. initiation parameter.Comprise cluster maximum cycle N MAX, ant bears the maximum mobile number of times thresholding N of object 1MAX, the ant number, two dimensional surface, the initial value of cluster radius etc. is set.
The present invention establishes N MAX=20000, N 1MAX=200, the ant number is 1, and two dimensional surface is limited to (0~100,0~100) zone, and cluster radius r is taken as 3.
Step 2. is mapped to data object to be clustered on the two dimensional surface at random, gives a coordinate at the original two-dimensional plane domain at random each data object, and generates a random chance p who is distributed between the 0-60% r
Step 3. is placed into ant on the two dimensional surface of localized area scope at random, and the ant original state is set is not load.
Step 4. is pressed following formula (1) and is calculated similarity parameter f (o i);
Figure BSA00000249041300061
(1) d (o in the formula i, o j) for being mapped to two data o on the two dimensional surface iWith o jBetween Euclidean distance, d MAX(o i, o j) be data o iInterior and the o of place cluster radius r iBetween maximum Euclidean distance, o jBe data o iInterior and the o of place cluster radius r iBetween have the data of maximum Euclidean distance.α is for regulating the parameter of similarity between data object, and its value has determined the number and the convergent speed of cluster, and when α was big more, similarity degree was big more between object, perhaps made not too identical object be classified as a class, and its clusters number is few more, and speed of convergence is also fast more; Otherwise α is more little, and similarity degree is more little between object, a big class may be divided into many groups under extreme case.Clusters number increases simultaneously, and speed of convergence is slack-off; The present invention finally gets α=4 according to experimental result repeatedly.
If the current state of step 5. ant is not load, then pick up probability by picking up probability function calculating, and judge and whether pick up probability greater than random chance, if greater than ant would be picked up the object of its position, the beginning mobile object, otherwise ant moves to other position at random, jumps to step 4 again and calculates the similarity parameter.
Existing one not the random motion ant of the load probability function of picking up of picking up an object following several form is arranged:
(1) LF/Deneubourg basic model
p p = ( k 1 k 1 + f ) 2 - - - ( 2 - 1 )
(2) Sigmoid function
p p = 1 1 + e k 1 f - - - ( 2 - 2 )
(3) piecewise function
p p=1-k 1f (2-3)
Above k in 3 formulas 1Be threshold constant, value according to actual needs in use, and k in 3 formulas 1Value generally different.
The present invention will pick up probability function and be defined as following formula (2-4), and wherein parameter f is the similarity of determining according to formula (1); B, k 1Be threshold constant, its value size influence convergence of algorithm speed, and satisfies to get in similarity and hour pick up probability and should be the bigger the better, and along with the increase gradually of similarity, picking up probability then should be more and more littler.Repeatedly experiment finds that the coefficient of similarity that calculates according to formula (1) generally is distributed in the scope of 0≤f≤0.3, and therefore, the present invention gets b=0.3, then k 1=11.11.
Figure BSA00000249041300073
Fig. 2 has provided four of formulas (2-1) to (2-4) and has picked up the variation rule curve that probability function changes with similarity f.As can be seen, existing pick up the probability function curve ratio with first three, when similarity is distributed in 0≤f≤0.3 scope, the probability function of picking up that the present invention provides reduces from probability 100% gradually with the f increase, when f is tending towards 0.3, pick up probability and be tending towards 0, after f is greater than 0.3, picks up probability and all become 0.Find in the experiment, the situation of f>0.3 can not appear in f substantially that calculate according to formula (1), this just makes picks up the probability function value and changes and to combine with formula (1) similarity parameter variation range, helps that ant is picked up with fast speed and the bigger object of distinctiveness ratio on every side.
If the current state of step 6. ant is a load condition, then calculate object and put down probability, and judge and whether put down probability greater than random chance by the following probability function that puts down, if greater than ant would be put down object, and put the ant state and be load condition not, go to step 7; Otherwise ant bears object continues to move to new Data Position, and ant bears same movement of objects number of times and adds 1, judges that whether ant bears same movement of objects number of times greater than thresholding N 1MAXIf, greater than then ant bears same movement of objects number of times puts 0, ant is put down object, and puts the ant state and be load condition not, goes to step 7, otherwise goes to step 4.
The present invention bears same movement of objects number of times by ant is set, and judges that in this step whether ant bears same movement of objects number of times greater than thresholding N 1MAX, can prevent that ant from bearing same object always and can not find extended position, program enters infinite loop, surpasses threshold value N if ant bears same movement of objects number of times 1MAX, then no matter put down the probability condition and whether satisfy, also must put down object.
The probability function that puts down that the ant of the random motion of an existing load is put next object is defined as following several form:
(1) piecewise function
p d = k 2 f f < 1 k 2 0 f &le; 0 1 f = 1 - - - ( 3 - 1 )
(2) LF basic model
Figure BSA00000249041300082
(3) Sigmoid function
p d = 1 1 + e - k 2 f - - - ( 3 - 3 )
(4) Deneubourg basic model
p d = ( f k 2 + f ) 2 - - - ( 3 - 4 )
(5) LF improved model
p d = f f < 1.0 1 f &GreaterEqual; 1 - - - ( 3 - 5 )
Wherein parameter f is the similarity of determining according to formula (1); k 2Be thresholding (threshold value) constant, value according to actual needs, and k in above-mentioned 5 formulas in use 2Value generally different.As can be seen, formula (3-1), formula (3-2) all are to change by the straight line rule with formula (3-5), and just straight slope is different, and LF basic model straight slope is 2, and LF improved model straight slope is 1, and piecewise function exists
Figure BSA00000249041300086
The time, straight slope is k 2Below further during contrast, only contrast with the LF improved model.
The Changing Pattern that puts down probability function should be that similarity is big more, and it is also big more to put down probability; Similarity is more little, and it is also more little to put down probability.For this reason, the present invention considers that increase progressively curve with secondary realizes, and will put down probability function and be defined as formula (3-6) form:
Figure BSA00000249041300091
The Changing Pattern of formula (3-6) quafric curve can satisfy and pick up probability demands in similarity hour, along with the increase of similarity, and the coefficient k of front 2Value directly influence probability and change k 2Too little, it is too slow with f increase increase that secondary increases progressively curve, k 2Too big, secondary increases progressively curve, and then increase is too fast with the f increase, can be tending towards 100% very soon and pick up probability.But, therefore should make f>0.3 o'clock, p because formula (1) similarity variation range is substantially in 0≤f≤0.3 interval d=100%, k like this 2=11.11.
Fig. 3 has provided the existing several formulas (3-6) of putting down probability function and the new proposition of the present invention of formula (3-3) to (3-5) and has put down the change curve of probability function with similarity.As can be seen, in 0≤f≤0.3 o'clock, increase with f, formula (3-6) put down probability function increase the fastest, in f>0.3 o'clock, formula (3-6) is put down probability and is become 100%, other several typical curves o'clock still slowly change in f>0.3, and experimental result shows that the situation of f>0.3 can not appear in f substantially that calculate according to formula (1), like this at f near 0.3 o'clock, if it is too little to put down probability, be unfavorable for that then ant puts down object fast.Therefore, formula (3-6) is put down the probability function Changing Pattern and can be improved ant and put down the speed of bearing object, and this helps improving the operational efficiency of algorithm, avoids ant to bear object for a long time and fails to lay down.
Step 7. is composed and is given the new Data Position of ant, generates a random chance p between the 0-60% once more r, cycle index adds 1, if cycle index is greater than maximum cycle N MAX, finish the cluster circulation, go to step 8, otherwise go to step 4;
Random chance p between the described 0-60% rBe to determine according to the random chance scope of ant colony clustering algorithm, the random chance scope of ant colony clustering algorithm is to determine according to the calculated value statistics of formula (2-4) and formula (3-6).
Its calculated value statistical method is that the result of calculation of formula (2-4) and formula (3-6) is added up, and statistics gained result shows p dAnd p pValue is distributed in the 0-0.6 scope, the situation greater than 0.6 do not occur.If random chance p rValue is still between existing 0-100%, then make and put down probability function and pick up the random chance that the probability function value can not produce greater than system for a long time, can not put down object immediately when causing having the load ant that object should be put down, when should picking up object, non-loaded ant can not pick up object immediately, make the object in the two dimensional surface not form cluster result by its similarity, thereby influenced convergence of algorithm speed and cluster result with fast speeds.
Step 8. is divided the subspace with cluster result by cluster radius.In the partition process, belong to the situation of a plurality of subspaces simultaneously, further increase the Euclidean distance criterion and decide final classification results at an input vector.The unfiled isolated point that is occurred during cluster, adopt the arest neighbors criterion to handle, promptly judge and the nearest input vector of this isolated point according to the Euclidean distance criterion, and this isolated point and nearest input vector be divided into a class, if nearest input vector also is an isolated point, a then newly-built classification that comprises these two isolated points.
Described isolated point is meant that cluster finishes the input vector that does not belong to any subspace that the back is occurred, and also is wild point.
Fig. 4 is described, optimizes ant colony clustering algorithm end of run, and the code book assorting process finishes, and then carries out the code book rearrangement process, and the code book rearrangement process may further comprise the steps
Step (1) is calculated the input vector number of each non-NULL subspace, and solves the barycenter of each subspace; Described non-NULL subspace is meant that this subspace has an input vector at least.
Step (2) merges with arest neighbors criterion and other classification district atypia classification district, promptly to the subspace of input vector number less than certain particular value, adopts the arest neighbors criterion to handle.This particular value is to determine according to the number of input vector, is 256 as the number of input vector of the present invention, and the input vector number in its atypia classification district is 5.
Described atypia classification district is meant the subspace of input vector number less than certain particular value.
It is after finding subspace with its centroid distance minimum by Euclidean distance, two sub spaces input vectors to be merged into a class that described employing arest neighbors criterion is handled; This step is in order to prevent that some input vector numbers from forming the atypia barycenter less than the subspace of certain particular value, therefore, merges subspace and other subspace of input vector number less than certain particular value.
Described other classification district is meant the non-NULL subspace beyond the atypia classification district.
Step (3) recomputates the input vector number of each non-NULL subspace, and solves the barycenter of each subspace.
Step (4) stores together input vector and forms a sub-code book by the subspace under it, the eigenwert of subcode book is represented with the barycenter of corresponding subspace.
Step (5) gets up to form the classification code book with each subcode book by the series arrangement identical with the eigenwert of subcode book.
Above-mentioned steps 1 to 8, and step (1) to (5) the complete procedure that constitutes in the lump that the embodiment of the invention provides based on the code book sorting technique of ant colony clustering algorithm, wherein, the embodiment that step 1 can be used as also to step 8 that the embodiment of the invention provides based on the code book assorting process of ant colony clustering algorithm, the embodiment that step (1) can be used as also to step (5) that the embodiment of the invention provides based on the code book rearrangement process of ant colony clustering algorithm.
Embodiment 2
Now the code book sorter of a kind of code book sorting technique based on the ant colony clustering algorithm that the embodiment of the invention is provided is elaborated as follows:
The code book sorter structural representation based on the code book sorting technique of ant colony clustering algorithm that Fig. 5 shows that the embodiment of the invention provides, this device comprises subcode book eigenwert unit, subcode book code word number unit and classification code book list unit.
Wherein the subcode book eigenwert of subcode book eigenwert unit storage is used for that the vector of importing to be quantified is carried out code book and divides time-like to determine the position of subcode book; The entry address of the corresponding subcode book of input vector to be quantified in classification code book list unit determined jointly with the code word number that each subcode school bag of subcode book code word number unit storage is drawn together in this subcode book eigenwert position; This entry address subcode book code word number corresponding with the subcode book eigenwert position of determining determined the exit address of the corresponding subcode book of input vector to be quantified in classification code book list unit jointly.
Subcode book code word number unit is the code word number that each subcode school bag that storage obtains with the code book sorting technique is drawn together, be arranged in the cell encoder of code book classification vector quantizer means, be used for vector to be quantified to input and carry out code book and divide the position that time-like determines the subcode book and the scope of subcode book;
Classification code book list unit is the code book that storage is obtained by the series arrangement identical with subcode book eigenwert location contents by the subcode book, be positioned at code book classification vector quantizer means cell encoder and decoder element, be used for providing the classification code book to the code book classification quantitative module of code book classification vector quantizer.
The calculating that each parts in Fig. 5 device relate to, the formula and the computing method that are provided in the code book sorting technique based on the ant colony clustering algorithm in the embodiment of the invention all can be provided, a kind of embodiment that can be used as that the embodiment of the invention provides based on the code book sorter of the code book sorting technique of ant colony clustering algorithm
Embodiment 3
Fig. 6 shows the code book classification vector quantizer means structural representation based on the ant colony clustering algorithm that the embodiment of the invention provides, this device comprises cell encoder and decoder element, wherein cell encoder comprises: code book sorter and code book classification quantitative module, decoder element comprise classification rearrangement code book unit and decoder module.
Cell encoder carries out vector quantization to importing vector to be quantified successively, and during quantification, each component function is as follows:
The code book sorter is determined corresponding subcode book scope when being used for vector quantization, after subcode book scope is determined, import the subcode book hunting zone of vector correspondence to be quantified and determine.Subcode book hunting zone determines that method is: import the subcode book entry address of vector correspondence to be quantified and the code word between the exit address.
Code book classification quantitative module, at first in the code book sorter, determine and the corresponding subcode book scope of input vector to be quantified, after determining, subcode book scope further in the subcode book, determines code word again with input vector distance minimum to be quantified, this code word is exactly the quantization vector of input vector to be quantified at cell encoder, the position of this code word in classification code book list unit is exactly the quantization index of input vector to be quantified, further quantization index write afterwards in the code stream that scrambler provides.
Decoder element is used for being sent to according to channel the reconstructed vector of the code stream acquisition input vector to be quantified of decoder end.Each ingredient in the decoder element is as follows:
The code book unit is reset in classification, is used for storage classification code book, and inquiry is provided during for decoder decode.
Decoder module, the code stream analyzing that is sent to decoder end according to channel goes out quantization index, inquires about the reconstructed vector of input vector to be quantified in classification code book list unit according to quantization index.
The calculating that each parts in Fig. 6 device relate to, all can use the formula and the computing method that are provided in the code book sorting technique in the embodiment of the invention, a kind of embodiment that can be used as that the embodiment of the invention provides based on the code book classification vector quantizer means of ant colony clustering algorithm based on the ant colony clustering algorithm.
Embodiment 1
The quantization parameter that adopts in the present embodiment is the 16 dimension adpedance spectral frequency ISF that AMR-WB AMR-WB speech coder uses, the quantization method that adopts is a two-stage division vector quantizer, two-stage division vector quantizer one-level in this example adopts the code book classification vector quantizer based on the ant colony clustering algorithm when quantizing, still adopt the exhaustive search vector quantizer during second level division vector quantization, be called the two-stage division vector quantizer that comprises the code book sorter, need to carry out in advance the first step and second step of following narration before this quantizer quantizes, this quantizer quantizing process comprises the 3rd step and the 4th step of following narration.Further describe below:
The first step: code book design process.
The LBG algorithm is adopted in the code book design.LBG code book algorithm for design is for well known to a person skilled in the art content.To AMR-WB wideband speech coding algorithm 16 dimension quantization parameter ISF, coding mode 1 to 8 o'clock, 9 dimension sub-vector code book sizes and 7 dimension sub-vector code book sizes used during first order division vector quantization all are 256, during second level division vector quantization, at first 9 dimension sub-vectors are split into 3 dimension sub-vectors, 7 dimension sub-vectors are split into 3 peacekeepings, 4 dimension sub-vectors, 5 sub-vector code book sizes are respectively 64,128,128,32,32.
Second step: code book classification rearrangement process.
As can be seen, 5 sub-vector code books sizes were big when used 9 dimension sub-vector code books and 7 were tieed up the sub-vector code books all than the second level division vector quantization during first order division vector quantization, therefore, only adopt code book classification vector quantizer to realize to first order division vector quantization, during second level division vector quantization 5 sub-vector code books are still adopted the exhaustive search algorithm, so only need to carry out code book classification rearrangement process and handle, specifically may further comprise the steps 9 dimension sub-vector code books and 7 dimension sub-vector code books
(1) carries out the code book classification with the ant colony clustering algorithm.9 dimension sub-vector code books of first step design are the input vectors of ant colony clustering algorithm.Ant colony clustering algorithm end of run, the code word arrangements that has identical subcode book barycenter in the 9 dimension sub-vector code books has formed 9 dimension subcode books together, and the barycenter of 9 dimension subcode books is exactly the eigenwert of corresponding 9 dimension subcode books.Similar, 7 dimension sub-vector code books, be re-used as the input vector of ant colony clustering algorithm, operation ant colony clustering algorithm, during the algorithm end of run, the code word arrangements that has identical subcode book barycenter in the 7 dimension sub-vector code books has formed 7 dimension subcode books together, and the barycenter of 7 dimension subcode books is exactly the eigenwert of corresponding 7 dimension subcode books.
(2) process of code book rearrangement.The 9 subcode books of tieing up sub-vectors that form are got up by the series arrangement identical with 9 corresponding dimension subcode book eigenwerts, constitute final 9 dimension classification code books, the code word number that each the 9 dimension subcode school bag that forms are drawn together by with the corresponding identical series arrangement of 9 dimension subcode book eigenwerts.Similar, the 7 subcode books of tieing up sub-vectors that form are got up by the series arrangement identical with 7 corresponding dimension subcode book eigenwerts, constitute final 7 dimension classification code books, the code word number that each 7 dimension subcode school bag is drawn together by with the corresponding identical series arrangement of 7 dimension subcode book eigenwerts.
(3) storing process of classified information and rearrangement code book.
The 9 dimension classification code books that constitute are deposited in 9 dimension classification code book list units, and the code word number that each 9 dimension subcode school bag is drawn together leaves 9 dimension subcode book code word number unit in, and the eigenwert of each 9 dimension subcode book then leaves 9 dimension subcode book eigenwert unit in; During concrete enforcement, more than content in 3 unit must be corresponding one by one.
Similar, code word number that 7 dimension classification code books, 7 dimension subcode school bags are drawn together and 7 dimension subcode book eigenwerts then leave 7 dimension classification code book list units, 7 dimension subcode book code word number unit and 7 dimension subcode book eigenwert unit respectively in, and the content in 3 unit also must be corresponding one by one.
Above-mentioned storage unit all is arranged in the two-stage division vector quantization apparatus appropriate section that comprises the code book sorter of AMR-WB wideband speech coding algorithm.
The 3rd step: the cataloged procedure of importing vector to be quantified.
During coding, at first 16 dimension inputs vector to be quantified is deducted the vector average, obtain importing residual error vector to be quantified, wherein vector average training in advance obtains; Again 16 dimension inputs residual error vector to be quantified is split into 9 dimension sub-vectors and 7 dimension sub-vectors, to 9 dimension sub-vectors, code book sorter 1 provides the code book classified information to code book classification quantitative module 1, code book classification quantitative module 1 utilizes Euclidean distance to estimate the quantization vector of determining 9 dimension sub-vectors in the subcode book of classification code book list unit 1 according to the information that provides, and the quantization index I of this quantization vector in classification code book 1 11
To 7 dimension sub-vectors, code book sorter 2 provides the code book classified information to code book classification quantitative module 2, code book classification quantitative module 2 utilizes Euclidean distance to estimate the quantization vector of determining 7 dimension sub-vectors in the subcode book of classification code book list unit 2 according to the information that provides, and the quantization index I of this quantization vector in classification code book 2 12
When the second level quantizes, at first 9 dimension sub-vectors are deducted the quantization vector of 9 dimension sub-vectors, obtain the residual error vector of 9 dimension sub-vectors, division vector quantization coding module 1 further is split into 33 dimension sub-vectors with 9 dimension sub-vectors, quantization vector and quantization index I that 33 dimension sub-vectors utilize Euclidean distance to estimate respectively to adopt the exhaustive search algorithm to determine 33 dimension sub-vectors in secondary code book 1, secondary code book 2 and secondary code book 3 21, I 22And I 23
Similar, to 7 dimension sub-vectors, it also is the quantization vector that at first 7 dimension sub-vectors is deducted 7 dimension sub-vectors, obtain the residual error vector of 7 dimension sub-vectors, division vector quantization coding module 2 then is split into 7 dimension sub-vectors 13 dimension sub-vector and 14 dimension sub-vector, and 3 dimension sub-vectors and 4 dimension sub-vectors further utilize Euclidean distance to estimate respectively to adopt the exhaustive search algorithm to determine the quantization vector and the quantization index I of 2 sub-vectors in secondary code book 4 and secondary code book 5 24And I 25
End-of-encode is with quantization index I 11, I 12, I 21, I 22, I 23, I 24And I 25Write code stream.
The 4th step: the decode procedure of importing vector to be quantified.
During decoding, division vector quantization decoder module is sent to the code stream of decoder element according to channel, parse all quantization index, and in classification code book 1, classification code book 2, secondary code book 1, secondary code book 2, secondary code book 3, secondary code book 4 and secondary code book 5, inquire about the quantization vector of corresponding sub-vector respectively, again 16 dimension ISF parameters are respectively tieed up component one-level quantization vector, secondary quantization vector and mean value vector addition, obtain importing the reconstructed vector of vector to be quantified.
Fig. 7 shows the two-stage division vector quantizer apparatus structure synoptic diagram that comprises the code book sorter, this device comprises cell encoder and decoder element, and wherein cell encoder comprises: mean value vector unit, 3 totalizers, code book sorter 1, code book sorter 2, code book classification quantitative module 1, code book classification quantitative module 2, division vector quantization coding module 1, division vector quantization coding module 2 and secondary code book unit.Decoder element comprises mean value vector unit, secondary code book unit, classification code book list unit and division vector quantization decoder module.The calculating that each parts in this device relate to, the formula and the computing method that are provided in code book sorting technique in the embodiment of the invention and the quantizing process all can be provided, comprise a kind of embodiment of the two-stage division vector quantization apparatus of code book sorter as the embodiment of the invention.
Cell encoder carries out vector quantization to importing vector to be quantified successively, and each ingredient is described as follows in the cell encoder:
The mean value vector unit is used to store the average that 16 dimension adpedance spectral frequency ISF respectively tie up component, needs training in advance to obtain, for well known to a person skilled in the art content.
Totalizer 1, being used for that input vector to be quantified is respectively tieed up component and mean value vector unit respectively ties up the average of component and subtracts each other, and will subtract each other the residual error sub-vector 1 that the 1st dimension component to the 9 dimension components that obtain form and offer code book sorter 1, will subtract each other the residual error sub-vector 2 that the 10th dimension component to the 16 dimension components that obtain form and offer code book sorter 2.
Totalizer 2, the quantized value that is used for residual error sub-vector 1 and code book classification quantitative module 1 are quantized the residual error sub-vector 1 that obtains subtracts each other, and obtains the quantification residual error of residual error sub-vector 1, and offers division vector quantization coding module 1.
Totalizer 3, the quantized value that is used for residual error sub-vector 2 and code book classification quantitative module 2 are quantized the residual error sub-vector 2 that obtains subtracts each other, and obtains the quantification residual error of residual error sub-vector 2, and offers division vector quantization coding module 2.
Code book sorter 1 comprises subcode book eigenwert unit 1, subcode book code word number unit 1 and classification code book list unit 1, and each Elementary Function is: subcode book eigenwert unit 1 is used to provide the subcode book eigenwert of 9 dimension sub-vectors; Subcode book code word number unit 1 is used to provide 9 to tie up the code word number that each subcode school bag of sub-vector are drawn together; Classification code book list unit 1 is used to provide the classification code book that is made of 9 each subcode book of dimension sub-vector.More than 3 location contents when determining jointly residual error sub-vector 1 quantized in the subcode book hunting zone of classification code book 1.
Code book sorter 2 comprises subcode book eigenwert unit 2, subcode book code word number unit 2 and classification code book list unit 2, and each Elementary Function is: subcode book eigenwert unit 2 is used to provide the subcode book eigenwert of 7 dimension sub-vectors; Subcode book code word number unit 2 is used to provide 7 to tie up the code word number that each subcode school bag of sub-vector are drawn together; Classification code book list unit 2 is used to provide the classification code book that is made of 7 each subcode book of dimension sub-vector.More than 3 location contents when determining jointly residual error sub-vector 2 quantized in the subcode book hunting zone of classification code book 2.
Code book classification quantitative module 1, the residual error sub-vector 1 that is used for providing according to code book sorter 1 is determined the quantization vector and the quantization index I of residual error sub-vector 1 in the subcode book hunting zone of classification code book 1 11, and quantization index write code stream, quantization vector is offered totalizer 2.
Code book classification quantitative module 2, the residual error sub-vector 2 that is used for providing according to code book sorter 2 is determined the quantization vector and the quantization index I of residual error sub-vector 2 in the subcode book hunting zone of classification code book 2 12, and quantization index write code stream, quantization vector is offered totalizer 3.
The quantification residual error of the residual error sub-vector 1 that provides according to totalizer 2 is provided division vector quantization coding module 1, adopts the division vector quantization method to determine 3 quantization vectors and quantization index I with the exhaustive search quantization method from preceding 3 the secondary code books in secondary code book unit 21, I 22And I 23, and quantization index write code stream.
The quantification residual error of the residual error sub-vector 2 that provides according to totalizer 3 is provided division vector quantization coding module 2, adopts the division vector quantization method to determine 2 quantization vectors and quantization index I with the exhaustive search quantization method 2 secondary code books behind secondary code book unit 24And I 25, and quantization index write code stream.
Division vector quantization coding module 1 and division vector quantization coding module 2 described division vector quantization methods are for well known to a person skilled in the art content.
Secondary code book unit, storing five secondary code books is that secondary code book 1 is to secondary code book 5, for division vector quantization coding module 1 and division vector quantization coding module 2 provide the code book that uses when quantizing.The concrete training method of five secondary code books adopts LBG code book algorithm for design, for well known to a person skilled in the art content.
The code stream decoding that decoder element is sent to decoder end according to channel successively obtains importing the reconstructed vector of vector to be quantified, and each ingredient in the decoder element is described as follows:
The mean value vector unit is used to store the average that 16 dimension adpedance spectral frequency ISF respectively tie up component, and is identical with content in the mean value vector unit in the cell encoder, offers division vector quantization decoder module.
Secondary code book unit, storing five secondary code books is that secondary code book 1 arrives secondary code book 5, identical with five secondary code book contents in the cell encoder, for division vector quantization decoder module provides inquiry.
Classification code book list unit is used for storage classification code book 1 and classification code book 2, for division vector quantization decoder module provides inquiry.
Division vector quantization decoder module, be used for being sent to the code stream of decoder end according to channel, parse the quantization index value, inquiry residual error sub-vector 1 and residual error sub-vector 2 quantize the quantized value of residual error in each code book of secondary code book unit, the quantized value of inquiry residual error sub-vector 1 and residual error sub-vector 2 in each code book of classification code book list unit, and with the quantized value of residual error sub-vector 1 quantification residual error and the quantized value addition of residual error sub-vector 1, again with the mean value vector unit before 9 dimensions respectively tie up the average summation of component, residual error sub-vector 2 is quantized the quantized value of residual error and the quantized value addition of residual error sub-vector 2, again with the mean value vector unit after 7 dimensions respectively tie up the average summation of component, preceding 9 dimension summed result and back 7 dimension summed result combine, and obtain the reconstructed vector of 16 dimension inputs vector to be quantified.
The two-stage division vector quantization apparatus that will comprise the code book sorter is used for the AMR-WB algorithm, by following experimental result, the effect that code book sorting technique that the embodiment of the invention provides and code book sorter thereof are obtained is described:
At present, the spectrum distortion parameter is to estimate a kind of objective evaluation standard that quantizes tolerance voltinism energy.In speech coding algorithm, make and not introduce any additional appreciable distortion in the encoded voice, general requirements vector quantizer quantized spectrum distortion needs to reach the quality index requirement of transparent quantification, that is: the averaging spectrum distortion is about 1dB; The averaging spectrum distortion is tending towards 0 greater than the number percent of 4dB frame; The averaging spectrum distortion is about 2% at the number percent of 2~4dB scope frame; There is not the averaging spectrum distortion to surpass the speech frame of 4dB.Spectrum distortion value when following table 1 has provided the two-stage division vector quantizer quantification that comprises the code book sorter, what wherein the code book sorting technique adopted is optimization ant colony clustering algorithm of the present invention, and the code book number of categories of 9 dimension sub-vector code books and 7 dimension sub-vector code books all have 4,6,12 and 14 totally 4 classes respectively.As can be seen, adopt code book sorter of the present invention after, the distortion of two-stage division vector quantizer quantized spectrum has reached the effect of transparent quantification.
Table 2 has provided AMR-WB algorithm reconstructed speech w-PESQ value.What S-MSVQ represented to adopt in the table is multistage division vector quantization method, is the two-stage division, when its column data representation AMR-WB algorithm adopts the S-MSVQ quantization method to the ISF parameter quantification, and the w-PESQ value of reconstructed speech; What code book number of categories column data representation AMR-WB algorithm adopted during to the ISF parameter quantification is the multistage division vector quantizer that comprises the code book sorter, the w-PESQ value of its reconstructed speech, wherein first 4 expression, 9 dimension sub-vector code books divide for 4 sub-code books in 4/4, second 4 expression 7 dimension sub-vector code books also divide for 4 sub-code books, and 6/6,12/12 and 14/14 represents same meaning; Comprise the multistage division vector quantizer reconstructed speech w-PESQ value of code book sorter and the difference of S-MSVQ reconstructed speech w-PESQ value with the employing of S-MSVQ difference column data representation.
As can be seen, compare with employing S-MSVQ quantization method, the code book number of categories is 4 classes and 6 time-likes, the average reconstructed speech w-PESQ value of two-stage division 9 kinds of patterns of vector quantizer that comprises the code book sorter has raising slightly, the code book number of categories is 12 and 14 time-likes, 9 kinds of average reconstructed speech w-PESQ values of pattern have decline slightly, but descend and the amplitude that improves and few, subjective auditory perception fundamental sensation less than with the difference of former algorithm decoded speech quality.Wherein coding mode is 2, number of categories is 6 o'clock, the w-PESQ value improve at most, be 0.037; Coding mode is 0, number of categories is 14 time-likes, the w-PESQ value descend at most, be-0.134.Hence one can see that, and the number of categories of code book sorter can not be too many, otherwise can cause the remarkable decline of reconstructed speech quality.
Above-described specific embodiment further describes purpose of the present invention, technical scheme and beneficial effect, to be used for helping to understand method of the present invention and thought thereof; But the above only is specific embodiments of the invention; and be not intended to limit the scope of the invention, for one of ordinary skill in the art, according to thought of the present invention; any modification of being done, be equal to replacement, improvement etc., all should be included in protection scope of the present invention.
Table 1
Figure BSA00000249041300161
Table 2
Figure BSA00000249041300171

Claims (9)

1. code book sorting technique based on the ant colony clustering algorithm, based on adpedance spectral frequency parameter, this method comprises the steps:
The code book assorting process:
The code book of code book algorithm for design design uses the code book sorting technique based on the ant colony clustering algorithm to be categorized as the subcode book, and each subcode book is represented with a sub-code book eigenwert;
The code book rearrangement process:
With each subcode book of code book sorting technique classification, by combining the composition and classification code book with identical the putting in order of subcode book eigenwert.
2. the method for claim 1, its ant colony clustering algorithm are to adopt following similarity function:
Figure FSA00000249041200011
(1) in the formula: d (o i, o j) be data o iWith data o jBetween Euclidean distance; d MAX(o i, o j) be data o iInterior and the o of place cluster radius r iBetween maximum Euclidean distance, d MAX(o i, o j) in o jBe data o iInterior and the o of place cluster radius r iBetween have the data of maximum Euclidean distance; α is for regulating the parameter of similarity between data object, α=4 in (1) formula.
3. method as claimed in claim 1 or 2, picking up probability function and putting down probability function in its ant colony clustering algorithm is expressed as follows:
Pick up probability function:
Figure FSA00000249041200012
Similarity f determines according to formula (1) in the formula (2-4); B=0.3, k 1=11.11;
Put down probability function:
Figure FSA00000249041200013
Similarity f determines according to formula (1) in the formula (3-6); k 2=11.11.
4. method as claimed in claim 1 or 2, the disposal route in isolated point in its ant colony clustering algorithm and atypia classification district is as follows:
Its isolated point disposal route is to reclassify with the arest neighbors criterion;
Its atypia classification district disposal route is to merge with arest neighbors criterion and other classification district.
5. method as claimed in claim 1 or 2, the random chance scope in its ant colony clustering algorithm are to determine according to the calculated value statistics of formula (2-4) and formula (3-6).
6. code book sorter based on the code book sorting technique of ant colony clustering algorithm, this device comprises subcode book eigenwert unit, subcode book code word number unit and classification code book list unit;
Described subcode book eigenwert unit is the subcode book eigenwert that storage obtains with the code book sorting technique, be used for the vector to be quantified of input is carried out the position that code book divides time-like to determine the subcode book, subcode book eigenwert unit is arranged in the cell encoder of code book classification vector quantizer means;
Described subcode book code word number unit is the code word number that each subcode school bag that storage obtains with the code book sorting technique is drawn together, be used for vector to be quantified to input and carry out code book and divide the position that time-like determines the subcode book and the scope of subcode book, subcode book code word number unit is arranged in the cell encoder of code book classification vector quantizer means;
Classification code book list unit be the storage by the subcode book by the code book that the series arrangement identical with subcode book eigenwert location contents obtains, be positioned at code book classification vector quantizer means cell encoder and decoder element.
7. code book sorter as claimed in claim 6, its code book classification vector quantizer is the vector quantizer that comprises the code book sorter; Code book classification vector quantizer means is to be made of cell encoder and decoder element.
8. as claim 6 and 7 described code book sorters, its cell encoder comprises code book sorter and code book classification quantitative module, code book classification quantitative module is used for the vector to be quantified of input is determined the corresponding quantitative vector at the code book sorter, and the quantization index of quantization vector is write code stream.
9. as claim 6 and 7 described code book sorters, its decoder element comprises classification code book list unit and decoder module, described decoder module is used to receive the quantization index that is sent to decoder element by code stream, and searches for the corresponding reconstructed vector that quantizes the input vector to be quantified of index value in classification code book list unit.
CN201010267156.8A 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof Expired - Fee Related CN101944358B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010267156.8A CN101944358B (en) 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010267156.8A CN101944358B (en) 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof

Publications (2)

Publication Number Publication Date
CN101944358A true CN101944358A (en) 2011-01-12
CN101944358B CN101944358B (en) 2014-04-09

Family

ID=43436318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010267156.8A Expired - Fee Related CN101944358B (en) 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof

Country Status (1)

Country Link
CN (1) CN101944358B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222098A (en) * 2011-06-20 2011-10-19 北京邮电大学 Method and system for pre-fetching webpage
CN103297766A (en) * 2012-02-23 2013-09-11 中兴通讯股份有限公司 Compression method and device of peak data in three-dimensional image data
CN104050963A (en) * 2014-06-23 2014-09-17 东南大学 Continuous speech emotion prediction algorithm based on emotion data field
CN104459686A (en) * 2014-12-30 2015-03-25 南京信息工程大学 Target detection and tracking method based on Hough transformation and ant colony similarity
CN106156841A (en) * 2016-06-24 2016-11-23 武汉理工大学 A kind of k means data processing method based on minimax pheromone
CN112435674A (en) * 2020-12-09 2021-03-02 北京百瑞互联技术有限公司 Method, apparatus, medium for optimizing LC3 arithmetic coding search table of spectrum data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179944A1 (en) * 2005-11-23 2007-08-02 Henry Van Dyke Parunak Hierarchical ant clustering and foraging
CN101266621A (en) * 2008-04-24 2008-09-17 北京学门科技有限公司 High dimension sparse data clustering system and method
CN101414365A (en) * 2008-11-20 2009-04-22 山东大学威海分校 Vector code quantizer based on particle group

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179944A1 (en) * 2005-11-23 2007-08-02 Henry Van Dyke Parunak Hierarchical ant clustering and foraging
CN101266621A (en) * 2008-04-24 2008-09-17 北京学门科技有限公司 High dimension sparse data clustering system and method
CN101414365A (en) * 2008-11-20 2009-04-22 山东大学威海分校 Vector code quantizer based on particle group

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
MARTENS, D.;DE BACKER, M.; ET AL.: "Classification With Ant Colony Optimization", 《IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 》, 31 October 2007 (2007-10-31), pages 651 - 665, XP011193173, DOI: doi:10.1109/TEVC.2006.890229 *
胡宏梅; 董恩清;: "人工蚁群聚类码书设计算法", 《通信技术》, 31 July 2007 (2007-07-31) *
胡宏梅; 董恩清;: "基于蚁群聚类的码书设计", 《苏州大学学报(工科版)》, 30 April 2007 (2007-04-30) *
陈雪勤; 赵鹤鸣; 俞一彪;: "蚁群聚类神经网络的耳语音声调识别", 《应用科学学报》, 31 October 2008 (2008-10-31) *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222098A (en) * 2011-06-20 2011-10-19 北京邮电大学 Method and system for pre-fetching webpage
CN103297766A (en) * 2012-02-23 2013-09-11 中兴通讯股份有限公司 Compression method and device of peak data in three-dimensional image data
US9509973B2 (en) 2012-02-23 2016-11-29 Zte Corporation Method and device for compressing vertex data in three-dimensional image data
CN103297766B (en) * 2012-02-23 2016-12-14 中兴通讯股份有限公司 The compression method of vertex data and device in a kind of 3 d image data
CN104050963A (en) * 2014-06-23 2014-09-17 东南大学 Continuous speech emotion prediction algorithm based on emotion data field
CN104050963B (en) * 2014-06-23 2017-02-15 东南大学 Continuous speech emotion prediction method based on emotion data field
CN104459686A (en) * 2014-12-30 2015-03-25 南京信息工程大学 Target detection and tracking method based on Hough transformation and ant colony similarity
CN106156841A (en) * 2016-06-24 2016-11-23 武汉理工大学 A kind of k means data processing method based on minimax pheromone
CN112435674A (en) * 2020-12-09 2021-03-02 北京百瑞互联技术有限公司 Method, apparatus, medium for optimizing LC3 arithmetic coding search table of spectrum data

Also Published As

Publication number Publication date
CN101944358B (en) 2014-04-09

Similar Documents

Publication Publication Date Title
CN101944358B (en) Ant colony algorithm-based codebook classification method and codebook classification device thereof
CN107622182B (en) Method and system for predicting local structural features of protein
CN103116762B (en) A kind of image classification method based on self-modulation dictionary learning
CN108717439A (en) A kind of Chinese Text Categorization merged based on attention mechanism and characteristic strengthening
CN101145787A (en) A vector quantification method and vector quantifier
CN109977212A (en) Talk with the reply content generation method and terminal device of robot
CN108921343A (en) Based on storehouse self-encoding encoder-support vector regression traffic flow forecasting method
CN101937680B (en) Vector quantization method for sorting and rearranging code book and vector quantizer thereof
CN102915445A (en) Method for classifying hyperspectral remote sensing images of improved neural network
CN102012977A (en) Signal peptide prediction method based on probabilistic neural network ensemble
CN113487855A (en) Traffic flow prediction method based on EMD-GAN neural network structure
CN110263343A (en) The keyword abstraction method and system of phrase-based vector
Palo et al. Classification of emotional speech of children using probabilistic neural network
CN113343640B (en) Method and device for classifying customs commodity HS codes
CN101923650A (en) Random forest classification method and classifiers based on comparison mode
CN111967483A (en) Method and device for determining classifier, determining generator and recognizing command
CN106898357B (en) A kind of vector quantization method based on normal distribution law
CN103310275B (en) Based on the Novel codebook design method of ant colony clustering and genetic algorithm
Zhao et al. Urban short-term traffic flow prediction based on stacked autoencoder
Rashno et al. Highly efficient dimension reduction for text-independent speaker verification based on relieff algorithm and support vector machines
CN113673219B (en) Power failure plan text analysis method
CN115329116A (en) Image retrieval method based on multi-layer feature fusion
Choi et al. Short-utterance embedding enhancement method based on time series forecasting technique for text-independent speaker verification
CN113096378A (en) Highway OD prediction method based on depth set empirical mode decomposition
Kawęcka et al. NA61/SHINE online noise filtering using machine learning methods

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140409

Termination date: 20210827