CN103824086A - Image matching method and device - Google Patents

Image matching method and device Download PDF

Info

Publication number
CN103824086A
CN103824086A CN201410110048.8A CN201410110048A CN103824086A CN 103824086 A CN103824086 A CN 103824086A CN 201410110048 A CN201410110048 A CN 201410110048A CN 103824086 A CN103824086 A CN 103824086A
Authority
CN
China
Prior art keywords
subregion
matched
template
image
histogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410110048.8A
Other languages
Chinese (zh)
Inventor
郎芬玲
万定锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netposa Technologies Ltd
Original Assignee
Netposa Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netposa Technologies Ltd filed Critical Netposa Technologies Ltd
Priority to CN201410110048.8A priority Critical patent/CN103824086A/en
Publication of CN103824086A publication Critical patent/CN103824086A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The application provides an image matching method. The image matching method comprises the following steps: segmenting an input template image by a computer to obtain template sub-areas which are not superposed and are the same in size, and generating the histogram of each template sub-area; segmenting images to be matched in an image database to obtain sub-areas to be matched, which are not superposed and are the same in size, and generating the histogram of each sub-area to be matched; according to the features of the histogram of each template sub-area, matching the histograms of the template sub-areas with the histograms of the sub-areas to be matched in the corresponding to-be-matched sub-area group one by one to obtain the matching result; and when the matching result meets preset similar conditions, judging that the template image and the images to be matched are similar, or else, judging that the template image and the images to be matched are not similar. Hence, the matching result is obtained by the computer by matching the features of the histograms, thus judging whether the two images are similar or not; manual search is not needed any more during searching of the target image, thus saving labour and search time, and increasing working efficiency.

Description

A kind of image matching method and device
Technical field
The application relates to image processing field, particularly a kind of image matching method and device.
Background technology
Along with the Large scale construction of " safe city ", view data storage capacity sharply increases, and image in image data base is all unordered, indexless.
At present, in " safe city " system, want to find target image mainly to rely on manually and search, due to the image volume in image data base is large and image be unordered, without index, therefore manually search and waste time and energy, inefficiency.
Summary of the invention
For solving the problems of the technologies described above, the embodiment of the present application provides a kind of image matching method and device, and to reach saving manpower, save and search the time, the object of increasing work efficiency, technical scheme is as follows:
A kind of image matching method, comprising:
Computing machine is cut apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion; And,
Any image to be matched in divided image data storehouse, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched, and the number of described template subregion is identical with the number of described subregion to be matched; And,
Extract the histogrammic feature of each template subregion and the histogrammic feature of each subregion to be matched; And,
Carry out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines matching result; And,
Judge whether described matching result meets default simulated condition, if so, determine described template image and described image similarity to be matched;
Otherwise, determine described template image and described image dissmilarity to be matched.
Preferably, described computing machine is cut apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogrammic process of each template subregion, comprising:
Computing machine judges whether the template image of input is coloured image;
If so, change described template image into gray scale template image, cut apart described gray scale template image, obtain non-overlapping copies, big or small identical template subregion, and generate the histogram of each template subregion;
If not, cut apart the template image of described input, obtain non-overlapping copies, big or small identical template subregion, and generate the histogram of each template subregion;
Or any image to be matched in described divided image data storehouse, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogrammic process of each subregion to be matched, comprising:
Computing machine judges whether described image to be matched is coloured image;
If so, change described image to be matched into gray scale image to be matched, cut apart described gray scale image to be matched, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched;
If not, cut apart described image to be matched, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched.
Preferably, extract the process of the histogrammic feature of any one template subregion, comprising:
The histogram of level and smooth this template subregion, removes burr or kick in the histogram of this template subregion, generates the secondary histogram of this template subregion;
Record the distribution of the histogrammic crest of secondary, trough and half crest of this template subregion, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this template subregion;
Or, extract the process of the histogrammic feature of any one subregion to be matched, comprising:
The histogram of level and smooth this subregion to be matched, removes burr or kick in the histogram of this subregion to be matched, generates the secondary histogram of this subregion to be matched;
Record the distribution of the histogrammic crest of secondary, trough and half crest of this subregion to be matched, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this subregion to be matched.
Preferably, the histogrammic feature of each template subregion, mates one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines the process of matching result, comprising:
The histogrammic coupling mark of the histogrammic feature of calculating successively each template subregion and each subregion to be matched in corresponding subregion group to be matched;
Analyze and mate mark described in corresponding each of each template subregion and whether be less than predetermined threshold value, determine the subregion to be matched matching with the histogrammic feature of corresponding template subregion in the each self-corresponding subregion group to be matched of each template subregion, as the subregion preferred to be matched of each template subregion;
From each template subregion subregion preferred to be matched separately, to choose with the region of each template subregion histogrammic feature difference minimum separately as matching area, composition coupling is right;
Determine the number that coupling is right.
Preferably, the number of described template subregion and the number of described subregion to be matched are 9.
Preferably, judge that whether described matching result meets the process of default simulated condition, comprising:
Judge whether the number that described coupling is right is greater than 5.
A kind of image matching apparatus, comprising:
First cuts apart module, for cutting apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion;
Second cuts apart module, for any image to be matched in divided image data storehouse, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched, the number of described template subregion is identical with the number of described subregion to be matched;
Extraction module, for extracting the histogrammic feature of each template subregion and the histogrammic feature of each subregion to be matched;
Matching module, for carrying out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines matching result;
Judge module, for judging whether described matching result meets default simulated condition, if so, carries out the first determination module, if not, carries out the second determination module;
The first determination module, for determining described template image and described image similarity to be matched;
The second determination module, for determining described module map picture and described image dissmilarity to be matched.
Preferably, described first cuts apart module and comprises:
Whether the first judging unit, be coloured image for the template image that judges input, if so, carries out the first converting unit, if not, carries out the first cutting unit;
The first converting unit, for changing described template image into gray scale template image, cuts apart described gray scale template image, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion;
The first cutting unit, for cutting apart the template image of described input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion;
Or described second cuts apart module comprises:
The second judging unit, for judging whether described image to be matched is coloured image, if so, carries out the second converting unit, if not, carries out the second cutting unit;
The second converting unit, for changing described image to be matched into gray scale image to be matched, cuts apart described gray scale image to be matched, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched;
The second cutting unit, for cutting apart described image to be matched, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched.
Preferably, described extraction module comprises:
The first generation unit, for the histogram of level and smooth this template subregion, removes burr or kick in the histogram of this template subregion, generates the secondary histogram of this template subregion;
The first record cell, for recording the distribution of the histogrammic crest of secondary, trough and half crest of this template subregion, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this template subregion;
The second generation unit, for the histogram of level and smooth this subregion to be matched, removes burr or kick in the histogram of this subregion to be matched, generates the secondary histogram of this subregion to be matched;
The second record cell, for recording the distribution of the histogrammic crest of secondary, trough and half crest of this subregion to be matched, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this subregion to be matched.
Preferably, described matching module comprises:
Computing unit, the histogrammic coupling mark for the histogrammic feature of calculating successively each template subregion with each subregion to be matched of corresponding subregion group to be matched;
Analytic unit, mate mark described in corresponding each of each template subregion and whether be less than predetermined threshold value for analyzing, determine the subregion to be matched matching with the histogrammic feature of corresponding template subregion in the each self-corresponding subregion group to be matched of each template subregion, as the subregion preferred to be matched of each template subregion;
Choose unit, for from each template subregion subregion preferred to be matched separately, choose with the region of each template subregion histogrammic feature difference minimum separately as matching area, composition coupling is right;
The 3rd determining unit, for the number of determining that coupling is right.
Compared with prior art, the application's beneficial effect is:
In this application, cut apart the template image of input by computing machine, generate the histogram of each template subregion; And image to be matched in divided image data storehouse, generate the histogram of each subregion to be matched; By carrying out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched respectively, obtains matching result; In the time that matching result meets default simulated condition, judge template image and image similarity to be matched, otherwise dissimilar.
Visible, computing machine utilizes the feature of matching histogram, obtains matching result, thereby judges that whether two width images are similar, in the time that target image is searched, no longer needs manually to search, and has saved manpower, has saved the time of searching, and has improved work efficiency.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present application, below the accompanying drawing of required use during embodiment is described is briefly described, apparently, accompanying drawing in the following describes is only some embodiment of the application, for those of ordinary skills, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is the process flow diagram of a kind of search method embodiment 1 of a kind of process flow diagram the application of the image matching method that provides of the application;
Fig. 2 is the interface schematic diagram of a kind of product information in a kind of sub-process figure the application of the image matching method that provides of the application;
Fig. 3 is the interface schematic diagram of the classification returning of search engine in another kind of sub-process figure the application of the image matching method that provides of the application;
Fig. 4 is the process flow diagram of a kind of search method embodiment 2 of another sub-process figure the application of the image matching method that provides of the application;
Fig. 5 is the process flow diagram of a kind of search method embodiment 3 of another sub-process figure the application of the image matching method that provides of the application;
Fig. 6 is the structural representation of query classifier input and input content in another sub-process figure the application of the image matching method that provides of the application;
Fig. 7 is the structured flowchart of a kind of indexing unit embodiment 1 of a kind of structural representation the application of the image matching apparatus that provides of the application;
Fig. 8 is the first structured flowchart of cutting apart a kind of indexing unit embodiment 2 of a kind of structural representation the application of module that the application provides;
Fig. 9 is the second structured flowchart of cutting apart a kind of indexing unit embodiment 3 of a kind of structural representation the application of module that the application provides;
Figure 10 is a kind of structural representation of the extraction module that provides of the application;
Figure 11 is a kind of structural representation of the matching module that provides of the application.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present application, the technical scheme in the embodiment of the present application is clearly and completely described, obviously, described embodiment is only the application's part embodiment, rather than whole embodiment.Based on the embodiment in the application, those of ordinary skills are not making the every other embodiment obtaining under creative work prerequisite, all belong to the scope of the application's protection.
An embodiment
Refer to Fig. 1, a kind of process flow diagram that it shows the image matching method that the application provides, can comprise the following steps:
Step S11: computing machine is cut apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion.
In the present embodiment, computing machine is cut apart the template image of input, obtains non-overlapping copies, big or small identical subregion.The present embodiment, the subregion that the template image of cutting apart input is obtained is called template subregion.
Step S12: any image to be matched in computing machine divided image data storehouse, obtains non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched.
In the present embodiment, any image to be matched in computing machine divided image data storehouse, obtains non-overlapping copies, big or small identical subregion.The present embodiment, is called subregion to be matched by cutting apart the subregion that image to be matched obtains.
In the present embodiment, the number of described template subregion is identical with the number of described subregion to be matched.
Wherein, the number of template subregion and the number of subregion to be matched can adjust accordingly according to the situation of enriching of image texture information.
Step S13: computing machine extracts the histogrammic feature of each template subregion and the histogrammic feature of each subregion to be matched.
Step S14: computing machine is carried out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines matching result.
In the present embodiment, due to the histogrammic feature carrying out each template subregion, in the process of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, if after the histogrammic feature Satisfying Matching Conditions of one of them subregion to be matched and some template subregions, the subregion to be matched of Satisfying Matching Conditions will no longer participate in follow-up operation, and the number of the subregion to be matched in the subregion group to be matched that therefore different template subregions is corresponding may be different.The histogrammic feature of each template subregion only need to be mated one by one with the histogram of each subregion to be matched in each self-corresponding subregion group to be matched.
For example, template image is divided into 3 template subregions, image to be matched is divided into 3 subregions to be matched.3 template subregions are respectively A1, A2 and A3, and 3 subregions to be matched are respectively B1, B2 and B3.Now carry out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, be specially: first carry out B1, B2 in the histogrammic feature of the A1 subregion group to be matched corresponding with A1 and the histogram of B3 and mate one by one, if the match is successful for B1 and A1, B1 will no longer participate in follow-up computing; When the histogram of carrying out each subregion to be matched in the histogrammic feature of the A2 subregion group to be matched corresponding with A2 mates one by one, subregion to be matched in subregion group to be matched corresponding to A2 becomes B2 and B3, if any one all fails B2 and B3 successfully to mate with A2, the subregion to be matched in the subregion group to be matched that A3 is corresponding is still B2 and B3.
Step S15: computing machine judges whether described matching result meets default simulated condition.
In the present embodiment, meet default simulated condition if computing machine judges described matching result, perform step S16, otherwise, execution step S17.
Step S16: determine described template image and described image similarity to be matched.
Step S17: determine described template image and described image dissmilarity to be matched.
Shown in Fig. 1 is template image and any process that image to be matched mates in image data base, certainly template image can mate with all images to be matched in image data base, therefore in image data base, may have several images similar to template image.But because the process that in image data base, any image to be matched mates with template image is all identical, process as shown in Figure 1, no longer sets forth with the process that each image to be matched in image data base mates template image at this.
In this application, cut apart the template image of input by computing machine, generate the histogram of each template subregion; And image to be matched in divided image data storehouse, generate the histogram of each subregion to be matched; By carrying out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched respectively, obtains matching result; In the time that matching result meets default simulated condition, judge template image and image similarity to be matched, otherwise dissimilar.
Visible, computing machine utilizes the feature of matching histogram, obtains matching result, thereby judges that whether two width images are similar, in the time that target image is searched, no longer needs manually to search, and has saved manpower, has saved the time of searching, and has improved work efficiency.
Another embodiment
In the present embodiment, show computing machine and cut apart the template image of input, obtain non-overlapping copies, big or small identical template subregion, and generate the histogrammic process of each template subregion, and any image to be matched in computing machine divided image data storehouse, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogrammic process of each subregion to be matched.
Refer to Fig. 2, Fig. 2 shows a kind of sub-process figure of the image matching method that the application provides, and can comprise the following steps:
Step S21: computing machine judges whether the template image of input is coloured image.
If the template image of computing machine judgement input is coloured image, perform step S22, otherwise execution step S23.
Step S22: change described template image into gray scale template image, cut apart described gray scale template image, obtain non-overlapping copies, big or small identical template subregion, and generate the histogram of each template subregion.
Step S23: cut apart the template image of described input, obtain non-overlapping copies, big or small identical template subregion, and generate the histogram of each template subregion.
Refer to Fig. 3, Fig. 3 shows the another kind of sub-process figure of the image matching method that the application provides, and can comprise the following steps:
Step S31: computing machine judges whether described image to be matched is coloured image.
If computing machine judges that described image to be matched is coloured image, perform step S32, otherwise execution step S33.
Step S32: change described image to be matched into gray scale image to be matched, cut apart described gray scale image to be matched, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched.
Step S33: cut apart described image to be matched, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched.
Another embodiment
In the present embodiment, illustrate be extract template subregion histogrammic feature process and extract the process of the histogrammic feature of subregion to be matched.
Because the leaching process of the histogrammic feature of each template subregion is identical, therefore in the present embodiment, only the leaching process of the histogrammic feature to any one template subregion is described.
Refer to Fig. 4, another sub-process figure that it shows the image matching method that the application provides, can comprise the following steps:
Step S41: the smoothly histogram of this template subregion, remove burr or kick in the histogram of this template subregion, generate the secondary histogram of this template subregion.
Step S42: record the distribution of the histogrammic crest of secondary, trough and half crest of this template subregion, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this template subregion.
In the present embodiment, a crest comprises reference position (section of going up a slope), final position (lower slope section) and peak.One and half crests comprise crest location, reference position or final position.
In the present embodiment, the more than ten that increases continuously of pixel value can be considered as to reference position, the continuous decline more than ten of pixel value is considered as final position.
The distribution of recording crest, trough and half crest is specifically as follows the position of recording position, trough that crest occurs and occurring, the position that half crest occurs, the length of crest, highly, the length of trough, the degree of depth, length, height or the degree of depth of half crest.
Because the leaching process of the histogrammic feature of each subregion to be matched is identical, therefore in the present embodiment, only the leaching process of the histogrammic feature to any one subregion to be matched is described.
Refer to Fig. 5, another sub-process figure that it shows the image matching method that the application provides, can comprise the following steps:
Step S51: the smoothly histogram of this subregion to be matched, remove burr or kick in the histogram of this subregion to be matched, generate the secondary histogram of this subregion to be matched.
Step S52: record the distribution of the histogrammic crest of secondary, trough and half crest of this subregion to be matched, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this subregion to be matched.
In the present embodiment, a crest comprises reference position (section of going up a slope), final position (lower slope section) and peak.One and half crests comprise crest location, reference position or final position.
In the present embodiment, the more than ten that increases continuously of pixel value can be considered as to reference position, the continuous decline more than ten of pixel value is considered as final position.
The distribution of recording crest, trough and half crest is specifically as follows the position of recording position, trough that crest occurs and occurring, the position that half crest occurs, the length of crest, highly, the length of trough, the degree of depth, length, height or the degree of depth of half crest.
Another embodiment
In the present embodiment, what illustrate is the histogrammic feature of each template subregion, mate one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determine the process of matching result, refer to Fig. 6, another sub-process figure that it shows the image matching method that the application provides, can comprise the following steps:
Step S61: the histogrammic coupling mark of the histogrammic feature of calculating successively each template subregion and each subregion to be matched in corresponding subregion group to be matched.
Step S62: analyze and mate mark described in corresponding each of each template subregion and whether be less than predetermined threshold value, determine the subregion to be matched matching with the histogrammic feature of corresponding template subregion in the each self-corresponding subregion group to be matched of each template subregion, as the subregion preferred to be matched of each template subregion.
Step S63: from each template subregion subregion preferred to be matched separately, choose with the region of each template subregion histogrammic feature difference minimum separately as matching area, composition coupling is right.
Step S64: determine the number that coupling is right.
Now for example the detailed process shown in Fig. 6 is described, for example, the number of template subregion and the number of subregion to be matched are 3.3 template subregions are respectively A1, A2 and A3, and 3 subregions to be matched are respectively B1, B2 and B3.Computation sequence is followed successively by A1, A2 and A3, for the A1 of initial calculation, subregion to be matched in subregion group to be matched corresponding to A1 is B1, B2 and B3, first the histogrammic feature of calculating A1 and B1 in corresponding subregion group to be matched, the coupling mark of B2 and B3, obtaining A1 is 0.33 with the mark that mates of B1, A1 is 0.31 with the mark that mates of B2, A1 is 0.47 with the mark that mates of B3, suppose that predetermined threshold value is 0.4, the subregion preferred to be matched that B1 and B2 are A1, be less than 0.33 due to 0.31, therefore be B2 with the subregion preferred to be matched of A1 difference minimum, it is right that A1 mates with B2 composition, and B2 no longer participates in follow-up computing.Subregion to be matched in subregion group to be matched corresponding to A2 is B1 and B3.The principle of matching area of determining A2 and A3 is identical with A1, does not repeat them here.
It should be noted that, in the present embodiment, coupling mark is directly proportional to the difference in two regions, and the difference in two regions is less, and coupling mark is lower, and the difference in two regions is larger, and coupling mark is higher.
Certainly, if the difference that operation rule is coupling mark and two regions is inversely proportional to, difference is less, and coupling mark is higher, and difference is larger, and coupling mark is lower, changes accordingly the size of predetermined threshold value and the condition of analysis.
In this application, preferred, the number of template subregion and the number of subregion to be matched can be 9.
In the time that the number of template subregion and the number of subregion to be matched are 9, in conjunction with Fig. 6, judge that whether matching result meets the process of default simulated condition, is specifically as follows: judge whether the number that described coupling is right is greater than 5.
Be greater than 5 if mate right number, determine described template image and described image similarity to be matched.
Be not more than 5 if mate right number, determine described template image and described image dissmilarity to be matched.
For aforesaid each embodiment of the method, for simple description, therefore it is all expressed as to a series of combination of actions, but those skilled in the art should know, the application is not subject to the restriction of described sequence of movement, because according to the application, some step can adopt other orders or carry out simultaneously.Secondly, those skilled in the art also should know, the embodiment described in instructions all belongs to preferred embodiment, and related action and module might not be that the application is necessary.
An embodiment
Embodiment is corresponding with said method, and the application provides a kind of image matching apparatus, refers to Fig. 7, and shown in Fig. 7 is a kind of structural representation of the image matching apparatus that provides of the application, and image matching apparatus comprises:
First cuts apart module 71, second cuts apart module 72, extraction module 73, matching module 74, judge module 75, the first determination module 76 and the second determination module 77.
First cuts apart module 71, for cutting apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion.
In the present embodiment, the first concrete structure of cutting apart module 71 can be referring to Fig. 8, first a kind of structural representation of cutting apart module that it shows that the application provides, first cuts apart module 71 comprises: the first judging unit 81, the first converting unit 82 and the first cutting unit 83.
Whether the first judging unit 81, be coloured image for the template image that judges input, if so, carries out the first converting unit 82, if not, carries out the first cutting unit 83.
The first converting unit 82, for changing described template image into gray scale template image, cuts apart described gray scale template image, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion.
The first cutting unit 83, for cutting apart the template image of described input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion
Second cuts apart module 72, for any image to be matched in divided image data storehouse, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched, the number of described template subregion is identical with the number of described subregion to be matched.
In the present embodiment, the second concrete structure of cutting apart module 72 can be referring to Fig. 9, second a kind of structural representation of cutting apart module that it shows that the application provides, second cuts apart module 72 comprises: the second judging unit 91, the second converting unit 92 and the second cutting unit 93.
The second judging unit 91, for judging whether described image to be matched is coloured image, if so, carries out the second converting unit 92, if not, carries out the second cutting unit 93.
The second converting unit 92, for changing described image to be matched into gray scale image to be matched, cuts apart described gray scale image to be matched, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched.
The second cutting unit 93, for cutting apart described image to be matched, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched.
Extraction module 73, for extracting the histogrammic feature of each template subregion and the histogrammic feature of each subregion to be matched.
In the present embodiment, the concrete structure of extraction module 73 can be referring to Figure 10, it shows a kind of structural representation of the extraction module that the application provides, and extraction module 73 comprises: the first generation unit 101, the first record cell 102, the second generation unit 103 and the second record cell 104.
The first generation unit 101, for the histogram of level and smooth this template subregion, removes burr or kick in the histogram of this template subregion, generates the secondary histogram of this template subregion.
The first record cell 102, for recording the distribution of the histogrammic crest of secondary, trough and half crest of this template subregion, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this template subregion.
The second generation unit 103, for the histogram of level and smooth this subregion to be matched, removes burr or kick in the histogram of this subregion to be matched, generates the secondary histogram of this subregion to be matched.
The second record cell 104, for recording the distribution of the histogrammic crest of secondary, trough and half crest of this subregion to be matched, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this subregion to be matched.
Matching module 74, for carrying out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines matching result.
In the present embodiment, the concrete structure of matching module 74 can be referring to Figure 11, it shows a kind of structural representation of the matching module that the application provides, and matching module 74 comprises: computing unit 111, analytic unit 112, choose unit 113 and the 3rd determining unit 114.
Computing unit 111, the histogrammic coupling mark for the histogrammic feature of calculating successively each template subregion with each subregion to be matched of corresponding subregion group to be matched.
Analytic unit 112, mate mark described in corresponding each of each template subregion and whether be less than predetermined threshold value for analyzing, determine the subregion to be matched matching with the histogrammic feature of corresponding template subregion in the each self-corresponding subregion group to be matched of each template subregion, as the subregion preferred to be matched of each template subregion.
Choose unit 113, for from each template subregion subregion preferred to be matched separately, choose with the region of each template subregion histogrammic feature difference minimum separately as matching area, composition coupling is right.
The 3rd determining unit 114, for the number of determining that coupling is right.
Judge module 75, for judging whether described matching result meets default simulated condition, if so, carries out the first determination module 76, if not, carries out the second determination module 77.
The first determination module 76, for determining described template image and described image similarity to be matched.
The second determination module 77, for determining described module map picture and described image dissmilarity to be matched.
Wherein, judge module 75 specifically can be for judging whether the right number of coupling is greater than 5, if so, carries out the first determination module 76, if not, carries out the second determination module 77.
In this application, image matching apparatus can be computing machine, can certainly be integrated in computing machine, as a module of computing machine.
It should be noted that, each embodiment in this instructions all adopts the mode of going forward one by one to describe, and what each embodiment stressed is and the difference of other embodiment, between each embodiment identical similar part mutually referring to.For device class embodiment, because it is substantially similar to embodiment of the method, so description is fairly simple, relevant part is referring to the part explanation of embodiment of the method.
Finally, also it should be noted that, in this article, relational terms such as the first and second grades is only used for an entity or operation to separate with another entity or operational zone, and not necessarily requires or imply and between these entities or operation, have the relation of any this reality or sequentially.And, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thereby the process, method, article or the equipment that make to comprise a series of key elements not only comprise those key elements, but also comprise other key elements of clearly not listing, or be also included as the intrinsic key element of this process, method, article or equipment.The in the situation that of more restrictions not, the key element being limited by statement " comprising ... ", and be not precluded within process, method, article or the equipment that comprises described key element and also have other identical element.
For convenience of description, while describing above device, being divided into various unit with function describes respectively.Certainly, in the time implementing the application, the function of each unit can be realized in same or multiple software and/or hardware.
As seen through the above description of the embodiments, those skilled in the art can be well understood to the mode that the application can add essential general hardware platform by software and realizes.Based on such understanding, the part that the application's technical scheme contributes to prior art in essence in other words can embody with the form of software product, this computer software product can be stored in storage medium, as ROM/RAM, magnetic disc, CD etc., comprise that some instructions (can be personal computers in order to make a computer equipment, server, or the network equipment etc.) carry out the method described in some part of each embodiment of the application or embodiment.
A kind of image matching method and the device that above the application are provided are described in detail, applied principle and the embodiment of specific case to the application herein and set forth, the explanation of above embodiment is just for helping to understand the application's method and core concept thereof; , for one of ordinary skill in the art, according to the application's thought, all will change in specific embodiments and applications, in sum, this description should not be construed as the restriction to the application meanwhile.

Claims (10)

1. an image matching method, is characterized in that, comprising:
Computing machine is cut apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion; And,
Any image to be matched in divided image data storehouse, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched, and the number of described template subregion is identical with the number of described subregion to be matched; And,
Extract the histogrammic feature of each template subregion and the histogrammic feature of each subregion to be matched; And,
Carry out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines matching result; And,
Judge whether described matching result meets default simulated condition, if so, determine described template image and described image similarity to be matched;
Otherwise, determine described template image and described image dissmilarity to be matched.
2. method according to claim 1, is characterized in that, described computing machine is cut apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogrammic process of each template subregion, comprising:
Computing machine judges whether the template image of input is coloured image;
If so, change described template image into gray scale template image, cut apart described gray scale template image, obtain non-overlapping copies, big or small identical template subregion, and generate the histogram of each template subregion;
If not, cut apart the template image of described input, obtain non-overlapping copies, big or small identical template subregion, and generate the histogram of each template subregion;
Or any image to be matched in described divided image data storehouse, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogrammic process of each subregion to be matched, comprising:
Computing machine judges whether described image to be matched is coloured image;
If so, change described image to be matched into gray scale image to be matched, cut apart described gray scale image to be matched, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched;
If not, cut apart described image to be matched, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched.
3. method according to claim 1, is characterized in that, extracts the process of the histogrammic feature of any one template subregion, comprising:
The histogram of level and smooth this template subregion, removes burr or kick in the histogram of this template subregion, generates the secondary histogram of this template subregion;
Record the distribution of the histogrammic crest of secondary, trough and half crest of this template subregion, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this template subregion;
Or, extract the process of the histogrammic feature of any one subregion to be matched, comprising:
The histogram of level and smooth this subregion to be matched, removes burr or kick in the histogram of this subregion to be matched, generates the secondary histogram of this subregion to be matched;
Record the distribution of the histogrammic crest of secondary, trough and half crest of this subregion to be matched, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this subregion to be matched.
4. method according to claim 1, is characterized in that, the histogrammic feature of each template subregion is mated one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines the process of matching result, comprising:
The histogrammic coupling mark of the histogrammic feature of calculating successively each template subregion and each subregion to be matched in corresponding subregion group to be matched;
Analyze and mate mark described in corresponding each of each template subregion and whether be less than predetermined threshold value, determine the subregion to be matched matching with the histogrammic feature of corresponding template subregion in the each self-corresponding subregion group to be matched of each template subregion, as the subregion preferred to be matched of each template subregion;
From each template subregion subregion preferred to be matched separately, to choose with the region of each template subregion histogrammic feature difference minimum separately as matching area, composition coupling is right;
Determine the number that coupling is right.
5. method according to claim 4, is characterized in that, the number of described template subregion and the number of described subregion to be matched are 9.
6. method according to claim 5, is characterized in that, judges that whether described matching result meets the process of default simulated condition, comprising:
Judge whether the number that described coupling is right is greater than 5.
7. an image matching apparatus, is characterized in that, comprising:
First cuts apart module, for cutting apart the template image of input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion;
Second cuts apart module, for any image to be matched in divided image data storehouse, obtain non-overlapping copies, big or small identical subregion to be matched, and generate the histogram of each subregion to be matched, the number of described template subregion is identical with the number of described subregion to be matched;
Extraction module, for extracting the histogrammic feature of each template subregion and the histogrammic feature of each subregion to be matched;
Matching module, for carrying out the histogrammic feature of each template subregion, the step of mating one by one with the histogram of each subregion to be matched in corresponding subregion group to be matched successively, determines matching result;
Judge module, for judging whether described matching result meets default simulated condition, if so, carries out the first determination module, if not, carries out the second determination module;
The first determination module, for determining described template image and described image similarity to be matched;
The second determination module, for determining described module map picture and described image dissmilarity to be matched.
8. device according to claim 7, is characterized in that, described first cuts apart module comprises:
Whether the first judging unit, be coloured image for the template image that judges input, if so, carries out the first converting unit, if not, carries out the first cutting unit;
The first converting unit, for changing described template image into gray scale template image, cuts apart described gray scale template image, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion;
The first cutting unit, for cutting apart the template image of described input, obtains non-overlapping copies, big or small identical template subregion, and generates the histogram of each template subregion;
Or described second cuts apart module comprises:
The second judging unit, for judging whether described image to be matched is coloured image, if so, carries out the second converting unit, if not, carries out the second cutting unit;
The second converting unit, for changing described image to be matched into gray scale image to be matched, cuts apart described gray scale image to be matched, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched;
The second cutting unit, for cutting apart described image to be matched, obtains non-overlapping copies, big or small identical subregion to be matched, and generates the histogram of each subregion to be matched.
9. device according to claim 7, is characterized in that, described extraction module comprises:
The first generation unit, for the histogram of level and smooth this template subregion, removes burr or kick in the histogram of this template subregion, generates the secondary histogram of this template subregion;
The first record cell, for recording the distribution of the histogrammic crest of secondary, trough and half crest of this template subregion, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this template subregion;
The second generation unit, for the histogram of level and smooth this subregion to be matched, removes burr or kick in the histogram of this subregion to be matched, generates the secondary histogram of this subregion to be matched;
The second record cell, for recording the distribution of the histogrammic crest of secondary, trough and half crest of this subregion to be matched, the histogrammic feature using the distribution of the crest of recording, trough and half crest as this subregion to be matched.
10. device according to claim 7, is characterized in that, described matching module comprises:
Computing unit, the histogrammic coupling mark for the histogrammic feature of calculating successively each template subregion with each subregion to be matched of corresponding subregion group to be matched;
Analytic unit, mate mark described in corresponding each of each template subregion and whether be less than predetermined threshold value for analyzing, determine the subregion to be matched matching with the histogrammic feature of corresponding template subregion in the each self-corresponding subregion group to be matched of each template subregion, as the subregion preferred to be matched of each template subregion;
Choose unit, for from each template subregion subregion preferred to be matched separately, choose with the region of each template subregion histogrammic feature difference minimum separately as matching area, composition coupling is right;
The 3rd determining unit, for the number of determining that coupling is right.
CN201410110048.8A 2014-03-24 2014-03-24 Image matching method and device Pending CN103824086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410110048.8A CN103824086A (en) 2014-03-24 2014-03-24 Image matching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410110048.8A CN103824086A (en) 2014-03-24 2014-03-24 Image matching method and device

Publications (1)

Publication Number Publication Date
CN103824086A true CN103824086A (en) 2014-05-28

Family

ID=50759138

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410110048.8A Pending CN103824086A (en) 2014-03-24 2014-03-24 Image matching method and device

Country Status (1)

Country Link
CN (1) CN103824086A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361135A (en) * 2014-12-11 2015-02-18 浪潮电子信息产业股份有限公司 Image search method
CN104539864A (en) * 2014-12-23 2015-04-22 小米科技有限责任公司 Method and device for recording images
CN105872556A (en) * 2016-04-11 2016-08-17 华为技术有限公司 Video coding method and device
CN105989371A (en) * 2015-03-03 2016-10-05 香港中文大学深圳研究院 Grayscale normalization method and apparatus for nuclear magnetic resonance image
WO2017050083A1 (en) * 2015-09-23 2017-03-30 广州视源电子科技股份有限公司 Element identification method and device
CN106778860A (en) * 2016-12-12 2017-05-31 中国矿业大学 Image position method based on Histogram Matching
CN106898017A (en) * 2017-02-27 2017-06-27 网易(杭州)网络有限公司 Method, device and terminal device for recognizing image local area
CN107180479A (en) * 2017-05-15 2017-09-19 深圳怡化电脑股份有限公司 A kind of bill discrimination method, device, equipment and storage medium
WO2019075601A1 (en) * 2017-10-16 2019-04-25 厦门中控智慧信息技术有限公司 Palm vein recognition method and device
CN109739233A (en) * 2018-12-29 2019-05-10 歌尔股份有限公司 AGV trolley localization method, apparatus and system
WO2020063523A1 (en) * 2018-09-29 2020-04-02 北京国双科技有限公司 Image detection method and device
CN111222571A (en) * 2020-01-06 2020-06-02 腾讯科技(深圳)有限公司 Image special effect processing method and device, electronic equipment and storage medium
CN113838082A (en) * 2021-10-21 2021-12-24 平安普惠企业管理有限公司 Image processing method, device, equipment and storage medium
CN116563357A (en) * 2023-07-10 2023-08-08 深圳思谋信息科技有限公司 Image matching method, device, computer equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1932838A (en) * 2005-09-12 2007-03-21 电子科技大学 Vehicle plate extracting method based on skiagraphy and mathematical morphology
CN101231662A (en) * 2008-01-25 2008-07-30 华中科技大学 Distributed medical image retrieval system base on gridding platform
CN101872475A (en) * 2009-04-22 2010-10-27 中国科学院自动化研究所 Method for automatically registering scanned document images
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features
CN103514694A (en) * 2013-09-09 2014-01-15 重庆邮电大学 Intrusion detection monitoring system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1932838A (en) * 2005-09-12 2007-03-21 电子科技大学 Vehicle plate extracting method based on skiagraphy and mathematical morphology
CN101231662A (en) * 2008-01-25 2008-07-30 华中科技大学 Distributed medical image retrieval system base on gridding platform
CN101872475A (en) * 2009-04-22 2010-10-27 中国科学院自动化研究所 Method for automatically registering scanned document images
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features
CN103514694A (en) * 2013-09-09 2014-01-15 重庆邮电大学 Intrusion detection monitoring system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张德胜: "基于区域的图像检索方法研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361135A (en) * 2014-12-11 2015-02-18 浪潮电子信息产业股份有限公司 Image search method
CN104539864A (en) * 2014-12-23 2015-04-22 小米科技有限责任公司 Method and device for recording images
CN104539864B (en) * 2014-12-23 2018-02-02 小米科技有限责任公司 The method and apparatus for recording image
CN105989371A (en) * 2015-03-03 2016-10-05 香港中文大学深圳研究院 Grayscale normalization method and apparatus for nuclear magnetic resonance image
CN105989371B (en) * 2015-03-03 2019-08-23 香港中文大学深圳研究院 A kind of grey scale method and apparatus of nuclear magnetic resonance image
WO2017050083A1 (en) * 2015-09-23 2017-03-30 广州视源电子科技股份有限公司 Element identification method and device
CN105872556A (en) * 2016-04-11 2016-08-17 华为技术有限公司 Video coding method and device
CN105872556B (en) * 2016-04-11 2020-01-03 华为技术有限公司 Video encoding method and apparatus
CN106778860A (en) * 2016-12-12 2017-05-31 中国矿业大学 Image position method based on Histogram Matching
CN106898017A (en) * 2017-02-27 2017-06-27 网易(杭州)网络有限公司 Method, device and terminal device for recognizing image local area
CN106898017B (en) * 2017-02-27 2019-05-31 网易(杭州)网络有限公司 The method, apparatus and terminal device of image local area for identification
CN107180479A (en) * 2017-05-15 2017-09-19 深圳怡化电脑股份有限公司 A kind of bill discrimination method, device, equipment and storage medium
CN107180479B (en) * 2017-05-15 2020-10-20 深圳怡化电脑股份有限公司 Bill identification method, device, equipment and storage medium
WO2019075601A1 (en) * 2017-10-16 2019-04-25 厦门中控智慧信息技术有限公司 Palm vein recognition method and device
WO2020063523A1 (en) * 2018-09-29 2020-04-02 北京国双科技有限公司 Image detection method and device
CN109739233A (en) * 2018-12-29 2019-05-10 歌尔股份有限公司 AGV trolley localization method, apparatus and system
CN111222571A (en) * 2020-01-06 2020-06-02 腾讯科技(深圳)有限公司 Image special effect processing method and device, electronic equipment and storage medium
CN113838082A (en) * 2021-10-21 2021-12-24 平安普惠企业管理有限公司 Image processing method, device, equipment and storage medium
CN116563357A (en) * 2023-07-10 2023-08-08 深圳思谋信息科技有限公司 Image matching method, device, computer equipment and computer readable storage medium
CN116563357B (en) * 2023-07-10 2023-11-03 深圳思谋信息科技有限公司 Image matching method, device, computer equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN103824086A (en) Image matching method and device
Nandagopalan et al. A universal model for content-based image retrieval
CN102254015B (en) Image retrieval method based on visual phrases
CN102356393B (en) Data processing device
US20180129658A1 (en) Color sketch image searching
US20140105505A1 (en) Near duplicate images
CN106407311A (en) Method and device for obtaining search result
CN105955950A (en) New word discovery method and device
CN102890700A (en) Method for retrieving similar video clips based on sports competition videos
CN104573130A (en) Entity resolution method based on group calculation and entity resolution device based on group calculation
CN105824862A (en) Image classification method based on electronic equipment and electronic equipment
CN103577462A (en) Document classification method and document classification device
CN106228554A (en) Fuzzy coarse central coal dust image partition methods based on many attribute reductions
CN103500158A (en) Method and device for annotating electronic document
CN110990541A (en) Method and device for realizing question answering
CN103927342A (en) Vertical search engine system on basis of big data
CN103177105A (en) Method and device of image search
CN110781275A (en) Question answering distinguishing method based on multiple characteristics and computer storage medium
CN104410867A (en) Improved video shot detection method
CN103425748A (en) Method and device for mining document resource recommended words
US20150201104A1 (en) Three-dimensional image searching based on inputs collected by a mobile device
Le et al. Improving logo spotting and matching for document categorization by a post-filter based on homography
CN108090117A (en) A kind of image search method and device, electronic equipment
CN110472058B (en) Entity searching method, related equipment and computer storage medium
Memon et al. Region based localized matching image retrieval system using color-size features for image retrieval

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140528