CN103049735B - The equipment of certain objects in the method for certain objects and detected image in detected image - Google Patents

The equipment of certain objects in the method for certain objects and detected image in detected image Download PDF

Info

Publication number
CN103049735B
CN103049735B CN201110310765.1A CN201110310765A CN103049735B CN 103049735 B CN103049735 B CN 103049735B CN 201110310765 A CN201110310765 A CN 201110310765A CN 103049735 B CN103049735 B CN 103049735B
Authority
CN
China
Prior art keywords
boundary
outer boundary
energy
certain objects
characteristic parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110310765.1A
Other languages
Chinese (zh)
Other versions
CN103049735A (en
Inventor
刘殿超
师忠超
钟诚
刘童
王刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority to CN201110310765.1A priority Critical patent/CN103049735B/en
Publication of CN103049735A publication Critical patent/CN103049735A/en
Application granted granted Critical
Publication of CN103049735B publication Critical patent/CN103049735B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides the method for certain objects in a kind of detected image, comprising: interesting region estimating step, in the pending image of input, estimate the region comprising described certain objects, as area-of-interest; Feature determining step, determines the characteristic parameter of object in described area-of-interest; Object energy determining step, according to the energy of the characteristic parameter determination object of object; This object discrimination by the energy of determined object compared with predetermined threshold, if the energy of this object is more than or equal to this predetermined threshold, is then described certain objects by certain objects discriminating step.The present invention also correspondingly provides the equipment of certain objects in a kind of detected image.

Description

The equipment of certain objects in the method for certain objects and detected image in detected image
Technical field
The present invention relates to the equipment of certain objects in the method for certain objects in a kind of detected image and detected image.
Background technology
Along with the development of computer technology, also there is significant progress at the area of pattern recognition of image and video, also increasing to the demand of the technology of certain objects in detected image, and achieve certain achievement.
At area of pattern recognition, the effect of object detection depends on choosing of key feature or key feature combination.In recent years, a large amount of practical feature is applied in object detecting areas.But the method for identifying and classifying based on single features generally can obtain precision comparison low, have the recognition result that a lot of flase drop efficiency is higher.For cloud detection, the recognition methods based on solid color feature can recognize most cloud, but has that much have may by flase drop Cheng Yun with the object of cloud Similar color simultaneously.
Patent document 1 (US7,480,052B1) proposes the method detecting cloud in satellite cloud picture based on electromagnetic field spectrum information.Certain region in satellite cloud picture, by the reflected value in the bandwidth range of more at least three discrete electromagnetic field frequency spectrums, then compares the ratio value between them, thus obtains the result of determination of cloud detection.But patent document 1 is only special in satellite cloud picture, by adopting electromagnetic field frequency spectrum reflected value to determine cloud detection result, because of instead of a kind of cloud detection method of optic of broad scope, be of limited application.
Non-patent document 1 (Classificationofsatellitecloudimagerybasedonmulti-featur etextureanalysisandneuralnetworks, Christodoulou, C.I.; Michaelides, S.C.; Pattichis, C.S.; Kyriakou, K.; Dept.ofComput.Sci., Univ.ofCyprus, ImageProcessing, 2001, Proceedings, 2001InternationalConference, vol.1,497-500, based on the cloud classification method in the satellite cloud picture of multiple features texture analysis and neural network, university of Cyprus) propose a kind ofly to distinguish different cloud classification method.9 kinds of different texture characteristic sets (comprising 55 features altogether) are extracted, and train effective cloud Classifier by neural network.Textural characteristics in non-patent document 1 contains border, texture etc. feature, but these features are separated, inputted for neural metwork training independently.Feature differentiation is limited, and process is comparatively complicated.
Summary of the invention
Make the present invention in view of the above-mentioned problems in the prior art, the present invention proposes the equipment of certain objects in a kind of method based on certain objects in the detected image of energy model and detected image.
According to an aspect of the embodiment of the present invention, propose the method for certain objects in a kind of detected image, comprising: interesting region estimating step, in the pending image of input, estimate the region comprising described certain objects, as area-of-interest; Feature determining step, determines the characteristic parameter of object in described area-of-interest; Object energy determining step, according to the energy of the characteristic parameter determination object of object; This object discrimination by the energy of determined object compared with predetermined threshold, if the energy of this object is more than or equal to this predetermined threshold, is then described certain objects by certain objects discriminating step.。
According to another aspect of the embodiment of the present invention, propose the equipment of certain objects in a kind of detected image, comprising: interesting region estimating device, in the pending image of input, estimate the region comprising described certain objects, as area-of-interest; Feature determining device, determines the characteristic parameter of object in described area-of-interest; Object energy determining device, according to the energy of the characteristic parameter determination object of object; This object discrimination by the energy of determined object compared with predetermined threshold, if the energy of this object is more than or equal to this predetermined threshold, is then described certain objects by certain objects discriminating gear.
By reading the detailed description of the following the preferred embodiments of the present invention considered by reference to the accompanying drawings, above and other target of the present invention, feature, advantage and technology and industrial significance will be understood better.
Accompanying drawing explanation
Fig. 1 illustrates the overview flow chart of the method for certain objects in the detected image according to the embodiment of the present invention.
Fig. 2 illustrates an example of pending image.
The area-of-interest about certain objects cloud that Fig. 3 is estimated after schematically showing and carrying out interesting region estimating process to image pending shown in Fig. 2.
Fig. 4 illustrates another example of pending image.
The area-of-interest about certain objects cloud that Fig. 5 is estimated after schematically showing and carrying out interesting region estimating process to image pending shown in Fig. 4.
Fig. 6 illustrates the schematic diagram to the region of interesting extraction object outer boundary characteristic parameter in Fig. 2.
Fig. 7 illustrates the schematic diagram to the image zooming-out interior of articles boundary characteristic parameter shown in Fig. 4.
Fig. 8 illustrates the general frame of the equipment of certain objects in the detected image according to the embodiment of the present invention.
Fig. 9 is the general frame of the system that certain objects in the detected image according to the embodiment of the present invention is shown.
Embodiment
Below in conjunction with accompanying drawing, the embodiment of the present invention is described.
Fig. 1 illustrates the overview flow chart of the method for certain objects in the detected image according to the embodiment of the present invention.As shown in Figure 1, in detected image, the method for certain objects can comprise: interesting region estimating step S100, in the pending image of input, can estimate the region comprising described certain objects, as area-of-interest; Feature determining step S200, can determine the characteristic parameter of object in described area-of-interest; Object energy determining step S300, can according to the energy of the characteristic parameter determination object of object; And certain objects discriminating step S400, can by the energy of determined object compared with predetermined threshold, if the energy of this object is more than or equal to this predetermined threshold, be then described certain objects by this object discrimination.
In this interesting region estimating step S100, described pending image can be divided into multiple region, obtain the color character of regional, whether regional meets the color character of described certain objects to utilize linear classifier to judge respectively, and the areas combine of the color character meeting described certain objects is obtained described area-of-interest.
For input picture to be detected, or be called pending image, in interesting region estimating step S100, Preliminary detection can be carried out to it, get rid of and obviously not there is the specific target object that will detect, the i.e. image of certain objects, to reduce the burden of this last handling process.
Preliminary detection can Shape-based interpolation, color, any one feature in size etc. or manifold combination, usually, Preliminary detection based on a kind of single features has very high processing speed, significantly can reduce the quantity of image to be detected, but Preliminary detection has relatively low verification and measurement ratio, have a certain amount of image in fact not comprising certain objects by this Preliminary detection, enter process after this.
Suppose when described certain objects is cloud, in this interesting region estimating step S100, the pending image of input can be carried out the Preliminary detection of the cloud based on solid color feature.Cloud can be divided into white clouds, black clouds and rosy clouds of dawn sunset clouds etc. classification substantially, can collect a large amount of positive sample images about all kinds of cloud (cloud atlas picture) and negative sample image (non-cloud image), carry out training classifier.Detection based on the cloud of color characteristic can utilize linear classifier, this linear classifier can by extracting the RGB color characteristic of abundant positive sample image and negative sample image, and then use support vector machine (SupportVectorMachine, SVM) training and obtain.
Interesting region estimating step S100 can carry out for each pixel one by one to the preliminary judgement of pending image, but, in order to reduce the complexity of process, also image can be divided into several rectangle frames of equal sizes, such as, the image of 1024*768 is divided into 32*27 rectangle frame, the ranks number of the image pixel number that the obvious embodiment of the present invention can process and division rectangle frame is not limited thereto.Through the process in units of pixel or rectangle frame to pending image, when certain or some above-mentioned units are judged as certain objects by Hypothetical classification device, suppose that certain objects is cloud, detect target comprise white clouds, black clouds, rosy clouds of dawn sunset clouds, then in pending image containing cloud region and in fact not containing cloud but contain can be detected with the region of the object of cloud Similar color.
Fig. 2 illustrates an example of pending image.By above-mentioned interesting region estimating step S100 to the process of image shown in Fig. 2, its area-of-interest about certain objects can be obtained.The area-of-interest about certain objects cloud that Fig. 3 is estimated after schematically showing and carrying out interesting region estimating process to image pending shown in Fig. 2, wherein, is divided into pending image the rectangle frame of several rows column number, wherein marks the rectangle frame of cross spider estimate to form area-of-interest, wherein do not determine not relate to this certain objects containing the rectangle frame of cross spider.
Fig. 4 illustrates another example of pending image.By above-mentioned interesting region estimating step S100 to the process of image shown in Fig. 4, its area-of-interest about certain objects can be obtained.The area-of-interest about certain objects cloud that Fig. 5 is estimated after schematically showing and carrying out interesting region estimating process to image pending shown in Fig. 4, wherein, is divided into pending image the rectangle frame of several rows column number, wherein marks the rectangle frame of cross spider estimate to form area-of-interest, wherein do not determine not relate to this certain objects containing the rectangle frame of cross spider.
Object in Fig. 2 is single object cloud, and when cloud is certain objects, the area-of-interest that interesting region estimating step S100 estimates is all really for comprising object cloud; Object in Fig. 4 has multiple, such as building, automobile, cloud etc., although cloud is still certain objects, but the area-of-interest that estimates of interesting region estimating step S100 may also be estimated as area-of-interest about cloud other object, although there is erroneous judgement viewed from last result, all remain in this step.
The region that may comprise certain objects detected by interesting region estimating step S100 can be called area-of-interest (ROI), interesting region estimating step S100 can record to the position of the estimated area-of-interest of pending image, as the additional data of the view data of this pending image, after this process can be carried out for detected area-of-interest, thus alleviates the pressure of subsequent treatment.
For the area-of-interest estimated by interesting region estimating step S100, feature determining step S200 can be entered process, but, alternatively, also can get rid of by getting rid of step the area-of-interest that some are unlikely the certain objects of target again, reducing the burden of this aftertreatment further.
Getting rid of step can be such as, according to the position of area-of-interest in described pending image, gets rid of the area-of-interest not meeting the position feature of described certain objects.The eliminating that eliminating step can adopt according to the position being not limited to area-of-interest, but also can adopt neighbour's feature or further feature to carry out feature verification, thus can get rid of a lot of flase drop result further.
Assuming that be cloud as the certain objects of the target that finally will detect, such as white clouds, black clouds, rosy clouds of dawn sunset clouds, getting rid of step can remove flase drop result by the verification of some important supplemental characteristics.Such as, position feature can be a simply effective feature, when getting rid of step and adopting position feature, if area-of-interest is arranged in the first half of image, object so is wherein likely just cloud, if there is in the latter half, object so wherein will be unlikely cloud.Again such as, when getting rid of step and adopting neighbour's feature, if there is object to be surrounded by blue, gray area in area-of-interest, this blue region may be blue sky, this gray area may be the cloudy day, and so this object is likely cloud, and such area-of-interest can retain; And if the object in area-of-interest is not all surrounded by blue, gray area, then this object is unlikely is cloud, is such as likely plate, car or skin etc., and such area-of-interest can be got rid of.
Area-of-interest estimated by interesting region estimating step S100 or the area-of-interest be not excluded through getting rid of step, enter feature determining step S200 and subsequent process.
In described feature determining step S200, determine the characteristic parameter of object in described area-of-interest, described characteristic parameter can comprise object outer boundary characteristic parameter and interior of articles boundary characteristic parameter.
About object outer boundary characteristic parameter, can by determining object boundary in area-of-interest, extract object outer boundary wherein, count out according to this object area occupied determination outer boundary, equidistant outer boundary point outer boundary being set and counting out on object outer boundary, determine size value and the direction metric of outer boundary point gradient, the size value of outer boundary point gradient and direction metric and outer boundary are counted out as object outer boundary characteristic parameter.
Feature determining step S200 will process to obtain the image of wherein object outer boundary characteristic parameter can still such as shown in Figure 2, but, should be appreciated that no matter whether through the process of above-mentioned eliminating step at this, now known the information of the position of the area-of-interest in Fig. 2 etc.
Fig. 6 illustrates the schematic diagram to the region of interesting extraction object outer boundary characteristic parameter in Fig. 2.
First, can pass through boundary detection method, such as Sobel or Canny Boundary Detection, extract the border of each object in area-of-interest, choosing wherein outermost border is outer boundary.In figure 6, show the outer boundary extracted of each object, schematically mark wherein outer boundary Le1, Le2, Le3 of three objects, in order to the enforcement of the feature determining step S200 of the embodiment of the present invention to be described.It will be understood by those skilled in the art that the outer boundary of other object in Fig. 2 also can also extract.For simplicity, in figure 6, the outer boundary of other object above-mentioned does not give label, but it will be understood by those skilled in the art that can with the process with the following process same way for outer boundary Le1 to process other outer boundary.Below, for outer boundary Le1 illustrate its enclose the extraction of the outer boundary feature of object.
Area area shared by the object that outer boundary Le1 encloses can be obtained by ripe means, can determine that the outer boundary adapted with this area is counted out Ns according to analyzing a large amount of positive sample image obtained empirical function Ns=f (area).Namely, the value of Ns can decide according to the size of certain objects to be detected, and wherein, can with any regular, such as the highest, minimum, the most left or the rightest, come the position that first outer boundary point is determined in starting point, all the other points or clockwise or also can arrange successively widdershins.Suitable Ns value makes it possible to the key feature extracting object outside as much as possible.
Then, outer boundary Le1 arranges this Ns outer boundary point successively equidistantly, in figure 6, show the outer boundary point on outer boundary Le1, wherein, for illustrative purposes, 4 outer boundary points Pe1, Pe2, Pe3, Pe4 are schematically marked, for simplicity, other outer boundary point of this object does not give label, but it will be understood by those skilled in the art that can with the process with the following process same way for outer boundary point Pe1, Pe2, Pe3, Pe4 to process other outer boundary point.
Then, can be calculated the gradient of each outer boundary point by ripe means, gradient is vector, and Grad comprises gradient size values and gradient direction value.The arrow at Fig. 6 peripheral frontier point Pe1, Pe2, Pe3, Pe4 place represents the direction of this some place gradient.
Calculate the mean value of the size of outer boundary point gradient, as the size value of described outer boundary point gradient.Namely, calculate the mean value msm of the gradient magnitude of all Ns outer boundary point, as the size value of described outer boundary point gradient, the size value msm of this gradient can be used for weighing the gradual change degree of object outer boundary.
Before or after the mean value of size calculating outer boundary point gradient, or meanwhile, the distribution of the consecutive point differential seat angle of outer boundary point gradient can be calculated, as the direction metric of described outer boundary point gradient.Wherein, the angular range of 360 degree can be divided into the angular interval of predetermined number, consecutive point differential seat angle is distributed to described angular interval, using being distributed with the number of described angular interval of described consecutive point differential seat angle as direction metric.
Such as, the gradient direction angle value of all adjacent two outer boundary points of object is done difference, for Fig. 6, (obviously also can in a clockwise direction) be to outside frontier point Pe1 assuming that in the counterclockwise direction, Pe2, Pe3, the gradient direction angle value at Pe4 place takes turns doing difference, the i.e. angle of the gradient direction of the angle-Pe3 point of the gradient direction of Pe4 point, the angle of the gradient direction of the angle-Pe2 point of the gradient direction of Pe3 point, the angle of the gradient direction of the angle-Pe1 point of the gradient direction of Pe2 point, so calculate successively, until complete Le1 enclose the calculating of the consecutive point differential seat angle of whole Ns outer boundary points of object, circulate one week, get back to Pe4 point.
Then, difference is grouped in default angle difference segment.Each segment covers 10 degree, has 36 segments, adds up the distribution situation of these angle differences, and counts the segment number ds that these angle differences distribute.Such as, suppose there is 25 outer boundary points (Ns=25), then have 25 angle differences, suppose that these 25 angle differences are distributed in 19 angular interval sections, then the direction metric ds=19 of described outer boundary point gradient.
Statistics object outer boundary point neighboring gradient direction difference can weigh the outside scrambling of object.The object outer boundary characteristic parameter obtained by above-mentioned process can comprise Ns, msm and ds, and these parameters will enter the processing procedure being used for setting up energy model after this.
Before or after the process of above-mentioned extract external body boundary characteristic parameter, or with it simultaneously, object inner boundary characteristic parameter can be extracted.
About interior of articles boundary characteristic parameter, can by determining object boundary in area-of-interest, extract interior of articles border wherein, count out according to interior of articles border total length determination inner boundary, equidistant inner boundary point inner boundary being set and counting out on interior of articles border, determine size value and the direction metric of inner boundary point gradient, the size value of inner boundary point gradient and direction metric and inner boundary are counted out as interior of articles boundary characteristic parameter.
Feature determining step S200 will process to obtain the image of wherein interior of articles boundary characteristic parameter can still such as shown in Figure 4, but, should be appreciated that no matter whether through the process of above-mentioned eliminating step at this, now known the information of the position of the area-of-interest in Fig. 4 etc.
Fig. 7 illustrates the schematic diagram to the image zooming-out interior of articles boundary characteristic parameter shown in Fig. 4.In order to clearly represent the interior of articles border of extracting, Fig. 7 adopts the form of binary picture.The feature determining step S200 that those skilled in the art are appreciated that the embodiment of the present invention by following explanation also can for its inner boundary of image zooming-out shown in Fig. 2, also can for its outer boundary of image zooming-out shown in Fig. 4.Adopt different picture specification feature determining step S200 to embody the general applicability of the embodiment of the present invention.
Extract the inner vein feature of each object in area-of-interest, such as, can boundary detection method be passed through, such as Sobel or Canny Boundary Detection, extract the border of each object in area-of-interest, and extract the inner boundary of wherein each object thus.In fact, about extract external body boundary characteristic parameter and extraction object inner boundary characteristic parameter, can utilize the result of a Boundary Detection, each object exterior-most limits is outer boundary, and within outer boundary is inner boundary.Then, for each object, all for this interior of articles borders are imagined as an entirety, the border of an interior of articles is such as all coupled together according to from top to bottom, by left-to-right order, by extracting the feature on the whole interior border that this is drawn up, weigh the characteristic of this interior of articles texture.In the figure 7, show the border of extracting of each object, comprise outer boundary and inner boundary, schematically mark the wherein whole interior border Li be formed by connecting of certain object, in order to the enforcement of the feature determining step S200 of the embodiment of the present invention to be described.
It will be understood by those skilled in the art that the inner boundary of other object in Fig. 4 also can also extract.For simplicity, in the figure 7, the inner boundary of other object above-mentioned does not give label, but it will be understood by those skilled in the art that can with the process with the following process same way for inner boundary Li to process other inner boundary.Below, the extraction of its inner vein feature is described for inner boundary Li.
The length length of inner boundary Li can be obtained by ripe means, can determine that the inner boundary adapted with this length is counted out Nb according to analyzing a large amount of positive sample image obtained empirical function Nb=f (length).Namely, the value of Nb can decide according to the inner boundary length of certain objects to be detected, and wherein, can with any regular, such as the highest, minimum, the most left or the rightest, come the position that first outer boundary point is determined in starting point, all the other points or clockwise or also can arrange successively widdershins.Suitable Nb value makes it possible to the key feature extracting interior of articles as much as possible.
Then, inner boundary Li arranges this Nb inner boundary point successively equidistantly, in the figure 7, show the inner boundary point on inner boundary Li, wherein, for illustrative purposes, 4 inner boundary points Pi1, Pi2, Pi3, Pi4 are schematically marked, for simplicity, other inner boundary point of this object does not give label, but it will be understood by those skilled in the art that can with the process with the following process same way for inner boundary point Pi1, Pi2, Pi3, Pi4 to process other inner boundary point.
Then, can be calculated the gradient of each inner boundary point by ripe means, gradient is vector, and Grad comprises gradient size values and gradient direction value.In Fig. 7, the arrow at inner boundary point Pi1, Pi2, Pi3, Pi4 place represents the direction of this some place gradient.
Calculate the mean value of the size of inner boundary point gradient, as the size value of described inner boundary point gradient.Namely, calculate the mean value mbm of the gradient magnitude of all Nb inner boundary point, as the size value of described inner boundary point gradient, the size value mbm of this gradient can be used for weighing the gradual change degree on interior of articles border.
Before or after the mean value of size calculating outer boundary point gradient, or meanwhile, the distribution of the consecutive point differential seat angle of inner boundary point gradient can be calculated, as the direction metric of described inner boundary point gradient.Wherein, the angular range of 360 degree can be divided into the angular interval of predetermined number, consecutive point differential seat angle is distributed to described angular interval, using being distributed with the number of described angular interval of described consecutive point differential seat angle as direction metric.
Such as, the gradient direction angle value of all adjacent two inner boundary points of object is done difference, for Fig. 7, assuming that by (obviously also can along other direction) from top left to bottom right to inner frontier point Pi1, Pi2, Pi3, the gradient direction angle value at Pi4 place takes turns doing difference, the i.e. angle of the gradient direction of the angle-Pi2 point of the gradient direction of Pi1 point, the angle of the gradient direction of the angle-Pi3 point of the gradient direction of Pi2 point, the angle of the gradient direction of the angle-Pi4 point of the gradient direction of Pi3 point, so calculate successively, until complete the calculating of the consecutive point differential seat angle of whole Nb inner boundary points on Li, the angle of the gradient direction of the angle-starting point of the gradient direction of rearmost point, thus circulate one week.
Then, difference is grouped in default angle difference segment.Each segment covers 10 degree, has 36 segments, adds up the distribution situation of these angle differences, and counts the segment number db that these angle differences distribute.Such as, suppose there is 25 inner boundary points (Nb=25), then have 25 angle differences, suppose that these 25 angle differences are distributed in 4 angular interval sections, then the direction metric db=4 of described inner boundary point gradient.
Statistics interior of articles frontier point neighboring gradient direction difference can weigh the scrambling of the inner vein of object.The interior of articles boundary characteristic parameter obtained by above-mentioned process can comprise Nb, mbm and db, and these parameters will enter the processing procedure being used for setting up energy model after this.
Then, at object energy determining step S300, the key feature of certain object determined in the pretreatment process of utilization, namely the parameter of its outer boundary Gradient Features and the parameter of inner boundary textural characteristics thereof, set up energy model.It will be appreciated by those skilled in the art that, although in the above description, the parameter of the outer boundary Gradient Features how determining object and the parameter of inner boundary textural characteristics is introduced respectively for different images, but, in the process setting up energy model, the inevitable outer boundary characteristic parameter according to same object and inner boundary characteristic parameter set up the energy model of this object.
Particularly, in described object energy determining step S300, can based on object outer boundary characteristic parameter determination object outer boundary energy, based on interior of articles boundary characteristic parameter determination interior of articles boundary energy, object outer boundary energy and interior of articles boundary energy are pressed predefined weight to be added, obtain the energy of this object.
Particularly, for some objects, its energy model can be set up by following formula (1):
E object=E surface+k·E body(1)
Wherein E objectthe gross energy of this object, E surfacethe outer boundary energy of object, E bodyit is the inner boundary energy of object; K is the weight parameter between outside and internal energy, can be the optimal value utilizing a large amount of sample trainings by the method for machine learning and obtain.
When the certain objects as the target that will detect is cloud, E objecte can be expressed as cloud.And in the case, k is the result of being undertaken training by a large amount of sample image about cloud and obtaining.When the certain objects as the target that will detect is other certain objects, is then undertaken training by the sample image in a large number about this other certain objects and obtain k value.
Particularly, the ratio can counted out according to the direction metric of described outer boundary point gradient and outer boundary and the size value of described outer boundary point gradient, determine described object outer boundary energy.Such as, object outer boundary ENERGY E surfacecan be calculated by following formula (2)
E surface=a ds/Ns+a -msm(2)
Wherein, Ns is this object outer boundary point number, msm is the size value of this Ns outer boundary point gradient, ds is the direction metric of this Ns outer boundary point gradient, a can for the arbitrary value being greater than 1, such as, a can be math constant e, also can be other constant value of 1.5,2,100 etc.
Particularly, the ratio can counted out according to the direction metric of described inner boundary point gradient and inner boundary and the size value of described inner boundary point gradient, determine described interior of articles boundary energy.
Such as, object outer boundary ENERGY E bodycan be calculated by following formula (3)
E body=a db/Nb+a -mbm(3)
Wherein, Nb is this interior of articles frontier point number, and mbm is the size value of this Nb inner boundary point gradient, and db is the direction metric of this Nb inner boundary point gradient, and its value of the implication of a is identical with above-mentioned formula (2).
Thus, the gross energy E of this object can be calculated by following formula (4) object.
E object=a ds/Ns+a -msm+k·(a db/Nb+a -mbm)(4)
Wherein respectively measure implication with above in description identical.
Can be found by formula (4), when this detected object has fuzzy, slow gradual change and irregular outer boundary, when having fuzzy smooth and irregular inner vein simultaneously or only have a small amount of inner vein, the total energy value of this object to be detected can trend towards becoming large.And in some other cases, such as, when object to be detected has clear or regular outer boundary, when having clear or regular inner vein, the total energy value of this object to be detected can trend towards diminishing simultaneously.For cloud, the outside of cloud is fuzzy irregular, and inner vein is fuzzy irregular or inner vein is a small amount of, therefore, when object to be detected is cloud time, can generate a higher total energy value.Therefore, the energy model that formula (4) is set up can weigh object surface, the gradual change degree of such as outer boundary and irregularity boundary, and weighs interior of articles feature, the sharpness of such as texture and distribution situation simultaneously.
After object energy determining step S300 calculates the gross energy of object to be detected, differentiate that whether this object is the certain objects as target by certain objects discriminating step S400.Wherein, can according to the method for machine learning, by training a large amount of samples about this certain objects, and according to the above-mentioned formula having identical parameters and arrange is described above, such as identical with above-mentioned testing process a value, k value etc., set up its energy model, obtain corresponding optimal value according to the energy value of sample, as the predetermined threshold being used for judging this certain objects.If the energy of this detected object is more than or equal to this predetermined threshold, be then described certain objects by this object discrimination; Obviously also can be if the energy of this detected object is greater than this predetermined threshold, be then described certain objects by this object discrimination; Otherwise, be not be described certain objects by this object discrimination.Detect in whole area-of-interest after object, obtain final process result.
So far, can identify with the arbitrary ripe means in this area the object being determined as target certain objects in the picture, and export with the arbitrary ripe means in this area.
The present invention can also be embodied as the equipment of certain objects in a kind of detected image, can be used for implementing the method for certain objects in aforementioned detected image.Fig. 8 illustrates the general frame of the equipment of certain objects in the detected image according to the embodiment of the present invention.As shown in Figure 8, in this detected image, the equipment of certain objects comprises: interesting region estimating device 100, can be used for implementing aforementioned interesting region estimating step S100, with in the pending image of input, estimate the region comprising described certain objects, as area-of-interest; Feature determining device 200, can be used for implementing preceding feature determining step S200, to determine the characteristic parameter of object in described area-of-interest; Object energy determining device 300, can be used for implementing aforesaid object energy determining step S300, with the energy of the characteristic parameter determination object according to object; Certain objects discriminating gear 400, can be used for implementing aforementioned certain objects discriminating step S400, with by the energy of determined object compared with predetermined threshold, if the energy of this object is more than or equal to this predetermined threshold, be then described certain objects by this object discrimination.
Wherein, the determined described characteristic parameter of described feature determining device 200 can comprise object outer boundary characteristic parameter and interior of articles boundary characteristic parameter.
Wherein, described object energy determining device 300 can based on object outer boundary characteristic parameter determination object outer boundary energy, based on interior of articles boundary characteristic parameter determination interior of articles boundary energy, object outer boundary energy and interior of articles boundary energy are pressed predefined weight to be added, obtain the energy of this object.
Wherein, described pending image can be divided into multiple region by described interesting region estimating device 100, obtain the color character of regional, whether regional meets the color character of described certain objects to utilize linear classifier to judge respectively, and the areas combine of the color character meeting described certain objects is obtained described area-of-interest.
Remover can also be comprised according to the equipment of certain objects in the detected image of the embodiment of the present invention, can be used for implementing aforementioned eliminating step, with according to the position of area-of-interest in described pending image, get rid of the area-of-interest not meeting the position feature of described certain objects.
According to the detected image of the embodiment of the present invention certain objects equipment in, can by determining object boundary in area-of-interest, extract object outer boundary wherein, count out according to this object area occupied determination outer boundary, equidistant outer boundary point outer boundary being set and counting out on object outer boundary, determine size value and the direction metric of outer boundary point gradient, the size value of outer boundary point gradient and direction metric and outer boundary are counted out as object outer boundary characteristic parameter.
According to the detected image of the embodiment of the present invention certain objects equipment in, can by determining object boundary in area-of-interest, extract interior of articles border wherein, count out according to interior of articles border total length determination inner boundary, equidistant inner boundary point inner boundary being set and counting out on interior of articles border, determine size value and the direction metric of inner boundary point gradient, the size value of inner boundary point gradient and direction metric and inner boundary are counted out as interior of articles boundary characteristic parameter.
According to the detected image of the embodiment of the present invention certain objects equipment in, the mean value of the size of outer boundary point gradient can be calculated, as the size value of described outer boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of outer boundary point gradient, as the direction metric of described outer boundary point gradient; The ratio of counting out according to direction metric and the outer boundary of described outer boundary point gradient and the size value of described outer boundary point gradient, determine described object outer boundary energy.
According to the detected image of the embodiment of the present invention certain objects equipment in, the mean value of the size of inner boundary point gradient can be calculated, as the size value of described inner boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of inner boundary point gradient, as the direction metric of described inner boundary point gradient; The ratio of counting out according to direction metric and the inner boundary of described inner boundary point gradient and the size value of described inner boundary point gradient, determine described interior of articles boundary energy.
According to the detected image of the embodiment of the present invention certain objects equipment in, the angular range of 360 degree can be divided into the angular interval of predetermined number, consecutive point differential seat angle is distributed to described angular interval, using being distributed with the number of described angular interval of described consecutive point differential seat angle as direction metric.
The present invention can also be implemented by the system of certain objects in a kind of detected image.Fig. 9 is the general frame of the system 1000 that certain objects in the detected image according to the embodiment of the present invention is shown, as shown in Figure 9, in detected image, the system 1000 of certain objects can comprise: input equipment 1100, for will the image of check processing from outside input, such as, can comprise keyboard, Genius mouse, scanner and communication network and remote input equipment of connecting thereof etc.; Treatment facility 1200, for implementing the above-mentioned method according to certain objects in the detected image of the embodiment of the present invention, or be embodied as the above-mentioned equipment according to certain objects in the detected image of the embodiment of the present invention, such as, can comprise the central processing unit of computing machine or other the chip with processing power as DSP etc.; Output device 1300, for externally exporting the result implementing above-mentioned certain objects testing process gained, such as, can comprise display, printer and communication network and remote output devices of connecting thereof etc.; And memory device 1400, for storing the result, order, intermediate data etc. of image involved by above-mentioned certain objects check processing process, gained in volatile or non-volatile mode, such as, can comprise the various volatile or nonvolatile memory of random access memory (RAM), ROM (read-only memory) (ROM), hard disk or semiconductor memory etc.
The outer boundary Gradient Features of the object that this instructions is above mentioned and inner vein feature (such as inner boundary Gradient Features) are just applied to two kinds of key characters in the various features of object detecting areas, enforcement of the present invention is not limited thereto, and other features many also can be used for setting up energy model.Inside and outside energy in energy model is two sub-energy, but, also the energy model of the object detection with three or more key features can be set up, this energy model is expanded to containing three or more sub-energy, in the case, in formula (1), corresponding energy element is added.
In the detected image of the embodiment of the present invention, in the method for certain objects and detected image, the equipment of certain objects can take cloud as certain objects, the cloud detection can be correlated with to white clouds, black clouds, rosy clouds of dawn sunset clouds etc., obviously can also be applied to detection and have fuzzy and irregular contour and fuzzy inner vein feature a type objects.Such as, in the detected image of the embodiment of the present invention, in the method for certain objects and detected image, the equipment of certain objects can be applied to the detection of such as some down toys with the configuration that cloud detection is identical.
One skilled in the art will recognize that for arbitrary certain objects, the equipment of certain objects in the method for certain objects in the detected image of the embodiment of the present invention and detected image can be applied.Sample image based on this certain objects is trained, and by setting up the energy model corresponding to this certain objects and determining corresponding predetermined threshold, can realize the detection of this certain objects in image.Therefore, the embodiment of the present invention can be applied to general object detection.
By the equipment according to certain objects in the method based on certain objects in the detected image of energy model of the embodiment of the present invention and detected image, can realize reducing false drop rate, remove and utilize prior art means can other object of flase drop.The energy model that the embodiment of the present invention adopts is by combining multiple key feature, they are divided and is equipped with different weights, be established in energy model by certain mathematical operation, thus generated energy value can be utilized characterize the distinctive numerical value with certain objects.By such energy model, the means of identification that the embodiment of the present invention realizes can obtain better recognition accuracy in conjunction with key feature effectively.
Compared with aforementioned patent document 1, in the detected image of the embodiment of the present invention in the method for certain objects and detected image the equipment of certain objects by extracting image unique characteristics, such as outer boundary Gradient Features and inner vein feature, set up energy model, thus the cloud in effective detected image, there is range of application more widely, and be not only satellite cloud picture.
Compared with aforementioned non-patent document 1, in the detected image of the embodiment of the present invention in the method for certain objects and detected image the equipment of certain objects except Preliminary detection and feature verification, two or more key features (such as outer boundary gradient and inner vein feature) can be established in energy model, be equipped with weight to realize combination, thus obtain better feature differentiation, and reduce processing load with the algorithm simplified.
In the method based on certain objects in the detected image of energy model of the embodiment of the present invention and detected image, the equipment of certain objects is by proposing energy model, the key feature of two or more objects of optimal combination, inside and outside energy is set up respectively for feature, then by weight parameter k optimal combination two kinds of features, to reach optimal classification effect.
In the method based on certain objects in the detected image of energy model of the embodiment of the present invention and detected image, the equipment of certain objects proposes optimum detection structure: first carry out the Preliminary detection based on single features, then feature verification is carried out to important supplemental characteristic, then set up the final detection that energy model carries out specific objective object, thus realize optimum detection efficiency and performance.
When the specific objective object detected is cloud, in the method based on certain objects in the detected image of energy model of the embodiment of the present invention and detected image, the equipment of certain objects adopts the outer boundary gradient characteristics of cloud and inner vein characteristic as key character, carries out the detection of cloud.
In addition, in the method based on certain objects in the detected image of energy model of the embodiment of the present invention and detected image, the equipment of certain objects proposes neighbour direction difference statistical method, to weigh the scrambling on border or texture.
Obviously, above illustrative concrete formula, parameter, hardware, numerical value are example, and those skilled in the art can in the scope of spirit of the present invention, obtain other place of equation, parameter, hardware, numerical value realizes the present invention according to the instruction of this instructions.Described in detail the detail of embodiments of the invention above as an example by the cloud detection model in image recognition model, but, it will be appreciated by those skilled in the art that, the applicable model of cognition of the present invention is not limited thereto, but can be applied to detection and the identification of other models beyond cloud detection model.
The sequence of operations illustrated in the description can be performed by the combination of hardware, software or hardware and software.When being performed this sequence of operations by software, computer program wherein can be installed in the storer be built in the computing machine of specialized hardware, make computing machine perform this computer program.Or, computer program can be installed in the multi-purpose computer that can perform various types of process, make computing machine perform this computer program.
Such as, computer program can be prestored in the hard disk or ROM (ROM (read-only memory)) of recording medium.Or, (record) computer program can be stored in removable recording medium, such as floppy disk, CD-ROM (compact disc read-only memory), MO (magneto-optic) dish, DVD (digital versatile disc), disk or semiconductor memory temporarily or for good and all.So removable recording medium can be provided as canned software.
The present invention has been described in detail with reference to specific embodiment.But clearly, when not deviating from spirit of the present invention, those skilled in the art can perform change to embodiment and replace.In other words, the form that the present invention illustrates is open, instead of explains with being limited.Judge main idea of the present invention, appended claim should be considered.

Claims (7)

1. the method for certain objects in detected image, comprising:
Interesting region estimating step, in the pending image of input, estimates the region comprising described certain objects, as area-of-interest;
Feature determining step, determines the characteristic parameter of object in described area-of-interest;
Object energy determining step, according to the energy of the characteristic parameter determination object of object;
This object discrimination by the energy of determined object compared with predetermined threshold, if the energy of this object is more than or equal to this predetermined threshold, is then described certain objects by certain objects discriminating step,
Wherein, in described feature determining step, described characteristic parameter comprises object outer boundary characteristic parameter and interior of articles boundary characteristic parameter; And
In described object energy determining step, based on object outer boundary characteristic parameter determination object outer boundary energy, based on interior of articles boundary characteristic parameter determination interior of articles boundary energy, object outer boundary energy and interior of articles boundary energy are pressed predefined weight to be added, obtain the energy of this object
Wherein, by determining object boundary in area-of-interest, extract object outer boundary wherein, count out according to this object area occupied determination outer boundary, equidistant outer boundary point outer boundary being set and counting out on object outer boundary, determine size value and the direction metric of outer boundary point gradient, the size value of outer boundary point gradient and direction metric and outer boundary are counted out as object outer boundary characteristic parameter, and
By determining object boundary in area-of-interest, extract interior of articles border wherein, count out according to interior of articles border total length determination inner boundary, equidistant inner boundary point inner boundary being set and counting out on interior of articles border, determine size value and the direction metric of inner boundary point gradient, the size value of inner boundary point gradient and direction metric and inner boundary are counted out as interior of articles boundary characteristic parameter.
2. according to the method for certain objects in detected image according to claim 1, wherein,
In described interesting region estimating step, described pending image is divided into multiple region, obtain the color character of regional, whether regional meets the color character of described certain objects to utilize linear classifier to judge respectively, and the areas combine of the color character meeting described certain objects is obtained described area-of-interest.
3., according to the method for certain objects in detected image according to claim 1, also comprise:
Get rid of step, according to the position of area-of-interest in described pending image, get rid of the area-of-interest not meeting the position feature of described certain objects.
4. according to the method for certain objects in detected image according to claim 1, wherein,
Calculate the mean value of the size of outer boundary point gradient, as the size value of described outer boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of outer boundary point gradient, as the direction metric of described outer boundary point gradient; The ratio of counting out according to direction metric and the outer boundary of described outer boundary point gradient and the size value of described outer boundary point gradient, determine described object outer boundary energy.
5. according to the method for certain objects in detected image according to claim 1, wherein,
Calculate the mean value of the size of inner boundary point gradient, as the size value of described inner boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of inner boundary point gradient, as the direction metric of described inner boundary point gradient; The ratio of counting out according to direction metric and the inner boundary of described inner boundary point gradient and the size value of described inner boundary point gradient, determine described interior of articles boundary energy.
6. according to the method for certain objects in the detected image described in claim 4 or 5, wherein,
The angular range of 360 degree is divided into the angular interval of predetermined number, consecutive point differential seat angle is distributed to described angular interval, using being distributed with the number of described angular interval of described consecutive point differential seat angle as direction metric.
7. the equipment of certain objects in detected image, comprising:
Interesting region estimating device, in the pending image of input, estimates the region comprising described certain objects, as area-of-interest;
Feature determining device, determines the characteristic parameter of object in described area-of-interest;
Object energy determining device, according to the energy of the characteristic parameter determination object of object;
This object discrimination by the energy of determined object compared with predetermined threshold, if the energy of this object is more than or equal to this predetermined threshold, is then described certain objects by certain objects discriminating gear,
Wherein, described characteristic parameter comprises object outer boundary characteristic parameter and interior of articles boundary characteristic parameter; And
Described object energy determining device is based on object outer boundary characteristic parameter determination object outer boundary energy, based on interior of articles boundary characteristic parameter determination interior of articles boundary energy, object outer boundary energy and interior of articles boundary energy are pressed predefined weight to be added, obtain the energy of this object
Wherein, by determining object boundary in area-of-interest, extract object outer boundary wherein, count out according to this object area occupied determination outer boundary, equidistant outer boundary point outer boundary being set and counting out on object outer boundary, determine size value and the direction metric of outer boundary point gradient, the size value of outer boundary point gradient and direction metric and outer boundary are counted out as object outer boundary characteristic parameter, and
By determining object boundary in area-of-interest, extract interior of articles border wherein, count out according to interior of articles border total length determination inner boundary, equidistant inner boundary point inner boundary being set and counting out on interior of articles border, determine size value and the direction metric of inner boundary point gradient, the size value of inner boundary point gradient and direction metric and inner boundary are counted out as interior of articles boundary characteristic parameter.
CN201110310765.1A 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image Expired - Fee Related CN103049735B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110310765.1A CN103049735B (en) 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110310765.1A CN103049735B (en) 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image

Publications (2)

Publication Number Publication Date
CN103049735A CN103049735A (en) 2013-04-17
CN103049735B true CN103049735B (en) 2016-02-03

Family

ID=48062368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110310765.1A Expired - Fee Related CN103049735B (en) 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image

Country Status (1)

Country Link
CN (1) CN103049735B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI550440B (en) * 2014-12-09 2016-09-21 由田新技股份有限公司 Method and system for detecting person to use handheld apparatus
CN110955243B (en) * 2019-11-28 2023-10-20 新石器慧通(北京)科技有限公司 Travel control method, apparatus, device, readable storage medium, and mobile apparatus
US11602132B2 (en) 2020-10-06 2023-03-14 Sixgill, LLC System and method of counting livestock

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6504951B1 (en) * 1999-11-29 2003-01-07 Eastman Kodak Company Method for detecting sky in images
CN101833750A (en) * 2010-04-15 2010-09-15 清华大学 Active contour method based on shape constraint and direction field, and system thereof
CN102122343A (en) * 2010-01-07 2011-07-13 索尼公司 Method and device for determining angle of inclination of body and estimating gesture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7840074B2 (en) * 2004-02-17 2010-11-23 Corel Corporation Method and apparatus for selecting an object in an image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6504951B1 (en) * 1999-11-29 2003-01-07 Eastman Kodak Company Method for detecting sky in images
CN102122343A (en) * 2010-01-07 2011-07-13 索尼公司 Method and device for determining angle of inclination of body and estimating gesture
CN101833750A (en) * 2010-04-15 2010-09-15 清华大学 Active contour method based on shape constraint and direction field, and system thereof

Also Published As

Publication number Publication date
CN103049735A (en) 2013-04-17

Similar Documents

Publication Publication Date Title
Naylor et al. Nuclei segmentation in histopathology images using deep neural networks
Bodzas et al. Automated detection of acute lymphoblastic leukemia from microscopic images based on human visual perception
Malon et al. Classification of mitotic figures with convolutional neural networks and seeded blob features
US9031294B2 (en) Region segmented image data creating system and feature extracting system for histopathological images
US8379961B2 (en) Mitotic figure detector and counter system and method for detecting and counting mitotic figures
CN102436648B (en) Target fluorescence spectrum unmixing method based on elimination of background fluorescence
JP5413501B1 (en) Image processing apparatus, image processing system, and program
CN103049733B (en) Method for detecting human face and human-face detection equipment
CN110826618A (en) Personal credit risk assessment method based on random forest
US9443129B2 (en) Methods and apparatus for image analysis using threshold compactness features
CN116205919A (en) Hardware part production quality detection method and system based on artificial intelligence
Chen et al. An automated bacterial colony counting and classification system
CN111968081B (en) Automatic fish shoal counting method and device, electronic equipment and storage medium
CN103049735B (en) The equipment of certain objects in the method for certain objects and detected image in detected image
CN102663723A (en) Image segmentation method based on color sample and electric field model
Liu et al. Impact of the lips for biometrics
CN102713974B (en) Learning device, recognition device, study recognition system and study recognition device
CN113989179B (en) Train wheel set tread defect detection method and system based on target detection algorithm
CN113743421B (en) Method for segmenting and quantitatively analyzing anthocyanin developing area of rice leaf
CN109191467B (en) Method and device for predicting autophagy phenotype of cell
CN116385435B (en) Pharmaceutical capsule counting method based on image segmentation
Chaddad et al. Carcinoma cell identification via optical microscopy and shape feature analysis
CN111507177B (en) Identification method and device for metering turnover cabinet
Chen et al. A modified fuzzy c-means algorithm for breast tissue density segmentation in mammograms
CN116612450A (en) Point cloud scene-oriented differential knowledge distillation 3D target detection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160203

CF01 Termination of patent right due to non-payment of annual fee