CN106327499A - Oil stain image recognition based on edge point self-similarity and TEDS system - Google Patents

Oil stain image recognition based on edge point self-similarity and TEDS system Download PDF

Info

Publication number
CN106327499A
CN106327499A CN201610752481.0A CN201610752481A CN106327499A CN 106327499 A CN106327499 A CN 106327499A CN 201610752481 A CN201610752481 A CN 201610752481A CN 106327499 A CN106327499 A CN 106327499A
Authority
CN
China
Prior art keywords
marginal point
self
point
similarity
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610752481.0A
Other languages
Chinese (zh)
Other versions
CN106327499B (en
Inventor
汪辉
刘晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Xinhe Electronic Technology Co Ltd
Original Assignee
Nanjing Xinhe Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Xinhe Electronic Technology Co Ltd filed Critical Nanjing Xinhe Electronic Technology Co Ltd
Priority to CN201610752481.0A priority Critical patent/CN106327499B/en
Publication of CN106327499A publication Critical patent/CN106327499A/en
Application granted granted Critical
Publication of CN106327499B publication Critical patent/CN106327499B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides oil stain image recognition based on edge point self-similarity, comprising the steps of acquiring edge points of an image to be detected, allocating a reference direction for each edge point and calculating a feature vector thereof, and normalizing the feature vectors; calculating partial and global self-similarity values of each edge point on edge lines, and performing weighted combination on the partial and global self-similarity values to obtain a final self-similarity value of the edge point; acquiring a set of edge points with high self-similarity on the edge lines, and screening and marking irregular edge points with low self-similarity in the set according to a predetermined mode; calculating pixel intensity differences in partial neighborhoods of all the irregular edge points; setting a pixel intensity difference threshold, and marking the irregular edge points of which the pixel intensity differences in partial neighborhoods are greater than the set threshold as an oil stain image. The invention further provides a TEDS system using the recognition, which quickly and accurately recognizes oil stains on motor train units, avoids recognition errors and reduces the misjudgment rate.

Description

The identification of greasy dirt image based on marginal point self-similarity and TEDS system
Technical field
The present invention relates to computer picture detection identification field, particularly relate to a kind of greasy dirt based on marginal point self-similarity The identification of image and apply the TEDS system of this identification.
Background technology
At present, greasy dirt image, as interferogram picture, has in field of image detection and is specifically designed for stationary body surface and oil contaminant Detection equipment and detection method, be mainly used in cleaning and the purification of object surface.
Chinese patent description CN104318556A discloses the recognition methods of a kind of greasy dirt interference region, this identification Method is that picture first carries out pretreatment, then uses to enter greasy dirt defect image based on the morphologic detection method of notable line scanning Row detection and localization, identifies greasy dirt interference region.The Clutter edge that this greasy dirt recognition methods detects is more, applies in EMUs event In barrier detection, EMUs operation troubles Motion Image Detection system (TEDS system) is easy is greasy dirt image by fault detect, improves False Rate.
Summary of the invention
The present invention proposes the identification of a kind of greasy dirt image based on marginal point self-similarity, it is possible to identify greasy dirt image, and Solve high problem of judging by accident in motor-car fault detect in prior art.
The technical scheme is that and be achieved in that:
The identification of a kind of greasy dirt image based on marginal point self-similarity, comprises the following steps:
Step one: input image to be detected in a computer, utilizes canny edge detection algorithm to obtain all limits of this image Edge point;
Step 2: classify all marginal points, similar marginal point belongs to an initial edge line of an image outline, obtains Take all of initial edge line of image to be detected, distribute a reference direction to each marginal point, and extract each edge The characteristic vector of point, and each characteristic vector is normalized;
Step 3: calculate each marginal point on every initial edge line according to the characteristic vector after each marginal point normalized Local self-similarity value and overall self-similarity value, and by local self-similarity value and the weighted array of overall self-similarity value As the self-similarity value that this marginal point is final;
Step 4: set a high threshold, obtains the marginal point higher than high threshold of the self-similarity value on every initial edge line Set, reject the marginal point less than high threshold of self-similarity value on every initial edge line;
Step 5: calculate the self-similarity value of each marginal point marginal point closest with it in above-mentioned set, sets a low threshold Value, obtains all of self-similarity value in set and, higher than the marginal point of Low threshold, will be less than the marginal point of Low threshold in this set It is labeled as being formed the broken edge point of irregular image;
Step 6: classifying the marginal point being higher than Low threshold in step 5, same class marginal point forms the school of an image outline Positive edge line, sets a length threshold to all of calibration edge line, obtains the calibration edge line less than this threshold value, by this school Point on positive edge line is labeled as being formed the broken edge point of irregular image;
Image pixel intensities in the local neighborhood of all broken edge points obtained in step 7, calculation procedure five and step 6 Difference, sets image pixel intensities difference limen value, and the image pixel intensities difference in the local neighborhood of labelling each broken edge point is more than given threshold The broken edge point of value is greasy dirt image.
Preferably, in the identification of described greasy dirt image based on marginal point self-similarity, basis in described step 3 Characteristic vector after each marginal point normalized calculates the self-similarity value that the marginal point on every initial edge line is final Mode be: setWithFor any two marginal point on edge line, its characteristic vector is respectivelyWith, then
Any two marginal point on edge lineWithSimilarity be:, here in vector Long-pending calculation is multiplied for vector corresponding element and is added, obtains two marginal pointsWithSimilarity;
Marginal pointLocal self-similarity value be: take this marginal pointAnd position adjacent with this marginal point on the edge line at place Four marginal points in its both sides, then marginal pointLocal self-similarity value be:
Marginal pointOverall self-similarity value be: assuming that total n marginal point, then marginal point on this edge lineEntirety Self-similarity value is:
Set the weights of local self-similarity and overall self-similarity as, and,, by local self-similarity value and overall self-similarity value combination, then marginal pointSelf-similarity value For:
Characteristic vector after normalizedScope between 0 to 1, then self-similarity value represents similar between 0 to 1 Degree, self-similarity value is to be the most dissimilar state when 0, is to be complete similar state when 1.
Preferably, in the identification of described greasy dirt image based on marginal point self-similarity, described step 2 is given every The mode of one marginal point one reference direction of distribution is:
For any one marginal point, structure local neighborhood centered by current edge point, all pixels in calculating this neighborhood Grad and direction, utilize Grad and the direction of all pixels in this neighborhood of statistics with histogram, comprise 0 in rectangular histogram 9 Nogata posts that the direction scope of ~ 180 degree is divided equally, divide equally and are incorporated on 9 Nogata posts for 180 ~ 360 degree;
Calculate each marginal point weight coefficient to adjacent both direction, calculate each edge further according to weight coefficient and Grad Contribution weights are added to histogrammic each Nogata post at this marginal point place by the some contribution weights to adjacent both direction On, direction, histogram peak place is the reference direction of this marginal point.
Preferably, in the identification of described greasy dirt image based on marginal point self-similarity, arbitrary in described step 2 The extracting mode of the characteristic vector of individual marginal point is:
Set any edge point as, the reference direction of this marginal point is, coordinate axes is rotated to reference direction;Seat after rotation Mark system takes distance marginal point respectively along four orientationThe point of predetermined location of pixels, construct withCentered by 5 local neighborhood, calculate the Grad of each pixel and each pixel and the contribution of adjacent both direction weighed Value;The directional spreding rectangular histogram of 5 local neighborhood of statistics, obtains 5 rectangular histograms ;The characteristic vector of this marginal point is:; Finally the characteristic vector of each marginal point is normalized.
Preferably, in the identification of described greasy dirt image based on marginal point self-similarity, described step 7 calculates The mode of the image pixel intensities difference in the local neighborhood of all broken edge points is:
Set arbitrary broken edge point as, construct withCentered by local neighborhood;Maximum picture in calculating this local neighborhood Element intensity and the pixel of minimum pixel intensity element by force is poor
A kind of TEDS system, including the identification of any of the above-described item greasy dirt based on marginal point self-similarity image, by labelling Greasy dirt image be defaulted as the normal condition of EMUs, the not labelling when EMUs fault detect.
The invention have the benefit that in the present invention, first consider every initial edge line, calculate every according to characteristic vector The self-similarity of the marginal point on initial edge line, is obtained from similar bigger marginal point set, rejects the limit that self-similarity is little Edge point;Again in all marginal point set that self-similarity is bigger according to predetermined mode recalculate each marginal point from phase Like property, it is obtained from the similarity marginal point set higher than Low threshold, increases the seriality of marginal point;To set is higher than low threshold The marginal point of value reclassifies, and obtains calibration edge line;It is less than correcting in the point and set being less than length threshold on edge line The point of Low threshold is labeled as broken edge point, and broken edge point constitutes irregular image;Then image pixel intensities difference limen is set Value, the image pixel intensities difference in the local neighborhood of labelling each broken edge point is oil more than the broken edge point of given threshold value Dirty image.The method first identifies relative to the irregular image in Background, then utilizes the characteristic of greasy dirt to identify greasy dirt figure Picture, improves the accuracy of greasy dirt identification;And apply in TEDS system, can quick and precisely identify the greasy dirt image on car, fall Low EMUs fault False Rate, improves the accuracy of TEDS system detection.
Accompanying drawing explanation
For the technical scheme being illustrated more clearly that in the embodiment of the present invention, in embodiment being described below required for make Accompanying drawing be briefly described, it should be apparent that, below describe in accompanying drawing be only some embodiments of the present invention, for From the point of view of those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to obtain other according to embodiment Accompanying drawing.
Fig. 1 is the image at a certain position of the EMUs gathered;
Fig. 2 is for identifying greasy dirt image in FIG.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Describe, it is clear that described embodiment is only a part of embodiment of the present invention rather than whole embodiments wholely.Based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under not making creative work premise Embodiment, broadly falls into the scope of protection of the invention.
Embodiment 1: the identification of a kind of greasy dirt image based on marginal point self-similarity applied in TEDS system, including Following steps:
Step one: input EMUs image to be detected in a computer, as it is shown in figure 1, utilize canny edge detection algorithm to obtain Take all marginal points of this image;Detailed process is as follows:
1, it is gray level image by EMUs image procossing on computers;
2, gray level image is carried out Gaussian Blur to reduce the interference of picture noise;
3, Grad and the direction of each pixel in the image after noise reduction are calculated;
4, the Grad to each pixel carries out non-maxima suppression, tentatively obtains image border point set;5, dual threashold is used Value method carries out edge connection, rejects false edge, completion emargintion, it is thus achieved that more accurate marginal point set.
Step 2: classify all marginal points, similar marginal point belongs to an initial edge of an image outline Line, obtains all of initial edge line of image to be detected, distributes a reference direction to each marginal point, and extracts every The characteristic vector of individual marginal point, and the characteristic vector of each marginal point is normalized;
The mode of each marginal point one reference direction of distribution is:
1, for any one marginal point, structure 8*8 neighborhood centered by current edge point, all pixels in calculating this neighborhood Grad and direction, utilize Grad and the direction of all pixels in this neighborhood of statistics with histogram, comprise 0 in rectangular histogram 9 Nogata posts that the direction scope of ~ 180 degree is divided equally, 20 degree of each Nogata post, divide equally and be incorporated in 9 Nogata posts for 180 ~ 360 degree On;
2, calculate each marginal point weight coefficient to adjacent both direction, calculate each limit further according to weight coefficient and Grad Contribution weights are added to histogrammic each Nogata post at this marginal point place by the edge point contribution weights to adjacent both direction On, direction, histogram peak place is the reference direction of this marginal point.
The extracting mode of the characteristic vector of each marginal point is:
1, set any edge point as, the reference direction of this marginal point is, coordinate axes being rotated to reference direction, coordinate becomes It is changed to;Coordinate system after rotation takes distance marginal point respectively along four orientationIn advance The point of fixed location of pixels, construct withCentered by 5 8*8 neighborhoods, calculate each The Grad of pixel, calculate each pixel to adjacent two sides To contribution weights
2, add up the directional spreding rectangular histogram of 5 8*8 neighborhoods, obtain 5 rectangular histograms ;The characteristic vector of this marginal point is:; Finally the characteristic vector of each marginal point is normalized.
Step 3: calculate each limit on every initial edge line according to the characteristic vector after each marginal point normalized The local self-similarity value of edge point and overall self-similarity value, and by local self-similarity value and the weighting of overall self-similarity value Combine the self-similarity value final as marginal point;
The calculation of the final self-similarity value of the marginal point on every initial edge line is: setWithFor edge Any two marginal point on line, its characteristic vector is respectivelyWith, then
Any two marginal point on edge lineWithSimilarity be:;Here in vector Long-pending calculation is multiplied for vector corresponding element and is added, obtains two marginal pointsWithSimilarity;
Marginal pointLocal self-similarity value be: take this marginal pointAnd position adjacent with this marginal point on the edge line at place Four marginal points in its both sides, then marginal pointLocal self-similarity value be:
Marginal pointOverall self-similarity value be: assuming that total n marginal point, then marginal point on this edge lineEntirety Self-similarity value is:
Set the weights of local self-similarity and overall self-similarity as, choose,, by local self-similarity value and overall self-similarity value combination, then marginal pointFinal self-similarity Value is:
Characteristic vector after normalizedScope between 0 to 1, then self-similarity value represents similar between 0 to 1 Degree, self-similarity value is to be the most dissimilar state when 0, is to be complete similar state when 1.
Step 4: high threshold is set to 0.5, obtains the edge higher than 0.5 of the self-similarity value on every initial edge line The set of point, rejects self-similarity value on every initial edge line and is less than the marginal point of high threshold.
Step 5: the self-similarity of any two marginal point in the set of calculation procedure four, is set to 0.1 by Low threshold, Obtain all of self-similarity value marginal point higher than 0.1 in set, be labeled as being formed by the marginal point being less than 0.1 in this set The broken edge point of irregular image.
Step 6: classifying the marginal point being higher than Low threshold in step 5, same class marginal point forms an image outline Calibration edge line, to all of calibration edge line preseting length threshold value 12, obtain the edge line length calibration edge less than 12 Line, is labeled as being formed the broken edge point of irregular image by the point on this calibration edge line.
Pixel in the local neighborhood of all broken edge points obtained in step 7, calculation procedure five and step 6 is strong It is poor to spend, and sets image pixel intensities difference limen value, and the image pixel intensities difference in the local neighborhood of labelling each broken edge point is more than given The broken edge point of threshold value is greasy dirt image.
Set the marginal point of image pixel intensities difference limen value labelling irregular image as greasy dirt image concrete mode as: due to Greasy dirt typically presents deeper color state, the shade being similar in image, has less gray value, and and peripheral region Gray value difference is relatively big, and the image pixel intensities in therefore we calculate the local neighborhood of the marginal point on irregular image is poor, sets Image pixel intensities difference limen value E=30, when the image pixel intensities difference in the local neighborhood of the marginal point of irregular image meets E > 30 time, mark Remember that the marginal point on current irregular image is greasy dirt image, as shown in Figure 2.
Step 8, the reference picture of the EMUs to be detected extracted in image library, application standard picture method is in TEDS system Image to be detected and reference picture are compared by system, the greasy dirt image identified in step 7 is defaulted as external interference Factor, is not the malfunction of EMUs, and when EMUs fault detect, not labelling, reduction fault erroneous judgement, improves the event obtained The degree of accuracy of barrier detection figure.
The neighborhood of above-mentioned appearance selects as the case may be, it is possible to elect other neighborhoods such as 8*16 as.Above-mentioned high threshold, low Threshold value, length threshold and image pixel intensities difference limen value can be chosen according to the type of actually detected image.
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all essences in the present invention Within god and principle, any modification, equivalent substitution and improvement etc. made, should be included within the scope of the present invention.

Claims (6)

1. the identification of a greasy dirt image based on marginal point self-similarity, it is characterised in that comprise the following steps:
Step one: input image to be detected in a computer, utilizes canny edge detection algorithm to obtain all limits of this image Edge point;
Step 2: classify all marginal points, similar marginal point belongs to an initial edge line of an image outline, obtains Take all of initial edge line of image to be detected, distribute a reference direction to each marginal point, and extract each edge The characteristic vector of point, and each characteristic vector is normalized;
Step 3: calculate each marginal point on every initial edge line according to the characteristic vector after each marginal point normalized Local self-similarity value and overall self-similarity value, and by local self-similarity value and the weighted array of overall self-similarity value As the self-similarity value that this marginal point is final;
Step 4: set a high threshold, obtains the marginal point higher than high threshold of the self-similarity value on every initial edge line Set, reject the marginal point less than high threshold of self-similarity value on every initial edge line;
Step 5: calculate the self-similarity value of each marginal point marginal point closest with it in above-mentioned set, sets a low threshold Value, obtains all of self-similarity value in set and, higher than the marginal point of Low threshold, will be less than the marginal point of Low threshold in this set It is labeled as being formed the broken edge point of irregular image;
Step 6: classifying the marginal point being higher than Low threshold in step 5, same class marginal point forms the school of an image outline Positive edge line, sets a length threshold to all of calibration edge line, obtains the calibration edge line less than this threshold value, by this school Point on positive edge line is labeled as being formed the broken edge point of irregular image;
Image pixel intensities in the local neighborhood of all broken edge points obtained in step 7, calculation procedure five and step 6 Difference, sets image pixel intensities difference limen value, and the image pixel intensities difference in the local neighborhood of labelling each broken edge point is more than given threshold The broken edge point of value is greasy dirt image.
The identification of greasy dirt image based on marginal point self-similarity the most according to claim 1, it is characterised in that described step The marginal point calculated on every initial edge line according to the characteristic vector after each marginal point normalized in rapid three is final The mode of self-similarity value is: setWithFor any two marginal point on edge line, its characteristic vector is respectivelyWith, then
Any two marginal point on edge lineWithSimilarity be:, here in vector Long-pending calculation is multiplied for vector corresponding element and is added, obtains two marginal pointsWithSimilarity;
Marginal pointLocal self-similarity value be: take this marginal pointOn the edge line at place adjacent with this marginal point and be positioned at Four marginal points of its both sides, then marginal pointLocal self-similarity value be:
Marginal pointOverall self-similarity value be: assuming that total n marginal point, then marginal point on this edge lineEntirety from Similarity is:
Set the weights of local self-similarity and overall self-similarity as, and,, by local self-similarity value and overall self-similarity value combination, then marginal pointSelf-similarity value For:
Characteristic vector after normalizedScope between 0 to 1, then self-similarity value represents similar journey between 0 to 1 Degree, self-similarity value is to be the most dissimilar state when 0, is to be complete similar state when 1.
The identification of greasy dirt image based on marginal point self-similarity the most according to claim 1, it is characterised in that described step The mode distributing a reference direction to each marginal point in rapid two is:
For any one marginal point, structure local neighborhood centered by current edge point, all pixels in calculating this neighborhood Grad and direction, utilize Grad and the direction of all pixels in this neighborhood of statistics with histogram, comprise 0 in rectangular histogram 9 Nogata posts that the direction scope of ~ 180 degree is divided equally, divide equally and are incorporated on 9 Nogata posts for 180 ~ 360 degree;
Calculate each marginal point weight coefficient to adjacent both direction, calculate each edge further according to weight coefficient and Grad Contribution weights are added to histogrammic each Nogata post at this marginal point place by the some contribution weights to adjacent both direction On, direction, histogram peak place is the reference direction of this marginal point.
The identification of greasy dirt image based on marginal point self-similarity the most according to claim 1, it is characterised in that described step In rapid two, the extracting mode of the characteristic vector of any one marginal point is:
Set any edge point as, the reference direction of this marginal point is, coordinate axes is rotated to reference direction;In rotation Coordinate system after Zhuaning takes distance marginal point respectively along four orientationThe point of predetermined location of pixels, construct withCentered by 5 local neighborhood, calculate each pixel Grad and Each pixel contribution weights to adjacent both direction;The directional spreding rectangular histogram of 5 local neighborhood of statistics, Obtain 5 rectangular histograms;The characteristic vector of this marginal point is:;The finally feature to each marginal point Vector is normalized.
The identification of greasy dirt image based on marginal point self-similarity the most according to claim 1, it is characterised in that described step The mode calculating the difference of the image pixel intensities in the local neighborhood of all broken edges point in rapid seven is:
Set arbitrary broken edge point as, construct withCentered by local neighborhood;Maximum picture in calculating this local neighborhood Element intensity and the pixel of minimum pixel intensity element by force is poor
6. a TEDS system, it is characterised in that include that any one described in claim 1 to 5 is based on marginal point self-similarity The identification of greasy dirt image, the greasy dirt image of labelling is defaulted as the normal condition of EMUs, when EMUs fault detect not Labelling.
CN201610752481.0A 2016-08-30 2016-08-30 The identification of greasy dirt image based on marginal point self-similarity and TEDS systems Active CN106327499B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610752481.0A CN106327499B (en) 2016-08-30 2016-08-30 The identification of greasy dirt image based on marginal point self-similarity and TEDS systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610752481.0A CN106327499B (en) 2016-08-30 2016-08-30 The identification of greasy dirt image based on marginal point self-similarity and TEDS systems

Publications (2)

Publication Number Publication Date
CN106327499A true CN106327499A (en) 2017-01-11
CN106327499B CN106327499B (en) 2017-09-26

Family

ID=57789094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610752481.0A Active CN106327499B (en) 2016-08-30 2016-08-30 The identification of greasy dirt image based on marginal point self-similarity and TEDS systems

Country Status (1)

Country Link
CN (1) CN106327499B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428474A (en) * 2019-06-21 2019-11-08 珠海格力电器股份有限公司 Judgment method, terminal and the computer-readable medium whether kitchen ventilator needs to clean
CN113538340A (en) * 2021-06-24 2021-10-22 武汉中科医疗科技工业技术研究院有限公司 Target contour detection method and device, computer equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050180649A1 (en) * 2004-02-12 2005-08-18 Xerox Corporation Systems and methods for connecting regions image data having similar characteristics
CN102194114A (en) * 2011-06-25 2011-09-21 电子科技大学 Method for recognizing iris based on edge gradient direction pyramid histogram
CN102542260A (en) * 2011-12-30 2012-07-04 中南大学 Method for recognizing road traffic sign for unmanned vehicle
CN104573713A (en) * 2014-12-31 2015-04-29 天津弘源慧能科技有限公司 Mutual inductor infrared image recognition method based on image textual features

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050180649A1 (en) * 2004-02-12 2005-08-18 Xerox Corporation Systems and methods for connecting regions image data having similar characteristics
CN102194114A (en) * 2011-06-25 2011-09-21 电子科技大学 Method for recognizing iris based on edge gradient direction pyramid histogram
CN102542260A (en) * 2011-12-30 2012-07-04 中南大学 Method for recognizing road traffic sign for unmanned vehicle
CN104573713A (en) * 2014-12-31 2015-04-29 天津弘源慧能科技有限公司 Mutual inductor infrared image recognition method based on image textual features

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428474A (en) * 2019-06-21 2019-11-08 珠海格力电器股份有限公司 Judgment method, terminal and the computer-readable medium whether kitchen ventilator needs to clean
CN113538340A (en) * 2021-06-24 2021-10-22 武汉中科医疗科技工业技术研究院有限公司 Target contour detection method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN106327499B (en) 2017-09-26

Similar Documents

Publication Publication Date Title
Wu et al. Lane-mark extraction for automobiles under complex conditions
Son et al. Real-time illumination invariant lane detection for lane departure warning system
US9607227B2 (en) Boundary detection apparatus and boundary detection method
Ding et al. An adaptive road ROI determination algorithm for lane detection
CN106682586A (en) Method for real-time lane line detection based on vision under complex lighting conditions
CN104462380A (en) Trademark retrieval method
US10997434B2 (en) Lane marker recognition
Kortli et al. A novel illumination-invariant lane detection system
CN109993099A (en) A kind of lane line drawing recognition methods based on machine vision
CN110765992B (en) Seal identification method, medium, equipment and device
Chang et al. An efficient method for lane-mark extraction in complex conditions
CN104899888A (en) Legemdre moment-based image subpixel edge detection method
CN109635737A (en) Automobile navigation localization method is assisted based on pavement marker line visual identity
CN104933398A (en) vehicle identification system and method
WO2016059643A1 (en) System and method for pedestrian detection
Wang et al. License plate location algorithm based on edge detection and morphology
CN106327499A (en) Oil stain image recognition based on edge point self-similarity and TEDS system
Satzoda et al. Robust extraction of lane markings using gradient angle histograms and directional signed edges
CN106326901A (en) Water stain image recognition based on edge point self-similarity and TEDS system
KR101862994B1 (en) A method of Stop-line Detection for Autonomous Vehicles
Ying et al. An illumination-robust approach for feature-based road detection
KR101205565B1 (en) Method for Dectecting Front and Rear Vehicle by Using Image
WO2015193152A1 (en) Method for detecting a viewing-angle-dependent feature of a document
Zong et al. Traffic light detection based on multi-feature segmentation and online selecting scheme
CN106384358B (en) The recognition methods of irregular image based on marginal point self-similarity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant