CN116109849A - SURF feature matching-based high-voltage isolating switch positioning and state identification method - Google Patents

SURF feature matching-based high-voltage isolating switch positioning and state identification method Download PDF

Info

Publication number
CN116109849A
CN116109849A CN202211735903.5A CN202211735903A CN116109849A CN 116109849 A CN116109849 A CN 116109849A CN 202211735903 A CN202211735903 A CN 202211735903A CN 116109849 A CN116109849 A CN 116109849A
Authority
CN
China
Prior art keywords
matching
feature
image
point
surf
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211735903.5A
Other languages
Chinese (zh)
Inventor
陈毅恒
李章维
周浩
郑文皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN202211735903.5A priority Critical patent/CN116109849A/en
Publication of CN116109849A publication Critical patent/CN116109849A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

A method for positioning and identifying the state of a high-voltage isolating switch based on SURF feature matching comprises the steps of firstly precisely positioning the position of the switch by matching the SURF feature matching method with an image similarity measure based on Hamming distance, and accurately identifying the state of the switch by using a perceptual hash algorithm as a similarity determination measure. The DCT is adopted to reduce the frequency, the accuracy is higher than that of the average hash algorithm and the difference hash algorithm, so that the requirement of high-voltage switch state identification on accuracy is met, in terms of calculation speed, under the condition of ensuring robustness and matching accuracy, the SURF algorithm is faster than the SIFT algorithm in speed, the Hamming distance calculation is faster than the Euclidean distance calculation, the characteristic of better adaptability to the influence of illumination, angles and other actual environments is inherited, the defect of the existing high-voltage switch state identification method on the aspects of sample size, calculation cost and accuracy is overcome, and the positioning and identification of the switch are higher in accuracy.

Description

SURF feature matching-based high-voltage isolating switch positioning and state identification method
Technical Field
The invention relates to the technical category of image matching in the fields of image processing and pattern recognition, in particular to a method for positioning and recognizing a state of a high-voltage isolating switch based on SURF (speeded up robust feature matching).
Background
With the rapid development of the economy in China, the power grid scale is continuously enlarged, the voltage level is continuously improved, and higher safety and efficiency requirements are provided for power grid construction and operation. In order to meet the requirements of unattended operation and safe production operation of the transformer substation, the on-line detection function of the operation state of the electric primary equipment of the unattended transformer substation is improved, the real-time fault early warning function is realized, the problems of inspection and the like of the disconnecting switch under the unattended condition are solved, the video image shot by the image shooting equipment is adopted, the on-line identification of the operation state of the disconnecting switch and other equipment is realized at the station end through the machine vision technology, the reliability of a transformer substation monitoring system can be improved, and the system vision is widened.
The research literature finds that a plurality of methods for identifying the state of the high-voltage switch are proposed, but a large number of training data sets and machine learning algorithms are generally used, so that the calculation cost is high, and meanwhile, the accuracy and the authenticity of the adopted high-voltage switch state monitoring based on machine learning are still to be further verified because the data samples of the training sets are fewer. However, SIFT feature matching in the conventional recognition algorithm is remarkable in a plurality of methods and has strong robustness, but is still lacking in speed and accuracy.
In summary, the existing high-voltage switch state identification method has a great gap from the actual application requirements in terms of sample size, calculation cost and accuracy, and needs to be improved urgently.
Disclosure of Invention
In order to overcome the defects of the existing high-voltage switch state identification method in terms of sample size, calculation cost and accuracy, the invention provides a SURF feature matching-based high-voltage isolation switch positioning and state identification method with low calculation cost and high accuracy under a low sample condition.
The technical scheme adopted for solving the technical problems is as follows:
a SURF feature matching-based high-voltage isolating switch positioning and state identifying method, comprising the steps of:
1) After the position of the camera is installed, initializing the isolating switch to enable the switch to be in a closed state and a full open state respectively;
2) Collecting images in the view field of the camera, intercepting the position of a switch of the isolating switch, and taking the images as a matching template;
3) A normal operation switch, wherein the camera shoots and stores the shot image in real time, and SURF feature matching is carried out on the stored image by using a matching template in a closed state and a matching template in a fully opened state respectively;
4) A perceptual hash algorithm is adopted for the template and the image with high matching degree, and the Hamming distance is calculated;
5) If the calculated Hamming distance is lower than the set threshold value, the display switch is in a state of a corresponding matching template;
6) If the calculated Hamming distances are all higher than the set threshold, displaying a half-open state of the switch;
7) After the test result is displayed, the step returns to the step 3) to start a new round of detection.
Further, the procedure of the step 3) is as follows:
3.1 Respectively carrying out SURF feature matching by using the images obtained after the operations of the step 1) and the step 2) as matching templates;
3.2 Saving the image obtained after the operation in the step 3.1), intercepting the matching parts, and respectively participating in subsequent calculation together with the corresponding matching templates.
Still further, in the step 3.1), the SURF feature matching algorithm is as follows:
3.1.1 A scale space is established, and characteristic points are determined and extracted;
3.1.2 Determining the direction of the feature points and generating feature vectors, namely distributing a main direction for each feature point, so as to ensure the invariance of the feature vectors;
3.1.3 Feature point matching and optimization thereof.
In the step 3.1.1), the process of establishing the scale space is as follows:
3.1.1.1 A scale space is established, namely, gaussian filtering is carried out on any point p (x, y) in the image I to obtain a Hessian matrix H (x, sigma) with a corresponding scale value sigma, wherein the Hessian matrix H (x, sigma) is as follows:
Figure BDA0004032780270000021
wherein L is xx (x,σ),L xy (x,σ),L yy (x, sigma) is the second partial derivative at point p
Figure BDA0004032780270000022
The gaussian function is defined as:
Figure BDA0004032780270000023
the value of the determinant of the Hessian matrix is expressed as:
Det(H)=L xx L yy -L xy L xy
wherein in order to increase the speed of operation, the SURF algorithm uses a box filter instead of a gaussian filter;
3.1.1.2 The SURF algorithm starts from a box filter with the size of 9 multiplied by 9, expands the size of the box filter, and the box filter with the size of 9 multiplied by 9 is a filtering template obtained by dispersing and subtracting a Gaussian second-order differential function when sigma is 1.2; in SURF, the image is kept unchanged, and the size of a Gaussian filter window is only changed to obtain images with different scales, namely, a scale space is formed, the scale space is divided into 4 groups, and each group is divided into 4 layers; first, the initial value of the filter is 9×09, corresponding to the value of the gaussian scale σ=1.2, the number of layers of the first set of filters is increased by 6 from the previous number of layers, the sizes of the first set are 9×9, 15×15, 21×21, 27×27, the difference of the sizes of each layer of the second set is 12, the value of each first layer of the first set is equal to the value of the second layer of the previous set, that is, the filter sizes of the second set are 15×15, 27×27, 39×39, 51×51, and so on, the value of the convolved image and the square filter template is D xx ,D xy ,D yy
3.1.1.3 Further solving to obtain Hessian matrix determinant arrangement as follows:
Det(H)=L xx L yy -L xy L xy =(A-Bω)C
wherein a=d xx D yy ,B=D xy D xy ω is a weight coefficient expressed as:
Figure BDA0004032780270000031
so that:
Det(H)=D xx D yy -(ωD xy )
3.1.1.4 After the calculation, a matrix image of the original image can be obtained at a certain point of the spatial scale, the SURF algorithm compares the processed pixel values with 26 surrounding points, if the processed pixel values are maximum or minimum, the processed pixel values are initially determined to be characteristic points, and then the points with the median value smaller than a certain threshold value are removed.
In the step 3.1.2), the process of generating the feature vector is as follows:
3.1.2.1 In order to realize rotation invariance, in the neighborhood of a certain characteristic point, in a region with the radius of 6σ, σ is the scale value of the characteristic point, and the horizontal and vertical Haar characteristics of each point set are counted; then scanning the whole circular area for 1 week at certain intervals by using a sector area with an angle pi/3 and taking the interest point as the center, finally obtaining that the sector area is positioned in each angle area, including the vector sum of the image point Haar wavelet response, and selecting the direction of the longest vector as the main direction of the feature point;
3.1.2.2 When the direction of the feature points is determined, the feature vectors are required to be generated finally, the feature point description vectors reflect the gradient change condition of gray values around the feature points, if one feature point information is more prominent, the corresponding gradient change is large, the gradient value change of gray values near the edges of the image is large, and in a smoother area, the gradient value change is not obvious; so that in order to improve the accuracy of the subsequent matching, those points whose information stands out are extracted when feature points are extracted.
In the step 3.1.3), the characteristic point matching process is as follows:
3.1.3.1 Step 3.1.1.3) the trace of the Hessian matrix has been obtained, and after obtaining the values of a series of characteristic points, the positive numbers are listed as a group, and the negative numbers are listed as a group; if the 2 feature points are in the same group, the Euclidean distance of the 2 description vectors is calculated, if the values are equal, the 2 points are a pair of matching points, if the values are not equal, the 2 points are not a matching pair, and the Euclidean distance is not calculated, so that the workload is reduced, and the matching efficiency is improved;
the Euclidean distance of the 2 description vectors is
Figure BDA0004032780270000041
3.1.3..2) in an n-dimensional vector, X ik Describing the kth element of the vector for the ith feature point in the target image; x is X jk The jth feature point in the graph to be matched describes the kth element of the vector; for one point A of the image, euclidean distances between the point A and feature points B and C in the image to be matched are respectively obtained, the obtained minimum distance and the obtained second minimum distance are ab and ac respectively, and if ab is less than or equal to eta ac, eta=0.8 and the obtained minimum distance and the obtained second minimum distance are sequentially found, the point A and the point B can be judged to be a pair of matching points.
In the step 3.1.2.2), generating the SURF feature vector includes the steps of:
3.1.2.2.1 Taking the interest point as the center, constructing a quadrilateral region with the length of 20σx20σ, and then rotating the horizontal x-axis of the region to the main direction of the interest point, wherein σ is the scale of the feature point;
3.1.2.2.2 Dividing the square area established in the last step into 4×4 subareas, and then calculating wavelet effect in the range of 5×5 in each subarea, wherein the characteristic comprises the response of Haar wavelet to the horizontal and vertical directions and the absolute value of the sum of the responses;
3.1.2.2.3 Four-dimensional characteristics of each sub-region are calculated and added, and the divided 16 sub-regions are accumulated to obtain a final 64-dimensional characteristic vector descriptor.
In the step 4), the process of the perceptual hash algorithm is as follows:
4.1 Uniformly scaling the picture size to 32 x 32, and obtaining 1024 pixel points in total;
4.2 The image is converted into a gray level image, the input standard of the next step is unified, and the non-single-channel image is converted into a single-channel gray level image;
4.3 A) calculating a Discrete Cosine Transform (DCT), i.e. calculating a corresponding 32X 32 data matrix after discrete cosine transformation of the 32X 32 data matrix, wherein the DCT formula is:
Figure BDA0004032780270000042
4.4 Reducing DCT, namely taking the step 9.3) to obtain 8X 8 subareas at the left upper corner of the 32X 32 data matrix;
4.5 Calculating the average value, i.e. by step 9.4) an 8 x 8 integer matrix G is obtained, the average value of all elements in this matrix being calculated, assuming a value of a;
4.6 Further reducing the DCT, greater than the average value is recorded as 1, otherwise recorded as 0;
4.7 Obtaining image information, combining 64 information bits, and keeping consistency of sequence at will;
4.8 After the hash value of the picture is obtained, the hamming distances of the hash values of the two pictures are compared.
In the step 4), the hamming distance calculation formula is:
Figure BDA0004032780270000051
where i=0, 1,..n-1, x, y are all n-bit codes,
Figure BDA0004032780270000052
representing exclusive or.
The technical conception of the invention is as follows: a SURF feature matching-based high-voltage isolating switch positioning and state identifying method is provided, namely a SURF feature matching-based high-voltage isolating switch positioning and state identifying method with low calculation cost and high accuracy under a low sample condition. According to the method, firstly, the position of the switch is accurately positioned through the SURF feature matching method matched with the image similarity measurement based on the Hamming distance, and finally, the switch state is accurately identified through a perception hash algorithm (pHash) as a similarity judgment measure, wherein DCT (discrete cosine transform) is adopted to reduce the frequency, the accuracy is higher than that of an average hash algorithm (aHash) and a difference hash algorithm (dHash) so as to meet the accuracy requirement of high-voltage switch state identification, in the aspect of calculation speed, under the condition of ensuring the robustness and the matching accuracy, the SURF algorithm is faster in speed than the SIFT algorithm, the Hamming distance calculation is faster than Euclidean distance calculation, the characteristics of better adaptability to the influence of the actual environments such as illumination, angles and the like are inherited, the defects of the existing high-voltage switch state identification method in the aspects of sample size, calculation cost and accuracy are overcome, and the positioning and identification of the switch are high in accuracy.
The beneficial effects of the invention are as follows: the method improves the problems of the existing high-voltage switch state identification method in terms of sample size, calculation cost, accuracy and the like. The existing method for identifying the state of the high-voltage switch generally uses a large number of training data sets and machine learning algorithms, so that the calculation cost is high, meanwhile, the accuracy and the authenticity of the adopted high-voltage switch state monitoring based on machine learning are still to be further verified because the data samples of the training sets are fewer. Even the SIFT feature matching algorithm commonly used in the traditional algorithm has the defects of speed and precision. Therefore, the invention adopts the traditional image recognition algorithm improvement mode, comprising SURF feature matching algorithm and hash algorithm, the SURF algorithm improves the real-time recognition speed on the SIFT basis, and the perception hash algorithm can make up for the defect of the SURF algorithm in precision, thereby solving the problems of the high-voltage switch state recognition method in the aspects of sample size, calculation cost, accuracy and the like, and realizing the recognition of the high-voltage switch real-time state.
Drawings
Fig. 1 is a flow chart of a high voltage isolation switch positioning and status recognition method based on SURF feature matching.
Fig. 2 is a feature matching flow chart based on the SURF algorithm.
Fig. 3 is a flow chart based on a perceptual hash algorithm.
Fig. 4 is a schematic diagram of a box filter and a gaussian filter.
Fig. 5 is a schematic diagram of the box filter size distribution in the scale space.
Fig. 6 is a feature point positioning process diagram.
Fig. 7 is a process diagram for determining a principal direction of a feature point.
Fig. 8 is a schematic diagram of a Haar wavelet filter.
Fig. 9 is a feature descriptor generation process diagram.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Referring to fig. 1 to 9, a method for positioning and identifying a state of a high-voltage isolating switch based on SURF feature matching includes the following steps:
1) After the position of the camera is installed, initializing the isolating switch to enable the switch to be in a closed state and a full open state respectively;
2) Collecting images in the view field of the camera, intercepting the position of a switch of the isolating switch, and taking the images as a matching template;
3) A normal operation switch, wherein the camera shoots and stores the shot image in real time, and SURF feature matching is carried out on the stored image by using a matching template in a closed state and a matching template in a fully opened state respectively;
the process of the step 3) is as follows:
3.1 SURF feature matching is performed by using the images obtained after the operations in the step 1 and the step 2 as matching templates, respectively, as shown in FIG. 2, the SURF feature matching includes the steps of:
3.1.1 A scale space is established, and characteristic points are determined and extracted;
establishing a scale space, namely performing Gaussian filtering on any point p (x, y) in the image I to obtain a Hessian matrix H (x, sigma) with a corresponding scale value sigma, wherein the Hessian matrix H (x, sigma) is as follows:
Figure BDA0004032780270000061
wherein L is xx (x,σ),L xy (x,σ),L yy (x, sigma) is the second partial derivative at point p
Figure BDA0004032780270000062
The gaussian function is defined as:
Figure BDA0004032780270000063
the value of the determinant of the Hessian matrix can be expressed as:
Det(H)=L xx L yy -L xy L xy
in which the SURF algorithm uses a box filter instead of a gaussian filter in order to increase the operation speed, as shown in fig. 4.
The SURF algorithm begins with a 9 x 9 box filter that is a filter template of discrete and subtracted gaussian second order differential function with a sigma of 1.2, and expands the size of the box filter. In SURF, we keep the image unchanged and only change the size of the gaussian filter window to obtain images of different scales, i.e. to construct a scale space, as shown in fig. 5.
The scale space is divided into 4 groups of 4 layers each. First, the initial value of the filter is 9×9, and the first group of the filter is 9×9, 15×15, 21×21, and 27×27, which correspond to the value of the gaussian scale σ=1.2, and the number of layers of the first group of the filter is increased by 6 from the previous number of layers. The second set of layer-by-layer size differences is 12, with the value of the first layer of each set being equal to the value of the second layer of the previous set. That is, the filter sizes of the second group are 15×15, 27×27, 39×39, 51×51, and so on. The value of the convolved image and the square filter template is D xx ,D xy ,D yy
Further solving to obtain Hessian matrix determinant to be sorted as follows:
Det(H)=L xx L yy -L xy L xy =(A-Bω)C
wherein a=d xx D yy ,B=D xy D xy ω is a weight coefficient expressed as:
Figure BDA0004032780270000071
so that:
Det(H)=D xx D yy -(ωD xy )
after the above calculation, a matrix image of the original image can be obtained at a certain point in the spatial scale. The SURF algorithm compares each processed pixel value with the surrounding 26 points and if it is the maximum or minimum value, it is primarily determined to be a feature point. Points where the value is less than a certain threshold are then removed as shown in fig. 6.
3.1.2 Determining the direction of the feature points and generating feature vectors, namely distributing a main direction for each feature point, so as to ensure the invariance of the feature vectors;
in order to realize rotation invariance, in the neighborhood of a certain characteristic point, the horizontal Haar characteristic and the vertical Haar characteristic of each point set are counted in a region with the radius of 6sigma (sigma is the scale value of the characteristic point). Then scanning the whole circular area for 1 week with a sector area with an angle pi/3 at certain intervals by taking the interest point as the center, finally obtaining that the sector area is positioned in each angle area, including the vector sum of the image point Haar wavelet responses, and selecting the direction of the longest vector as the main direction of the feature point, as shown in fig. 7.
After the direction of the feature points is determined, the feature vectors are finally required to be generated. The characteristic point description vector reflects the gradient change condition of gray values around the characteristic points, and if one characteristic point information is more prominent, the corresponding gradient change is large. The gray gradient values generally vary more around the edges of the image, while in smoother areas the gradient values do not vary significantly. So that in order to improve the accuracy of the subsequent matching, those points whose information stands out are extracted when feature points are extracted.
Wherein generating the SURF feature vector includes the following 3 steps:
taking the interest point as a center, constructing a quadrilateral region with the length of 20σx20σ, and then rotating the horizontal x-axis of the region to the main direction of the interest point, wherein σ is the scale of the feature point;
dividing the square area established in the last step into 4×4 subareas, and then calculating wavelet effect in the range of 5×5 in each subarea, wherein the characteristic comprises the response and the absolute value of the sum of the response of Haar wavelet to the horizontal and vertical directions, and the Haar wavelet filter is shown in fig. 8;
four-dimensional features of each sub-region are calculated and added, and the divided 16 sub-regions are accumulated to obtain a final 64-dimensional feature vector descriptor, as shown in fig. 9.
3.1.3 Feature point matching and optimization;
step 3.1.1) has obtained the trace of the Hessian matrix, after obtaining the values of the series of characteristic points, the positive numbers are listed as a group and the negative numbers as a group. If the 2 feature points are in the same group, the Euclidean distance of the 2 description vectors is calculated, if the values are equal, the 2 points are a pair of matching points, if the values are not equal, the 2 points are not a matching pair, and the Euclidean distance is not calculated, so that the workload is reduced, and the matching efficiency is improved.
The Euclidean distance of the 2 description vectors is
Figure BDA0004032780270000081
/>
In an n-dimensional vector, X ik Describing the kth element of the vector for the ith feature point in the target image; x is x jk The j-th feature point in the graph to be matched describes the k-th element of the vector. Specifically, for one point a of the image, euclidean distances between a and feature points B and C in the image to be matched are respectively obtained, the obtained minimum distances and second minimum distances are ab and ac respectively, and if ab is less than or equal to ηac, (η=0.8), and the obtained minimum distances and second minimum distances are sequentially found, it can be determined that a and B are a pair of matching points.
3.2 Saving the image obtained after the operation in the step 2.1), intercepting the matching parts, and respectively participating in subsequent calculation together with the corresponding matching templates.
4) A perceptual hash algorithm is adopted for the template and the image with high matching degree, and the Hamming distance is calculated, as shown in FIG. 3; the perceptual hash algorithm comprises the steps of:
4.1 Uniformly scaling the picture size to 32 x 32, and obtaining 1024 pixel points in total;
4.2 The image is converted into a gray level image, the input standard of the next step is unified, and the non-single-channel image is converted into a single-channel gray level image;
4.3 A) calculating a Discrete Cosine Transform (DCT), i.e. calculating a corresponding 32X 32 data matrix after discrete cosine transformation of the 32X 32 data matrix, wherein the DCT formula is:
Figure BDA0004032780270000091
4.4 Reducing DCT, namely taking the step 9.3) to obtain 8X 8 subareas at the left upper corner of the 32X 32 data matrix;
4.5 Calculating the average value, i.e. by step 9.4) an 8 x 8 integer matrix G is obtained, the average value of all elements in this matrix being calculated, assuming a value of a;
4.6 Further reducing the DCT, greater than the average value is recorded as 1, otherwise recorded as 0;
4.7 Image information is obtained, 64 information bits are combined, and the sequence is kept consistent at will.
4.8 After the hash values of the pictures are obtained, the hamming distances of the two pictures are compared, and a group of pictures with hamming distances smaller than 10 are generally considered to be similar pictures.
The hamming distance calculation formula is as follows:
Figure BDA0004032780270000092
where i=0, 1,..n-1, x, y are all n-bit codes,
Figure BDA0004032780270000093
representing exclusive or.
5) If the calculated Hamming distance is lower than the set threshold value 10, displaying that the switch is in a state of a corresponding matching template;
6) If the calculated Hamming distances are all higher than the set threshold 10, displaying a half-open state of the switch;
7) After the test result is displayed, the step returns to the step 3) to start a new round of detection.
The foregoing detailed description is provided to illustrate the present invention and not to limit the invention, and any modifications and changes made to the present invention within the spirit of the present invention and the scope of the appended claims fall within the scope of the present invention.

Claims (9)

1. The high-voltage isolating switch positioning and state identifying method based on SURF feature matching is characterized by comprising the following steps:
1) After the position of the camera is installed, initializing the isolating switch to enable the switch to be in a closed state and a full open state respectively;
2) Collecting images in the view field of the camera, intercepting the position of a switch of the isolating switch, and taking the images as a matching template;
3) A normal operation switch, wherein the camera shoots and stores the shot image in real time, and SURF feature matching is carried out on the stored image by using a matching template in a closed state and a matching template in a fully opened state respectively;
4) A perceptual hash algorithm is adopted for the template and the image with high matching degree, and the Hamming distance is calculated;
5) If the calculated Hamming distance is lower than the set threshold value, the display switch is in a state of a corresponding matching template;
6) If the calculated Hamming distances are all higher than the set threshold, displaying a half-open state of the switch;
7) After the test result is displayed, the step returns to the step 3) to start a new round of detection.
2. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as claimed in claim 1, wherein the method comprises the following steps: the process of the step 3) is as follows:
3.1 Respectively carrying out SURF feature matching by using the images obtained after the operations of the step 1) and the step 2) as matching templates;
3.2 Saving the image obtained after the operation in the step 3.1), intercepting the matching parts, and respectively participating in subsequent calculation together with the corresponding matching templates.
3. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as claimed in claim 2, wherein: in the step 3.1), the procedure of the SURF feature matching algorithm is as follows:
3.1.1 A scale space is established, and characteristic points are determined and extracted;
3.1.2 Determining the direction of the feature points and generating feature vectors, namely distributing a main direction for each feature point, so as to ensure the invariance of the feature vectors;
3.1.3 Feature point matching and optimization thereof.
4. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as claimed in claim 3, wherein the method comprises the following steps: in the step 3.1.1), the process of establishing the scale space is as follows:
3.1.1.1 A scale space is established, namely, gaussian filtering is carried out on any point p (x, y) in the image I to obtain a Hessian matrix H (x, sigma) with a corresponding scale value sigma, wherein the Hessian matrix H (x, sigma) is as follows:
Figure FDA0004032780260000021
wherein L is xx (x,σ),L xy (x,σ),L yy (x, sigma) is the second partial derivative at point p
Figure FDA0004032780260000022
The gaussian function is defined as:
Figure FDA0004032780260000023
the value of the determinant of the Hessian matrix is expressed as:
Det(H)=L xx L yy -L xy L xy
wherein in order to increase the speed of operation, the SURF algorithm uses a box filter instead of a gaussian filter;
3.1.1.2 The SURF algorithm starts from a box filter with the size of 9 multiplied by 9, expands the size of the box filter, and the box filter with the size of 9 multiplied by 9 is a filtering template obtained by dispersing and subtracting a Gaussian second-order differential function when sigma is 1.2; in SURF, the image is kept unchanged, and the size of a Gaussian filter window is only changed to obtain images with different scales, namely, a scale space is formed, the scale space is divided into 4 groups, and each group is divided into 4 layers; first, the initial value of the filter is 9×09, corresponding to the value of the gaussian scale σ=1.2, the number of layers of the first set of filters is increased by 6 from the previous number of layers, the sizes of the first set are 9×9, 15×15, 21×21, 27×27, the difference of the sizes of each layer of the second set is 12, the value of each first layer of the first set is equal to the value of the second layer of the previous set, that is, the filter sizes of the second set are 15×15, 27×27, 39×39, 51×51, and so on, the value of the convolved image and the square filter template is D xx ,D xy ,D yy
3.1.1.3 Further solving to obtain Hessian matrix determinant arrangement as follows:
Det(H)=L xx L yy -L xy L xy =(A-Bω)C
wherein a=d xx D yy ,B=D xy D xy ω is a weight coefficient expressed as:
Figure FDA0004032780260000024
so that:
Det(H)=D xx D yy -(ωD xy )
3.1.1.4 After the calculation, a matrix image of the original image can be obtained at a certain point of the spatial scale, the SURF algorithm compares the processed pixel values with 26 surrounding points, if the processed pixel values are maximum or minimum, the processed pixel values are initially determined to be characteristic points, and then the points with the median value smaller than a certain threshold value are removed.
5. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as claimed in claim 3, wherein the method comprises the following steps: in the step 3.1.2), the process of generating the feature vector is as follows:
3.1.2.1 In order to realize rotation invariance, in the neighborhood of a certain characteristic point, in a region with the radius of 6σ, σ is the scale value of the characteristic point, and the horizontal and vertical Haar characteristics of each point set are counted; then scanning the whole circular area for 1 week at certain intervals by using a sector area with an angle pi/3 and taking the interest point as the center, finally obtaining that the sector area is positioned in each angle area, including the vector sum of the image point Haar wavelet response, and selecting the direction of the longest vector as the main direction of the feature point;
3.1.2.2 When the direction of the feature points is determined, the feature vectors are required to be generated finally, the feature point description vectors reflect the gradient change condition of gray values around the feature points, if one feature point information is more prominent, the corresponding gradient change is large, the gradient value change of gray values near the edges of the image is large, and in a smoother area, the gradient value change is not obvious; so that in order to improve the accuracy of the subsequent matching, those points whose information stands out are extracted when feature points are extracted.
6. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as set forth in claim 4, wherein the method comprises the following steps: in the step 3.1.3), the characteristic point matching process is as follows:
3.1.3.1 Step 3.1.1.3) the trace of the Hessian matrix has been obtained, and after obtaining the values of a series of characteristic points, the positive numbers are listed as a group, and the negative numbers are listed as a group; if the 2 feature points are in the same group, the Euclidean distance of the 2 description vectors is calculated, if the values are equal, the 2 points are a pair of matching points, if the values are not equal, the 2 points are not a matching pair, and the Euclidean distance is not calculated, so that the workload is reduced, and the matching efficiency is improved;
the Euclidean distance of the 2 description vectors is
Figure FDA0004032780260000031
3.1.3..2) in an n-dimensional vector, X ik Describing the kth element of the vector for the ith feature point in the target image; x is X jk The jth feature point in the graph to be matched describes the kth element of the vector; for one point A of the image, euclidean distances between the point A and feature points B and C in the image to be matched are respectively obtained, the obtained minimum distance and the obtained second minimum distance are ab and ac respectively, and if ab is less than or equal to eta ac, eta=0.8 and the obtained minimum distance and the obtained second minimum distance are sequentially found, the point A and the point B can be judged to be a pair of matching points.
7. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as claimed in claim 5, wherein the method comprises the following steps: in the step 3.1.2.2), generating the SURF feature vector includes the steps of:
3.1.2.2.1 Taking the interest point as the center, constructing a quadrilateral region with the length of 20σx20σ, and then rotating the horizontal x-axis of the region to the main direction of the interest point, wherein σ is the scale of the feature point;
3.1.2.2.2 Dividing the square area established in the last step into 4×4 subareas, and then calculating wavelet effect in the range of 5×5 in each subarea, wherein the characteristic comprises the response of Haar wavelet to the horizontal and vertical directions and the absolute value of the sum of the responses;
3.1.2.2.3 Four-dimensional characteristics of each sub-region are calculated and added, and the divided 16 sub-regions are accumulated to obtain a final 64-dimensional characteristic vector descriptor.
8. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as claimed in claim 1, wherein the method comprises the following steps: in the step 4), the process of the perceptual hash algorithm is as follows:
4.1 Uniformly scaling the picture size to 32 x 32, and obtaining 1024 pixel points in total;
4.2 The image is converted into a gray level image, the input standard of the next step is unified, and the non-single-channel image is converted into a single-channel gray level image;
4.3 A) calculating a Discrete Cosine Transform (DCT), i.e. calculating a corresponding 32X 32 data matrix after discrete cosine transformation of the 32X 32 data matrix, wherein the DCT formula is:
Figure FDA0004032780260000041
4.4 Reducing DCT, namely taking the step 9.3) to obtain 8X 8 subareas at the left upper corner of the 32X 32 data matrix;
4.5 Calculating the average value, i.e. by step 9.4) an 8 x 8 integer matrix G is obtained, the average value of all elements in this matrix being calculated, assuming a value of a;
4.6 Further reducing the DCT, greater than the average value is recorded as 1, otherwise recorded as 0;
4.7 Obtaining image information, combining 64 information bits, and keeping consistency of sequence at will;
4.8 After the hash value of the picture is obtained, the hamming distances of the hash values of the two pictures are compared.
9. The SURF feature matching-based high-voltage isolating switch positioning and state identifying method as claimed in claim 1, wherein the method comprises the following steps: in the step 4), the hamming distance calculation formula is:
d(x,y)=∑x[i]⊕y[i]
where i=0, 1, ·n-1, x, y are all n-bit encodings, representing exclusive or.
CN202211735903.5A 2022-12-30 2022-12-30 SURF feature matching-based high-voltage isolating switch positioning and state identification method Pending CN116109849A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211735903.5A CN116109849A (en) 2022-12-30 2022-12-30 SURF feature matching-based high-voltage isolating switch positioning and state identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211735903.5A CN116109849A (en) 2022-12-30 2022-12-30 SURF feature matching-based high-voltage isolating switch positioning and state identification method

Publications (1)

Publication Number Publication Date
CN116109849A true CN116109849A (en) 2023-05-12

Family

ID=86259163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211735903.5A Pending CN116109849A (en) 2022-12-30 2022-12-30 SURF feature matching-based high-voltage isolating switch positioning and state identification method

Country Status (1)

Country Link
CN (1) CN116109849A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116907349A (en) * 2023-09-12 2023-10-20 北京宝隆泓瑞科技有限公司 Universal switch state identification method based on image processing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116907349A (en) * 2023-09-12 2023-10-20 北京宝隆泓瑞科技有限公司 Universal switch state identification method based on image processing
CN116907349B (en) * 2023-09-12 2023-12-08 北京宝隆泓瑞科技有限公司 Universal switch state identification method based on image processing

Similar Documents

Publication Publication Date Title
CN109118479B (en) Capsule network-based insulator defect identification and positioning device and method
CN111383209B (en) Unsupervised flaw detection method based on full convolution self-encoder network
CN111402224B (en) Target identification method for power equipment
CN103164856B (en) Video copying and pasting blind detection method based on dense SIFT stream
CN109191421A (en) Cylindricality lithium battery periphery pit visible detection method
CN111275010A (en) Pedestrian re-identification method based on computer vision
CN115861210B (en) Transformer substation equipment abnormality detection method and system based on twin network
CN110659637A (en) Electric energy meter number and label automatic identification method combining deep neural network and SIFT features
Liang et al. Automatic defect detection of texture surface with an efficient texture removal network
Avola et al. Real-time deep learning method for automated detection and localization of structural defects in manufactured products
CN116109849A (en) SURF feature matching-based high-voltage isolating switch positioning and state identification method
CN115170520A (en) Metal mesh defect detection method based on structure contrast information lamination
CN109241932B (en) Thermal infrared human body action identification method based on motion variance map phase characteristics
CN113095332B (en) Saliency region detection method based on feature learning
Niu et al. Electrical equipment identification method with synthetic data using edge-oriented generative adversarial network
CN109389017B (en) Pedestrian re-identification method
CN110910497B (en) Method and system for realizing augmented reality map
Shuo et al. Digital recognition of electric meter with deep learning
CN115587950B (en) Low-light-level enhanced color recovery method
CN116912184A (en) Weak supervision depth restoration image tampering positioning method and system based on tampering area separation and area constraint loss
Gao et al. A novel patterned fabric defect detection algorithm based on GHOG and low-rank recovery
Guo et al. Fault diagnosis of power equipment based on infrared image analysis
CN116912670A (en) Deep sea fish identification method based on improved YOLO model
Li et al. Surface Defect Detection of Seals Based on K‐Means Clustering Algorithm and Particle Swarm Optimization
CN110414595B (en) Method for estimating direction field of texture image with direction consistency

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination