CN109359643B - Dial plate pointer identification method using computer vision - Google Patents

Dial plate pointer identification method using computer vision Download PDF

Info

Publication number
CN109359643B
CN109359643B CN201710813475.6A CN201710813475A CN109359643B CN 109359643 B CN109359643 B CN 109359643B CN 201710813475 A CN201710813475 A CN 201710813475A CN 109359643 B CN109359643 B CN 109359643B
Authority
CN
China
Prior art keywords
dial
image
area
pointer
straight line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710813475.6A
Other languages
Chinese (zh)
Other versions
CN109359643A (en
Inventor
张海剑
樊路之
杨天韵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201710813475.6A priority Critical patent/CN109359643B/en
Publication of CN109359643A publication Critical patent/CN109359643A/en
Application granted granted Critical
Publication of CN109359643B publication Critical patent/CN109359643B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/243Aligning, centring, orientation detection or correction of the image by compensating for image skew or non-uniform image deformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a dial pointer identification method using computer vision, which comprises the steps of firstly positioning an area where a dial is located, then correcting the angle deviation of the dial, then detecting a pointer and finally reading. The invention provides that a KCF tracking algorithm is applied to the positioning of the dial area, and the detection rate is greatly improved. In addition, in the existing pointer identification algorithm, a dial is required to be horizontally placed during reading, and then the reading is obtained according to an angle method. In practice, however, the dial is not necessarily placed horizontally in the image captured by the camera. The present invention addresses this problem by using SURF to correct for angular misalignment of the dial.

Description

Dial plate pointer identification method using computer vision
Technical Field
The invention belongs to the technical field of computer vision and image processing, and particularly relates to a method for tracking a jittering dial plate in real time through computer vision, correcting angle deviation caused by jitter, detecting a pointer and displaying a reading.
Background
In many plants, the meters used to detect the operational status of the equipment are still mostly pointer meters. The method for ensuring the safe operation of the plant equipment depends on the regular patrol inspection of special personnel, and emergency action is taken when the abnormal indication of the instrument is found. This makes the work of patrol personnel cumbersome and inefficient, and since the equipment is not found to be abnormal at the first time by human inspection, no action can be taken at the first time. The computer vision uses machine to replace visual organ to make measurement and judgment, and it can be used in instrument detection to raise detection efficiency and automation degree. The dial plate pictures are captured through the camera and processed by a computer at the background, the dial plate scales are detected in real time, and once the dial plate scales are abnormal, emergency measures are taken immediately. In a dial pointer identification algorithm based on computer vision, the general method is to determine the area of a dial in the whole image, then find the straight line of the pointer in the dial area, and finally read the number. When determining the dial area, the current methods include a subtraction method, a Hough circle transformation method, a rotation center positioning algorithm based on a canny operator, and the like. The subtraction method requires that two pointers are strictly aligned with dial plate images at different scales, otherwise, images are subtracted to generate a plurality of interference areas; the Hough circle transformation method and the canny operator positioning method have too large calculation amount, so that the processing is too slow, the processing time is 0.4s-0.5s per frame, the real-time processing cannot be realized, and the real-time processing is particularly important in practical application. In addition, in the existing pointer identification algorithm, the dial is required to be horizontally placed in the last reading step, so that the reading is calculated by using an angle method. However, in practical applications, the dial plate shakes along with the shaking of the machine, so that the dial plate is not horizontally placed in a picture captured by the camera.
Disclosure of Invention
In order to solve the technical problems, the invention provides a dial pointer detection method using computer vision, which can better improve the detection efficiency and the automation degree in instrument detection.
The technical scheme adopted by the invention is as follows: a method for identifying a dial indicator by using computer vision comprises the following steps:
step 1: positioning the area of the dial plate;
step 2: correcting dial angle offset;
and step 3: detecting a pointer;
and 4, step 4: and (6) reading.
The invention provides that a Kernelized Correlation Filters (KCF for short) tracking algorithm is applied to the area of a positioning table panel, so that the detection rate is greatly improved, and on average, each frame is 0.05s, and real-time detection can be realized; and SURF characteristic detection is applied to the correction of dial offset, so that dial horizontal placement is not strictly required during reading, and actual requirements can be met.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention;
FIG. 2 is a flow chart of positioning a dial region in accordance with an embodiment of the present invention;
FIG. 3 is an exemplary diagram of a dial area positioning result according to an embodiment of the present invention;
FIG. 4 is an exemplary diagram of correcting dial angle offset according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating an example of pointer positioning according to an embodiment of the present invention.
Detailed Description
In order to facilitate the understanding and implementation of the present invention for those of ordinary skill in the art, the present invention is further described in detail with reference to the accompanying drawings and examples, it is to be understood that the embodiments described herein are merely illustrative and explanatory of the present invention and are not restrictive thereof.
Referring to fig. 1, the method for identifying a dial indicator using computer vision according to the present invention includes the following steps:
step 1: positioning the area of the dial plate;
due to the jitter of the machine, the video-captured meter is also jittered. In a first frame image of a video, a region where a dial is located is framed, and then the position of the dial is tracked by applying a Kernelized Correlation Filters (KCF for short).
Kernelized Correlation Filters (KCF) tracking algorithm consisting of
Figure BDA0001404533870000021
Henriques et al. The principles and associated definitions of the KCF tracking algorithm are described below.
The KCF algorithm utilizes a cyclic shift theory to construct a training sample of the classifier, so that training data are greatly increased, and the target position is updated in a mode of screening a candidate region with the maximum similarity to the target through a kernel function, so that efficient tracking is achieved.
(1) A cyclic matrix;
for an n × 1 vector x ═ x1,...,xn]TAnd P is an n × n permutation matrix for cyclic shift
Figure BDA0001404533870000031
Px=[xn,x1,x2,...,x(n-1)]TIndicating that x is cyclically shifted by 1 bit, { P }ux | u ═ 0, …, n-1} indicates that x is cyclically shifted by u bits.
The circulant matrix X may be expressed as:
Figure BDA0001404533870000032
x represents the 1 st row in the matrix X, i.e. the basis vector, and the 2 nd to nth rows in X are cyclically shifted from the 1 st row. From the property that the circulant matrix can be diagonalized by a Discrete Fourier Transform (DFT), it can be seen that:
Figure BDA0001404533870000033
where ^ denotes the discrete Fourier transform, and F is the discrete Fourier transform matrix and is a constant matrix independent of x, for any z, there is
Figure BDA0001404533870000034
FHF=FFHI is the identity matrix, H is the conjugate transpose of the matrix, XH=(X*)T
(2) A Gaussian kernel function;
a Radial Basis Function (RBF) is a radially symmetric scalar Function, generally defined as a monotonic Function of euclidean distance from any point x to a center x 'in space, and can be referred to as κ (x, x'). The most commonly used radial basis function is the gaussian kernel function, of the form:
Figure BDA0001404533870000035
wherein x' is the center of the kernel function, and delta is the width parameter of the function, and the radial action range of the function is controlled. Known nuclear correlations
Figure BDA0001404533870000036
Then the Gaussian kernel correlation kxx′
Figure BDA0001404533870000037
Figure BDA0001404533870000041
Representing element-to-element operations.
(3) Performing classifier training by using a kernel function;
the process of training the classifier is mainly to pass through a training sample xiAnd its regression label yiTo find a function f (z) w that minimizes the mean square errorTz, i.e.
Figure BDA0001404533870000042
Where w is the parameters of the classifier, z is the image region to be detected, and λ is the regularization parameter used to control the overfitting.
Defining a kernel function
Figure BDA0001404533870000043
Wherein
Figure BDA0001404533870000044
The input to a linear problem is mapped to a function of the nonlinear feature space. Then the solution of equation (5) can be written as a linear combination for the inputs, i.e.:
Figure BDA0001404533870000045
wherein,
α=(K+λI)-1y (7)
where α isiFor training sample xiK is a kernel matrix of n × n and it is composed of dot products of all sample pairs, i.e.:
Kij=κ(xi,xj) (8)
then
Figure BDA0001404533870000046
Since the kernel matrix K is a circulant matrix (the proof process is shown in derivation 1 below), it can be derived from the properties of the circulant matrix in equation (3):
Figure BDA0001404533870000047
wherein k isxxIs row 1 of the circulant matrix K. Substituting the above formula (10) into the formula (7) can obtain:
Figure BDA0001404533870000048
ride on both sides
Figure BDA0001404533870000049
Obtaining:
Figure BDA0001404533870000051
because of the fact that
Figure BDA0001404533870000052
FFH=I,XH=(X*)TThen the above formula can be written as:
Figure BDA0001404533870000053
and simultaneously removing conjugation on two sides to obtain:
Figure BDA0001404533870000054
the solution of the problem is transformed to the frequency domain based on the characteristics of the circulant matrix, thereby avoiding the process of matrix inversion. The classifier is trained to find the function f (z) ═ wTz, namely finding the optimal w, and converting the finding of w into the finding of the optimal alpha through the formula (14), so that the algorithm complexity is greatly reduced, and the operation efficiency is improved. Derivation 1: proving that the kernel matrix K is a cyclic matrix;
theorem: if K (x, x ') -K (Mx, Mx ') is K (Mx, Mx ') for any permutation matrix M, K is a circulant matrix.
And (3) proving that:
p is an n × n permutation matrix for cyclic shift:
Figure BDA0001404533870000055
then K isij=κ(xi,xj)=κ(Pix,Pjx)=κ(P-iPix,P-iPjx)=κ(x,Pj-ix)。
Because of Pn=P0Therefore, the above formula can be written as Kij=κ(x,P(j-i)modnx), mod represents the remainder. X is satisfied due to the circulant matrix X ═ C (X) formed as in formula (2)ij=x((j-i)modn)+1That is, if the element depends only on the value of (j-i) mod n, then the matrix is a circulant matrix. Thus K is the circulant matrix. The gaussian kernel matrix satisfies the condition κ (x, x ') ═ κ (Mx, Mx'), so the gaussian kernel matrix is a circulant matrix.
(4) Carrying out rapid detection;
for the new input image region z, equation (9)
Figure BDA0001404533870000056
And formula (8) Kij=κ(xi,xj) Obtaining:
f(z)=(Kz)Tα (15)
Kzthe kernel matrix of the training sample and the candidate image area is known according to the cyclic displacement and kernel function theorem:
Figure BDA0001404533870000061
wherein k isxzIs a nuclear correlation between x and z and is KzThe first row of the matrix. Substituting equation (16) into equation (15) to obtain:
Figure BDA0001404533870000062
pair (17) equation with both sides being multiplied
Figure BDA0001404533870000063
Obtaining:
Figure BDA0001404533870000064
because of the fact that
Figure BDA0001404533870000065
And FFHI, (18) formula may be written as:
Figure BDA0001404533870000066
when the value of the coordinate is the maximum value, the corresponding coordinate represents the change of the position of the target area, and the coordinate is added with the coordinate of the target area of the previous frame to obtain the position of the target area of the frame.
(5) Updating the model;
in the target tracking process of the KCF algorithm, a target appearance model needs to be updated in real time, and the updating strategy is as follows:
α=(1-β)αt-1+βαt (20)
x=(1-β)xt-1+βxt (21)
where β is the learning rate of the model, αt-1Representing training samples x from the previous framet-1Training the resulting coefficient vector, αtRepresenting the input sample x from the current new inputtAnd training the obtained coefficient vector, wherein alpha represents the coefficient vector of the next frame, and x represents the training sample of the next frame.
The position of the dial in the video stream is obtained through a KCF tracking algorithm, and the specific positioning method is shown in figure 2.
Firstly, setting the area of the dial in a first frame image, reading the length and the width of the area of the dial and the coordinates of the top left corner of the area, and then calculating the coordinates of the center of the dial; the search area is set to be 2.5 times the area of the dial plate, so that environmental factors and negative samples can be provided for the tracking area. A gaussian regression label y is created. Determining the parameter lambda 10-4δ is 0.5 and β is 0.02. According to the target area (the area of the dial)Domain) xtThe Gaussian kernel autocorrelation is obtained from the formula (4)
Figure BDA0001404533870000067
Is obtained from the formula (14)
Figure BDA0001404533870000068
Let x be xt
Figure BDA0001404533870000069
Then inputting a new image sequence; and preliminarily determining the central coordinate of the table in the previous frame of image according to the central coordinate of the table in the previous frame of image. The Gaussian kernel cross-correlation is obtained from the formula (4)
Figure BDA0001404533870000071
Is obtained from the formula (13)
Figure BDA0001404533870000072
Is obtained from the formula (19)
Figure BDA0001404533870000073
The corresponding coordinate when the value is maximum represents the position change of the frame target area relative to the last frame target area. Adding the coordinate of the table center of the previous frame to the coordinate of the table center of the previous frame
Figure BDA0001404533870000074
And obtaining the accurate position of the central coordinate of the dial area in the frame image by the corresponding coordinate when the value is maximum. Updating the target area x according to the central coordinate value of the dialtUpdating Gaussian kernel autocorrelation
Figure BDA0001404533870000075
Thereby updating
Figure BDA0001404533870000076
Further according to the formulas (20) and (21),
Figure BDA0001404533870000077
updating input of next frame image
Figure BDA0001404533870000078
x. And (5) circulating 1.2 in the way until the image sequence is processed. An example of dial positioning results is shown in fig. 3.
Step 2: correcting dial angle offset;
due to the shaking of the machine, the dial plate in the picture captured by the camera is not horizontally placed. And (3) correcting the angle offset of the dial plate by utilizing Speeded Up Robust Features (SURF for short) in the dial plate area framed in the step (1) to level the dial plate.
A Speeded Up Robust Features (SURF for short) algorithm is proposed by Herbert Bay, firstly, a Hessian matrix is constructed, and all interest points are generated and used for feature extraction; then constructing a scale space; then comparing the pixel points processed by the Hessian matrix with 26 adjacent points in the 3-dimensional space field, if the pixel points are larger or smaller than the other 26 points, reserving the pixel points as primary feature points, filtering out the feature points with weaker energy and wrong positioning, and screening out the final stable feature points; counting harr wavelet characteristics in the neighborhood of the characteristic points and determining the main direction of the characteristic points; generating a feature point descriptor; and finally, determining the matching degree by calculating the Euclidean distance between two feature points, wherein the shorter the Euclidean distance is, the better the two feature points are matched.
Step 1, the central coordinate of the area where the dial is located in the image sequence, namely the center of the dial, is obtained, and the value with the smaller value of the length and the width of the area where the dial is located is used as the radius of the dial, so that the dial area can be determined (the area where the dial tracked in step 1 is a rectangle, and the operation is adopted considering that the dial is a circle in actual conditions). Firstly, selecting a dial area of the 1 st frame image, enabling pixels of other areas in the image to be zero, and regarding the image as original, thereby eliminating the interference of the surrounding environment. Similarly, the dial areas of the images from the 2 nd frame to the nth frame are selected, and the pixels of other areas in the images are set to be zero, and the images are regarded as images. Respectively detecting SURF characteristics of original and image and extracting; then, carrying out feature matching on the original and the image, namely finding out the position of a feature point in the original image in the image; estimating the offset angles of the two images according to the feature point matching result; and finally compensating the angle offset. An example of correcting dial angle offset is shown in fig. 4.
And step 3: detecting a pointer;
and (3) preprocessing the image corrected in the step (2), detecting a straight line by using a Hough straight line detection method, and finally selecting the best straight line to represent the pointer.
The idea of detecting straight lines by Hough transformation is as follows: a line is denoted by (r, theta), where r is the distance of the line from the origin and theta is the angle between the perpendicular to the line and the x-axis. A straight line in the x-y coordinate system is represented as a point in the r-theta coordinate system. There are n straight lines through a point in the x-y coordinate system, and then the n straight lines are n points in the r-theta coordinate system. To determine whether m points in the x-y coordinate system are on a straight line, drawing m x n straight lines by a traversal method, correspondingly, m x n (r, theta) coordinates in the r-theta coordinate system, if the theta of the m points is equal to thetaiWhen r is equal to riThen m points are proved to be on a straight line. In the actual line detection case, if more than a certain number of points have the same (r, theta) coordinates, it can be determined that there is a line.
And (3) performing morphological operation and skeletonization on the image processed in the step (2), and then performing Hough transformation to detect a straight line. The lines detected by Hough transformation are multiple, the best line is selected according to two parameters of the length of the line and the distance from the line to the center of the dial, and the line which is long and is close to the center of the dial is the best line to be found. An example of the detection pointer results is shown in fig. 5.
And 4, step 4: reading;
and 4, obtaining a straight line where the pointer is located, then judging which quadrant the pointer is in relative to the center of the dial plate, and finally solving the number indicated by the pointer according to an angle method.
And (4) taking the center of the dial as the origin of coordinates, and judging that the pointer detected in the step (3) is positioned in the quadrant II. The judging method comprises the following steps: selecting a point of the two end points of the pointer farther from the origin (x, y) with origin coordinates of (x)0,y0) If x > x0,y>y0Then the pointer is in the first quadrant, and so on. And then calculating the included angle between the pointer and the horizontal direction. The ratio of the included angle between the pointer and the minimum scale to the total angle is equal to the ratio of the current scale to the measuring range, and therefore the current scale is obtained.
The effect of the invention can be illustrated by the following simulations:
the test data for the first experiment was derived from the factory shot example, as shown in fig. 5. The pointer is fixed at 0.37 position in the shooting picture, and the camera changes the shooting angle, the shooting distance and the shooting light so as to increase the detection difficulty. As can be seen from FIG. 5, when the angle is shifted, the distance is increased, and the light is darkened, the pointer can be accurately detected by the method, the reading error is within 0.0035, and the speed is 0.05s per frame on average.
And the second experiment and the third experiment are dials of pointer movement shot by the mobile phone. The shaking of the machine is simulated, the shooting angle of the mobile phone is changed, and the dial plate in the picture captured by the mobile phone is horizontal or inclined, or clear or fuzzy. The dial plate used in the second experiment is shown in figure 3, the measuring range is 60MPa, the minimum interval is 1MPa, and the error of automatic reading is lower than 0.36 MPa; the dial plate used in the third experiment is shown in the figure 3, the measuring range is 0.4MPa, the minimum interval is 0.01MPa, and the error of automatic reading is lower than 0.003 MPa. The speed averages 0.05s per frame. Table 1 gives part of the experimental data.
The effectiveness and the practicability of the invention are demonstrated by simulation experiments, the time delay of the existing positioning dial plate area algorithm is greatly improved, the bottleneck that the dial plate level in a picture needs to be shot when the reading is carried out is solved, and the practical application requirements are met.
TABLE 1 Dial automatic identification result and observed value comparison (MPa)
Figure BDA0001404533870000091
It should be understood that parts of the specification not set forth in detail are well within the prior art.
It should be understood that the above description of the preferred embodiments is given for clarity and not for any purpose of limitation, and that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (4)

1. A method for identifying a dial indicator by using computer vision is characterized by comprising the following steps:
step 1: positioning the area of the dial plate;
in a first frame image of a video, framing an area where a dial is located, determining the length and width of the area where the dial is located and coordinates of a top left corner vertex of the area, and solving a central coordinate of the dial; then, tracking the dial by using a KCF algorithm, and determining the area where the dial is located and the center coordinate of the dial in each frame of image;
step 2: correcting dial angle offset;
and step 3: detecting a pointer;
and 4, step 4: and (6) reading.
2. The method for identifying a dial indicator using computer vision according to claim 1, wherein: in step 2, the central coordinates of the area where the dial is located in the image sequence obtained in step 1 are used, and the value with the smaller value of the length and the width of the area where the dial is located is used as the radius of the dial, so that the dial area is determined; firstly, making pixels of a non-dial area in a 1 st frame image be zero, and regarding the image as original to eliminate the interference of the surrounding environment; then, making pixels of a non-dial area in the images from the 2 nd frame to the nth frame be zero, and regarding the image as image; respectively detecting SURF characteristics of original and image and extracting; then carrying out feature matching on the original and the image to find out the position of a feature point in the original image in the image; estimating the offset angles of the two images according to the feature matching result; and finally, compensating the angle deviation to enable the dial to be horizontal.
3. The method for identifying a dial indicator using computer vision according to claim 1, wherein: and 3, performing morphological operation and skeletonization pretreatment on the image corrected in the step 2, detecting straight lines by using a Hough straight line detection method, wherein a plurality of straight lines are detected by Hough transformation, and the optimal straight line is selected according to two parameters of the length of the straight line and the distance from the straight line to the center of the dial plate, so that the straight line which is long and is close to the center of the dial plate is the optimal straight line.
4. The method for identifying a dial indicator using computer vision according to any one of claims 1 to 3, wherein: step 4, regarding the center of the dial as the origin of coordinates, and judging that the pointer detected in the step 3 is positioned in the fourth quadrant; the judging method comprises the following steps: selecting a point of the two end points of the pointer farther from the origin (x,y) The origin coordinate isx 0,y 0) If, ifxx 0yy 0If the pointer is in the first quadrant, and so on; then, an included angle between the pointer and the horizontal direction is calculated, and the ratio of the included angle between the pointer and the minimum scale to the total angle is equal to the ratio of the current scale to the measuring range, so that the current scale is obtained.
CN201710813475.6A 2017-09-11 2017-09-11 Dial plate pointer identification method using computer vision Expired - Fee Related CN109359643B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710813475.6A CN109359643B (en) 2017-09-11 2017-09-11 Dial plate pointer identification method using computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710813475.6A CN109359643B (en) 2017-09-11 2017-09-11 Dial plate pointer identification method using computer vision

Publications (2)

Publication Number Publication Date
CN109359643A CN109359643A (en) 2019-02-19
CN109359643B true CN109359643B (en) 2022-05-13

Family

ID=65349680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710813475.6A Expired - Fee Related CN109359643B (en) 2017-09-11 2017-09-11 Dial plate pointer identification method using computer vision

Country Status (1)

Country Link
CN (1) CN109359643B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1122521A1 (en) * 2000-02-01 2001-08-08 Setrix AG Method and apparatus for monitoring an analog meter
CN104899609B (en) * 2015-06-19 2019-03-26 四川大学 A kind of mechanical meter recognition methods based on image registration
CN105303168A (en) * 2015-10-14 2016-02-03 南京第五十五所技术开发有限公司 Multi-view pointer type instrument identification method and device
CN105740856B (en) * 2016-01-28 2019-01-18 宁波理工监测科技股份有限公司 A kind of pointer instrument registration read method based on machine vision
CN106650592B (en) * 2016-10-05 2020-08-28 北京深鉴智能科技有限公司 Target tracking system
CN106991695A (en) * 2017-03-27 2017-07-28 苏州希格玛科技有限公司 A kind of method for registering images and device

Also Published As

Publication number Publication date
CN109359643A (en) 2019-02-19

Similar Documents

Publication Publication Date Title
CN107423702B (en) Video target tracking method based on TLD tracking system
CN110580723B (en) Method for carrying out accurate positioning by utilizing deep learning and computer vision
CN108875794B (en) Image visibility detection method based on transfer learning
CN108109162B (en) Multi-scale target tracking method using self-adaptive feature fusion
CN104657711B (en) A kind of readings of pointer type meters automatic identifying method of robust
CN105427298A (en) Remote sensing image registration method based on anisotropic gradient dimension space
CN108121945A (en) A kind of multi-target detection tracking, electronic equipment and storage medium
CN111950396A (en) Instrument reading neural network identification method
CN110956131B (en) Single-target tracking method, device and system
CN110910456B (en) Three-dimensional camera dynamic calibration method based on Harris angular point mutual information matching
CN110659637A (en) Electric energy meter number and label automatic identification method combining deep neural network and SIFT features
CN116229052B (en) Method for detecting state change of substation equipment based on twin network
CN115456956A (en) Method and device for detecting scratches of liquid crystal display and storage medium
CN108596032B (en) Detection method, device, equipment and medium for fighting behavior in video
CN112652020A (en) Visual SLAM method based on AdaLAM algorithm
CN115147418A (en) Compression training method and device for defect detection model
CN106101640A (en) Adaptive video sensor fusion method and device
CN108830828A (en) A kind of method for detecting change of remote sensing image and device
CN114463397A (en) Multi-modal image registration method based on progressive filtering
CN105427276A (en) Camera detection method based on image local edge characteristics
CN109359643B (en) Dial plate pointer identification method using computer vision
Du et al. Grid-based matching for full-field large-area deformation measurement
CN105447524A (en) Image identification method and device
Huang et al. A checkerboard corner detection method using circular samplers
Tong et al. Surface Defect Detection Method Based on Improved Faster-RCNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220513

CF01 Termination of patent right due to non-payment of annual fee