CN111832534A - Equipment detection method and device - Google Patents

Equipment detection method and device Download PDF

Info

Publication number
CN111832534A
CN111832534A CN202010727478.XA CN202010727478A CN111832534A CN 111832534 A CN111832534 A CN 111832534A CN 202010727478 A CN202010727478 A CN 202010727478A CN 111832534 A CN111832534 A CN 111832534A
Authority
CN
China
Prior art keywords
state
status
image data
image
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010727478.XA
Other languages
Chinese (zh)
Inventor
陈庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202010727478.XA priority Critical patent/CN111832534A/en
Publication of CN111832534A publication Critical patent/CN111832534A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/211Selection of the most significant subset of features
    • G06F18/2113Selection of the most significant subset of features by ranking or filtering the set of features, e.g. using a measure of variance or of feature cross-correlation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure provides a method of detecting a device. The method includes acquiring image data of at least one device; determining the status light characteristics of each device according to the image data; and determining the equipment state indicated by each state light characteristic by using a preset state recognition model, wherein the state light characteristic comprises at least one of a color characteristic, a size characteristic and a position characteristic of the state light. The disclosure also provides a detection device, an electronic device and a computer readable storage medium.

Description

Equipment detection method and device
Technical Field
The present disclosure relates to the field of security technologies, and in particular, to a method and an apparatus for detecting a device.
Background
The machine room inspection has important significance for guaranteeing safe and stable operation of the data center, and the inspection robot executes a remote detection task aiming at equipment, so that the inspection robot is one of common forms of machine room inspection.
In the process of realizing the concept of the invention, the inventor finds that in the equipment detection of the related technology, the inspection robot compares the acquired equipment image with the image in the preset reference image library according to the model information of the equipment, so as to determine whether the equipment state is abnormal or not.
The equipment detection method needs to acquire images of all equipment in a machine room in advance and also needs to acquire model information of all the equipment in advance, and is large in early-stage preparation workload of equipment detection, low in detection efficiency and not suitable for application scenes of large-data-volume equipment detection.
Disclosure of Invention
One aspect of the present disclosure provides a method of detecting a device. The method includes acquiring image data of at least one device; determining status light characteristics of each of the devices based on the image data; and determining the equipment state indicated by each state light characteristic by using a preset state identification model, wherein the state light characteristic comprises at least one of a color characteristic, a size characteristic and a position characteristic of the state light.
Optionally, the acquiring of the image data of the at least one device includes acquiring at least one image of a cabinet where the at least one device is located; splicing the at least one image to obtain a complete image of the cabinet; and carrying out identification processing on the complete image to obtain image data of the at least one device.
Optionally, the performing the identification process on the complete image to obtain the image data of the at least one device includes performing the identification process on the edge of the at least one device in the complete image by using an image edge identification algorithm to obtain the image data of the at least one device.
Optionally, the method further includes determining location information of the device with abnormal state in case that the status light feature indicates that the device is abnormal in state; and generating alarm information according to the position information and pushing the alarm information.
Optionally, the determining the position information of the equipment with the abnormal state includes performing stitching processing on the at least one image to obtain a complete image of the cabinet; and determining the height information of the abnormal state equipment in the cabinet according to the complete image, wherein the height information forms the position information.
Optionally, the determining, by using a preset state identification model, the device state indicated by each state light feature includes performing screening processing on each state light feature by using the state identification model to obtain an effective state light feature; and determining a device status indicated by the valid status light feature based on the valid status light feature.
Optionally, the training method of the state recognition model includes acquiring image data of at least one sample device, and acquiring device state information of each sample device; determining the status light characteristics of each sample device according to the image data; and performing model training using the state light feature and the device state information of each sample device as training samples to obtain the state recognition model.
Another aspect of the present disclosure provides a detection apparatus. The device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring image data of at least one device; the first determining module is used for determining the status light characteristics of each device according to the image data; and a second determining module, configured to determine, by using a preset state recognition model, a device state indicated by each of the state light characteristics, where the state light characteristics include at least one of color characteristics, size characteristics, and position characteristics of the state light.
Optionally, the acquiring module includes an acquiring sub-module, configured to acquire at least one image of a cabinet in which the at least one device is located; the first processing submodule is used for splicing the at least one image to obtain a complete image of the cabinet; and the second processing submodule is used for identifying the complete image to obtain image data of the at least one device.
Optionally, the second processing sub-module includes a first processing unit, configured to perform recognition processing on a rim of at least one device in the complete image by using an image edge recognition algorithm, so as to obtain image data of the at least one device.
Optionally, the apparatus further includes a first processing module, configured to determine, when the status light characteristic indicates that the device is abnormal in status, location information of the device in the abnormal status; and the second processing module is used for generating alarm information and pushing the alarm information according to the position information.
Optionally, the first processing module includes a second processing sub-module, configured to perform stitching processing on the at least one image to obtain a complete image of the cabinet; and a third processing submodule, configured to determine, according to the complete image, height information of the abnormal-state device in the cabinet, where the height information constitutes the position information.
Optionally, the second determining module includes a fourth processing sub-module, configured to perform screening processing on each status light feature by using the status recognition model to obtain an effective status light feature; and a fifth processing submodule, configured to determine, based on the valid status light feature, a device status indicated by the valid status light feature.
Another aspect of the present disclosure provides an electronic device including: one or more processors; memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the methods of embodiments of the present disclosure.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions that, when executed, implement the method of embodiments of the present disclosure.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which,
fig. 1 schematically illustrates a system architecture of a detection method and apparatus according to an embodiment of the present disclosure;
FIG. 2A schematically illustrates a flow chart of a detection method of a device according to an embodiment of the present disclosure;
fig. 2B schematically illustrates a schematic of a cabinet according to an embodiment of the disclosure;
FIG. 3 schematically shows a flow chart of a detection method of a device according to another embodiment of the present disclosure;
FIG. 4 schematically shows a block diagram of a detection apparatus according to an embodiment of the present disclosure; and
fig. 5 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It is to be understood that such description is merely illustrative and not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, operations, and/or components, but do not preclude the presence or addition of one or more other features, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data detection apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
Embodiments of the present disclosure provide a detection method of a device and an apparatus that can be used to perform the detection method, which may include, for example, the following operations. Acquiring image data of at least one device, and determining the status light characteristics of each device according to the image data; and determining the equipment state indicated by each state light characteristic by using a preset state recognition model, wherein the state light characteristic comprises at least one of a color characteristic, a size characteristic and a position characteristic of the state light.
Fig. 1 schematically shows a system architecture of a detection method and apparatus of a device according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a system architecture to which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, and does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the system architecture includes at least one device to be detected (a plurality of devices 101, 102, 103 are shown in the figure) and an inspection robot 104 (specifically, a processor, a server, etc. of the inspection robot, which are not shown in the figure). In the system architecture 100, the inspection robot 104 acquires image data of at least one device, and determines status light characteristics of each device (e.g., devices 101, 102, 103) according to the image data; and then, determining the equipment state indicated by each state light characteristic by using a preset state recognition model, wherein the state light characteristic comprises at least one of a color characteristic, a size characteristic and a position characteristic of the state light.
The present disclosure will be described in detail below with reference to the drawings and specific embodiments.
Fig. 2A schematically shows a flow chart of a detection method of a device according to an embodiment of the present disclosure.
As shown in fig. 2A, the method may include operations S210 to S230.
In operation S210, image data of at least one device is acquired.
In the embodiment of the present disclosure, specifically, image data of at least one device to be detected is obtained, the device to be detected includes a device panel, a status light is disposed in the device panel, and the status light can reflect a device status of the device, and specifically can reflect an operation status of different components of the device, such as a hard disk, a power supply, a pointer instrument, and a digital instrument. Status lights may be different for different models of equipment, and specifically may be different in shape, color, size, location, etc. For different models of devices, the status lights of the same color may indicate different device statuses, and therefore, the status lights of the devices of different models need to be identified and determined. Alternatively, the devices to be detected may be different servers integrated in the cabinet, and the different servers are arranged at different height positions in the cabinet, fig. 2B schematically illustrates a schematic view of the cabinet according to the embodiment of the disclosure, as shown in fig. 2B, a plurality of servers are integrated in the cabinet, and status lights of device panels of the plurality of servers are different.
When the image data of at least one device is acquired, the acquired image data can reflect the image information of all the devices to be detected, and each image data is not required to reflect the image information of only one device to be detected, which is beneficial to improving the convenience and the degree of freedom of device detection. Illustratively, a plurality of photos of at least one server are acquired by using an industrial camera of the inspection robot, and image information of the plurality of servers can be included in each photo.
Optionally, before acquiring the image data of the at least one device, the inspection robot acquires the detection instruction, and determines the target cabinet identifier indicated by the detection instruction. And acquiring the position information of the target cabinet associated with the target cabinet identifier from the cabinet position database according to the target cabinet identifier. According to the position information of the target cabinet, the laser radar sensing environment characteristics of the inspection robot are utilized, and the target cabinet is successfully positioned by positioning the cabinet environment in real time. Then, the inspection robot divides the target cabinet into M preset height areas according to a cabinet identification algorithm, wherein M is an integer larger than 1, and then obtains image data in each height area of the target cabinet by adjusting the pose parameters of the image acquisition device, wherein each image data can comprise image information of a plurality of devices in the target cabinet. The target enclosure identification may include, for example, an enclosure number, an enclosure ID, an enclosure name, etc. of the target enclosure. In addition, the inspection instruction acquired by the inspection robot can also directly indicate the position information of the target cabinet to be detected, or directly indicate the equipment identification of the target equipment to be detected.
Next, in operation S220, status light characteristics of the respective devices are determined according to the image data.
In the embodiment of the present disclosure, specifically, according to the acquired image data of at least one device, status light characteristics of each device are determined, where the status light characteristics include at least one of a color characteristic, a size characteristic, and a position characteristic of the status light, where the position characteristic indicates position information of the status light in a panel of the device. Optionally, the acquired image data is subjected to a recognition process to determine at least one image feature in the image data, and then based on the at least one image feature in the image data, a status light feature of each device indicated by the image data is determined. Exemplarily, performing pixel-based binarization processing on the acquired image data according to a preset color threshold and a preset brightness threshold to obtain color features and brightness features of the image data; and determining the status light characteristics of each device indicated by the image data according to the color characteristics and the brightness characteristics of the image data by combining an edge recognition algorithm and a template matching method.
Next, in operation S230, the device status indicated by each status light feature is determined using a preset status recognition model.
In the embodiment of the present disclosure, specifically, the determined characteristics of each status light are used as input data, and the input data is input into a preset status recognition model, so as to obtain the device status indicated by the characteristics of each status light. Since not all status lights reflect the device status of the device, for example, some status lights are only used to indicate whether a function is running, and their status light feature does not indicate the device status. Therefore, the state recognition model can be utilized to perform screening processing on each state light feature to obtain an effective state light feature, and then the equipment state indicated by the effective state light feature is determined based on the effective state light feature, wherein the effective state light feature is the state light feature which can be used for indicating the equipment state.
The training method of the state recognition model can comprise the steps of obtaining image data of at least one sample device and obtaining device state information of each sample device; determining the status light characteristics of each sample device according to the image data; and taking the state light characteristics and the equipment state information of each sample equipment as training samples, and performing model training by using the training samples to obtain a state recognition model.
Specifically, when acquiring image data of at least one sample device, the image data should cover as much as possible the brand and model of all devices currently in use. The acquired device state information of each sample device includes state normal information and state abnormal information. Dividing the acquired sample data into training data and testing data, wherein the training data is used for training the state recognition model, and the testing data is used for testing the training result of the state recognition model. And taking the image data of at least one sample device in the training data as input data and the state information of each sample device as output data, and carrying out model parameter training to obtain a neural network model for device state identification. And setting a loss function in the model training process and judging whether the loss function is converged, when the loss function is converged, determining that the model training effect is good, outputting a neural network model to obtain the state recognition model, and otherwise, continuing training. In addition, the image data of at least one sample device in the test data is input into the neural network model, and the accuracy of the recognition result of the neural network model is judged according to the state information of each sample device. And when the accuracy is greater than a preset threshold value, determining that the model training effect is good, and using the trained neural network model as a state recognition model to automatically recognize the equipment state.
Because the status light features of the equipment of the same brand have relative stability, after the sample data contains a plurality of sample data of the equipment of the same brand, if the image data of the equipment of the new model of the same brand is acquired in the equipment inspection process, the status recognition model can recognize the status light features of the equipment of the new model of the same brand. Compared with the image data or the state lamp characteristics of the novel equipment added in the preset reference image library in the related art, the state recognition model in the embodiment of the disclosure has a good generalization function, which is beneficial to improving the detection efficiency of equipment detection and reducing the early preparation workload of the equipment detection.
Because the state recognition model is obtained by training according to the image data and the state information of a large number of sample devices, when the state recognition model is used for automatically recognizing the device state, the brand or model information of each device does not need to be obtained in advance, the position information of each device in a corresponding cabinet does not need to be obtained in advance, but only the image data of each device needs to be obtained, the state light feature of each device is determined according to the image data, and the device state indicated by the state light feature is determined. Because a database associating the equipment model with the equipment image does not need to be established in advance, particularly, a database associating the equipment model with the status light image of the equipment panel does not need to be established in advance, the method can effectively improve the intelligent degree and the detection efficiency of equipment detection. Because the status light of the equipment integrated in the cabinet has the characteristics of small volume, large density and multiple types, the method can effectively improve the detection efficiency of the equipment, effectively reduce the early preparation workload of equipment detection, and is suitable for the application scene of automatic detection of equipment with large data volume.
In the embodiment of the disclosure, image data of at least one device is acquired, status light characteristics of each device are determined according to the image data, and then the device status indicated by each status light characteristic is determined by using a preset status light recognition model, wherein the status light characteristics include at least one of color characteristics, size characteristics and position characteristics of the status light. The state lamp characteristics in the image data are determined, the preset state recognition model is utilized, the equipment state indicated by the state lamp characteristics is determined, on one hand, the intelligent degree of equipment detection is improved, the detection efficiency of the equipment detection is improved, on the other hand, the database of the equipment model and the state lamp image correlation of the equipment panel is not required to be established, and the position information of each equipment in the corresponding cabinet is not required to be acquired in advance.
Fig. 3 schematically shows a flow chart of a detection method of a device according to another embodiment of the present disclosure.
As shown in fig. 3, operation S210 may include operations S310 to S330.
In operation S310, at least one image of a cabinet in which at least one device is located is collected.
In an embodiment of the present disclosure, in particular, at least one device to be detected is integrated in a cabinet, for example a large number of servers of a data center are integrated in different cabinets. Because the height of the cabinet is generally higher, at least one image of the cabinet needs to be acquired by the image acquisition device of the inspection robot, and the at least one image can reflect all image information of the cabinet. Optionally, designing recognition algorithms for different height regions of different cabinets, and before acquiring at least one image of the cabinet, adjusting the pose parameters of the image acquisition device by the inspection robot according to the recognition algorithms for the different height regions of the target cabinet, and acquiring the image in each height region of the target cabinet. In addition, because the heights of different cabinets are basically consistent, at least one set of pose parameters can be preset for the image acquisition device of the inspection robot, and the at least one set of pose parameters are used for acquiring images in different height areas of the cabinets.
Next, in operation S320, at least one image is stitched to obtain a complete image of the cabinet.
In the embodiment of the present disclosure, specifically, the at least one image for the same cabinet is subjected to stitching processing, specifically, repeated contents in the at least one image are removed, and non-repeated contents in the at least one image are stitched according to a cabinet height sequence corresponding to each image, so as to obtain a complete image of the cabinet, where the complete image includes image data of each device in the cabinet.
Next, in operation S330, the entire image is subjected to recognition processing, resulting in image data of at least one device.
In the embodiment of the present disclosure, specifically, the complete image is identified to obtain image data of at least one device in the cabinet, where the image data of each device includes image information of a status light of a device panel, and the image information can provide a status light feature of each device. Optionally, an image edge recognition algorithm is used to perform edge recognition processing on the complete image, specifically, an edge of at least one device in the complete image is recognized, so as to obtain image data of at least one device in the cabinet. Common image edge recognition algorithms may include, for example, a first order differential operator: roberts, Sobel, Prewitt; second order differential operator: laplacian, Log/Marr; the non-differential edge detection operator Canny, etc.
Illustratively, the method for performing edge recognition processing on the complete image can include converting the complete image of the cabinet from a color image to a grayscale image; the gray image is subjected to gaussian blurring, which aims to remove noise, and since noise is mainly concentrated in a high-frequency signal and is easily recognized as a false edge, the gray image is subjected to noise removal processing using gaussian blurring. Then, the gray gradient magnitude and direction are calculated, and since the image edges may point in different directions, the gray gradient magnitude in the horizontal, vertical, and diagonal directions may be calculated. And finally, performing non-maximum suppression, wherein the non-maximum suppression is an edge thinning algorithm, the local maximum gray gradient value is reserved, other gray gradient values are suppressed, the sharpest position in the gray gradient change is reserved, the thinning of the gradient edge with more than one pixel width is realized, and the image edge with only one accurate point width is extracted.
As an optional implementation manner, in the case that the status light feature indicates that the status of the device is abnormal, the position information of the device with the abnormal status is determined; and generating alarm information according to the position information and pushing the alarm information. The method for determining the position information of the abnormal state equipment comprises the step of determining the height information of the abnormal state equipment in the cabinet according to the complete image of the cabinet, wherein the height information forms the position information.
When the state lamp characteristics indicate that the state of a certain device is abnormal, the position information of the abnormal state device is determined in order to accurately report the information of the abnormal state device. Compared with the prior art that the position information of each device in the cabinet needs to be acquired in advance, the method does not need to establish a database of the position information of the device, and also does not need to acquire the position information of each device in advance, and determines the height information of the abnormal-state device in the cabinet by identifying and processing the complete image of the cabinet, specifically, determines the U digit occupied by the abnormal-state device in the cabinet, and acquires the position information of the abnormal-state device.
The height of the cabinet is in units of U, and commonly, 1U is 44.45mm, the common heights of the electronic equipment cabinets are 15U, 20U, 25U, 30U, 35U, 40U and the like, the common heights of the network cabinet are 22U, 27U, 32U, 37U, 42U and the like, and the height marked by the number of U is the effective height used in the cabinet. The location information of the device may be reflected by the number of U bits occupied by the device in the cabinet. The granularity of equipment asset management of the data center is not required to reach the U-bit level, and an asset information table is not required to be established independently and updated in time according to the change of the installation position of the equipment.
In the embodiment of the disclosure, at least one image of a cabinet where at least one device is located is acquired, the at least one image is spliced to obtain a complete image of the cabinet, the complete image is identified to obtain image data of the at least one device, and then the status light characteristics of the devices and the device status indicated by the status light characteristics are determined based on the image data of the at least one device. According to the equipment detection method and the equipment detection system, an asset management table containing equipment position information does not need to be established, and the position information of each equipment in the corresponding cabinet does not need to be acquired in advance.
Fig. 4 schematically shows a block diagram of a detection apparatus according to an embodiment of the present disclosure.
As shown in fig. 4, the detection apparatus 400 may include an acquisition module 401, a first determination module 402, and a second determination module 403. The detection apparatus may perform the method described above with reference to the method embodiment, which is not described herein again.
Specifically, the acquiring module 401 is configured to acquire image data of at least one device; a first determining module 402, configured to determine status light characteristics of each device according to the image data; and a second determining module 403, configured to determine, by using a preset state recognition model, a device state indicated by each state light feature, where the state light feature includes at least one of a color feature, a size feature, and a position feature of the state light.
In the embodiment of the disclosure, image data of at least one device is acquired, status light characteristics of each device are determined according to the image data, and then the device status indicated by each status light characteristic is determined by using a preset status light recognition model, wherein the status light characteristics include at least one of color characteristics, size characteristics and position characteristics of the status light. The state lamp characteristics in the image data are determined, the preset state recognition model is utilized, the equipment state indicated by the state lamp characteristics is determined, on one hand, the intelligent degree of equipment detection is improved, the detection efficiency of the equipment detection is improved, on the other hand, the database of the equipment model and the state lamp image correlation of the equipment panel is not required to be established, and the position information of each equipment in the corresponding cabinet is not required to be acquired in advance.
As an optional embodiment, the obtaining module includes a collecting sub-module, configured to collect at least one image of a cabinet in which the at least one device is located; the first processing submodule is used for splicing at least one image to obtain a complete image of the cabinet; and the second processing submodule is used for identifying the complete image to obtain image data of at least one device.
As an alternative embodiment, the second processing sub-module includes a first processing unit, configured to perform an identification process on a rim of at least one device in the complete image by using an image edge identification algorithm to obtain image data of the at least one device.
As an optional embodiment, the apparatus further includes a first processing module, configured to determine, in a case that the status light feature indicates that the status of the device is abnormal, location information of the device in the abnormal status; and the second processing module is used for generating and pushing alarm information according to the position information.
As an optional embodiment, the first processing module includes a second processing sub-module, configured to perform stitching processing on at least one image to obtain a complete image of the cabinet; and the third processing submodule is used for determining the height information of the abnormal state equipment in the cabinet according to the complete image, and the height information forms position information.
As an optional embodiment, the second determining module includes a fourth processing sub-module, configured to perform screening processing on each status light feature by using the status recognition model to obtain an effective status light feature; and a fifth processing sub-module for determining the device status indicated by the valid status light feature based on the valid status light feature.
In the embodiment of the disclosure, at least one image of a cabinet where at least one device is located is acquired, the at least one image is spliced to obtain a complete image of the cabinet, the complete image is identified to obtain image data of the at least one device, and then the status light characteristics of the devices and the device status indicated by the status light characteristics are determined based on the image data of the at least one device. According to the equipment detection method and the equipment detection system, an asset management table containing equipment position information does not need to be established, and the position information of each equipment in the corresponding cabinet does not need to be acquired in advance.
Any of the modules according to embodiments of the present disclosure, or at least part of the functionality of any of them, may be implemented in one module. Any one or more of the modules according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules according to the embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging the circuit, or in any one of three implementations, or in any suitable combination of any of the software, hardware, and firmware. Or one or more of the modules according to embodiments of the disclosure, may be implemented at least partly as computer program modules which, when executed, may perform corresponding functions.
For example, any plurality of the obtaining module 401, the first determining module 402 and the second determining module 403 may be combined and implemented in one module, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the obtaining module 401, the first determining module 402, and the second determining module 403 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or may be implemented by any one of three implementations of software, hardware, and firmware, or any suitable combination of any of the three. Alternatively, at least one of the obtaining module 401, the first determining module 402 and the second determining module 403 may be at least partially implemented as a computer program module, which when executed may perform a corresponding function.
Fig. 5 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device 500 includes a processor 510, a computer-readable storage medium 520. The electronic device 500 may perform a method according to an embodiment of the present disclosure.
In particular, processor 510 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 510 may also include on-board memory for caching purposes. Processor 510 may be a single processing module or a plurality of processing modules for performing different actions of a method flow according to embodiments of the disclosure.
Computer-readable storage media 520, for example, may be non-volatile computer-readable storage media, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 520 may include a computer program 521, which computer program 521 may include code/computer-executable instructions that, when executed by the processor 510, cause the processor 510 to perform a method according to an embodiment of the disclosure, or any variation thereof.
The computer program 521 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 521 may include one or more program modules, including for example 521A, modules 521B, … …. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, and when these program modules are executed by the processor 510, the processor 510 may execute the method according to the embodiment of the present disclosure or any variation thereof.
According to an embodiment of the present disclosure, at least one of the obtaining module 401, the first determining module 402 and the second determining module 403 may be implemented as a computer program module described with reference to fig. 5, which, when executed by the processor 510, may implement the respective operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be understood by those skilled in the art that while the present disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. A method for testing a device, wherein,
acquiring image data of at least one device;
determining the status light characteristics of each device according to the image data; and
determining the equipment state indicated by the characteristic of each state lamp by using a preset state identification model,
wherein the status light characteristic comprises at least one of a color characteristic, a size characteristic, and a position characteristic of the status light.
2. The method of claim 1, wherein the acquiring image data of at least one device comprises:
collecting at least one image of a cabinet where the at least one device is located;
splicing the at least one image to obtain a complete image of the cabinet; and
and identifying the complete image to obtain image data of the at least one device.
3. The method of claim 2, wherein the identifying the complete image to obtain image data of the at least one device comprises:
and identifying the edge of at least one device in the complete image by using an image edge identification algorithm to obtain image data of the at least one device.
4. The method of claim 2, further comprising:
under the condition that the status light characteristics indicate that the equipment is abnormal in status, determining the position information of the equipment with abnormal status; and
and generating alarm information according to the position information and pushing the alarm information.
5. The method of claim 4, wherein the determining location information of the anomalous state device comprises:
and determining the height information of the abnormal state equipment in the cabinet according to the complete image, wherein the height information forms the position information.
6. The method of claim 1, wherein determining the device status for each status light feature indication using a preset status recognition model comprises:
screening each state lamp characteristic by using the state identification model to obtain effective state lamp characteristics; and
based on the valid status light feature, determining a device status indicated by the valid status light feature.
7. The method of any of claims 1 to 6, wherein the training method of the state recognition model comprises:
acquiring image data of at least one sample device, and acquiring device state information of each sample device;
determining the status light characteristics of each sample device according to the image data; and
and taking the state light characteristics and the equipment state information of each sample equipment as training samples, and performing model training by using the training samples to obtain the state recognition model.
8. A detection device, comprising:
an acquisition module for acquiring image data of at least one device;
the first determining module is used for determining the status light characteristics of each device according to the image data; and
a second determining module for determining the device status indicated by the status light characteristics using a preset status recognition model,
wherein the status light characteristic comprises at least one of a color characteristic, a size characteristic, and a position characteristic of the status light.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
10. A computer-readable storage medium storing computer-executable instructions for implementing the method of any one of claims 1 to 7 when executed.
CN202010727478.XA 2020-07-24 2020-07-24 Equipment detection method and device Pending CN111832534A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010727478.XA CN111832534A (en) 2020-07-24 2020-07-24 Equipment detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010727478.XA CN111832534A (en) 2020-07-24 2020-07-24 Equipment detection method and device

Publications (1)

Publication Number Publication Date
CN111832534A true CN111832534A (en) 2020-10-27

Family

ID=72926455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010727478.XA Pending CN111832534A (en) 2020-07-24 2020-07-24 Equipment detection method and device

Country Status (1)

Country Link
CN (1) CN111832534A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116071336A (en) * 2023-02-14 2023-05-05 北京博维仕科技股份有限公司 Intelligent video analysis method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110309033A (en) * 2019-07-15 2019-10-08 中国工商银行股份有限公司 Failure monitoring method, device and system
CN110490854A (en) * 2019-08-15 2019-11-22 中国工商银行股份有限公司 Obj State detection method, Obj State detection device and electronic equipment
CN111259892A (en) * 2020-01-19 2020-06-09 福建升腾资讯有限公司 Method, device, equipment and medium for inspecting state of indicator light

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110309033A (en) * 2019-07-15 2019-10-08 中国工商银行股份有限公司 Failure monitoring method, device and system
CN110490854A (en) * 2019-08-15 2019-11-22 中国工商银行股份有限公司 Obj State detection method, Obj State detection device and electronic equipment
CN111259892A (en) * 2020-01-19 2020-06-09 福建升腾资讯有限公司 Method, device, equipment and medium for inspecting state of indicator light

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116071336A (en) * 2023-02-14 2023-05-05 北京博维仕科技股份有限公司 Intelligent video analysis method and system
CN116071336B (en) * 2023-02-14 2023-08-11 北京博维仕科技股份有限公司 Intelligent video analysis method and system

Similar Documents

Publication Publication Date Title
US11587219B2 (en) Method and apparatus for detecting pixel defect of optical module, and device
CN107509107B (en) Method, device and equipment for detecting video playing fault and readable medium
CN110309033B (en) Fault monitoring method, device and system
US20100201880A1 (en) Shot size identifying apparatus and method, electronic apparatus, and computer program
US20110026764A1 (en) Detection of objects using range information
CN110335313B (en) Audio acquisition equipment positioning method and device and speaker identification method and system
CN110334768B (en) Refrigerator display detection method and system and electronic equipment
JP2013239165A (en) Detection of near-field camera obstruction
US20140320638A1 (en) Electronic device and method for detecting surface flaw of sample
CN109283182A (en) A kind of detection method of battery welding point defect, apparatus and system
CN112634201A (en) Target detection method and device and electronic equipment
US8547430B2 (en) System and method for marking discrepancies in image of object
CN109102026A (en) A kind of vehicle image detection method, apparatus and system
CN111832534A (en) Equipment detection method and device
CN112287905A (en) Vehicle damage identification method, device, equipment and storage medium
CN114550129B (en) Machine learning model processing method and system based on data set
CN113763466A (en) Loop detection method and device, electronic equipment and storage medium
CN116993654B (en) Camera module defect detection method, device, equipment, storage medium and product
CN111488846A (en) Method and equipment for identifying water level
CN115511870A (en) Object detection method and device, electronic equipment and storage medium
CN115222699A (en) Defect detection method, defect detection device and system
CN108447107B (en) Method and apparatus for generating video
CN114067145A (en) Passive optical splitter detection method, device, equipment and medium
CN113012137A (en) Panel defect inspection method, system, terminal device and storage medium
CN111539361B (en) Noise identification method, device, storage medium, processor and carrier

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination