CN111797914A - Device identification method, server, system, and computer-readable storage medium - Google Patents

Device identification method, server, system, and computer-readable storage medium Download PDF

Info

Publication number
CN111797914A
CN111797914A CN202010584174.2A CN202010584174A CN111797914A CN 111797914 A CN111797914 A CN 111797914A CN 202010584174 A CN202010584174 A CN 202010584174A CN 111797914 A CN111797914 A CN 111797914A
Authority
CN
China
Prior art keywords
equipment
identified
augmented reality
information
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010584174.2A
Other languages
Chinese (zh)
Inventor
宋书福
李春廷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuanguang Software Co Ltd
Original Assignee
Yuanguang Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yuanguang Software Co Ltd filed Critical Yuanguang Software Co Ltd
Priority to CN202010584174.2A priority Critical patent/CN111797914A/en
Publication of CN111797914A publication Critical patent/CN111797914A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Disclosed are a device identification method, a server, a system, and a computer-readable storage medium. The equipment identification method comprises the following steps: the server receives a real-time scene image acquired by the mobile terminal by the equipment to be identified; judging whether the real-time scene image is matched with a reference augmented reality image of the equipment to be identified; and if so, confirming that the mobile terminal correctly identifies the equipment to be identified. According to the scheme, the identification efficiency of the equipment can be improved.

Description

Device identification method, server, system, and computer-readable storage medium
Technical Field
The present application relates to the field of device identification technologies, and in particular, to a device identification method, a server, a system, and a computer-readable storage medium.
Background
With the development of social economy, the scale of an enterprise is larger and larger, the types of equipment are more and more, the distribution is wider and wider, the difficulty of equipment management is larger and larger, the equipment is the root of a production enterprise, and the normal performance of the equipment can provide basic guarantee for the research and development and production of products of the enterprise.
The traditional equipment management completely depends on people, and the relevant information of the equipment is incomplete or even lost due to personnel, management and other reasons, so that the identification efficiency of the equipment is low, and the management work of the equipment is not facilitated. In order to improve the identification efficiency and accuracy of equipment, people provide more solutions, such as sticking bar codes, two-dimensional codes, RFID and the like, and although the identification efficiency of the equipment is improved, the working difficulties of large work load of sticking cards, falling of cards and the like still exist, and the sticking cards of some charged equipment and high-altitude equipment have larger potential safety hazards.
Disclosure of Invention
The technical problem mainly solved by the application is to provide a device identification method, a server, a system and a computer readable storage medium, so that the identification efficiency of the device is improved.
In order to solve the above problem, a first aspect of the present application provides a device identification method, including: the server receives a real-time scene image acquired by the mobile terminal by the equipment to be identified; judging whether the real-time scene image is matched with a reference augmented reality image of the equipment to be identified; and if so, confirming that the mobile terminal correctly identifies the equipment to be identified.
In order to solve the above problem, a second aspect of the present application provides a server comprising a communication circuit, a memory, and a processor coupled to each other; the communication circuit is used for communicating with terminal equipment; the memory is used for storing program data; the processor executes the program data for implementing the device identification method according to the first aspect.
In order to solve the above problem, a third aspect of the present application provides a device identification system, including a server and at least one mobile terminal connected to each other: the mobile terminal is used for acquiring a real-time scene image of the equipment to be identified and sending the real-time scene image to the server; the server is the server of the second aspect.
In order to solve the above-mentioned problem, a fourth aspect of the present application provides a computer-readable storage medium having stored thereon program instructions that, when executed by a processor, implement the device identification method of the first aspect.
The invention has the beneficial effects that: different from the prior art, the device identification method of the application comprises the following steps: the server receives a real-time scene image acquired by the mobile terminal by the equipment to be identified; judging whether the real-time scene image is matched with a reference augmented reality image of the equipment to be identified; and if so, confirming that the mobile terminal correctly identifies the equipment to be identified. According to the method and the device, whether the real-time scene image acquired by the to-be-recognized device and the reference augmented reality image of the to-be-recognized device are matched or not is judged by the mobile terminal, whether the to-be-recognized device is correctly identified or not can be confirmed, the identification of the to-be-recognized device which is inconvenient or difficult to manually observe and identify is more effective, and the identification efficiency of the device is improved.
Drawings
Fig. 1 is a schematic flowchart of a first embodiment of a device identification method provided in the present application;
fig. 2 is a schematic flowchart of a second embodiment of a device identification method provided in the present application;
FIG. 3 is a flowchart illustrating an embodiment of step S21 in FIG. 2;
fig. 4 is a schematic flowchart of a third embodiment of a device identification method provided in the present application;
fig. 5 is a schematic flowchart of an application scenario of a device identification method provided in the present application;
FIG. 6 is a schematic block diagram of an embodiment of a server provided in the present application;
FIG. 7 is a schematic diagram of an embodiment of an equipment identification system provided herein;
FIG. 8 is a schematic structural diagram of an embodiment of a computer-readable storage medium provided in the present application.
Detailed Description
The following describes in detail the embodiments of the present application with reference to the drawings attached hereto.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, interfaces, techniques, etc. in order to provide a thorough understanding of the present application.
The terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. Further, the term "plurality" herein means two or more than two.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a device identification method according to a first embodiment of the present disclosure. The device identification method in the embodiment includes:
step S11: and the server receives the real-time scene image acquired by the equipment to be identified by the mobile terminal.
The mobile terminal can be carried about by a user and is an intelligent terminal with camera shooting and communication functions, for example, the mobile terminal can be an intelligent electronic device such as a smart phone, intelligent glasses, an intelligent watch, a palm computer or a tablet computer, and the device to be identified in the application can be a related device needing to be managed, for example, a transformer, a motor, a boiler, a material vehicle, a machine tool or a robot. It can be understood that the maintenance of normal performance of the equipment can provide basic guarantee for product development and production of enterprises, so the management and maintenance of the equipment are very important, and the primary task of managing the equipment is to correctly identify the equipment. In the embodiment of the application, the mobile terminal is an intelligent terminal which can be carried about by a user and has the functions of camera shooting and communication, so that the user can acquire the real-time scene image of the equipment to be identified through the camera shooting function of the mobile terminal and send the real-time scene image to the server, and the server can receive the real-time scene image acquired by the mobile terminal of the equipment to be identified.
Step S12: and judging whether the real-time scene image is matched with the reference augmented reality image of the equipment to be identified. If yes, go to step S13; if not, the fact that the real-time scene image acquired by the user through the mobile terminal is wrong and does not accord with the device to be identified is indicated, and the device is not identified successfully.
Step S13: and confirming that the mobile terminal correctly identifies the equipment to be identified.
Augmented Reality (AR) is a new technology for seamlessly integrating real world information and virtual world information, and is characterized in that entity information which is difficult to experience in a certain time space range of the real world originally, such as visual information, sound, taste, touch and the like, is simulated and then superposed through scientific technologies such as computers and the like, virtual information is applied to the real world and is perceived by human senses, so that the sensory experience beyond the Reality is achieved, and a real environment and a virtual object are superposed on the same picture or space in real time. The augmented reality technology not only shows the information of the real world, but also displays the virtual information at the same time, the two kinds of information are mutually supplemented and superposed, in the visual augmented reality, a user utilizes display equipment to multiply synthesize the real world and computer graphics together, so that the real world can be seen to surround the real world, and the augmented reality provides information which is different from human perception under the general condition. In the application, the reference augmented reality image of the device to be recognized is the AR information of the device to be recognized, the device structure of the device to be recognized can be presented to a user, and particularly, a structure part which is difficult to be found by the user can be presented, for example, the back of the device which is difficult to be found by the user, the top of the device with a large body type, high-altitude equipment and charged equipment which are difficult to approach, the internal structure of the device which is difficult to be presented and the like. Therefore, after receiving the real-time scene image acquired by the mobile terminal to the equipment to be identified, the server can judge whether the real-time scene image is matched with the preset reference augmented reality image of the equipment to be identified, if the real-time scene image is not matched with the reference augmented reality image of the equipment to be identified, the fact that the real-time scene image acquired by the user through the mobile terminal is wrong, the real-time scene image is not matched with the equipment to be identified, the equipment is not identified successfully, and if the real-time scene image is matched with the reference augmented reality image of the equipment to be identified, the fact that the mobile terminal identifies the equipment to be identified correctly is confirmed.
In this embodiment, whether the real-time scene image that the equipment to be identified was gathered and the reference augmented reality image of the equipment to be identified are matched through judging mobile terminal, can confirm whether correct to this equipment to be identified's discernment, it is more effective to the discernment of some inconvenient or difficult manual observation discernment equipment especially, the recognition efficiency of equipment has been promoted, and for modes such as traditional pasting bar code, two-dimensional code, RFID, can alleviate the work load that the user made, pasted the label, and can effectively reduce administrative cost.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a device identification method according to a second embodiment of the present application. The device identification method in the embodiment includes:
step S21: the server establishes a device information database of the device to be identified. The device information database comprises basic archive information of the device to be identified and a reference augmented reality image.
It can be understood that before the device is identified by judging whether the real-time scene image acquired by the device to be identified by the mobile terminal is matched with the preset reference augmented reality image of the device to be identified, the server is required to establish a device information database of the device to be identified. The device information database comprises basic archive information and a reference augmented reality image of the device to be identified, the basic archive information of the device to be identified and the reference augmented reality image of the device to be identified are associated and stored in the device information database, and therefore the reference augmented reality image and the basic archive information of the device to be identified can be in one-to-one correspondence. In one embodiment, the basic archive information of the device to be identified and other related information and data may also be presented with reference to the augmented reality image, for example, a variety of auxiliary information is presented to the user, including static data such as a device number, a factory date, a manufacturer, a panel of a virtual instrument, a structure of a part of the device to be repaired, and parameters of operation of the device.
Referring to fig. 3, fig. 3 is a flowchart illustrating an embodiment of step S21 in fig. 2. In this embodiment, the step S21 may specifically include:
step S211: and the server sends the basic archive information of the equipment to be identified to the information acquisition terminal.
Step S212: and receiving at least one initial augmented reality image acquired by the information acquisition terminal.
It can be understood that, establishing the device information database of the device to be identified needs to acquire the reference augmented reality image of the device to be identified, and needs to establish an association relationship between the reference augmented reality image of the device to be identified and the basic archive information of the device to be identified. Therefore, the server can send the basic archive information of the equipment to be identified to the information acquisition terminal, the information acquisition terminal acquires at least one real image of the equipment to be identified as an initial augmented reality image of the equipment to be identified according to the basic archive information of the equipment to be identified, and then the server can receive the at least one initial augmented reality image acquired by the information acquisition terminal. The information acquisition terminal is an intelligent terminal which can be carried about by a user and has camera shooting and communication functions, and can be intelligent electronic equipment such as a smart phone, intelligent glasses, a smart watch, a palm computer or a tablet personal computer; in an embodiment, the information collecting terminal may be the aforementioned mobile terminal, and in other embodiments, the information collecting terminal may not be the aforementioned mobile terminal.
Step S213: and after the at least one initial augmented reality image is determined to pass the audit, obtaining a reference augmented reality image of the equipment to be identified by using the at least one initial augmented reality image after the audit passes.
Step S214: and storing the basic archive information and the reference augmented reality image of the equipment to be identified in an equipment information database of the equipment to be identified.
After the initial augmented reality image acquired by the information acquisition terminal is acquired, the initial augmented reality image needs to be audited, it can be understood that the acquired initial augmented reality image can be one or more, and a user needs to check and screen the initial augmented reality image item by item to judge whether the acquired initial augmented reality image is correct. After it is determined that the at least one initial augmented reality image passes the audit, a reference augmented reality image of the device to be identified can be obtained by using the at least one initial augmented reality image after the audit passes. Specifically, the server may generate the image-text information and the three-dimensional model of the device to be recognized by using the at least one initial augmented reality image and the basic archive information of the device to be recognized after the audit is passed, that is, the image-text information and the three-dimensional model of the device to be recognized include the basic archive information and the reference augmented reality image of the device to be recognized, and then store the basic archive information and the reference augmented reality image of the device to be recognized in the device information database of the device to be recognized.
In an application scene, an equipment information database needs to be established for a plurality of equipment, at the moment, a plurality of information acquisition equipment connected with a server can be set, acquisition tasks of images of the equipment are distributed to the information acquisition equipment, the information acquisition equipment acquires basic archive information of the equipment to be identified through the server, after the information acquisition equipment arrives at the site of the equipment to be identified, the information acquisition equipment completes real images of the equipment according to respective tasks, the real images can comprise integral or local structural images of the equipment to be identified, and the real images are transmitted back to the server as initial augmented reality images to be checked and audited; the user checks and screens the initial augmented reality image returned by the information acquisition equipment item by item, eliminates invalid information and retains useful information, and after the initial augmented reality image is confirmed to be approved without errors, the approved initial augmented reality image is stored in the equipment information database of the equipment to be identified.
Step S22: and the server receives the real-time scene image acquired by the equipment to be identified by the mobile terminal.
Step S23: and judging whether the real-time scene image is matched with the reference augmented reality image of the equipment to be identified. If yes, go to step S24; if not, step S25 is executed.
Step S24: and confirming that the mobile terminal correctly identifies the equipment to be identified.
In this implementation scenario, steps S22 to S24 provided in this embodiment are substantially similar to steps S11 to S13 in the first embodiment of the device identification method provided in this application, and are not described again here.
Step S25: and continuously receiving a new real-time scene image acquired by the mobile terminal to the equipment to be identified, and re-executing the steps of judging whether the real-time scene image is matched with the reference augmented reality image of the equipment to be identified and the subsequent steps by utilizing the new real-time scene image.
When the real-time scene image is judged to be not matched with the reference augmented reality image of the equipment to be identified, the equipment is not identified successfully, which may be caused by the fact that the real-time scene image acquired by the user through the mobile terminal is wrong, therefore, the new real-time scene image acquired by the mobile terminal to the equipment to be identified needs to be continuously received, and whether the real-time scene image is matched with the reference augmented reality image of the equipment to be identified and the subsequent steps are re-executed by utilizing the new real-time scene image; and if the new real-time scene image is judged to be matched with the reference augmented reality image of the equipment to be identified, the mobile terminal can be confirmed to correctly identify the equipment to be identified.
Step S26: and after the failure times of acquiring the real-time scene image matched with the augmented reality image exceed the preset times, stopping acquiring the real-time scene image of the equipment to be identified, and receiving the augmented reality abnormal information of the equipment to be identified, which is sent by the mobile terminal. The augmented reality abnormal information of the device to be recognized comprises a real-time scene image of the device to be recognized after the field operation is completed and operation information about the device to be recognized.
It is understood that, when the new real-time scene image acquired by the mobile terminal by the device to be recognized is continuously received and continuously matched with the reference augmented reality image of the device to be recognized in step S25, there may be a case where multiple times of matching is unsuccessful, which may be caused by the reference augmented reality image of the device to be recognized being incorrect, or may be caused by a person using the mobile terminal not recognizing the correct device; therefore, after the failure times of acquiring the real-time scene image matched with the augmented reality image exceed the preset times, the server can stop acquiring the real-time scene image of the device to be identified and receive the augmented reality abnormal information of the device to be identified sent by the mobile terminal. In this embodiment, the purpose of identifying the device to be identified is that a worker can complete field operation of the device to be identified after correct identification, for example, the device to be identified is checked and maintained, or the device to be identified is repaired, or the device to be identified is subjected to field operation and adjustment of working parameters, and the reason that the real-time scene image matched with the augmented reality image is obtained fails cannot be determined at this time.
Step S27: and after the augmented reality abnormal information is determined to pass the audit, updating the reference augmented reality image of the equipment to be identified by using the real-time scene image in the augmented reality abnormal information so as to complete the updating of the equipment information database of the equipment to be identified.
After receiving the augmented reality abnormal information sent by the mobile terminal, the augmented reality abnormal information needs to be checked, and particularly, real-time scene images of the equipment to be identified in the augmented reality abnormal information after field operation is completed need to be checked and screened item by item. Specifically, the server may regenerate the image-text information and the three-dimensional model of the device to be recognized by using the real-time scene image in the augmented reality abnormal information and the basic archive information of the device to be recognized, or replace a part of the initial augmented reality image that has the same position information as the real-time scene image in the augmented reality abnormal information, so as to update the image-text information and the three-dimensional model of the device to be recognized; thus, the updated reference augmented reality image of the device to be recognized can be obtained, and the updating of the device information database of the device to be recognized is completed.
In this embodiment, whether the identification of the device to be identified is correct can be determined by judging whether the real-time scene image acquired by the mobile terminal of the device to be identified is matched with the reference augmented reality image of the device to be identified, and the augmented reality abnormal information of the device to be identified can be sent to the server through the mobile terminal under the condition that the identification of the device to be identified fails, so as to complete the update of the device information database of the device to be identified; in addition, by auditing the augmented reality abnormal information, the method can judge whether the equipment identification fails due to the working error of the staff using the information acquisition terminal or the working error of the staff using the mobile terminal, can effectively avoid the intentional or unintentional working error of the staff, and is convenient for the management of the staff.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating a device identification method according to a third embodiment of the present application. The device identification method in the embodiment includes:
step S41: and the server receives the real-time scene image acquired by the equipment to be identified by the mobile terminal.
Step S42: and judging whether the real-time scene image is matched with the reference augmented reality image of the equipment to be identified. If yes, go to step S43; if not, the fact that the real-time scene image acquired by the user through the mobile terminal is wrong and does not accord with the device to be identified is indicated, and the device is not identified successfully.
Step S43: and confirming that the mobile terminal correctly identifies the equipment to be identified.
In this implementation scenario, steps S41 to S43 provided in this embodiment are substantially similar to steps S11 to S13 in the first embodiment of the device identification method provided in this application, and are not described again here.
Step S44: and confirming that the mobile terminal is in the field of the equipment to be identified, and receiving operation information about the equipment to be identified, which is sent by the mobile terminal.
It can be understood that after the mobile terminal is confirmed to correctly identify the device to be identified, it can be confirmed that the mobile terminal is located on the site of the device to be identified, and then the worker using the mobile terminal can complete the site operation of the device to be identified after correctly identifying, and after the site operation is completed, the relevant information of the device to be identified may be changed, for example, the device to be identified is repaired to change a partial structure of the device, or the device to be identified is subjected to site operation and adjustment of working parameters to change data information of the device, so that it is necessary to send the operation information about the device to be identified to the server through the mobile terminal to determine whether the site operation of the device to be identified is completed, and the image-text information and the three-dimensional model of the device to be identified can be updated.
Fig. 5 is a schematic flowchart of an application scenario of the device identification method provided in the present application, which includes two parts, namely, collecting AR information of a device and identifying the device by the AR information. The method comprises the steps that firstly, AR information of equipment needs to be collected, a task of collecting the AR information of the equipment can be distributed to an information collector through a leader of an equipment part, and then the information collector executes a collecting task by using an information collecting terminal; specifically, the information acquisition terminal has basic archive information of the equipment, an information acquirer selects the equipment to be acquired on the information acquisition terminal, a camera of the information acquisition terminal is opened to acquire picture information and position information of the corresponding equipment, whether the acquisition of the picture information and the position information of the corresponding equipment is completed or not is judged, if the acquisition is completed, the information of the next equipment is continuously acquired, and if the acquisition of the information of all the equipment to be acquired is completed, the acquired equipment information can be added into an equipment information database to complete the acquisition of the AR information of the equipment. When the device leader sends the operation task of the device to the task executor, the basic archive information and AR information of the device are sent to the mobile terminal used by the task executor, so that the task executor can select the device to be operated on the mobile terminal, the mobile terminal can automatically prompt the AR information of the device to be operated, then the task executor opens a camera of the mobile terminal to perform device identification, and whether the acquired picture is matched with the AR information of the device to be operated is judged; if the matching is successful, the equipment identification is correct, and then the task executor can execute the operation task of the equipment; if the matching fails, the picture needs to be continuously acquired and matched with the AR information of the equipment to be operated, if the matching failure times exceed the limit times, abnormal AR information and the latest picture information need to be recorded, the operation task of the equipment is completed, at the moment, the abnormal information needs to be processed after the operation task of the equipment is completed, the AR information of the equipment is re-identified from the abnormal AR information and the latest picture, the re-identified AR information of the equipment is added into an equipment information database, and the updating of the AR information of the equipment is completed.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a server according to an embodiment of the present disclosure. The server 60 in the present application includes a communication circuit 600, a memory 602, and a processor 604 coupled to each other; the communication circuit 600 is used for communicating with a terminal device; the memory 602 is used to store program data; processor 604 executes program data for implementing the following device identification methods:
the server 60 receives the real-time scene image collected by the device to be identified by the mobile terminal through the communication circuit 600; the processor 604 determines whether the real-time scene image matches a reference augmented reality image of the device to be identified; and if so, confirming that the mobile terminal correctly identifies the equipment to be identified.
Further, before the step of the server 60 receiving the real-time scene image acquired by the mobile terminal through the communication circuit 600, the processor 604 of the server 60 is further configured to establish an equipment information database of the equipment to be identified; the device information database comprises basic archive information of the device to be identified and a reference augmented reality image.
Further, the processor 604 executes the above steps of establishing the device information database of the device to be identified, including: basic file information of the equipment to be identified is sent to the information acquisition terminal through a communication circuit 600 of the server 60; receiving at least one initial augmented reality image acquired by an information acquisition terminal; after the at least one initial augmented reality image is determined to pass the audit, obtaining a reference augmented reality image of the equipment to be identified by using the at least one initial augmented reality image after the audit passes; and storing the basic archive information and the reference augmented reality image of the equipment to be identified in an equipment information database of the equipment to be identified.
As an implementation manner, after the step of determining whether the real-time scene image matches with the reference augmented reality image of the device to be recognized, the processor 602 further includes: and if not, continuously receiving a new real-time scene image acquired by the mobile terminal for the equipment to be identified, and re-executing the subsequent steps of judging whether the real-time scene image is matched with the reference augmented reality image of the equipment to be identified by using the new real-time scene image.
Further, the device identification method implemented by the processor 602 further includes: after the failure times of acquiring the real-time scene image matched with the augmented reality image exceed the preset times, the processor 602 stops acquiring the real-time scene image of the device to be identified and receives augmented reality abnormal information of the device to be identified, which is sent by the mobile terminal; the augmented reality abnormal information of the equipment to be identified comprises a real-time scene image of the equipment to be identified after the field operation is finished and operation information about the equipment to be identified; and after the augmented reality abnormal information is determined to pass the audit, updating the reference augmented reality image of the equipment to be identified by using the real-time scene image in the augmented reality abnormal information so as to complete the updating of the equipment information database of the equipment to be identified.
As an implementation manner, after executing the above step of confirming that the mobile terminal correctly identifies the device to be identified, the processor 602 further includes: the processor 602 confirms that the mobile terminal is in the field of the device to be identified and receives the job information about the device to be identified sent by the mobile terminal.
The specific process for the processor 602 and other components in the server embodiment provided by the present application to implement the above functions may refer to the above method embodiment.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an embodiment of an equipment identification system provided in the present application. The device identification system 70 in the present application comprises a server 700 and at least one mobile terminal 702 connected to each other: the mobile terminal 702 is configured to acquire a real-time scene image of the device to be identified and send the real-time scene image to the server 700; the server 700 is the server in the above-described server embodiment.
Further, the device identification system 70 further includes at least one information collecting terminal 704 connected to the server 700; the information acquisition terminal 704 is configured to acquire at least one initial augmented reality image and send the initial augmented reality image to the server 700 to obtain a reference augmented reality image of the device to be identified by using the initial augmented reality image.
The server 700 cooperates with at least one mobile terminal 702 and at least one information collecting terminal 704 to implement any one of the above-mentioned device identification methods, and specific processes can refer to the above-mentioned method embodiments.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of a computer-readable storage medium provided in the present application. Stored in the computer readable storage medium 80 are program instructions 800, which program instructions 800 can be executed to implement the method of rendering image frames as described above. The computer readable storage medium 80 may be a storage chip in a server, a readable and writable tool such as an SD card, or a server.
In the several embodiments provided in the present application, it should be understood that the disclosed method, server, system and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (10)

1. A method for device identification, the method comprising:
the server receives a real-time scene image acquired by the mobile terminal by the equipment to be identified;
judging whether the real-time scene image is matched with a reference augmented reality image of the equipment to be identified;
and if so, confirming that the mobile terminal correctly identifies the equipment to be identified.
2. The method according to claim 1, wherein before the step of the server receiving the real-time scene image acquired by the mobile terminal through the device to be recognized, the method comprises:
the server establishes an equipment information database of the equipment to be identified; and the equipment information database comprises basic archive information and a reference augmented reality image of the equipment to be identified.
3. The method according to claim 2, wherein the step of the server establishing the device information database of the device to be identified comprises:
the server sends the basic file information of the equipment to be identified to an information acquisition terminal;
receiving at least one initial augmented reality image acquired by the information acquisition terminal;
after the at least one initial augmented reality image is determined to pass the audit, obtaining a reference augmented reality image of the equipment to be identified by using the at least one initial augmented reality image after the audit passes;
and storing the basic archive information and the reference augmented reality image of the equipment to be identified in an equipment information database of the equipment to be identified.
4. The method of claim 2, wherein after the step of determining whether the real-time scene image matches the augmented reality image, the method further comprises:
and if not, continuously receiving a new real-time scene image acquired by the mobile terminal to the equipment to be identified, and re-executing the steps of judging whether the real-time scene image is matched with the reference augmented reality image of the equipment to be identified and the subsequent steps by utilizing the new real-time scene image.
5. The method of claim 4, further comprising:
stopping acquiring the real-time scene image of the equipment to be identified after the failure frequency of acquiring the real-time scene image matched with the augmented reality image exceeds a preset frequency, and receiving augmented reality abnormal information of the equipment to be identified, which is sent by the mobile terminal; the augmented reality abnormal information of the equipment to be identified comprises a real-time scene image of the equipment to be identified after the on-site operation of the equipment to be identified is completed and operation information about the equipment to be identified;
and after the augmented reality abnormal information is determined to pass the audit, updating the reference augmented reality image of the equipment to be identified by using the real-time scene image in the augmented reality abnormal information so as to complete the updating of the equipment information database of the equipment to be identified.
6. The method according to claim 1, characterized in that after said step of confirming that the identification of the device to be identified by the mobile terminal is correct, the method comprises:
and confirming that the mobile terminal is in the site of the equipment to be identified, and receiving operation information about the equipment to be identified, which is sent by the mobile terminal.
7. A server, comprising communication circuitry, memory and a processor coupled to each other; the communication circuit is used for communicating with terminal equipment; the memory is used for storing program data; the processor executes the program data for implementing the method according to any one of claims 1-6.
8. An equipment identification system, characterized by comprising a server and at least one mobile terminal connected to each other:
the mobile terminal is used for acquiring a real-time scene image of the equipment to be identified and sending the real-time scene image to the server;
the server is the server of claim 7.
9. The system of claim 8, further comprising at least one information collection terminal connected to the server;
the information acquisition terminal is used for acquiring at least one initial augmented reality image and sending the initial augmented reality image to the server so as to obtain a reference augmented reality image of the equipment to be identified by using the initial augmented reality image.
10. A computer-readable storage medium having stored thereon program instructions, which when executed by a processor, implement the device identification method of any one of claims 1 to 6.
CN202010584174.2A 2020-06-23 2020-06-23 Device identification method, server, system, and computer-readable storage medium Pending CN111797914A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010584174.2A CN111797914A (en) 2020-06-23 2020-06-23 Device identification method, server, system, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010584174.2A CN111797914A (en) 2020-06-23 2020-06-23 Device identification method, server, system, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN111797914A true CN111797914A (en) 2020-10-20

Family

ID=72804581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010584174.2A Pending CN111797914A (en) 2020-06-23 2020-06-23 Device identification method, server, system, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN111797914A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103489002A (en) * 2013-09-27 2014-01-01 广州中国科学院软件应用技术研究所 Reality augmenting method and system
CN110246163A (en) * 2019-05-17 2019-09-17 联想(上海)信息技术有限公司 Image processing method and its device, equipment, computer storage medium
CN110580024A (en) * 2019-09-17 2019-12-17 Oppo广东移动通信有限公司 workshop auxiliary operation implementation method and system based on augmented reality and storage medium
CN110909091A (en) * 2019-09-23 2020-03-24 国网天津静海供电有限公司 Marketing and distribution through data acquisition system and method based on AR identification technology

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103489002A (en) * 2013-09-27 2014-01-01 广州中国科学院软件应用技术研究所 Reality augmenting method and system
CN110246163A (en) * 2019-05-17 2019-09-17 联想(上海)信息技术有限公司 Image processing method and its device, equipment, computer storage medium
CN110580024A (en) * 2019-09-17 2019-12-17 Oppo广东移动通信有限公司 workshop auxiliary operation implementation method and system based on augmented reality and storage medium
CN110909091A (en) * 2019-09-23 2020-03-24 国网天津静海供电有限公司 Marketing and distribution through data acquisition system and method based on AR identification technology

Similar Documents

Publication Publication Date Title
WO2019174009A1 (en) Machine room management method and dynamic environment system
EP2755096A1 (en) Work management system, work management terminal, program and work management method
CN113469378B (en) Maintenance method and maintenance equipment
CN110972519B (en) Information providing system and information providing method
JPWO2017216929A1 (en) Medical device information providing system, medical device information providing method and program
CN111192019A (en) Reimbursement processing method of target bill and related equipment
CN110119915B (en) Object warehousing processing method, device and system
JP2021114700A (en) Work support system and work support method
CN112507945B (en) Method and device for managing and controlling behavior of operator, electronic equipment and storage medium
CN112035325B (en) Text robot automatic monitoring method and device
CN110266994A (en) A kind of video call method, video conversation apparatus and terminal
CN113361468A (en) Business quality inspection method, device, equipment and storage medium
CN111797914A (en) Device identification method, server, system, and computer-readable storage medium
EP1653191A1 (en) Information presentation device and information presentation system using the same
CN108960111B (en) Face recognition method, face recognition system and terminal equipment
KR102469474B1 (en) Automatic construction daily report creation system using video input apparatus at the construction site
CN109872470A (en) A kind of self-help teller machine working method, system and device
JP2003233719A (en) Automatic contract machine and its transaction method
CN111627126B (en) Method and device for checking attendance and preventing cheating, storage medium and terminal
US20210302945A1 (en) Information processing apparatus, information processing system, and non-transitory computer readable medium storing program
JP2019074904A (en) Work instruction system and method
CN113963363A (en) Detection method and device based on AR technology
KR200337484Y1 (en) Apparatus for creating source data for assiduity management
TWM596380U (en) Artificial intelligence and augmented reality system
CN105259846B (en) A kind of intelligent robot for realizing slitless connection between system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination