CN111105506A - Real-time monitoring method and device based on mixed reality - Google Patents

Real-time monitoring method and device based on mixed reality Download PDF

Info

Publication number
CN111105506A
CN111105506A CN201911321083.3A CN201911321083A CN111105506A CN 111105506 A CN111105506 A CN 111105506A CN 201911321083 A CN201911321083 A CN 201911321083A CN 111105506 A CN111105506 A CN 111105506A
Authority
CN
China
Prior art keywords
equipment
real
monitored
video data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201911321083.3A
Other languages
Chinese (zh)
Inventor
黄锦培
贾沛
陈朝晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Huazhiyuan Information Engineering Co ltd
Guangzhou Huajia Software Co Ltd
Guangzhou Jiadu Urban Rail Intelligent Operation And Maintenance Service Co Ltd
Guangzhou Xinke Jiadu Technology Co Ltd
Original Assignee
Guangdong Huazhiyuan Information Engineering Co ltd
Guangzhou Huajia Software Co Ltd
Guangzhou Jiadu Urban Rail Intelligent Operation And Maintenance Service Co Ltd
Guangzhou Xinke Jiadu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Huazhiyuan Information Engineering Co ltd, Guangzhou Huajia Software Co Ltd, Guangzhou Jiadu Urban Rail Intelligent Operation And Maintenance Service Co Ltd, Guangzhou Xinke Jiadu Technology Co Ltd filed Critical Guangdong Huazhiyuan Information Engineering Co ltd
Priority to CN201911321083.3A priority Critical patent/CN111105506A/en
Publication of CN111105506A publication Critical patent/CN111105506A/en
Priority to PCT/CN2020/121658 priority patent/WO2021120816A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • User Interface Of Digital Computer (AREA)
  • Computer Hardware Design (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The embodiment of the application discloses a real-time monitoring method and device based on mixed reality, electronic equipment and a storage medium. According to the technical scheme, the corresponding equipment to be monitored is determined based on the position information by acquiring the position information, the equipment information and the real-time state data of the equipment to be monitored are extracted, the equipment information and the real-time state data are marked in the corresponding video data and the corresponding graphical interface, and then the video data and the graphical interface are coded and output to the terminal equipment for displaying. By adopting the technical means, the technical problem that only the state information of the equipment can be seen but the actual condition of the equipment cannot be seen when the equipment is monitored in the current rail transit industry is solved. And the technical problem that only the equipment entity can be seen but detailed state information of the equipment cannot be seen in field inspection is solved, and the convenience and the safety of equipment monitoring and inspection are further improved by marking the equipment state data in the video data of the corresponding equipment.

Description

Real-time monitoring method and device based on mixed reality
Technical Field
The embodiment of the application relates to the technical field of equipment monitoring, in particular to a real-time monitoring method and device based on mixed reality.
Background
At present, in the running process of an urban rail transit system, the running state of equipment can be monitored in real time in order to better perform running maintenance on each equipment. The real-time state data of the equipment is collected through the background server side for processing, and the front-end human-computer interface is displayed in a mode of combining configuration with a software interface. When the real-time state data of the equipment is displayed, a configuration development tool is used for drawing a human-computer interface according to a physical or logic layout drawing of the equipment, different icons are used for representing various kinds of equipment, and important equipment state data are displayed on the icons in a color, text and other modes. When the user needs to view the detailed information, the corresponding device control panel program can be opened for viewing by selecting the icon. Therefore, the operation and maintenance management personnel can check the equipment state data in real time.
However, when the operation and maintenance manager monitors and inspects the equipment by using the above-mentioned equipment state data display method, the operation and maintenance manager needs to operate the front-end human-machine interface for many times to obtain the corresponding equipment state data, and the query flow of the equipment state data is complicated. In addition, the current video picture of the equipment cannot be checked on a human-computer interface, the real-scene condition of a fault part cannot be visually presented when the equipment fails, and the inspection efficiency of maintenance managers is seriously influenced.
Disclosure of Invention
The embodiment of the application provides a real-time monitoring method and device based on mixed reality, electronic equipment and a storage medium, so that the equipment state data can be conveniently checked, and the convenience and the safety of equipment monitoring and routing inspection are improved.
In a first aspect, an embodiment of the present application provides a real-time monitoring method based on mixed reality, including:
acquiring position information, and determining corresponding equipment to be monitored based on the position information, wherein the position information comprises the position information of a current user or the position information input by the current user;
extracting the equipment information and the real-time state data of the equipment to be monitored, and marking the equipment information and the real-time state data in the video data and the graphical interface corresponding to the equipment to be monitored, wherein the graphical interface comprises a three-dimensional model corresponding to the equipment to be monitored and an equipment state display picture;
and encoding the video data and the graphical interface and outputting the encoded video data and the graphical interface to terminal equipment for displaying.
Further, the determining the corresponding device to be monitored based on the location information includes:
determining a corresponding monitoring camera within a set distance range according to the position information;
and extracting real-time video data of the monitoring camera, and determining the corresponding equipment to be monitored from the video data.
Further, the terminal device comprises a background monitoring terminal, a mobile terminal device or VR glasses.
Further, the determining, according to the position information, a corresponding monitoring camera within a set distance range includes:
feeding back a corresponding three-dimensional monitoring scene according to the position information, and feeding back a plurality of camera information within a set distance range to VR glasses of a current user;
the method comprises the steps of obtaining a first focusing direction of a current user through VR glasses, and determining a corresponding monitoring camera from a plurality of cameras according to a camera corresponding to the first focusing direction.
Further, the extracting real-time video data of the monitoring camera and determining the corresponding device to be monitored from the video data includes:
extracting real-time video data of the monitoring camera, embedding the real-time video data into a three-dimensional monitoring scene, and sending the real-time video data to VR glasses of a current user for displaying;
and acquiring a second focusing direction of the current user through VR glasses, and determining the equipment to be monitored according to a position corresponding to the second focusing direction.
Further, the determining the corresponding device to be monitored based on the location information includes:
determining a corresponding monitoring camera within a set distance range according to the position information;
sending the equipment information corresponding to the monitoring camera to the terminal equipment;
and determining the equipment to be monitored according to the selection information returned by the terminal equipment.
Further, the marking the device information and the real-time status data in the video data and the graphical interface corresponding to the device to be monitored, where the graphical interface includes a three-dimensional model corresponding to the device to be monitored and a device status display picture, includes:
determining the position of the corresponding equipment to be monitored in the video data and the graphical interface;
and marking the equipment information and the real-time state data corresponding to the equipment to be monitored in the video picture of the corresponding video data and the three-dimensional model of the graphical interface.
In a second aspect, an embodiment of the present application provides a real-time monitoring apparatus based on mixed reality, including:
the acquisition module is used for acquiring the position information of the current user or the position information input by the user and determining the corresponding equipment to be monitored based on the position information;
the marking module is used for extracting the equipment information and the real-time state data of the equipment to be monitored, and marking the equipment information and the real-time state data in the video data and the graphical interface corresponding to the equipment to be monitored, wherein the graphical interface comprises a three-dimensional model corresponding to the equipment to be monitored and an equipment state display picture;
and the output module is used for encoding the video data and the graphical interface and outputting the video data and the graphical interface to terminal equipment for displaying.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory and one or more processors;
the memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the mixed reality based real-time monitoring method of the first aspect.
In a fourth aspect, embodiments of the present application provide a storage medium containing computer-executable instructions for performing the mixed reality based real-time monitoring method according to the first aspect when executed by a computer processor.
According to the embodiment of the application, the corresponding equipment to be monitored is determined based on the position information by acquiring the position information, the equipment information and the real-time state data of the equipment to be monitored are extracted and marked in the corresponding video data and the graphical interface, and then the video data and the graphical interface are coded and output to the terminal equipment for displaying. By adopting the technical means, the corresponding equipment can be selected to check the state data according to the position information when the equipment is monitored or patrolled in a centralized manner, so that the convenience of the equipment monitoring or patrolling in a centralized manner is improved.
In addition, this application embodiment is through marking equipment status data in the video data and the graphical interface of corresponding equipment, is convenient for directly look over equipment status data through video data and graphical interface to when avoiding patrolling and examining equipment, look over the dangerous condition that equipment status data exists on the spot, further improve the security that equipment patrolled and examined.
Drawings
Fig. 1 is a flowchart of a real-time monitoring method based on mixed reality according to an embodiment of the present application;
FIG. 2 is a first flowchart of determining a device to be monitored according to one embodiment of the present application;
FIG. 3 is a second flowchart of determining a device to be monitored according to one embodiment of the present application;
fig. 4 is a schematic diagram of equipment inspection by using VR glasses in the first embodiment of the present application;
fig. 5 is a flowchart of selecting a surveillance camera for VR glasses in an embodiment of the present application;
fig. 6 is a schematic view of a monitoring camera selection interface of VR glasses according to an embodiment of the present application;
fig. 7 is a flowchart illustrating VR glasses selecting a device to be monitored according to an embodiment of the present application;
fig. 8 is a schematic view of a to-be-monitored device selection interface of VR glasses according to an embodiment of the present application;
FIG. 9 is a flowchart of video data annotation according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a real-time monitoring device based on mixed reality according to a second embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, specific embodiments of the present application will be described in detail with reference to the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some but not all of the relevant portions of the present application are shown in the drawings. Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The application provides a real-time monitoring method based on mixed reality, aims at carrying out the control and the inspection of equipment through utilizing mixed reality technology to assist. And determining the corresponding equipment to be monitored according to the position information of the current user or the position information input by the user, extracting the equipment information and the real-time state data of the equipment to be monitored, and marking the equipment information and the real-time state data on the video picture and the three-dimensional model of the corresponding equipment to be monitored. The user can learn the actual operation condition of the equipment to be monitored in all directions on the visual terminal equipment. Referring to the existing equipment monitoring system, most of the equipment monitoring systems are monitored through videos or through a human-computer interface and characters. This monitoring method does not facilitate the query of the device status data. Moreover, the monitoring system is usually arranged at the back end, and the query of the device state data can be only carried out at the back end through the monitoring center. When equipment is patrolled and examined, the patrolling personnel can only see the surface running condition of the field equipment, and the specific state data can not be synchronously displayed, so that the equipment is very unfavorable for patrolling, examining and maintaining the field equipment. Therefore, the real-time monitoring method based on mixed reality is provided to solve the technical problem that the equipment state data query process of the existing equipment monitoring system is complicated.
The first embodiment is as follows:
fig. 1 is a flowchart of a real-time monitoring method based on mixed reality according to an embodiment of the present disclosure, where the real-time monitoring method based on mixed reality provided in this embodiment may be executed by a real-time monitoring device based on mixed reality, the real-time monitoring device based on mixed reality may be implemented in a software and/or hardware manner, and the real-time monitoring device based on mixed reality may be formed by two or more physical entities or may be formed by one physical entity. Generally, the real-time monitoring device based on mixed reality can be a computer, a server side of a device monitoring system, and the like.
The following description will be given by taking a real-time monitoring device based on mixed reality as an example of a main body for executing the real-time monitoring method based on mixed reality. Referring to fig. 1, the real-time monitoring method based on mixed reality specifically includes:
s110, position information is obtained, and corresponding equipment to be monitored is determined based on the position information, wherein the position information comprises position information of a current user or position information input by the current user.
Illustratively, when a user monitors or inspects equipment, the current position information of the user is uploaded through terminal equipment, and the position information is used for determining the equipment which the user wants to inquire or the equipment corresponding to the current position of the user. The terminal equipment can be visual terminal equipment such as a background monitoring terminal of the equipment monitoring system, a mobile terminal of a user or VR glasses, and in practical application, different types of terminal equipment are selected to inquire real-time state data of the equipment according to practical monitoring or inspection requirements. For example, when the device is remotely monitored in the background, the background monitoring terminal or the user mobile terminal can be used for inquiring the real-time state data of the device; when the on-site inspection is carried out on the equipment, the inquiry of the real-time state data of the equipment can be carried out by adopting a mobile terminal or VR glasses of a user. There are many types of terminal devices for querying device status data, and this embodiment of the present application is not limited to this, and is not described here in detail.
It should be noted that, when the device is monitored in the background, the location information may be input through the terminal device, uploaded to the real-time monitoring device based on the mixed reality, and obtained by the real-time monitoring device; when the field device is patrolled and examined, the current position information can be positioned through the terminal device, the position information is taken as the position information of a user, and the position information is uploaded to the real-time monitoring device based on the mixed reality. It should be noted that, when the terminal device is used to locate the current location information, the current location is located indoors, and the positioning accuracy using the conventional GPS positioning method is relatively low, so that the current location information is obtained using the WiFi or bluetooth-based indoor positioning method in the embodiments of the present application. When indoor positioning is carried out, signal transmitters are arranged at various positions indoors, and WiFi or Bluetooth of terminal equipment is used as a signal receiver. The current position of the terminal equipment is determined by determining the distance between the terminal equipment and each signal transmitter, so that indoor accurate positioning is realized. There are many embodiments for the existing indoor positioning, and the embodiments of the present application are not limited to be fixed and are not described herein. In addition, the position information of the current user can be determined by a camera arranged on a monitoring site. The camera acquires video data containing a user, performs video analysis by combining a target detection and recognition algorithm, further determines position information of the user, and sends the position information to the real-time monitoring equipment based on mixed reality. The user identity is pre-bound with the terminal equipment, and when the camera identifies the corresponding user, the terminal equipment which has a binding relation with the user identity can be determined, so that the subsequent terminal equipment can conveniently interact with the real-time monitoring equipment based on mixed reality.
Furthermore, the device to be monitored, which needs to perform status data query by the current user, can be determined according to the acquired location information. Referring to fig. 2, the process of determining the device to be monitored includes:
s111, determining a corresponding monitoring camera in a set distance range according to the position information;
and S112, extracting real-time video data of the monitoring camera, and determining the corresponding equipment to be monitored from the video data.
Specifically, according to the embodiment of the application, relevant position data of each camera in a monitoring field is stored in advance, and the cameras perform real-time monitoring corresponding to one or more field devices. After the real-time monitoring equipment based on mixed reality acquires the position information, comparing the position information with the position data of each camera in a pre-stored monitoring field, and determining the monitoring camera which is located in a set distance range and corresponds to the position information, wherein the monitoring camera is used for further determining the equipment to be monitored, which needs to be queried by a user, of the state data. It should be noted that if there are a plurality of corresponding cameras in the set distance range, the terminal device is required to further determine which camera is used as the monitoring camera, and if there is only one camera in the set distance range, the camera is used as the monitoring camera.
Further, according to the determined monitoring camera, since the monitoring camera may correspond to one or more field devices for real-time monitoring, the corresponding device to be monitored needs to be further determined from the monitoring camera. It can be understood that if the cameras and the field devices are in a one-to-one correspondence relationship, after the monitoring cameras are determined, the devices to be monitored, which the user wants to query the status data, can be further determined. If the monitoring camera corresponds to a plurality of field devices for real-time monitoring, the user needs to further select and determine which of the corresponding field devices is to be monitored.
Specifically, the real-time instance of the application extracts the video data of the monitoring camera, sends the video data to the terminal device, and the terminal device displays the real-time video data of the monitoring camera. And selecting one field device from the field devices displayed in the real-time video data as the device to be monitored by the user. When the video data are displayed on the user mobile terminal or the background monitoring terminal, corresponding equipment in the video data can be selected through touch operation of a user, and the corresponding equipment to be monitored can be determined through responding to the selection of the user. When video data is displayed on VR glasses, the device selected by the user may be determined by the user's eye focus direction. It should be noted that, in the embodiment of the present application, the monitoring view angle of each camera is fixed, and the position of each field device in the correspondingly acquired video data is also relatively fixed. When a user performs touch operation or eye focusing on the terminal device, the corresponding field device at the position is determined according to the corresponding position on the video data corresponding to the operation, and then the device to be monitored is determined.
In addition, referring to fig. 3, another determination method for a device to be monitored is provided, where the process of determining the device to be monitored includes:
s113, determining a corresponding monitoring camera in a set distance range according to the position information;
s114, sending the equipment information corresponding to the monitoring camera to the terminal equipment;
and S115, determining the equipment to be monitored according to the selection information returned by the terminal equipment.
With respect to the above steps S111 to S112, the embodiment of the present application provides another determination method for a device to be monitored. As shown in fig. 3, according to the determined monitoring camera, the device to be monitored is determined through the association relationship between the camera and the device. And setting and storing corresponding association relations between the cameras and the corresponding field devices according to one or more field devices correspondingly monitored by the camera monitoring pictures. And subsequently, after the monitoring camera is determined, extracting the equipment information of each equipment under the monitoring picture of the monitoring camera according to the pre-stored association relation, and sending the equipment information to the terminal equipment for the user to select. And displaying the equipment information on a human-computer interaction interface of the terminal equipment in a mode of an option window. And determining which option is selected by the user according to the touch operation or eye focusing of the user, feeding back the selection information of the user to the real-time monitoring equipment based on mixed reality, and selecting the equipment to be monitored, which the user wants to inquire the state data, according to the selection information.
In addition, when the user patrols and examines, the device to be monitored can be directly determined by a video analysis mode according to the position information of the current user and the computer vision technology. It can be understood that the video analysis can detect which device the user is currently close to for polling, and then the video data is extracted from the camera for video analysis by determining the camera corresponding to the current user position information, that is, the device to be monitored, which the user is prepared to query the state data, can be detected and analyzed.
For example, embodiments of the present application are provided in which VR glasses are used to determine a monitoring camera and a device to be monitored. Referring to fig. 4, a scene schematic diagram of the rail transit system for polling related devices (gate machines) is provided, when the devices are polled, a user wears VR glasses, position information is provided through the VR glasses, and the device information and real-time state data are checked from a display interface of the VR glasses. Wherein, referring to fig. 5, the flow that the monitoring camera was selected to VR glasses includes:
s1111, feeding back a corresponding three-dimensional monitoring scene according to the position information, and feeding back information of a plurality of cameras within a set distance range to VR glasses of a current user;
s1112, a first focusing direction of the current user is obtained through VR glasses, and a corresponding monitoring camera is determined from a plurality of cameras according to the camera corresponding to the first focusing direction.
Illustratively, referring to fig. 6, according to the position information, on one hand, the three-dimensional monitoring scene at the corresponding position is fed back to VR glasses for display. And the VR glasses can also display the three-dimensional monitoring scene corresponding to the current position according to the self positioning information. The three-dimensional monitoring scene is constructed in advance corresponding to the monitoring scene of each device, and is stored in the real-time monitoring device based on mixed reality or the terminal device (such as VR glasses or mobile phones) of the user according to actual needs. On the other hand, a plurality of cameras are determined through VR glasses, the cameras are displayed on a display interface of the VR glasses in an option window mode, and the camera selected by the user is determined to be used as the camera to be monitored through the focusing direction of the eyes of the user. When a user uses VR glasses to select a monitoring camera, the focus point of the eyes of the user is tracked in real time according to the information of the eyes and the iris of the user, the coordinate position of the current focus point of the eyeballs of the user on the VR glasses is determined through an eyeball tracking function, an option window corresponding to the coordinate position is further determined, and then which camera is used as the monitoring camera is determined. In addition, a plurality of cameras determined according to the position information can also be displayed on the display interface of the VR glasses in a preview window mode. When displayed in the preview window, only the identification information (e.g., the number "video 1") related to the camera is displayed, and the image screen of the camera is not displayed. Due to the fact that the number of the determined cameras is possibly large, the condition that all the cameras are displayed in the option window form of the camera image may not be met in consideration of the limitation of the VR glasses display interface. In addition, when the preview window is displayed, the preview window can be dispersedly displayed on the display interface of the VR glasses as much as possible, so that the camera selected by the user can be more accurately determined when the eyes of the user are focused and selected.
In addition, since the positions of the plurality of cameras within the setting range determined according to the position information are generally adjacent to each other, in order to optimize the user experience, the video pictures or the three-dimensional models including the cameras can be fed back to the VR glasses worn by the user. Similarly, when a video picture or a three-dimensional model is displayed on VR glasses, the coordinate position of the current focus point of the eyeball of the user on the VR glasses is determined through the eyeball tracking function, and a camera corresponding to the coordinate position is further determined, so that which camera is selected by the user as a monitoring camera can be determined.
Further, after the VR glasses determine the monitoring camera, if the monitoring camera corresponds to a device to be monitored, the device is directly determined to be the device to be monitored; if the monitoring camera corresponds to a plurality of devices, the real-time monitoring device based on mixed reality further interacts with VR glasses, and the device to be monitored is further determined. Referring to fig. 7, a process of selecting a device to be monitored by VR glasses includes:
s1121, extracting real-time video data of the monitoring camera, embedding the real-time video data into the three-dimensional monitoring scene, and sending the real-time video data to VR glasses of a current user for displaying;
s1122, a second focusing direction of the current user is obtained through VR glasses, and the device to be monitored is determined according to the position corresponding to the second focusing direction. The second focusing direction is used herein to describe the current user's eye focusing direction to distinguish from the first focusing direction when the monitoring camera is selected as described above.
Specifically, the video data is sent to VR glasses of the user by extracting real-time video data of the monitoring camera corresponding to the determined monitoring camera. Referring to fig. 8, a schematic diagram of a selection interface of a device to be monitored of VR glasses is provided. In the figure, since the view angle of the monitoring camera is fixed, the positions of the devices in the video data are relatively fixed on the video picture. The positions of the devices in the video data are predetermined, so that the devices selected by the user can be determined according to the focusing positions of the glasses of the user. When the user uses the VR glasses to select the monitoring equipment, the video data corresponding to the monitoring camera is displayed on the VR glasses, and the focusing point of the user glasses is tracked in real time according to the eye and iris information of the user. And determining the coordinate position of the current focusing point of the eyeballs of the user on the VR glasses through an eyeball tracking function, further determining equipment in the video picture corresponding to the coordinate position, and taking the equipment as equipment to be monitored. And, when VR glasses need switch the camera and look over. And focusing on the screen position corresponding to the 'return' option through eyes, and according to the focusing position of the glasses of the user, the VR glasses exit the current display interface and return to the camera selection display interface shown in FIG. 6. The user further performs eye focusing to reselect a camera as a camera to be monitored through a camera selection interface as shown in fig. 6.
In addition, when the equipment to be monitored is selected, equipment information of each piece of equipment correspondingly monitored by the camera to be monitored can be sent to the VR glasses, the equipment is displayed on the VR glasses in an option window mode for a user to select, and finally the equipment to be monitored is determined according to the focus position of the eyes of the user.
According to the embodiment of the application, the closest camera and the closest equipment can be directly selected through the position information to serve as the monitoring camera and the equipment to be monitored, the determination modes of the monitoring camera and the equipment to be monitored are multiple, the embodiment of the application is not fixedly limited, and the repeated description is omitted. It should be noted that when the real-time video data is sent to the VR glasses for display, the video data is embedded in the three-dimensional monitoring scene, that is, the video data is superimposed on the current three-dimensional monitoring scene. The position of the corresponding monitoring camera can be specifically used, so that the appearance of the user is optimized. In the embodiment of the application, the selection and determination of the monitoring camera and the equipment to be monitored are carried out on the three-dimensional monitoring scene displayed by the terminal equipment, so that the use experience of a user can be further optimized.
And S120, extracting the equipment information and the real-time state data of the equipment to be monitored, and marking the equipment information and the real-time state data in the video data and the graphical interface corresponding to the equipment to be monitored, wherein the graphical interface comprises a three-dimensional model corresponding to the equipment to be monitored and an equipment state display picture.
And according to the device to be monitored determined in the step S110, extracting real-time status data and device information of the corresponding device through the background server, and feeding back the extracted real-time status data and device information to the terminal device for display. The real-time state data can correspond to different devices and can be real-time operation parameter information such as voltage, current, temperature, humidity and the like. The device information is the basic identification information such as the name and number of the device. The device information and the real-time state data are marked on the graphical interface corresponding to the device to be monitored, so that a user can visually check the device information and the real-time state data of the device to be monitored. The video data is a video picture corresponding to the equipment to be monitored; the graphical interface comprises a three-dimensional model corresponding to the equipment to be monitored and an equipment state display picture. When equipment information and real-time state data are labeled, the method and the device can label the video picture and the three-dimensional model of the equipment to be monitored together, and also provide an equipment state display picture of the equipment to be monitored. According to actual needs, the real-time status data to be labeled can be important status data of a part of the equipment to be monitored. And the detailed equipment information corresponding to all the state data of the equipment to be monitored can be displayed through an equipment state display picture.
Referring to fig. 9, the video data annotation process includes:
s1201, determining the position of the corresponding equipment to be monitored in the video data and the graphical interface;
s1202, the equipment information and the real-time state data corresponding to the equipment to be monitored are marked in the video picture of the corresponding video data and the three-dimensional model of the graphical interface.
It should be noted that, when the device information and the status data are labeled, the video picture and the three-dimensional model to be labeled may actually include the device to be monitored and the related background picture nearby, and this part of the background may also include other devices nearby the device to be monitored. In order to accurately display the equipment information and the state data and well take other equipment as background reference objects, the equipment information and the state data are marked in the corresponding positions of the three-dimensional model and the video picture by determining the specific positions of the equipment to be monitored in the three-dimensional model and the video picture during marking. The video picture can be a picture shot by a monitoring camera corresponding to the equipment to be monitored. The three-dimensional model can also be a three-dimensional model under the view angle of the corresponding monitoring camera or a three-dimensional model constructed based on the main view angle of the equipment to be monitored. When the equipment information and the real-time state data are labeled, the display picture corresponding to the three-dimensional model and the video data is extracted, the display picture is edited, and the corresponding equipment information and the real-time state data are inserted into the corresponding position of the display picture (namely the position of the equipment to be monitored), so that the equipment information and the real-time state data labeling corresponding to the three-dimensional model and the video picture can be completed. It should be noted that, during the labeling, different colors may be used for labeling corresponding to different real-time status data of the device to be monitored, so that the user can intuitively distinguish different real-time status conditions of the device to be monitored. For example, data in a normal state is marked with white, data in a critical state is marked with yellow, and data in an alarm state is marked with red.
And S130, encoding the video data and the graphical interface and outputting the encoded video data and the graphical interface to terminal equipment for displaying.
Finally, according to the three-dimensional model and the video picture labeled by the device information and the real-time status data and the device status display picture corresponding to the device to be monitored, which are completed in the step S120, the graphical interface corresponding to the three-dimensional model and the device status display picture and the video data corresponding to the video picture are sent to the terminal device for display, so as to complete the query of the device real-time status data.
For example, in an operation scene of an urban rail transit equipment monitoring system, equipment information and real-time state data of each equipment in the urban rail transit equipment monitoring system can be inquired through terminal equipment according to position information, and monitoring or inspection of various kinds of equipment of urban rail transit by operation and maintenance management personnel is optimized. When the background monitoring terminal is used for remotely monitoring the equipment, the position information of the equipment to be monitored can be input into a human-computer interaction interface of the background monitoring terminal, the position information is uploaded to the real-time monitoring equipment based on mixed reality, the real-time monitoring equipment finally outputs a corresponding three-dimensional model, an equipment state display picture and a video picture to the background monitoring terminal through the determination of the equipment to be monitored, the equipment information and the real-time state data, and a user can check the equipment information and the real-time state data of the corresponding equipment. On the other hand, when the inspection is performed by a device such as a mobile terminal or VR glasses. A user carries a terminal device to walk to the position near the device to be monitored, the position information is determined by means of terminal device positioning, active input of position information or camera video analysis and the like, the device to be monitored is determined according to the position information, and then corresponding three-dimensional models, device state display pictures and video data are finally output to a mobile terminal or VR glasses of the user through marking of the device information and real-time state data, so that the real-time performance and the synchronism of device information and real-time state data query are achieved, and the data query flow of device polling is further optimized. Further, when inspecting some equipment having a certain risk such as high pressure and high temperature, it is not suitable to perform a field inspection of the equipment close to the equipment because such equipment has a certain risk. The position information of the corresponding equipment can be uploaded through the terminal equipment, the video data and the graphical interface of the corresponding equipment marked with the equipment information and the real-time state data are finally obtained based on the position information, and the inspection of the high-risk equipment can be finished through the video data and the graphical interface, so that the inspection safety of the equipment is ensured.
The method includes the steps of obtaining position information, determining corresponding equipment to be monitored based on the position information, extracting the equipment information and the real-time state data of the equipment to be monitored, marking the equipment information and the real-time state data in corresponding video data and a corresponding graphical interface, and then coding the video data and the graphical interface and outputting the coded video data and the graphical interface to the terminal equipment for displaying. By adopting the technical means, the corresponding equipment can be selected to check the state data according to the position information when the equipment is monitored or patrolled in a centralized manner, so that the convenience of the equipment monitoring or patrolling in a centralized manner is improved. In addition, this application embodiment is through marking equipment status data in the video data and the graphical interface of corresponding equipment, is convenient for directly look over equipment status data through video data and graphical interface to when avoiding patrolling and examining equipment, look over the dangerous condition that equipment status data exists on the spot, further improve the security that equipment patrolled and examined.
Example two:
on the basis of the foregoing embodiment, fig. 10 is a schematic structural diagram of a real-time monitoring device based on mixed reality according to a second embodiment of the present application. Referring to fig. 10, the real-time monitoring apparatus based on mixed reality provided in this embodiment specifically includes: an acquisition module 21, a labeling module 22 and an output module 23.
The acquiring module 21 is configured to acquire position information of a current user or position information input by the user, and determine a corresponding device to be monitored based on the position information;
the marking module 22 is configured to extract device information and real-time status data of the device to be monitored, and mark the device information and the real-time status data in video data and a graphical interface corresponding to the device to be monitored, where the graphical interface includes a three-dimensional model corresponding to the device to be monitored and a device status display picture;
the output module 23 is configured to encode the video data and the graphical interface and output the encoded video data and the graphical interface to a terminal device for displaying.
The method includes the steps of obtaining position information, determining corresponding equipment to be monitored based on the position information, extracting the equipment information and the real-time state data of the equipment to be monitored, marking the equipment information and the real-time state data in corresponding video data and a corresponding graphical interface, and then coding the video data and the graphical interface and outputting the coded video data and the graphical interface to the terminal equipment for displaying. By adopting the technical means, the corresponding equipment can be selected to check the state data according to the position information when the equipment is monitored or patrolled in a centralized manner, so that the convenience of the equipment monitoring or patrolling in a centralized manner is improved. In addition, this application embodiment is through marking equipment status data in the video data and the graphical interface of corresponding equipment, is convenient for directly look over equipment status data through video data and graphical interface to when avoiding patrolling and examining equipment, look over the dangerous condition that equipment status data exists on the spot, further improve the security that equipment patrolled and examined.
Specifically, the obtaining module 21 includes:
the first determining unit is used for determining a corresponding monitoring camera in a set distance range according to the position information;
and the extraction unit is used for extracting the real-time video data of the monitoring camera and determining the corresponding equipment to be monitored from the video data.
Specifically, the obtaining module 21 further includes:
the second determining unit is used for determining a corresponding monitoring camera in a set distance range according to the position information;
the sending unit is used for sending the equipment information corresponding to the monitoring camera to the terminal equipment;
and the return unit is used for determining the equipment to be monitored according to the selection information returned by the terminal equipment.
Specifically, the labeling module 22 includes:
the third determining unit is used for determining the position of the corresponding equipment to be monitored in the video data and the graphical interface;
and the marking unit is used for marking the equipment information and the real-time state data corresponding to the equipment to be monitored in the video picture of the corresponding video data and the three-dimensional model of the graphical interface.
The real-time monitoring device based on mixed reality provided by the second embodiment of the application can be used for executing the real-time monitoring method based on mixed reality provided by the first embodiment of the application, and has corresponding functions and beneficial effects.
Example three:
an embodiment of the present application provides an electronic device, and with reference to fig. 11, the electronic device includes: a processor 31, a memory 32, a communication module 33, an input device 34, and an output device 35. The number of processors in the electronic device may be one or more, and the number of memories in the electronic device may be one or more. The processor, memory, communication module, input device, and output device of the electronic device may be connected by a bus or other means.
The memory 32 is a computer readable storage medium, and can be used for storing software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the real-time monitoring method based on mixed reality according to any embodiment of the present application (for example, an obtaining module, an annotating module, and an outputting module in the real-time monitoring device based on mixed reality). The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system and an application program required by at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory may further include memory located remotely from the processor, and these remote memories may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The communication module 33 is used for data transmission.
The processor 31 executes various functional applications and data processing of the device by running software programs, instructions and modules stored in the memory, so as to implement the above-mentioned real-time monitoring method based on mixed reality.
The input device 34 may be used to receive entered numeric or character information and to generate key signal inputs relating to user settings and function controls of the apparatus. The output device 35 may include a display device such as a display screen.
The electronic device provided by the embodiment can be used for executing the real-time monitoring method based on mixed reality provided by the embodiment one, and has corresponding functions and beneficial effects.
Example four:
embodiments of the present application further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a mixed reality-based real-time monitoring method, where the mixed reality-based real-time monitoring method includes: acquiring position information, and determining corresponding equipment to be monitored based on the position information, wherein the position information comprises the position information of a current user or the position information input by the current user; extracting the equipment information and the real-time state data of the equipment to be monitored, and marking the equipment information and the real-time state data in the video data and the graphical interface corresponding to the equipment to be monitored, wherein the graphical interface comprises a three-dimensional model corresponding to the equipment to be monitored and an equipment state display picture; and encoding the video data and the graphical interface and outputting the encoded video data and the graphical interface to terminal equipment for displaying.
Storage medium-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a different second computer system connected to the first computer system through a network (such as the internet). The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media residing in different locations, e.g., in different computer systems connected by a network. The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the real-time monitoring method based on mixed reality as described above, and may also perform related operations in the real-time monitoring method based on mixed reality as provided in any embodiment of the present application.
The real-time monitoring device, the storage medium, and the electronic device based on mixed reality provided in the above embodiments may execute the real-time monitoring method based on mixed reality provided in any embodiment of the present application, and reference may be made to the real-time monitoring method based on mixed reality provided in any embodiment of the present application without detailed technical details described in the above embodiments.
The foregoing is considered as illustrative of the preferred embodiments of the invention and the technical principles employed. The present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the claims.

Claims (10)

1. A real-time monitoring method based on mixed reality is characterized by comprising the following steps:
acquiring position information, and determining corresponding equipment to be monitored based on the position information, wherein the position information comprises the position information of a current user or the position information input by the current user;
extracting the equipment information and the real-time state data of the equipment to be monitored, and marking the equipment information and the real-time state data in the video data and the graphical interface corresponding to the equipment to be monitored, wherein the graphical interface comprises a three-dimensional model corresponding to the equipment to be monitored and an equipment state display picture;
and encoding the video data and the graphical interface and outputting the encoded video data and the graphical interface to terminal equipment for displaying.
2. The mixed reality-based real-time monitoring method according to claim 1, wherein the determining the corresponding device to be monitored based on the location information comprises:
determining a corresponding monitoring camera within a set distance range according to the position information;
and extracting real-time video data of the monitoring camera, and determining the corresponding equipment to be monitored from the video data.
3. The real-time monitoring method based on mixed reality according to claim 2, wherein the terminal device comprises a background monitoring terminal, a mobile terminal device or VR glasses.
4. The real-time monitoring method based on mixed reality according to claim 3, wherein the determining the corresponding monitoring camera within the set distance range according to the position information comprises:
feeding back a corresponding three-dimensional monitoring scene according to the position information, and feeding back a plurality of camera information within a set distance range to VR glasses of a current user;
the method comprises the steps of obtaining a first focusing direction of a current user through VR glasses, and determining a corresponding monitoring camera from a plurality of cameras according to a camera corresponding to the first focusing direction.
5. The real-time monitoring method based on mixed reality according to claim 3, wherein the extracting real-time video data of the monitoring camera and determining the corresponding device to be monitored from the video data comprises:
extracting real-time video data of the monitoring camera, embedding the real-time video data into a three-dimensional monitoring scene, and sending the real-time video data to VR glasses of a current user for displaying;
and acquiring a second focusing direction of the current user through VR glasses, and determining the equipment to be monitored according to a position corresponding to the second focusing direction.
6. The mixed reality-based real-time monitoring method according to claim 1, wherein the determining the corresponding device to be monitored based on the location information comprises:
determining a corresponding monitoring camera within a set distance range according to the position information;
sending the equipment information corresponding to the monitoring camera to the terminal equipment;
and determining the equipment to be monitored according to the selection information returned by the terminal equipment.
7. The real-time monitoring method based on mixed reality according to claim 1, wherein the marking of the device information and the real-time status data in the video data and the graphical interface corresponding to the device to be monitored, the graphical interface including a three-dimensional model corresponding to the device to be monitored and a device status display screen, comprises:
determining the position of the corresponding equipment to be monitored in the video data and the graphical interface;
and marking the equipment information and the real-time state data corresponding to the equipment to be monitored in the video picture of the corresponding video data and the three-dimensional model of the graphical interface.
8. A real-time monitoring device based on mixed reality, comprising:
the acquisition module is used for acquiring the position information of the current user or the position information input by the user and determining the corresponding equipment to be monitored based on the position information;
the marking module is used for extracting the equipment information and the real-time state data of the equipment to be monitored, and marking the equipment information and the real-time state data in the video data and the graphical interface corresponding to the equipment to be monitored, wherein the graphical interface comprises a three-dimensional model corresponding to the equipment to be monitored and an equipment state display picture;
and the output module is used for encoding the video data and the graphical interface and outputting the video data and the graphical interface to terminal equipment for displaying.
9. An electronic device, comprising:
a memory and one or more processors;
the memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the mixed reality based real-time monitoring method of any one of claims 1-7.
10. A storage medium containing computer-executable instructions for performing the mixed reality based real-time monitoring method of any one of claims 1-7 when executed by a computer processor.
CN201911321083.3A 2019-12-19 2019-12-19 Real-time monitoring method and device based on mixed reality Withdrawn CN111105506A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911321083.3A CN111105506A (en) 2019-12-19 2019-12-19 Real-time monitoring method and device based on mixed reality
PCT/CN2020/121658 WO2021120816A1 (en) 2019-12-19 2020-10-16 Real-time monitoring method and apparatus based on mixed reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911321083.3A CN111105506A (en) 2019-12-19 2019-12-19 Real-time monitoring method and device based on mixed reality

Publications (1)

Publication Number Publication Date
CN111105506A true CN111105506A (en) 2020-05-05

Family

ID=70423452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911321083.3A Withdrawn CN111105506A (en) 2019-12-19 2019-12-19 Real-time monitoring method and device based on mixed reality

Country Status (2)

Country Link
CN (1) CN111105506A (en)
WO (1) WO2021120816A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111988573A (en) * 2020-08-27 2020-11-24 浙江中控技术股份有限公司 Industrial control information display method, device and system
CN112530045A (en) * 2020-11-26 2021-03-19 重庆电子工程职业学院 Intelligent inspection system of factory
CN112804502A (en) * 2021-03-10 2021-05-14 重庆第二师范学院 Video monitoring system, method, storage medium and device based on artificial intelligence
WO2021120816A1 (en) * 2019-12-19 2021-06-24 广州新科佳都科技有限公司 Real-time monitoring method and apparatus based on mixed reality
CN113284271A (en) * 2021-04-28 2021-08-20 中国工商银行股份有限公司 Machine room inspection method and device, electronic equipment and computer readable storage medium
CN113347392A (en) * 2021-05-31 2021-09-03 高新兴科技集团股份有限公司 AR-based power distribution room monitoring data visualization display method, system and medium
CN114143460A (en) * 2021-11-29 2022-03-04 青岛歌尔声学科技有限公司 Video display method and device and electronic equipment
CN116071336A (en) * 2023-02-14 2023-05-05 北京博维仕科技股份有限公司 Intelligent video analysis method and system
CN116798142A (en) * 2023-06-25 2023-09-22 中路高科交通检测检验认证有限公司 Visual inspection method, system, equipment and storage medium for long bridge

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103048965A (en) * 2012-12-17 2013-04-17 江苏省电力公司 Visual centralized monitoring system for loading videos of dynamic machine rooms
CN104618694A (en) * 2015-03-02 2015-05-13 国家电网公司 Three-dimensional dynamic monitoring method and system of machine room
CN105959619A (en) * 2016-04-21 2016-09-21 深圳市盐田港同惠投资股份有限公司 Positioning monitoring system and positioning monitoring method
CN106375718A (en) * 2016-09-05 2017-02-01 北京凯文盛业建筑技术有限公司 Display method and device
WO2018000619A1 (en) * 2016-06-29 2018-01-04 乐视控股(北京)有限公司 Data display method, device, electronic device and virtual reality device
CN109348185A (en) * 2018-11-21 2019-02-15 广东领域集团有限公司 A kind of Traffic monitoring VR camera system and the method being monitored by virtual reality
CN109405895A (en) * 2018-12-29 2019-03-01 广州供电局有限公司 Cable tunnel monitoring management system
CN109872466A (en) * 2019-02-27 2019-06-11 淮海工学院 Deposit terminal management method and device
CN110543344A (en) * 2019-08-22 2019-12-06 上海晋泷科技有限公司 information display method and device in virtual scene, electronic device and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930598B (en) * 2012-10-08 2016-04-13 山东康威通信技术股份有限公司 Three-dimensional model is used to locate and show the system and method for Tunnel testing equipment state
CN105572707A (en) * 2015-12-23 2016-05-11 北京奇虎科技有限公司 Geographical position monitoring method and equipment
CN106331643A (en) * 2016-09-05 2017-01-11 北京凯文盛业建筑技术有限公司 Display method and display device
CN111105506A (en) * 2019-12-19 2020-05-05 广州新科佳都科技有限公司 Real-time monitoring method and device based on mixed reality

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103048965A (en) * 2012-12-17 2013-04-17 江苏省电力公司 Visual centralized monitoring system for loading videos of dynamic machine rooms
CN104618694A (en) * 2015-03-02 2015-05-13 国家电网公司 Three-dimensional dynamic monitoring method and system of machine room
CN105959619A (en) * 2016-04-21 2016-09-21 深圳市盐田港同惠投资股份有限公司 Positioning monitoring system and positioning monitoring method
WO2018000619A1 (en) * 2016-06-29 2018-01-04 乐视控股(北京)有限公司 Data display method, device, electronic device and virtual reality device
CN106375718A (en) * 2016-09-05 2017-02-01 北京凯文盛业建筑技术有限公司 Display method and device
CN109348185A (en) * 2018-11-21 2019-02-15 广东领域集团有限公司 A kind of Traffic monitoring VR camera system and the method being monitored by virtual reality
CN109405895A (en) * 2018-12-29 2019-03-01 广州供电局有限公司 Cable tunnel monitoring management system
CN109872466A (en) * 2019-02-27 2019-06-11 淮海工学院 Deposit terminal management method and device
CN110543344A (en) * 2019-08-22 2019-12-06 上海晋泷科技有限公司 information display method and device in virtual scene, electronic device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨慧等: "基于虚拟现实技术的变电站设备实时监控***设计与实现", 《电脑知识与技术》 *
魏焱: "基于VR的变电站智能巡检场景研究与设计", 《科技视界》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021120816A1 (en) * 2019-12-19 2021-06-24 广州新科佳都科技有限公司 Real-time monitoring method and apparatus based on mixed reality
CN111988573A (en) * 2020-08-27 2020-11-24 浙江中控技术股份有限公司 Industrial control information display method, device and system
CN112530045A (en) * 2020-11-26 2021-03-19 重庆电子工程职业学院 Intelligent inspection system of factory
CN112804502A (en) * 2021-03-10 2021-05-14 重庆第二师范学院 Video monitoring system, method, storage medium and device based on artificial intelligence
CN113284271A (en) * 2021-04-28 2021-08-20 中国工商银行股份有限公司 Machine room inspection method and device, electronic equipment and computer readable storage medium
CN113347392A (en) * 2021-05-31 2021-09-03 高新兴科技集团股份有限公司 AR-based power distribution room monitoring data visualization display method, system and medium
CN114143460A (en) * 2021-11-29 2022-03-04 青岛歌尔声学科技有限公司 Video display method and device and electronic equipment
CN116071336A (en) * 2023-02-14 2023-05-05 北京博维仕科技股份有限公司 Intelligent video analysis method and system
CN116071336B (en) * 2023-02-14 2023-08-11 北京博维仕科技股份有限公司 Intelligent video analysis method and system
CN116798142A (en) * 2023-06-25 2023-09-22 中路高科交通检测检验认证有限公司 Visual inspection method, system, equipment and storage medium for long bridge

Also Published As

Publication number Publication date
WO2021120816A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
CN111105506A (en) Real-time monitoring method and device based on mixed reality
KR102289745B1 (en) System and method for real-time monitoring field work
US9746913B2 (en) Secured mobile maintenance and operator system including wearable augmented reality interface, voice command interface, and visual recognition systems and related methods
US11867816B2 (en) Laser gas detector and laser gas detection system
CN109271881B (en) Safety management and control method and device for personnel in transformer substation and server
CN110047150B (en) Complex equipment operation on-site simulation system based on augmented reality
US20120019659A1 (en) Video surveillance system and method for configuring a video surveillance system
CN109117684A (en) System and method for the selective scanning in binocular augmented reality equipment
CN110579191A (en) target object inspection method, device and equipment
CN109638959B (en) Power equipment remote signaling function debugging method and system based on AR and deep learning
KR20200068075A (en) Remote guidance apparatus and method capable of handling hyper-motion step based on augmented reality and machine learning
CN113222184A (en) Equipment inspection system and method based on augmented reality AR
CN113066195A (en) Power equipment inspection method and device, AR glasses and storage medium
CN111770450B (en) Workshop production monitoring server, mobile terminal and application
US10719547B2 (en) Image retrieval assist device and image retrieval assist method
AU2020270461B2 (en) Situational Awareness Monitoring
JP2015073191A (en) Image processing system and control method therefor
CN103973738A (en) Method, device and system for locating personnel
CN114374819A (en) Substation personnel monitoring method, device, equipment, storage medium and program product
CN112418140A (en) Electric shock prevention alarm method and system for power distribution construction site
CN108170390B (en) Interaction method and system for switching display by remote control
CN111597940A (en) Method and device for evaluating rendering model, electronic equipment and readable storage medium
KR102221898B1 (en) Method for visualization in virtual object based on real object
KR101702452B1 (en) A method and a system for providing cctv image applied augmented reality
CN115223384B (en) Vehicle data display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40027900

Country of ref document: HK

WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200505

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200505