CN113452962A - Data center enhanced inspection system and method with space collaborative perception - Google Patents

Data center enhanced inspection system and method with space collaborative perception Download PDF

Info

Publication number
CN113452962A
CN113452962A CN202110690427.9A CN202110690427A CN113452962A CN 113452962 A CN113452962 A CN 113452962A CN 202110690427 A CN202110690427 A CN 202110690427A CN 113452962 A CN113452962 A CN 113452962A
Authority
CN
China
Prior art keywords
inspection
robot
augmented reality
data center
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110690427.9A
Other languages
Chinese (zh)
Other versions
CN113452962B (en
Inventor
方维
胥小宇
杨大胜
商旭
王刚义
马爱民
李张文驰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Boyu Times Technology Co ltd
Beijing University of Posts and Telecommunications
Original Assignee
Beijing Boyu Times Technology Co ltd
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Boyu Times Technology Co ltd, Beijing University of Posts and Telecommunications filed Critical Beijing Boyu Times Technology Co ltd
Priority to CN202110690427.9A priority Critical patent/CN113452962B/en
Publication of CN113452962A publication Critical patent/CN113452962A/en
Application granted granted Critical
Publication of CN113452962B publication Critical patent/CN113452962B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/18Self-organising networks, e.g. ad-hoc networks or sensor networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a data center enhanced inspection system with space cooperative sensing and a method thereof, wherein the intelligent inspection of a data center provides a carrier; the high-definition binocular camera is fixed on the inspection robot system, is used for acquiring a 3D video stream in the inspection process and can transmit the 3D video stream in real time through a wireless sensor network; the augmented reality equipment drives a virtual model corresponding to the data center equipment to roam in the inspection process based on the pose of the inspection robot in the field environment, and superimposes information such as the virtual model and the like in a real image by combining with a data center physical equipment image to form the enhanced visualization of the inspection process. The invention displays the information of the motion track, the motion state and the like of the robot in the inspection process in a visual mode in the wearable intelligent glasses, prevents and avoids potential risks of collision, data center damage and the like caused by positioning failure and the like of the robot, and is beneficial to operation and maintenance managers to control the current inspection state.

Description

Data center enhanced inspection system and method with space collaborative perception
Technical Field
The invention discloses a data center enhanced inspection system and method with space collaborative perception, relates to the technical field of data inspection, and particularly relates to a data center robot intelligent inspection system and method integrating an augmented reality technology.
Background
Whether the parameters and the operation data of the data center equipment are normal or not directly influences the normal work of the whole power communication network, and has important significance on national economic development and national safety. In the conventional inspection process of the electric power data machine room, the manual operation mode of workers is mainly relied, the operation indicator light, the smoke, the abnormal sound and the like of the robot are judged manually in the inspection process, the problems of low inspection efficiency, strong subjectivity of the inspection result and the like exist, and the inspection result is difficult to be visually fused with a physical site. Therefore, the intelligent visual inspection method is provided urgently, and the intelligent visualization of the inspection process of the data center is realized.
As a new visualization mode, the augmented reality technology calculates the pose of an augmented reality system in a physical environment in real time, and further fuses virtual information (models, pictures, videos and the like) in a geometric consistency mode in a field environment, so that a user can see a real world with a virtual object. The intelligent visual mode of 'practical operability' helps people to further understand and operate physical environment, and provides detailed guidance and a new interaction means for completing complex industrial tasks (such as routing inspection, maintenance and the like).
Although the inspection robot and the augmented reality technology play important roles in the intelligent operation and maintenance of the data center, most of the existing systems and methods divide the robot and the augmented reality glasses into two independent modules to work, if a detection result in the inspection process is transmitted to the glasses end to be displayed, the two modules are lack of the space collaborative perception interaction capability of tight coupling, so that the augmented reality glasses are difficult to intuitively perceive the current motion state at the visual angle of the inspection robot, and simultaneously, the augmented visualization information given by the augmented reality glasses end is difficult to effectively transmit and be understood by the robot.
Therefore, a new system and a method for enhancing the inspection robot are urgently needed to perform spatial collaborative fusion on the data center inspection robot and the augmented reality technology, so that interactive perception and intelligent visual display of the inspection process and the inspection site of the robot are realized, and the system and the method have important significance for improving the automation and intelligent visual level of the inspection of the existing data center.
Disclosure of Invention
Aiming at the problems, the invention provides a data center enhanced inspection system with space collaborative perception and a method thereof, which solve the problems that the existing robot is low in visualization degree in the inspection process and lacks space collaborative perception capability between the robot and an augmented reality system, expand the intelligent visualization level of the inspection process, display the task to be inspected in a visual form in the coordinate system of the inspection robot, help the equipment management personnel to find and solve the problems in the inspection process in time and ensure the normal operation of the data center.
The invention is realized by the following technical scheme:
the utility model provides a data center reinforcing system of patrolling and examining with space perception in coordination, including patrolling and examining robot, augmented reality equipment, high definition binocular camera and wireless sensor network. The inspection robot can display the current direction and motion state of the robot in real time, and carries out analysis and abnormal alarm by collecting data such as abnormal sound, peculiar smell and the like in a field environment, and the inspection robot can be specifically divided into four parts including a motion navigation system, a detection system, a control system and a communication system, and provides a carrier for intelligent inspection of a data center; the high-definition binocular camera is fixed on the inspection robot system, is used for acquiring a 3D video stream in the inspection process and can transmit the 3D video stream in real time through a wireless sensor network; the augmented reality equipment is typically wearable intelligent glasses, a virtual model corresponding to the data center equipment in the inspection process can be driven to roam based on the pose of the inspection robot in the field environment, and information such as the virtual model is overlapped in a real image by combining a data center physical equipment image acquired by a high-definition binocular camera to form the augmented visualization of the inspection process.
The augmented inspection intelligent visualization system is characterized in that a three-dimensional data model of data center equipment is constructed in an off-line mode, a virtual three-dimensional model corresponding to the data center equipment is loaded and displayed in the augmented reality equipment by the aid of the augmented reality equipment, and the model can estimate a position and posture of the augmented reality equipment in space in real time to realize roaming of a viewpoint in a virtual scene.
The intelligent inspection robot detection system utilizes an industrial camera carried on a robot body to acquire visual information such as indicator lamps and instrument panels on data center equipment, performs classification training on colors of different indicator lamps based on a convolutional neural network model, effectively identifies corresponding equipment data such as pointer instrument panels, and converts unstructured image data into corresponding structured semantic data under time and space dimensions.
Preferably, the inspection robot and the augmented reality device in the invention mainly have two forms:
one form is that the augmented reality equipment and the inspection robot are fixedly connected together, and the mutual pose relationship isRobotTARAnd in the inspection process, the pose between the augmented reality equipment and the inspection robot is kept unchanged (as shown in figure 1).
The inspection robot can automatically position by utilizing data acquired by the laser sensor in real time according to the constructed 2D grid map of the inspection robot on the physical site of the data center, and further obtain the pose of the inspection robot under the physical coordinate system of the siteWTRobotAnd then the pose of the augmented reality equipment in the field inspection process can be obtainedWTAR
WTARWTRobot RobotTAR
Therefore, the augmented reality device can obtain an augmented inspection result and the virtual model of the data center are superposed in the field global coordinate system, so that the inspection robot can 'see' the inspection field with augmented visualization.
As shown in fig. 2, the other form is that the augmented reality device and the inspection robot can freely move in a three-dimensional space, at the moment, a priori artificial identification is pasted on the inspection robot, image segmentation and feature point extraction are performed through image information acquired by the end camera of the augmented reality device, the pose of the current augmented reality device relative to the coordinate system of the inspection robot is obtained through calculation, and initial calibration of the pose between the augmented reality device and the inspection robot is achievedRobotTAR0And defining a coordinate system corresponding to the augmented reality equipment at the moment as a field global coordinate system.
Then, the inspection robot can perform autonomous positioning by using data acquired by the laser sensor in real time according to the constructed 2D grid map of the inspection robot on the physical site of the data center, and further obtain the pose of the inspection robot under the site physical coordinate systemWTRobot
Meanwhile, the augmented reality equipment end can obtain the pose of the augmented reality equipment end in the initial calibration coordinate system according to the real-time pose estimation capability of the augmented reality equipment end on the operation siteAR0TARAnd then can acquire the position and posture relation of the inspection robot relative to the augmented reality equipment under the dynamic inspection working condition:
WTARWTRobot RobotTAR0 AR0TAR
from this, the administrator accessible is worn augmented reality equipment to visual form observes and patrols and examines current state and the follow-up plan of patrolling and examining of robot. The real-time linkage of the augmented reality equipment and the inspection robot is realized, the real-time roaming of the virtual three-dimensional model of the data center equipment on the augmented reality equipment is driven, and a semantic detection map (shown in figure 2) in the inspection process is constructed by combining the structured detection information acquired by the inspection robot detection system in real time.
Furthermore, information such as the state of the indicating lamp of the data center equipment, the numerical value of a dial plate, peculiar smell and abnormal sound in the inspection is given to the corresponding cabinet data virtual model according to the time and space dimensions of the equipment in the inspection process. Therefore, a structured map with space inspection semantic recognition can be obtained once inspection is performed on the data center, and the limitations that the visualization degree of a paper report obtained in the inspection process of the existing robot is low and the like are overcome.
The augmented reality inspection system is based on the structured semantic map, physical scene information acquired in real time by the binocular high-definition camera is utilized, a high-precision autonomous positioning result in the motion process of the inspection robot is used as a support, a cooperative positioning system between the augmented reality device and the inspection robot in the inspection process is established, virtual augmented information is superposed in a physical image acquired by the inspection robot, and personnel wearing the augmented reality device can visually know the running state of the physical device of the inspection data center. In addition, when the operation and maintenance maintainer wears the augmented reality equipment to enter the data center, the potential risks existing in the routing inspection process can be quickly identified and positioned based on the constructed structured semantic map.
By combining the posture estimation of the data center three-dimensional virtual model and the augmented reality equipment which are acquired in advance, the method can realize the enhanced routing inspection effect under two working conditions:
firstly, on the inspection site, an operation and maintenance manager can fuse the virtual scene model with the real operation environment by wearing an augmented reality device (or by the augmented reality device fixedly connected to the robot), so that the inspection-enhanced robot system is realized, and the intelligent visualization degree of information in the inspection process is improved;
and secondly, outside the inspection site, the real-time data center site high-definition video stream sensed by the binocular camera system in real time can be transmitted to the server end through the wireless network communication module, real-time virtual-real fusion is realized at the far end by combining the constructed virtual data center roaming scene, and an enhanced inspection robot system which is personally on the scene is provided for backstage operation and maintenance managers.
In the positioning navigation of the enhanced patrol robot, information such as a motion track, a motion state and the like of the robot in the patrol process can be displayed in wearable intelligent glasses in a visual mode based on the cooperative positioning capability among systems, potential risks such as collision and data center damage caused by robot positioning failure and the like are prevented and avoided in advance, and operation and maintenance management personnel can be facilitated to control the current patrol state.
Drawings
Fig. 1 is a schematic diagram illustrating a fixed connection between an augmented reality device and an inspection robot according to an embodiment of the present invention;
wherein, P is a three-dimensional scene point of the inspection scene, { A } is an augmented reality equipment coordinate system, { R } is a mobile robot coordinate system, and { W } is a mobile robot coordinate system.
Fig. 2 is a schematic diagram showing that the augmented reality device and the inspection robot of the embodiment of the invention can move independently;
wherein P is a three-dimensional scene point of the inspection scene, { A0The reference numeral is an augmented reality device initial coordinate system, { A } is a coordinate system of an augmented reality device motion process, { R } is a mobile robot coordinate system, and { W } is a mobile robot coordinate system.
FIG. 3 is a diagram illustrating a data center enhanced inspection system with spatial collaborative awareness according to a real-time embodiment of the present invention;
FIG. 4 is a diagram illustrating inspection results with structured semantic information of an inspection process according to an embodiment of the present invention;
FIG. 5 is a block diagram illustrating the overall flow of the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to examples, and the exemplary embodiments and descriptions thereof are only used for explaining the present invention and are not used as limitations of the present invention.
Referring to fig. 1, the embodiment provides an enhanced inspection system and an enhanced inspection method for a data center with spatial collaborative perception. The inspection robot can display the current direction and motion state of the robot in real time, and carries out analysis and abnormal alarm by collecting data such as abnormal sound, peculiar smell and the like in a field environment, and the inspection robot can be specifically divided into four parts including a motion navigation system, a detection system, a control system and a communication system, and provides a carrier for intelligent inspection of a data center; the high-definition binocular camera is fixed on the robot system, is used for acquiring a 3D video stream in the inspection process and can transmit the 3D video stream in real time through a wireless sensor network; the wearable intelligent glasses can drive the virtual model corresponding to the data center equipment in the polling process to roam based on the pose of the polling robot in the field environment, and superimpose information such as the virtual model and the like on a real image by combining a data center physical equipment image acquired by a high-definition binocular camera to form the enhanced visualization of the polling process. Specifically, the method comprises the following steps:
in one embodiment, based on the environment sensing and positioning navigation capacity of the laser radar on the inspection robot, the global pose information P of the inspection robot in the data center at the moment t can be obtainedRt=(t,x,y,θ)R. Wherein t is the corresponding time stamp in the inspection process, (x, y) is the site position of the robot data center,theta is the orientation of the inspection robot on the site, and the pose of the inspection robot in the site global coordinate system can be obtainedWTRobot
On the basis, the wearable intelligent glasses can be based on the pose of the inspection robot under the field global coordinate systemWTRobotAnd the real-time attitude information of the inspection robot is transmitted to the augmented reality glasses end by utilizing the wireless network data transmission function. Then, the position and pose calibration result between the coordinate systems of the augmented reality glasses and the inspection robot is combinedRobotTARAcquiring real-time pose information of the augmented reality glasses in the data center field environmentWTARThe enhanced routing inspection cooperative positioning in the embodiment of the invention mainly has two forms:
the intelligent inspection robot can automatically position by utilizing data acquired by a laser sensor in real time according to a constructed 2D grid map of the intelligent inspection robot on a physical site of a data center so as to obtain the pose of the intelligent inspection robot under a site physical coordinate systemWTRobotAnd the pose of the augmented reality glasses in the field inspection process is acquiredWTAR
WTARWTRobot RobotTAR
Therefore, the inspection result and the corresponding 3D virtual model can be enhanced, the structured virtual model with semantic information is constructed, and the result of the cooperative positioning is obtainedWTARThe structured virtual model can be superposed in a field global coordinate system and displayed in real time in the augmented reality glasses and the inspection robot system, so that the inspection robot can 'see' an enhanced visual inspection field (figure 1).
The other method is that the augmented reality equipment and the inspection robot can freely move in a three-dimensional space, a priori artificial identification is pasted on the inspection robot at the moment, image segmentation and feature point extraction are carried out through image information obtained by the end camera of the augmented reality glasses, the pose of the current augmented reality glasses relative to the coordinate system of the inspection robot is obtained through calculation, and the method is achievedInitial calibration of pose between augmented reality device and robot systemRobotTAR0And defining a coordinate system corresponding to the augmented reality glasses at the moment as a field global coordinate system.
Then, the inspection robot can perform autonomous positioning by using data acquired by the laser sensor in real time according to the constructed 2D grid map of the inspection robot on the physical site of the data center, and further obtain the pose of the inspection robot under the site physical coordinate systemWTRobot
Meanwhile, the augmented reality glasses end can obtain the pose of the augmented reality glasses end in the initial calibration coordinate system according to the real-time pose estimation capability of the augmented reality glasses end on the operation siteAR0TARAnd then can acquire the position and posture relation of the inspection robot relative to the augmented reality glasses under the dynamic inspection working condition:
WTARWTRobot RobotTAR0 AR0TAR
from this, the administrator accessible is worn augmented reality glasses to visual form observes and patrols and examines current state and the follow-up plan of patrolling and examining of robot. The real-time linkage of the augmented reality glasses and the inspection robot is realized, the real-time roaming of the virtual three-dimensional model of the data center equipment on the wearable intelligent glasses is driven, and a semantic detection map (shown in figure 2) in the inspection process is constructed by combining the structured detection information acquired by the detection system in real time.
The invention can realize the enhanced routing inspection effect under two working conditions:
firstly, on the inspection site, operation and maintenance managers can fuse the virtual scene model with the real operation environment based on augmented reality glasses, so that an inspection-enhanced robot system is realized, and the intelligent visualization degree of information in the inspection process is improved;
and secondly, outside the inspection site, the real-time data center site high-definition video stream sensed by the binocular camera system in real time can be transmitted to the server end through the wireless network communication module, real-time virtual-real fusion is realized at the far end by combining the constructed virtual data center roaming scene, and an enhanced inspection robot system which is personally on the scene is provided for backstage operation and maintenance managers.
In the embodiment, the detection system of the inspection robot acquires visual information such as indicator lights and instrument panels on data center equipment by using an industrial camera on the robot body, performs classification training on colors of different indicator lights based on a convolutional neural network model, effectively identifies corresponding device data such as pointer instrument panels, and converts unstructured images and other data into corresponding structured semantic data under time and space dimensions.
In this embodiment, an inspection result graph with structured semantic information of an inspection process is designed and constructed on the basis of the 3D virtual model corresponding to the data center device, as shown in fig. 2. Specifically, information such as the state of an indicator lamp of data center equipment, the numerical value of a dial plate, peculiar smell and abnormal sound in the inspection is given to a corresponding cabinet data virtual model according to the time and space dimensions of the information appearing in the inspection process. Therefore, a structured map with space inspection semantic recognition can be obtained once inspection is performed on the data center, and the limitations that the visualization degree of a paper report obtained in the inspection process of the existing robot is low and the like are overcome.
In the embodiment, in order to improve rechecking efficiency and accuracy of inspection results of subsequent operation and maintenance personnel, the required enhanced inspection system is based on the acquired structured semantic map, physical scene information acquired in real time by using a binocular high-definition camera is utilized, a high-precision global positioning result in the robot motion process is used as a support, a cooperative positioning system between an augmented reality system and an inspection robot in the inspection process is established, and information such as states of indicator lights, dial values, peculiar smells and abnormal sounds of data center equipment in inspection is endowed to a corresponding cabinet data virtual model according to time and space dimensions of the data center equipment in the inspection process. Like this, once patrolling and examining at data center, can obtain a structured map (fig. 4) that has the space and patrol and examine semantic recognition, overcome current robot and patrol and examine the in-process and obtain the visual degree low grade limitation of paper report for the personnel of wearing wearable smart glasses can directly perceivedly know the running state of patrolling and examining data center physical equipment. In addition, when the operation and maintenance maintainer wears the augmented reality glasses to enter the data center, the potential risks existing in the routing inspection process can be quickly identified and positioned based on the constructed structured semantic map.
In the positioning navigation of the enhanced patrol robot, the information such as the motion track, the motion state and the like of the robot in the patrol process can be displayed in the wearable intelligent glasses in a visual mode based on the cooperative positioning capability among systems, so that the potential risks such as collision, data center damage and the like caused by robot positioning failure and the like can be prevented and avoided in advance, and the automation and intelligence level of the existing patrol process can be effectively improved.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the protection scope of the present invention, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (7)

1. The utility model provides a data center reinforcing system of patrolling and examining with space is perceptive in coordination which characterized in that: the system comprises an inspection robot, augmented reality equipment, a high-definition binocular camera and a wireless sensor network; the inspection robot displays the current direction and motion state of the robot in real time, analyzes and gives an abnormal alarm by collecting abnormal sound and peculiar smell data in a field environment, and can be divided into four parts, namely a motion navigation system, a detection system, a control system and a communication system, so as to provide a carrier for a data center; the high-definition binocular camera is fixed on the inspection robot system and used for acquiring a 3D video stream in the inspection process and transmitting the 3D video stream in real time through a wireless sensor network; the augmented reality equipment drives a virtual model corresponding to the data center equipment in the inspection process to roam based on the pose of the inspection robot in the field environment, and superimposes information such as the virtual model and the like in a real image by combining a data center physical equipment image acquired by a high-definition binocular camera to form the enhanced inspection intelligent visualization;
the augmented inspection intelligent visualization system is characterized in that a three-dimensional data model of data center equipment is constructed in an off-line mode, a virtual three-dimensional model corresponding to the data center equipment is loaded and displayed in the augmented reality equipment by using the augmented reality equipment, and the model can estimate the real-time position and pose of the augmented reality equipment in space to realize the roaming of a viewpoint in a virtual scene;
the intelligent inspection robot detection system utilizes an industrial camera on a robot body to acquire visual information of an indicator light and an instrument panel on data center equipment, carries out classification training on colors of different indicator lights based on a convolutional neural network model, effectively identifies corresponding pointer and dial plate equipment data, and converts unstructured data such as images into corresponding structured semantic data under time and space dimensions.
2. The data center enhanced inspection system with the space collaborative awareness according to claim 1, wherein: patrol and examine a form between robot and the augmented reality equipment be: the augmented reality equipment and the inspection robot are fixedly connected together, and the mutual pose relationship isRobotTARAnd the pose between the augmented reality equipment and the inspection robot is kept unchanged in the inspection process.
3. The data center enhanced inspection system with the space collaborative awareness according to claim 1, wherein: patrol and examine a form between robot and the augmented reality equipment be: augmented reality equipment and inspection robot can freely remove each other in three-dimensional space to paste a priori artificial identification on inspection robot.
4. A data center enhanced tour inspection method according to the system of claims 1 to 3, wherein: when the augmented reality equipment is fixedly connected with the inspection robot, the mutual pose relationship isRobotTARIn the process of inspection, augmented reality deviceThe pose between the standby robot and the inspection robot is kept unchanged; the detection method comprises the following steps:
the inspection robot carries out autonomous positioning by utilizing data acquired by the laser sensor in real time according to the constructed 2D grid map of the inspection robot on the physical site of the data center, and further obtains the pose of the inspection robot under the site physical coordinate systemWTRobotAnd then the pose of the augmented reality equipment in the field inspection process can be obtainedWTAR
WTARWTRobot RobotTAR
The augmented reality device obtains the augmented inspection result and the virtual model of the data center are superposed in the field global coordinate system, so that the inspection robot can 'see' the augmented visual inspection field.
5. A data center enhanced tour inspection method according to the system of claims 1 to 3, wherein: when the augmented reality equipment and the inspection robot can freely move in a three-dimensional space, a priori artificial identification is pasted on the inspection robot, image information acquired through the end camera of the augmented reality equipment is used for image segmentation and feature point extraction, the pose of the current augmented reality equipment relative to the coordinate system of the inspection robot is obtained through calculation, and initial calibration of the pose between the augmented reality equipment and the inspection robot is achievedRobotTAR0Defining a coordinate system corresponding to the augmented reality equipment at the moment as a field global coordinate system;
then, the inspection robot can perform autonomous positioning by using data acquired by the laser sensor in real time according to the constructed 2D grid map of the inspection robot on the physical site of the data center, and further obtain the pose of the inspection robot under the site physical coordinate systemWTRobot
Meanwhile, the augmented reality equipment end obtains the pose of the augmented reality equipment end in the initial calibration coordinate system according to the real-time pose estimation capability of the augmented reality equipment end on the operation siteAR0TARAnd then can acquire the position and posture relation of the inspection robot relative to the augmented reality equipment under the dynamic inspection working condition:
WTARWTRobot RobotTAR0 AR0TAR
therefore, managers can observe the current inspection state and the subsequent inspection plan of the inspection robot in a visual mode by wearing the augmented reality equipment;
the augmented reality equipment is linked with the inspection robot in real time, the real-time roaming of the virtual three-dimensional model of the data center equipment on the augmented reality equipment is driven, and a semantic detection map in the inspection process is constructed in combination with structured detection information acquired by the inspection robot detection system in real time.
6. The data center enhanced inspection method according to claim 4 or 5, characterized in that: endowing the state of an indicator light, a dial value, peculiar smell and abnormal sound information of data center equipment in the inspection to a corresponding cabinet data virtual model according to the time and space dimensions of the indicator light in the inspection process; therefore, a structured map with space inspection semantic recognition can be obtained once inspection is performed on the data center.
7. The data center enhanced inspection method according to claim 6, wherein: the augmented reality inspection system is based on the structured semantic map, physical scene information acquired in real time by the binocular high-definition camera is utilized, a high-precision autonomous positioning result in the motion process of the inspection robot is used as a support, a cooperative positioning system between the augmented reality device and the inspection robot in the inspection process is established, virtual augmented information is superposed in a physical image acquired by the inspection robot, and therefore a person wearing the augmented reality device can visually know the running state of the physical device of the inspection data center.
CN202110690427.9A 2021-06-22 2021-06-22 Data center enhanced inspection system and method with space collaborative perception Active CN113452962B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110690427.9A CN113452962B (en) 2021-06-22 2021-06-22 Data center enhanced inspection system and method with space collaborative perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110690427.9A CN113452962B (en) 2021-06-22 2021-06-22 Data center enhanced inspection system and method with space collaborative perception

Publications (2)

Publication Number Publication Date
CN113452962A true CN113452962A (en) 2021-09-28
CN113452962B CN113452962B (en) 2022-08-05

Family

ID=77812045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110690427.9A Active CN113452962B (en) 2021-06-22 2021-06-22 Data center enhanced inspection system and method with space collaborative perception

Country Status (1)

Country Link
CN (1) CN113452962B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114779679A (en) * 2022-03-23 2022-07-22 北京英智数联科技有限公司 Augmented reality inspection system and method
CN115468560A (en) * 2022-11-03 2022-12-13 国网浙江省电力有限公司宁波供电公司 Quality inspection method, robot, device and medium based on multi-sensor information fusion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017128934A1 (en) * 2016-01-29 2017-08-03 成都理想境界科技有限公司 Method, server, terminal and system for implementing augmented reality
CN108243247A (en) * 2018-01-03 2018-07-03 广州供电局有限公司 A kind of data center's intelligent robot inspection tour system
CN109341694A (en) * 2018-11-12 2019-02-15 哈尔滨理工大学 A kind of autonomous positioning air navigation aid of mobile sniffing robot
CN110826549A (en) * 2019-11-04 2020-02-21 山东欧玛嘉宝电气科技有限公司 Inspection robot instrument image identification method and system based on computer vision
CN112366821A (en) * 2020-10-29 2021-02-12 江苏易索电子科技股份有限公司 Three-dimensional video intelligent inspection system and inspection method
CN112395928A (en) * 2019-08-19 2021-02-23 珠海格力电器股份有限公司 Method for automatically detecting equipment state operation
CN112688438A (en) * 2020-12-24 2021-04-20 桂林电子科技大学 Intelligent system for recognizing and reading meters

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017128934A1 (en) * 2016-01-29 2017-08-03 成都理想境界科技有限公司 Method, server, terminal and system for implementing augmented reality
CN108243247A (en) * 2018-01-03 2018-07-03 广州供电局有限公司 A kind of data center's intelligent robot inspection tour system
CN109341694A (en) * 2018-11-12 2019-02-15 哈尔滨理工大学 A kind of autonomous positioning air navigation aid of mobile sniffing robot
CN112395928A (en) * 2019-08-19 2021-02-23 珠海格力电器股份有限公司 Method for automatically detecting equipment state operation
CN110826549A (en) * 2019-11-04 2020-02-21 山东欧玛嘉宝电气科技有限公司 Inspection robot instrument image identification method and system based on computer vision
CN112366821A (en) * 2020-10-29 2021-02-12 江苏易索电子科技股份有限公司 Three-dimensional video intelligent inspection system and inspection method
CN112688438A (en) * 2020-12-24 2021-04-20 桂林电子科技大学 Intelligent system for recognizing and reading meters

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114779679A (en) * 2022-03-23 2022-07-22 北京英智数联科技有限公司 Augmented reality inspection system and method
CN115468560A (en) * 2022-11-03 2022-12-13 国网浙江省电力有限公司宁波供电公司 Quality inspection method, robot, device and medium based on multi-sensor information fusion

Also Published As

Publication number Publication date
CN113452962B (en) 2022-08-05

Similar Documents

Publication Publication Date Title
CN113452962B (en) Data center enhanced inspection system and method with space collaborative perception
CN106340217B (en) Manufacturing equipment intelligence system and its implementation based on augmented reality
CN110047150B (en) Complex equipment operation on-site simulation system based on augmented reality
WO2022121911A1 (en) Virtual inspection system and visualized factory system in augmented reality environment
CN112650255A (en) Robot indoor and outdoor positioning navigation system method based on vision and laser radar information fusion
CN111813130A (en) Autonomous navigation obstacle avoidance system of intelligent patrol robot of power transmission and transformation station
CN108590664B (en) Multi-functional unattended intelligent tunnel digging change system based on trinocular vision identification technology
CN110490339A (en) A kind of auto repair auxiliary system and method based on augmented reality
CN109491383A (en) Multirobot positions and builds drawing system and method
CN113225212A (en) Data center monitoring system, method and server
CN112153267B (en) Human eye visual angle limitation space operation remote monitoring system based on AR intelligent glasses
CN106125092A (en) A kind of unmanned plane automatic obstacle-avoiding system and method based on two-dimensional laser radar
CN212515475U (en) Autonomous navigation obstacle avoidance system of intelligent patrol robot of power transmission and transformation station
CN110722559A (en) Auxiliary inspection positioning method for intelligent inspection robot
CN111702763B (en) Transformer substation inspection robot repositioning system and method based on Beidou system
CN112462723B (en) System for real-time control and visualization of digital factory under augmented reality environment
CN112669485B (en) Real scene immersion type patrol system for electric power operation site based on Internet of things
CN109746893A (en) Intelligence O&M robot, data center
CN111770450B (en) Workshop production monitoring server, mobile terminal and application
CN110696012B (en) Intelligent robot system for distribution room
CN113990034A (en) Transmission maintenance safety early warning method, system and terminal based on RTK positioning
CN102528811B (en) Mechanical arm positioning and obstacle avoiding system in Tokamak cavity
CN113627005B (en) Intelligent vision monitoring method
CN107644457A (en) A kind of monitoring method applied to industry spot
CN115793673B (en) VR technology-based natural gas station robot inspection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant