CN114779679A - Augmented reality inspection system and method - Google Patents

Augmented reality inspection system and method Download PDF

Info

Publication number
CN114779679A
CN114779679A CN202210286995.7A CN202210286995A CN114779679A CN 114779679 A CN114779679 A CN 114779679A CN 202210286995 A CN202210286995 A CN 202210286995A CN 114779679 A CN114779679 A CN 114779679A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
point cloud
module
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210286995.7A
Other languages
Chinese (zh)
Inventor
刘瀚诚
陈圣泓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yingzhi Digital Technology Co ltd
Original Assignee
Beijing Yingzhi Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yingzhi Digital Technology Co ltd filed Critical Beijing Yingzhi Digital Technology Co ltd
Priority to CN202210286995.7A priority Critical patent/CN114779679A/en
Publication of CN114779679A publication Critical patent/CN114779679A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/20Checking timed patrols, e.g. of watchman
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/23Pc programming
    • G05B2219/23051Remote control, enter program remote, detachable programmer

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an augmented reality inspection system and method, which comprise the following steps: after detecting that the unmanned aerial vehicle moves, the positioning module collects the primary position of the unmanned aerial vehicle and the point cloud data of the laser radar, acquires pre-collected point cloud data in a geographic information database, screens the pre-collected point cloud data by adopting the primary position of the unmanned aerial vehicle, constructs a point cloud map of a site based on the point cloud data of the laser radar, and obtains a routing inspection position of the unmanned aerial vehicle by combining the point cloud map of the site and the screened pre-collected point cloud data; the video streaming module collects a real scene picture of the inspection position of the unmanned aerial vehicle; the rendering module acquires the geographic structure information and the time-space data of the patrol position of the unmanned aerial vehicle in the geographic information database, constructs a virtual environment, renders the virtual environment onto a real scene picture, and obtains an augmented reality view of the patrol position of the unmanned aerial vehicle. The visual alignment of the augmented reality space and the real space is realized.

Description

Augmented reality inspection system and method
Technical Field
The invention belongs to the technical field of on-site inspection, and particularly relates to an augmented reality inspection system and method.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
In the inspection work, workers are required to carry mobile equipment to go to a specific area and return the field condition to a central control system, and relevant information can be collected into a time-space database by combining position information acquired by a position sensor arranged in the mobile equipment.
However, the above routing inspection process has many problems and limitations:
firstly, a target area may have danger, and an operator cannot enter the target area, so that the field situation cannot be known;
secondly, some solutions select to adopt a remote control robot to enter the site, but under the condition that the physical form is seriously interfered by explosion, fire and the like, so that the spatial characteristics are difficult to identify, no matter the human or the robot, the real-time situation is still easy to misjudge;
in addition, the patroller is also restricted by the physiological structure and the spatial structure, the visual field is narrow, and peripheral conditions outside the visual field are difficult to know, so that a decision can be made better.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides the augmented reality inspection system, which realizes the visual alignment of the augmented reality space and the real space so that a user can know the detailed conditions of specific equipment, buildings and production lines.
To achieve the above object, one or more embodiments of the present invention provide the following technical solutions:
on one hand, the augmented reality inspection system comprises an unmanned aerial vehicle, a field control end, a remote control end and a geographic information database; the unmanned aerial vehicle is provided with a video streaming module and a positioning module; the field control end and the remote control end are both provided with rendering modules;
the positioning module is connected with the geographic information database and used for acquiring the preliminary position of the unmanned aerial vehicle and the point cloud data of the laser radar after the unmanned aerial vehicle is detected to move, acquiring the point cloud data which is acquired in advance in the geographic information database, screening the preliminary position of the unmanned aerial vehicle, constructing a point cloud map of a site based on the point cloud data of the laser radar, and obtaining the routing inspection position of the unmanned aerial vehicle by combining the point cloud map of the site and the screened preliminary position point cloud data of the laser radar;
the video streaming module is used for acquiring a real scene picture of the inspection position of the unmanned aerial vehicle;
the rendering module is connected with the video streaming module, the positioning module and the geographic information database and used for acquiring geographic structure information and time-space data of the inspection position where the unmanned aerial vehicle is located in the geographic information database, constructing a virtual environment, rendering the virtual environment onto the real scene picture and obtaining an augmented reality view of the inspection position where the unmanned aerial vehicle is located.
Furthermore, the field control end and the remote control end are also provided with UI control modules, and the UI control modules are connected with the rendering module and used for providing UI interfaces in the augmented reality view.
Furthermore, the unmanned aerial vehicle and the field control end are also provided with a flight control module;
the flight control module of the field control end is used for receiving a user control instruction and uploading the user control instruction to the flight control module of the unmanned aerial vehicle;
the flight control module of the unmanned aerial vehicle is used for controlling the unmanned aerial vehicle to move based on the user control instruction.
Further, the viewpoint position and the orientation of the virtual environment are determined according to the patrol inspection position of the unmanned aerial vehicle returned by the positioning module.
Further, unmanned aerial vehicle, on-the-spot control end and remote control end all dispose network communication module, network communication module is used for intercommunication between unmanned aerial vehicle, ground control end, remote control end and the geographic information database.
Further, the network communication module also provides voice and video call functions.
On the other hand, the method for the augmented reality inspection is disclosed, and comprises the following steps:
starting the unmanned aerial vehicle;
after detecting that the unmanned aerial vehicle moves, the positioning module collects the primary position of the unmanned aerial vehicle and the point cloud data of the laser radar, acquires pre-collected point cloud data in a geographic information database, screens the pre-collected point cloud data by adopting the primary position of the unmanned aerial vehicle, constructs a point cloud map of a site based on the point cloud data of the laser radar, and obtains a routing inspection position of the unmanned aerial vehicle by combining the point cloud map of the site and the screened pre-collected point cloud data;
a video streaming module collects a real scene picture of the patrol position of the unmanned aerial vehicle;
and the rendering module acquires the geographic structure information and the time-space data of the inspection position of the unmanned aerial vehicle in the geographic information database, constructs a virtual environment, renders the virtual environment onto the real scene picture, and obtains an augmented reality view of the inspection position of the unmanned aerial vehicle.
Further, still include: a UI control module provides a UI interface in the augmented reality view.
Further, the method also comprises the following steps: a flight control module of the field control end receives a user control instruction and uploads the user control instruction to a flight control module of the unmanned aerial vehicle; the flight control module of the unmanned aerial vehicle controls the unmanned aerial vehicle to move based on the user control instruction.
Further, the viewpoint position and the orientation of the virtual environment are determined according to the patrol inspection position of the unmanned aerial vehicle returned by the positioning module.
The above one or more technical solutions have the following beneficial effects:
the rendering module in the technical scheme of the invention renders the real scene picture and the virtual environment collected by the video streaming module on the screen, thus realizing the visual alignment of the augmented reality space and the real space, providing users with the detailed conditions of specific equipment, buildings and production lines, and providing more perfect routing inspection and emergency functions.
According to the technical scheme, the laser radar is added into the positioning module, the point cloud data and the preliminary position information are combined to realize positioning of the unmanned aerial vehicle, the time complexity of an algorithm is reduced, and the single-frame positioning result can be calculated within 50ms, so that the real-time performance of the data is guaranteed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
Fig. 1 is a general architecture diagram of an augmented reality inspection system according to a first embodiment of the present invention;
fig. 2 is a diagram of a module connection relationship of the augmented reality inspection system according to the first embodiment of the present invention;
fig. 3 is a module configuration diagram of an augmented reality inspection system according to a first embodiment of the present disclosure;
fig. 4 is a flowchart of an augmented reality inspection method according to an embodiment of the present invention.
Detailed Description
It is to be understood that the following detailed description is exemplary and is intended to provide further explanation of the invention as claimed. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
Example one
Referring to fig. 1, the embodiment discloses an augmented reality inspection system, including: unmanned aerial vehicle, on-the-spot control end, remote control end and geographic information database.
As shown in fig. 3, the drone is configured with a video streaming module, a positioning module, and a flight control module. The field control end is provided with a UI control module, a rendering module and a flight control module and is connected with a geographic information database. The remote control end is provided with a UI control module and a rendering module and is connected with the geographic information database. Unmanned aerial vehicle, field control end and remote control end all dispose network communication module, and network communication module is used for the intercommunication between unmanned aerial vehicle, ground control end, remote control end and the geographic information database.
The unmanned aerial vehicle is used for cruising in a target area, and the configured video streaming module is used for acquiring a real scene picture of a patrol position where the unmanned aerial vehicle is located and transmitting the real scene picture to the field control end through the network communication module. And the field control end transmits the real scene picture back to the remote control end through the network communication module for comprehensive analysis by managers.
The field control end is located near the target area and used for controlling the flight and aerial photography of the unmanned aerial vehicle, executing various kinds of work of inspection, and transmitting data back to the remote control end for comprehensive analysis by managers.
The remote control end is located in an external environment far away from a target area and used for providing comprehensive on-site inspection condition monitoring service, providing analysis service for managers and communicating with the on-site control end.
As shown in fig. 2, the rendering module is connected to the video streaming module, the positioning module and the geographic information database through a network communication module, and specifically, the rendering module of the field control terminal is connected to the video streaming module and the positioning module configured by the unmanned aerial vehicle through the field control terminal and the network communication module configured by the unmanned aerial vehicle; the rendering module of the remote control terminal is connected with the video streaming module and the positioning module which are configured by the unmanned aerial vehicle sequentially through the network communication module which is configured by the remote control terminal, the field control terminal and the unmanned aerial vehicle. The UI control module is connected with the rendering module, and specifically, the rendering module of the field control terminal is connected with the UI control module of the field control terminal; and the rendering module of the remote control terminal is connected with the UI control module of the remote control terminal. The positioning module is connected with the geographic information database through a communication module configured by the network unmanned aerial vehicle. The flight control module only receives and transmits the remote control command through the network communication module and is not connected with the rendering module.
The geographic information database stores geographic structure information, space-time data and pre-collected point cloud data of the whole target area. Specifically, the geographic information database is used for storing geographic structure information (including terrain and terrain information, planar map information and three-dimensional spatial structure of equipment/buildings) of a target area and space-time data (such as production data, real-time safety alarm records and the like) associated with the geographic structure information, and forms a data read-write source of the rendering module and the UI control module. In addition, laser point clouds (namely, pre-collected point cloud data) recorded by map mapping personnel when building a three-dimensional space structure map are also stored in a geographic information database and are used for providing reference for the precise positioning of the unmanned aerial vehicle.
The flight control module is used for controlling the flight of the unmanned aerial vehicle. Specifically, the flight control module of the field control end is used for receiving a user control instruction and uploading the user control instruction to the flight control module of the unmanned aerial vehicle; the flight control module of the unmanned aerial vehicle is used for controlling the flight of the unmanned aerial vehicle based on the user control instruction. The flight control module also uploads a user control instruction to the positioning module and the video streaming module, and the positioning module and the video streaming module detect whether the unmanned aerial vehicle moves according to the user control instruction.
The positioning module is connected with the geographic information database through the network communication module and used for acquiring the preliminary position of the unmanned aerial vehicle and the point cloud data of the laser radar after detecting that the unmanned aerial vehicle moves; constructing a point cloud map on site based on the laser radar point cloud data; acquiring pre-acquired point cloud data in a geographic information database, and screening the pre-acquired point cloud data by adopting the primary position of an unmanned aerial vehicle; and obtaining the corrected position of the unmanned aerial vehicle, namely the accurate inspection position of the unmanned aerial vehicle by combining the point cloud map of the site and the pre-acquired point cloud data after screening and adopting a global positioning algorithm (DELIGHT algorithm). The positioning module calculates the patrol position of the unmanned aerial vehicle in real time, and comprises space coordinates (including longitude, latitude and altitude data), the real-time attitude (including a rotation angle and a pitch angle) of the unmanned aerial vehicle, the flying speed, the acceleration and the like. In order to realize the method, a key problem lies in obtaining the accurate inspection position of the unmanned aerial vehicle, and the precision of the method is ensured to be in centimeter level so as to ensure that the virtual information under the situation of augmented reality is accurately attached to the real environment. The GPS data (preliminary position, only contains space coordinate, does not contain real-time gesture) precision that unmanned aerial vehicle gathered is the meter level, is difficult to satisfy needs. In order to solve the problem, the positioning module is added with a laser radar (LiDar), LiDar point cloud data is collected in real time, a point cloud map of a site is constructed, and the position of the unmanned aerial vehicle is corrected by using a DELIGHT algorithm in combination with screened pre-collected point cloud data in a geographic information database. The DELIGHT algorithm is used for inputting pre-acquired point cloud data after screening and real-time LiDar point cloud data acquired based on a laser radar to construct a point cloud map on site, outputting coordinates, postures, flight speeds and accelerations corrected by the DELIGHT algorithm, and the complexity of the DELIGHT algorithm is positively correlated with the scale of input data. The pre-collected point cloud data is global data of a target area. The industrial production area is usually large in area, so the data size of the industrial production area is also large, and the efficiency of the algorithm is affected. Therefore, the pre-collected point cloud data needs to be screened. Considering that the preliminary position information acquired by the GPS module is not accurate enough but can roughly describe the activity range of the unmanned aerial vehicle, the preliminary position information is used for screening the pre-acquired point cloud data, and only the pre-acquired point cloud data of a local area is selected to participate in the calculation of the DELIGHT algorithm, so that the time complexity of the algorithm is reduced. The positioning result of a single frame can be calculated within 50ms, so that the real-time performance of data is ensured.
The video streaming module is used for acquiring a real scene picture of the patrol position of the unmanned aerial vehicle. Specifically, the video streaming module is used for acquiring a high-definition real-time picture of a patrol inspection position of the unmanned aerial vehicle in the target area so as to construct a real scene in the system.
The rendering module is used for acquiring geographic structure information and space-time data of the patrol inspection position of the unmanned aerial vehicle in the geographic information database based on the patrol inspection position of the unmanned aerial vehicle, constructing a virtual environment, and rendering the virtual environment on a real scene picture of the patrol inspection position of the unmanned aerial vehicle acquired by the video streaming module to obtain an augmented reality view of the patrol inspection position of the unmanned aerial vehicle. Wherein, virtual environment's viewpoint position and orientation are according to the unmanned aerial vehicle place routing inspection position determination of orientation module passback. The rendering module renders the real scene picture acquired by the video streaming module and the virtual environment formed by the geographic information data and the production and safety service data on a screen, so that the visual alignment of the augmented reality space and the real space is realized, and a user can know the detailed conditions of specific equipment, buildings and production lines.
The UI module provides a UI interface for the inspection personnel in the augmented reality view on the basis of the rendering module so as to support service functions such as equipment state inspection, data transmission, inspection plan tracking, emergency planning and the like.
The network communication module is used for mutual communication among the unmanned aerial vehicle, the ground control end, the remote control end and the geographic information database. The video data of the video streaming module and the unmanned aerial vehicle position inspection position of the positioning module are collected, data interaction is carried out on the unmanned aerial vehicle position inspection position and the geographic information database, and data reading and writing and the like are completed. Still provide pronunciation, video conversation function among the network communication module, supply on-the-spot control unmanned aerial vehicle's the personnel of patrolling and examining and remote scheduling personnel to carry out real-time communication.
In the hardware level, an unmanned aerial vehicle, portable mobile equipment (a mobile phone, a tablet computer, a notebook computer and the like) and a workstation need to be prepared. The system deployment steps are as follows: installing software programs of the augmented reality inspection system related by the invention on portable mobile equipment (a field control end) and a workstation (a remote control end), and mounting a laser radar in a positioning module on an unmanned aerial vehicle; starting software to complete the butt joint of the unmanned aerial vehicle and the field control end, wherein the network communication module can be automatically connected with the unmanned aerial vehicle, the remote command end and the geographic information database to carry out the work of spatial position calibration, content rendering and the like; after the connection is successful, the user controls the unmanned aerial vehicle to ascend through the field control end software, and a camera carried by the unmanned aerial vehicle is started; by this point, the system deployment is complete.
Example two
As shown in fig. 4, the embodiment discloses an augmented reality inspection method, which includes:
step 1, after the unmanned aerial vehicle is started to lift off, a positioning module, a video streaming module, a flight control module and a network communication module are started to execute the functions of the modules; a flight control module of the field control end receives a user control instruction and uploads the user control instruction to a flight control module of the unmanned aerial vehicle; the flight control module of the unmanned aerial vehicle controls the unmanned aerial vehicle to move based on the user control instruction.
Step 2, after detecting that the unmanned aerial vehicle moves, a positioning module collects the primary position of the unmanned aerial vehicle and the point cloud data of the laser radar, acquires pre-collected point cloud data in a geographic information database, screens the pre-collected point cloud data by adopting the primary position of the unmanned aerial vehicle, constructs a spot point cloud map based on the point cloud data of the laser radar, and obtains the routing inspection position of the unmanned aerial vehicle by combining the spot point cloud map of the spot and the screened pre-collected point cloud data; the unmanned aerial vehicle place patrols and examines the position and includes unmanned aerial vehicle's real-time position (contains longitude and latitude and altitude information), attitude information (contains rotation angle and pitch angle), airspeed and acceleration.
And 3, acquiring a real scene picture (a live picture (a video frame)) of the inspection position of the unmanned aerial vehicle by the video streaming module.
And 4, connecting the network communication module of the field control end with the network communication module on the unmanned aerial vehicle, and reading the frame (namely the real scene picture of the patrol position where the unmanned aerial vehicle is located) and the real-time position of the unmanned aerial vehicle. In order to ensure that the image is clear and recognizable, the resolution of the returned video should be ensured to be more than 2K. Meanwhile, the real scene picture of the inspection position of the unmanned aerial vehicle is transmitted back to the remote control end through the network communication module of the field control end and the network communication module of the remote control end.
And 5, the rendering module acquires the geographic structure information and the time-space data of the patrol position of the unmanned aerial vehicle in the geographic information database, constructs a virtual environment, and renders the virtual environment onto a real scene picture to obtain an augmented reality view of the patrol position of the unmanned aerial vehicle. The UI control module provides a UI interface in the augmented reality view to support equipment state inspection, data returning, routing inspection plan tracking and emergency planning. The rendering module displays two layers at a field control end and a remote control end, the two layers are vertically overlapped in the same view area to obtain an augmented reality view, and the lower layer of the augmented reality view is a real scene layer (namely a real scene picture) and displays a real-time picture of the unmanned aerial vehicle camera; the upper layer is augmented reality picture layer (be virtual environment), shows augmented reality data, and the position of the viewpoint position on augmented reality picture layer and orientation are confirmed according to the positional information and the attitude information of unmanned aerial vehicle passback. Once the viewpoint position and the viewpoint orientation are determined, the rendering module can display important information in the visible area in the picture. The road planning condition in the production area can be displayed in the picture, and the information of important equipment around the road is marked. The information is stored in a geographic information database, and can be acquired from a data system of a third party through a network communication module according to specific service requirements.
And 6, at the field control end, controlling the unmanned aerial vehicle to move and adjust the posture by the user by using the unmanned aerial vehicle control module so as to check important or abnormal areas. And when the unmanned aerial vehicle moves, periodically and repeatedly executing the steps 2-5 to ensure the synchronization of the data and the picture.
And 7, through the field control end and the remote control end, a user can execute relevant polling functions such as data calling, uploading, alarming, route navigation and the like, and can realize instant communication between the field control end and the remote control end through the network communication module so as to carry out cooperative work.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.

Claims (10)

1. An augmented reality inspection system is characterized by comprising an unmanned aerial vehicle, a field control end, a remote control end and a geographic information database; the unmanned aerial vehicle is provided with a video streaming module and a positioning module; the field control end and the remote control end are both provided with rendering modules;
the positioning module is connected with the geographic information database and used for acquiring the preliminary position of the unmanned aerial vehicle and the point cloud data of the laser radar after the unmanned aerial vehicle is detected to move, acquiring the point cloud data which is acquired in advance in the geographic information database, screening the preliminary position of the unmanned aerial vehicle, constructing a point cloud map of a site based on the point cloud data of the laser radar, and obtaining the routing inspection position of the unmanned aerial vehicle by combining the point cloud map of the site and the screened preliminary position point cloud data of the laser radar;
the video streaming module is used for acquiring a real scene picture of the patrol position of the unmanned aerial vehicle;
the rendering module is connected with the video streaming module, the positioning module and the geographic information database and used for acquiring geographic structure information and time-space data of the inspection position where the unmanned aerial vehicle is located in the geographic information database, constructing a virtual environment, rendering the virtual environment onto the real scene picture and obtaining an augmented reality view of the inspection position where the unmanned aerial vehicle is located.
2. The augmented reality inspection system according to claim 1, wherein the field control end and the remote control end are further configured with UI control modules, and the UI control modules are connected with the rendering module and used for providing a UI interface in the augmented reality view.
3. The augmented reality inspection system according to claim 1, wherein the unmanned aerial vehicle and the field control end are further configured with flight control modules;
the flight control module of the field control end is used for receiving a user control instruction and uploading the user control instruction to the flight control module of the unmanned aerial vehicle;
the flight control module of the unmanned aerial vehicle is used for controlling the unmanned aerial vehicle to move based on the user control instruction.
4. The augmented reality inspection system according to claim 1, wherein the viewpoint position and orientation of the virtual environment are determined according to the inspection position of the unmanned aerial vehicle returned by the positioning module.
5. The augmented reality inspection system according to claim 1, wherein the unmanned aerial vehicle, the field control end and the remote control end are each configured with a network communication module for mutual communication among the unmanned aerial vehicle, the ground control end, the remote control end and the geographic information database.
6. The augmented reality inspection system according to claim 5, wherein the network communication module further provides voice and video call functionality.
7. An augmented reality inspection method is characterized by comprising the following steps:
starting the unmanned aerial vehicle;
after detecting that the unmanned aerial vehicle moves, the positioning module collects the preliminary position of the unmanned aerial vehicle and the point cloud data of the laser radar, acquires pre-collected point cloud data in a geographic information database, screens the pre-collected point cloud data by adopting the preliminary position of the unmanned aerial vehicle, constructs a point cloud map of a site based on the point cloud data of the laser radar, and obtains a routing inspection position of the unmanned aerial vehicle by combining the point cloud map of the site and the screened pre-collected point cloud data;
a video streaming module collects a real scene picture of the patrol position of the unmanned aerial vehicle;
the rendering module acquires the geographic structure information and the time-space data of the patrol position of the unmanned aerial vehicle in the geographic information database, constructs a virtual environment, renders the virtual environment onto the real scene picture, and obtains an augmented reality view of the patrol position of the unmanned aerial vehicle.
8. The augmented reality inspection method according to claim 7, further comprising: the UI control module provides a UI interface in the augmented reality view.
9. The augmented reality inspection method according to claim 7, further comprising: a flight control module of the field control end receives a user control instruction and uploads the user control instruction to a flight control module of the unmanned aerial vehicle; the flight control module of the unmanned aerial vehicle controls the unmanned aerial vehicle to move based on the user control instruction.
10. The augmented reality inspection method according to claim 7, wherein the viewpoint position and the orientation of the virtual environment are determined according to the inspection position of the unmanned aerial vehicle returned by the positioning module.
CN202210286995.7A 2022-03-23 2022-03-23 Augmented reality inspection system and method Pending CN114779679A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210286995.7A CN114779679A (en) 2022-03-23 2022-03-23 Augmented reality inspection system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210286995.7A CN114779679A (en) 2022-03-23 2022-03-23 Augmented reality inspection system and method

Publications (1)

Publication Number Publication Date
CN114779679A true CN114779679A (en) 2022-07-22

Family

ID=82424966

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210286995.7A Pending CN114779679A (en) 2022-03-23 2022-03-23 Augmented reality inspection system and method

Country Status (1)

Country Link
CN (1) CN114779679A (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101845796B1 (en) * 2017-11-16 2018-04-05 사단법인 한국선급 A virtual reality-based management method combining drone inspection information
CN108171817A (en) * 2018-01-10 2018-06-15 上海市地下空间设计研究总院有限公司 Method for inspecting based on MR or AR, MR or AR equipment and cruising inspection system
CN108200415A (en) * 2018-03-16 2018-06-22 广州成至智能机器科技有限公司 Unmanned plane image frame processing system and its method based on augmented reality
CN109872401A (en) * 2019-02-18 2019-06-11 中国铁路设计集团有限公司 A kind of UAV Video augmented reality implementation method
CN210090988U (en) * 2019-04-11 2020-02-18 株洲时代电子技术有限公司 Unmanned aerial vehicle system of patrolling and examining
CN112014857A (en) * 2020-08-31 2020-12-01 上海宇航***工程研究所 Three-dimensional laser radar positioning and navigation method for intelligent inspection and inspection robot
CN112366821A (en) * 2020-10-29 2021-02-12 江苏易索电子科技股份有限公司 Three-dimensional video intelligent inspection system and inspection method
CN112767391A (en) * 2021-02-25 2021-05-07 国网福建省电力有限公司 Power grid line part defect positioning method fusing three-dimensional point cloud and two-dimensional image
CN112947512A (en) * 2021-01-27 2021-06-11 昭通亮风台信息科技有限公司 AR-based unmanned aerial vehicle power grid line patrol method and system
CN113014824A (en) * 2021-05-11 2021-06-22 北京远度互联科技有限公司 Video picture processing method and device and electronic equipment
CN113222184A (en) * 2021-01-29 2021-08-06 国网辽宁省电力有限公司大连供电公司 Equipment inspection system and method based on augmented reality AR
CN113452962A (en) * 2021-06-22 2021-09-28 北京邮电大学 Data center enhanced inspection system and method with space collaborative perception
WO2021227359A1 (en) * 2020-05-14 2021-11-18 佳都新太科技股份有限公司 Unmanned aerial vehicle-based projection method and apparatus, device, and storage medium
CN113778137A (en) * 2021-11-09 2021-12-10 北京数字绿土科技有限公司 Unmanned aerial vehicle autonomous inspection method for power transmission line
CN113934802A (en) * 2020-07-13 2022-01-14 中国石油化工股份有限公司 Memory, pipeline inspection video augmented reality display method, device and equipment
WO2022021739A1 (en) * 2020-07-30 2022-02-03 国网智能科技股份有限公司 Humanoid inspection operation method and system for semantic intelligent substation robot
CN114034304A (en) * 2021-11-16 2022-02-11 西安热工研究院有限公司 Wind power plant unmanned aerial vehicle inspection method, device, equipment and readable storage medium
CN114092537A (en) * 2021-09-23 2022-02-25 国电南瑞科技股份有限公司 Automatic inspection method and device for electric unmanned aerial vehicle of transformer substation

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101845796B1 (en) * 2017-11-16 2018-04-05 사단법인 한국선급 A virtual reality-based management method combining drone inspection information
CN108171817A (en) * 2018-01-10 2018-06-15 上海市地下空间设计研究总院有限公司 Method for inspecting based on MR or AR, MR or AR equipment and cruising inspection system
CN108200415A (en) * 2018-03-16 2018-06-22 广州成至智能机器科技有限公司 Unmanned plane image frame processing system and its method based on augmented reality
CN109872401A (en) * 2019-02-18 2019-06-11 中国铁路设计集团有限公司 A kind of UAV Video augmented reality implementation method
CN210090988U (en) * 2019-04-11 2020-02-18 株洲时代电子技术有限公司 Unmanned aerial vehicle system of patrolling and examining
WO2021227359A1 (en) * 2020-05-14 2021-11-18 佳都新太科技股份有限公司 Unmanned aerial vehicle-based projection method and apparatus, device, and storage medium
CN113934802A (en) * 2020-07-13 2022-01-14 中国石油化工股份有限公司 Memory, pipeline inspection video augmented reality display method, device and equipment
WO2022021739A1 (en) * 2020-07-30 2022-02-03 国网智能科技股份有限公司 Humanoid inspection operation method and system for semantic intelligent substation robot
CN112014857A (en) * 2020-08-31 2020-12-01 上海宇航***工程研究所 Three-dimensional laser radar positioning and navigation method for intelligent inspection and inspection robot
CN112366821A (en) * 2020-10-29 2021-02-12 江苏易索电子科技股份有限公司 Three-dimensional video intelligent inspection system and inspection method
CN112947512A (en) * 2021-01-27 2021-06-11 昭通亮风台信息科技有限公司 AR-based unmanned aerial vehicle power grid line patrol method and system
CN113222184A (en) * 2021-01-29 2021-08-06 国网辽宁省电力有限公司大连供电公司 Equipment inspection system and method based on augmented reality AR
CN112767391A (en) * 2021-02-25 2021-05-07 国网福建省电力有限公司 Power grid line part defect positioning method fusing three-dimensional point cloud and two-dimensional image
CN113014824A (en) * 2021-05-11 2021-06-22 北京远度互联科技有限公司 Video picture processing method and device and electronic equipment
CN113452962A (en) * 2021-06-22 2021-09-28 北京邮电大学 Data center enhanced inspection system and method with space collaborative perception
CN114092537A (en) * 2021-09-23 2022-02-25 国电南瑞科技股份有限公司 Automatic inspection method and device for electric unmanned aerial vehicle of transformer substation
CN113778137A (en) * 2021-11-09 2021-12-10 北京数字绿土科技有限公司 Unmanned aerial vehicle autonomous inspection method for power transmission line
CN114034304A (en) * 2021-11-16 2022-02-11 西安热工研究院有限公司 Wind power plant unmanned aerial vehicle inspection method, device, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
US10181211B2 (en) Method and apparatus of prompting position of aerial vehicle
US20200106818A1 (en) Drone real-time interactive communications system
US20160127712A1 (en) Method and video communication device for transmitting video to a remote user
US10839612B1 (en) Method and system for visualizing overlays in virtual environments
CN109459029B (en) Method and equipment for determining navigation route information of target object
JP2022508135A (en) Surveying systems, surveying methods, equipment and devices
US11668577B1 (en) Methods and systems for response vehicle deployment
WO2007076555A2 (en) A location based wireless collaborative environment with a visual user interface
CN109561282B (en) Method and equipment for presenting ground action auxiliary information
CN213302860U (en) Three-dimensional visual obstacle avoidance system of unmanned aerial vehicle
CN111213367B (en) Load control method and device
JP2022507715A (en) Surveying methods, equipment and devices
CN112286228A (en) Unmanned aerial vehicle three-dimensional visual obstacle avoidance method and system
CN112969977A (en) Catching auxiliary method, ground command platform, unmanned aerial vehicle, system and storage medium
US20210034052A1 (en) Information processing device, instruction method for prompting information, program, and recording medium
CN116308944A (en) Emergency rescue-oriented digital battlefield actual combat control platform and architecture
JP2013235367A (en) Flight path display system, method, and program
EP3264380A1 (en) System and method for immersive and collaborative video surveillance
CN106394918A (en) Unmanned aerial vehicle carried panorama camera system and operation method thereof
AU2018450271B2 (en) Operation control system, and operation control method and device
CN114779679A (en) Augmented reality inspection system and method
CN111176330A (en) Double-control method of unmanned aerial vehicle
CN112558008B (en) Navigation method, system, equipment and medium based on optical communication device
KR101674033B1 (en) Image mapping system of a closed circuit television based on the three dimensional map
US20230035962A1 (en) Space recognition system, space recognition method and information terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination