WO2021159603A1 - 无人机室内导航方法、装置、设备和存储介质 - Google Patents

无人机室内导航方法、装置、设备和存储介质 Download PDF

Info

Publication number
WO2021159603A1
WO2021159603A1 PCT/CN2020/085853 CN2020085853W WO2021159603A1 WO 2021159603 A1 WO2021159603 A1 WO 2021159603A1 CN 2020085853 W CN2020085853 W CN 2020085853W WO 2021159603 A1 WO2021159603 A1 WO 2021159603A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
model
information
point cloud
navigation
Prior art date
Application number
PCT/CN2020/085853
Other languages
English (en)
French (fr)
Inventor
冼志海
陈东亮
Original Assignee
深圳壹账通智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳壹账通智能科技有限公司 filed Critical 深圳壹账通智能科技有限公司
Publication of WO2021159603A1 publication Critical patent/WO2021159603A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • This application relates to the field of artificial intelligence technology, and in particular to the indoor navigation method, device, equipment and storage medium of an unmanned aerial vehicle.
  • the current drones that can realize autonomous flight rely on satellite signals to give location information, and then use the existing map to plan the route based on the location information.
  • the drone cannot receive GPS signals in an environment with shelters such as indoors and tunnels. Therefore, the indoor positioning and navigation of UAVs are usually in the following ways:
  • this technology does not require any front-end network deployment, and ordinary devices that can receive mobile phone signals can be positioned by comparing signals from multiple base stations. But it is difficult to guarantee the positioning accuracy;
  • This positioning system can use the camera to collect images of the surrounding environment and determine the location by comparing it with the information entered in advance. This method cannot determine different partitions of the same layout (such as different rooms with the same hotel decoration), and Due to the changeable environment, in a relatively small space, drones may not be able to avoid obstacles smoothly.
  • the main purpose of this application is to provide a method, device, equipment and storage medium for UAV indoor navigation, aiming to solve the current technical problem of high cost and low accuracy of UAV indoor navigation.
  • the indoor navigation method for a UAV includes the following steps:
  • the point cloud model and the preset BIM model are superimposed to obtain a superimposed model, a navigation route to the destination location is generated according to the superimposed model, and the drone is controlled to operate according to the navigation route.
  • the indoor navigation device for an unmanned aerial vehicle includes:
  • the request receiving module is used to obtain the initial position and destination position of the drone when receiving the drone navigation request;
  • the information collection module is used to collect the operating state information of the UAV through the first collection device in the UAV, and collect the different heights and heights at the initial position through the second collection device in the UAV. Surrounding environment information at different shooting angles;
  • a model construction module configured to construct a feature point cloud of the shooting object according to the operating state information and the surrounding environment information, and process the feature point cloud to obtain a point cloud model of the shooting object;
  • the route generation module is used to superimpose the point cloud model and the preset BIM model to obtain a superimposed model, generate a navigation route to the destination location according to the superimposed model, and control the drone to operate according to the navigation route .
  • this application also provides an indoor navigation device for drones
  • the UAV indoor navigation device includes: a memory, a processor, and a computer program stored in the memory and capable of running on the processor, wherein: the computer program is executed by the processor to achieve the above
  • the steps of the UAV indoor navigation method at least includes the following steps: when receiving the UAV navigation request, obtain the initial position and the target position of the UAV;
  • the first collection device in the drone collects operating state information of the drone, and the second collection device in the drone collects surrounding environment information at different heights and different shooting angles at the initial position;
  • the operating state information and the surrounding environment information construct a feature point cloud of the shooting object, processing the feature point cloud to obtain a point cloud model of the shooting object; superimposing the point cloud model with a preset BIM model to obtain a superimposed model , Generating a navigation route to the destination location according to the superposition model, and controlling the drone to operate according to the navigation route.
  • this application also provides a computer storage medium; the computer storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the above-mentioned indoor navigation method for drones are realized, wherein:
  • the indoor navigation method of the UAV includes at least the following steps: when receiving the UAV navigation request, obtain the initial position and the target position of the UAV;
  • the operating state information of the man-machine is collected by the second collecting device in the drone to collect the surrounding environment information at different heights and different shooting angles at the initial position; and the shooting is constructed according to the operating state information and the surrounding environment information
  • the feature point cloud of the object is processed, and the feature point cloud is processed to obtain the point cloud model of the shooting object; the point cloud model and the preset BIM model are superimposed to obtain the superposition model, and the superposition model is generated according to the superposition model. Navigate the route, and control the UAV to operate according to the navigation route.
  • the embodiment of the application proposes an indoor navigation method, device, equipment, and storage medium for a UAV.
  • a terminal receives a UAV navigation request, it obtains the initial position and the target position of the UAV;
  • the first collection device in the UAV collects the operating state information of the UAV, and the second collection device in the UAV collects the surrounding environment information at different heights and different shooting angles at the initial position; according to the operation State information and the surrounding environment information construct a feature point cloud of the subject, process the feature point cloud to obtain a point cloud model of the subject; superimpose the point cloud model with a preset BIM model to obtain a superimposed model, and
  • the superposition model generates a navigation route to the destination location, and controls the UAV to operate according to the navigation route.
  • the technical solution of this embodiment does not require additional hardware costs, and the information of building components, doors and windows and other UAV navigation obstacles can be accurately determined through the superimposed model, so that the UAV can better avoid obstacles and reduce the incidence of accidents.
  • the indoor automatic navigation of the UAV is realized, and the accuracy of the indoor navigation of the UAV is improved.
  • FIG. 1 is a schematic diagram of a device structure of a hardware operating environment involved in a solution of an embodiment of the present application
  • FIG. 2 is a schematic flowchart of the first embodiment of the indoor navigation method for drones under this application;
  • FIG. 3 is a schematic diagram of functional modules of an embodiment of an indoor navigation device for drones under this application.
  • Figure 1 is the terminal of the hardware operating environment involved in the solution of the embodiment of the application (also called the UAV indoor navigation device, where the UAV indoor navigation device can be a separate UAV indoor navigation device
  • the structure can also be formed by combining other devices with the UAV indoor navigation device) schematic diagram of the structure.
  • the terminal in the embodiments of this application can be a fixed terminal or a mobile terminal, such as smart air conditioners with networking functions, smart lights, smart power supplies, smart speakers, autonomous vehicles, PC (personal computer) personal computers, smart phones, and tablet computers. , E-book readers, portable computers, etc.
  • the terminal may include a processor 1001, for example, a central processing unit (CPU), a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002.
  • the communication bus 1002 is used to implement connection and communication between these components.
  • the user interface 1003 may include a display screen (Display) and an input unit such as a keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
  • the network interface 1004 may optionally include a standard wired interface and a wireless interface (such as WIreless-FIdelity, WIFI interface).
  • the memory 1 005 may be a high-speed RAM memory, or a stable memory (non-volatile memory), for example, a magnetic disk memory.
  • the memory 1005 may also be a storage device independent of the aforementioned processor 1001.
  • the terminal may also include a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, and a WiFi module; an input unit, a display screen, and a touch screen; the network interface can be selected in addition to WiFi, Bluetooth, Probes and so on.
  • sensors such as light sensors, motion sensors and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor; of course, the mobile terminal may also be equipped with other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc., which will not be repeated here.
  • terminal structure shown in FIG. 1 does not constitute a limitation on the terminal, and may include more or less components than shown in the figure, or combine some components, or arrange different components.
  • the computer software product is stored in a storage medium (storage medium: also called computer storage medium, computer medium, readable medium, readable storage medium, computer readable storage medium, or directly called medium, etc., storage medium
  • storage medium can be a non-volatile readable storage medium, such as RAM, magnetic disk, optical disk, and includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute this application
  • the memory 1005 as a computer storage medium may include an operating system, a network communication module, a user interface module, and a computer program.
  • the network interface 1004 is mainly used to connect to a back-end server and communicate with the back-end server;
  • the user interface 1003 is mainly used to connect to a client (user side) to communicate with the client;
  • the processor 1001 can be used to call a computer program stored in the memory 1005 and execute the steps in the UAV indoor navigation method provided in the following embodiments of the present application.
  • the point cloud model generated in real time can automatically plan the UAV indoor navigation route, so that the UAV can navigate. specifically:
  • the indoor navigation method for an unmanned aerial vehicle includes:
  • step S10 when the drone navigation request is received, the initial position and the target position of the drone are acquired.
  • the UAV indoor navigation method in this embodiment is applied to a terminal (also called UAV indoor navigation equipment).
  • the terminal communicates with the UAV, and the terminal can control the UAV.
  • the UAV has a preset collection device, The collection device is used to collect information about the surrounding environment of the drone and the operating status information of the drone (the operating status of the drone and the relative displacement information of the drone).
  • the number of depth cameras, inertial measurement devices, perception cameras, and depth cameras is not limited.
  • the drone sends the collected surrounding environment information and drone operation status information to the terminal, and the terminal processes the surrounding environment information and the drone operation Status information, get the UAV's indoor navigation route, and realize the accurate navigation of the UAV.
  • the terminal receives the drone navigation request
  • the triggering method of the drone navigation request is not specifically limited, that is, the drone navigation request can be triggered by the user actively, for example, the user clicks the drone navigation corresponding on the terminal display interface Button to trigger the drone navigation request; in addition, the drone navigation request can also be automatically triggered by the terminal.
  • the trigger condition of the drone navigation request is preset in the terminal: trigger the drone navigation request in the early morning every day. Collect xxx floor information, and the terminal will automatically trigger the drone navigation request when it reaches the wee hours.
  • the terminal When the terminal receives the drone navigation request, the terminal obtains the initial position and destination position of the drone.
  • the initial position and destination position of the drone can be set by the user; for example, the initial position of the drone is entered on the user terminal It is: 1 room on the 3rd floor of the xxx building, and the destination is: 3 rooms on the 2nd floor of the xxx building.
  • a specific implementation method for determining the initial position and the target position of the drone includes the following steps:
  • Step a1 when receiving the drone navigation request, obtain the building identifier of the building where the drone is currently located, and the preset BIM model associated with the building identifier;
  • Step a2 construct a three-dimensional coordinate system based on the preset BIM model, use the three-dimensional coordinates of the current position of the drone as the initial position of the drone, and obtain the navigation destination corresponding to the drone navigation request; Use the three-dimensionality of the navigation destination as the target position of the drone.
  • the terminal when the terminal receives the drone navigation request, the terminal obtains the building identifier of the building where the drone is currently located (the building identifier refers to the identification information that uniquely identifies the building, such as the name of the building or the location information of the building), and the terminal Obtain the preset BIM model associated with the building logo (the preset BIM model refers to the preset BIM model (Building Information Modeling model) associated with the building logo, and the BIM model contains buildings); it is generally constructed based on the preset BIM model In the three-dimensional coordinate system, the terminal uses the three-dimensional coordinates of the current position of the drone as the initial position of the drone, the terminal obtains the navigation destination corresponding to the drone navigation request, and uses the three-dimensional navigation destination as the destination position of the drone.
  • the building identifier refers to the identification information that uniquely identifies the building, such as the name of the building or the location information of the building
  • the terminal Obtain the preset BIM model associated with the building logo
  • a three-dimensional coordinate system is constructed based on the BIM model to accurately determine the initial position and target position of the drone, so as to realize accurate navigation of the drone.
  • Step S20 Collect operating state information of the drone through the first collection device in the drone, and collect different heights and different shooting angles at the initial position through the second collection device in the drone Surrounding environment information.
  • the collection device is preset in the drone, and the collection device is divided into a first collection device and a second collection device according to the purpose of the collection device.
  • the first collection device is used to collect the operating state information of the drone, and the first collection device includes at least One inertial measuring device and at least one sensing camera;
  • the second collecting device is used to collect information about the surrounding environment of the drone, and the second collecting device includes at least one depth camera.
  • the drone carries one inertial measuring device and one depth camera.
  • 4 environment-aware cameras of which, the inertial measuring device is responsible for sensing the direction information of the drone, the environment-aware camera is responsible for obtaining the relative displacement of the drone, and the depth camera is responsible for sensing the depth image information of the drone’s object.
  • the inertial measuring device is responsible for sensing the direction information of the drone
  • the environment-aware camera is responsible for obtaining the relative displacement of the drone
  • the depth camera is responsible for sensing the depth image information of the drone’s object.
  • Step b1 Adjust the height and shooting angle of the drone at the initial position, collect the direction information of the drone through the inertial measurement device, collect characteristic images through the sensing camera, and analyze the characteristic images to obtain The relative position information of the drone, using the direction information and the relative position information as operating state information of the drone;
  • Step b2 Transmit infrared pulses to the subject through the depth camera, receive the infrared pulse reflected by the subject and the reflection time of the infrared pulse, process the reflection time to obtain the depth image information of the subject, and convert the depth
  • the image information serves as the surrounding environment information.
  • the terminal controls the drone to autonomously adjust the height and shooting angle near the initial position, and performs multi-angle shooting to obtain feature images.
  • the terminal extracts feature points of the feature image, and the terminal analyzes the feature image to obtain the relative position information of the drone.
  • the direction information and relative position information are used as the operating status information of the UAV.
  • the terminal emits infrared pulses through the depth camera, and obtains the depth image information by calculating the reflection time, that is, the distance from the surface of the object to the camera.
  • Step S30 Construct a feature point cloud of the shooting object according to the operating state information and the surrounding environment information, and process the feature point cloud to obtain a point cloud model of the shooting object.
  • the terminal converts the depth image information in the surrounding environment information into a three-dimensional feature point cloud, and the terminal fuses the three-dimensional feature point cloud into a three-dimensional grid to obtain a point cloud model, that is, the terminal emits rays from the current collection device position to the three-dimensional feature point cloud of the previous step.
  • the feature point cloud is intersected to obtain the point cloud under the current frame perspective, and the terminal calculates its normal vector to register the input depth image information of the next frame, and continuously loop to obtain feature point clouds under different perspectives.
  • the scene surface of the reconstructed subject is reconstructed to form a point cloud model.
  • step S30 includes:
  • Step b1 extracting direction information and relative position information in the operating state information, and iterating the direction information and the relative position information to obtain the attitude change value of the first collecting device in the UAV;
  • Step b2 extracting the depth image information in the surrounding environment information, and iterating the depth image information according to the attitude change value to obtain the feature point cloud of the drone photographed object;
  • Step b3 processing the characteristic point cloud by a preset SLAM algorithm to obtain a point cloud model of the shooting object.
  • the terminal extracts the direction information and relative position information from the operating status information, and the terminal iterates the direction information and relative position information to obtain the attitude change value of the first acquisition device in the drone; the terminal extracts the depth in the surrounding environment information For image information, the terminal iterates the depth image information according to the attitude change value to obtain the feature point cloud of the drone photographed object; the terminal uses the preset SLAM algorithm (Simultaneous localization and mapping, synchronous positioning and mapping algorithm) to process the feature point cloud to obtain Point cloud model of the subject.
  • SLAM algorithm Simultaneous localization and mapping, synchronous positioning and mapping algorithm
  • the terminal determines the attitude change value of the drone to construct the characteristic point cloud of the shooting object, and obtains the approximate point cloud model of the shooting target according to the SLAM algorithm.
  • the terminal recognizes the corresponding non-building components in real time according to the point cloud model Obstacle information.
  • Step S40 Superimpose the point cloud model and the preset BIM model to obtain a superimposed model, generate a navigation route to the destination location according to the superimposed model, and control the drone to operate according to the navigation route.
  • the BIM model is preset in the terminal, and the terminal determines the building component information according to the preset BIM model.
  • the terminal can determine the non-building component information based on the point cloud model.
  • the terminal superimposes the BIM model and the point cloud model to obtain a superimposed model.
  • the superimposed model contains building components.
  • the terminal For UAV navigation obstacles such as information and non-building component information, the terminal generates a navigation route to the destination location according to the superimposed model, and controls the UAV to operate according to the navigation route.
  • step S30 includes:
  • Step c1 Determine the reference position corresponding to the initial position in the preset BIM model, and compare the edge information in the point cloud model with the edge information at the reference position in the preset BIM model to obtain the result The minimum distance between the point cloud model and the preset BIM model;
  • Step c2 superimpose the point cloud model and the preset BIM model according to the minimum distance to obtain a superimposed model
  • Step c3 trace back the path from the initial position according to the superimposed model, obtain a navigation route to the destination position, and control the UAV to operate according to the navigation route.
  • the terminal determines the reference position corresponding to the initial position in the preset BIM model.
  • the terminal compares the image of the subject in the point cloud model with the image at the reference position in the preset BIM model to obtain the minimum distance between the edge feature points of the two images.
  • the point cloud model and the preset BIM model are superimposed at the minimum distance to obtain the superimposed model.
  • the terminal traces the path from the initial position according to the superimposed model, obtains the navigation route to the destination position, and controls the drone to operate according to the navigation route.
  • the technical solution of this embodiment does not require additional hardware costs, and the information of building components, doors and windows and other UAV navigation obstacles can be accurately determined through the superimposed model, so that the UAV can better avoid obstacles and reduce the incidence of accidents.
  • the indoor automatic navigation of the UAV is realized, and the accuracy of the indoor navigation of the UAV is improved.
  • This embodiment is a refinement of step S40 in the first embodiment.
  • the difference between this embodiment and the first embodiment of this application lies in:
  • Step S41 Perform path traceability along the destination position from the initial position in the overlay model to determine whether there are obstacles in the traceability path, whether the traceability path repetition rate is greater than a preset repetition rate, and/or whether there are at least two Trace path
  • the terminal traces the path along the destination from the initial position in the superimposed model, that is, the superimposed model contains building component information and non-building component information, the terminal uses the building component information and non-building component information as obstacles, and the terminal avoids the obstacles.
  • Path tracing specifically, the terminal determines whether there are obstacles in the tracing path (obstacles can be walls, lamps, decorations, etc.), and whether the repetition rate of the traced path is greater than the preset repetition rate (the preset repetition rate refers to the preset repetition rate).
  • the ratio of the full path to the repeated path for example, the preset repetition rate is set to 30%) and/or whether there are at least two retrospective paths.
  • Step S42 if there are obstacles in the traceability path, change the traceability direction of the trace; if the traceability path repetition rate is greater than the preset repetition rate, then discard the traceability path; and/or if at least two traceability paths are obtained, the trace with the shortest distance
  • the path is used as the UAV navigation route, and the UAV is controlled to operate according to the navigation route.
  • the terminal determines that the path has reached the end, and the terminal changes the path tracing direction; if the repetition rate of the tracing path is greater than the preset repetition rate, the terminal determines that the path is a repetitive path, and then abandons the tracing path; and/or if it is obtained There are at least two tracing paths.
  • the terminal uses the shortest tracing path as the drone navigation route and controls the drone to operate according to the navigation route.
  • the path generation method is given in this embodiment, which effectively ensures the rationality of the UAV navigation route and makes the UAV navigation more accurate.
  • This embodiment is a step after step S40 in the first embodiment.
  • the difference between this embodiment and the first embodiment of the application lies in:
  • Step S50 When it is detected that the UAV deviates from the navigation route, a route control instruction is sent to the UAV to make the UAV return to the navigation route.
  • the terminal monitors the drone's running path information in real time.
  • the running path information includes the running speed, running route, and running time.
  • the terminal judges whether the drone deviates from the navigation route based on the drone's running route information. If it deviates from the navigation route, the terminal sends a route control instruction to the UAV so that the UAV can return to the navigation route according to the route control instruction.
  • Step S60 If the UAV does not return to the navigation route within a preset time period, send an information collection instruction to the UAV so that the UAV feeds back current operating parameters.
  • the terminal sends an information collection instruction to the drone, and the drone receiving terminal sends it
  • the UAV obtains the current operating parameters.
  • the operating parameters include operating time and operating route.
  • the UAV feeds back the current operating parameters to the mobile terminal.
  • Step S70 Receive current operating parameters fed back by the drone, and if the current operating parameters are abnormal, output prompt information.
  • the terminal receives the current operating parameters fed back by the drone.
  • the terminal compares the current operating parameters with the preset standard operating parameters to determine whether the current operating parameters meet the standard operating parameters. If the current operating parameters do not meet the standard operating parameters, the terminal determines the current operation If the parameter is abnormal, the terminal judges that the drone is faulty, and the terminal outputs prompt information.
  • the terminal monitors the operating status of the drone. When the drone fails, the terminal can output prompt information in real time to perform timely maintenance of the drone.
  • an embodiment of the present application also proposes an indoor navigation device for a drone, and the indoor navigation device for a drone includes:
  • the request receiving module 10 is used to obtain the initial position and the target position of the UAV when the UAV navigation request is received;
  • the information collection module 20 is configured to collect the operating state information of the drone through the first collection device in the drone, and collect the different heights at the initial position through the second collection device in the drone And surrounding environment information from different shooting angles;
  • the model construction module 30 is configured to construct a feature point cloud of the shooting object according to the operating state information and the surrounding environment information, and process the feature point cloud to obtain a point cloud model of the shooting object;
  • the route generation module 40 is configured to superimpose the point cloud model and the preset BIM model to obtain a superimposed model, generate a navigation route to the destination location according to the superimposed model, and control the drone to follow the navigation route run.
  • the first acquisition device includes at least one inertial measurement device and at least one perception camera, and the second acquisition device includes at least one depth camera;
  • the information collection module 20 includes:
  • the first collection module is used to adjust the height and shooting angle of the drone at the initial position, collect the direction information of the drone through the inertial measurement device, collect characteristic images through the sensing camera, and analyze the Obtaining the relative position information of the drone by using the characteristic image, and using the direction information and the relative position information as the operating state information of the drone;
  • the second acquisition module is used to transmit infrared pulses to the subject through the depth camera, receive the infrared pulse reflected by the subject and the reflection time of the infrared pulse, process the reflection time to obtain the depth image information of the subject,
  • the depth image information is used as surrounding environment information.
  • the model construction module 30 includes:
  • the attitude calculation unit is used to extract the direction information and relative position information in the operating state information, and iterate the direction information and the relative position information to obtain the attitude change value of the first collecting device in the UAV ;
  • a point cloud determining unit configured to extract the depth image information in the surrounding environment information, iterate the depth image information according to the attitude change value, and obtain the characteristic point cloud of the drone photographed object;
  • the model generating unit is configured to process the characteristic point cloud by using a preset SLAM algorithm to obtain a point cloud model of the shooting object.
  • the route generation module 40 includes:
  • the information comparison sub-module is used to determine the reference position corresponding to the initial position in the preset BIM model, and compare the edge information in the point cloud model with the edge information at the reference position in the preset BIM model Compare to obtain the minimum distance between the point cloud model and the preset BIM model;
  • the model superimposition sub-module is used to superimpose the point cloud model and the preset BIM model according to the minimum distance to obtain a superimposed model
  • the route generation sub-module is used to trace the path from the initial position according to the overlay model to obtain the navigation route to the destination position, and control the drone to operate according to the navigation route.
  • the route generation sub-module includes:
  • the retrospective judgment unit is used to trace the path from the initial position in the superimposition model along the destination location to determine whether there are obstacles in the retrospective path, whether the retrospective path repetition rate is greater than a preset repetition rate and/or whether there is At least two tracing paths;
  • the control operation unit is used to change the path tracing direction if there are obstacles in the tracing path; if the repetition rate of the tracing path is greater than the preset repetition rate, the tracing path is abandoned; and/or if at least two tracing paths are obtained, the distance is changed
  • the shortest traceability path is used as the drone navigation route, and the drone is controlled to operate according to the navigation route.
  • the indoor navigation device for drones includes:
  • a route monitoring module configured to send a route control instruction to the drone when it is detected that the drone deviates from the navigation route, so that the drone returns to the navigation route;
  • An instruction sending module configured to send an information collection instruction to the drone if the drone does not return to the navigation route within a preset time period, so that the drone can feed back current operating parameters
  • the prompt output module is used to receive the current operating parameters fed back by the drone, and output prompt information if the current operating parameters are abnormal.
  • the steps implemented by the various functional modules of the UAV indoor navigation device can refer to the various embodiments of the UAV indoor navigation method of the present application, which will not be repeated here.
  • the embodiment of the present application also proposes a computer storage medium, and the computer-readable storage medium may be non-volatile or volatile.
  • the computer storage medium stores a computer program that, when executed by a processor, implements the operations in the UAV indoor navigation method provided in the above embodiments, wherein the UAV indoor navigation method includes the following steps: When receiving the drone navigation request, obtain the initial position and the target position of the drone; collect the operating state information of the drone through the first acquisition device in the drone, and use the drone
  • the second collecting device in the S2 collects surrounding environment information at different heights and different shooting angles at the initial position; constructs a feature point cloud of the shooting object according to the operating state information and the surrounding environment information, and processes the feature point cloud to obtain The point cloud model of the shooting object; superimpose the point cloud model and the preset BIM model to obtain a superimposed model, generate a navigation route to the target location according to the superimposed model, and control the drone to follow the The navigation route runs.
  • the description is relatively simple, and for related parts, please refer to the part of the description of the method embodiment.
  • the device embodiments described above are merely illustrative, and the units described as separate components may or may not be physically separate. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solution of the present application. Those of ordinary skill in the art can understand and implement without creative work.
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM) as described above. , Magnetic disks, optical disks), including several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present application.
  • a terminal device which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种基于无人机的室内导航方法,装置、设备和存储介质,方法包括:在接收到无人机导航请求时,获取无人机的初始位置和目的位置(S10);通过无人机中的第一采集装置采集无人机的运行状态信息,通过无人机中的第二采集装置采集初始位置处不同高度和不同拍摄角度的周边环境信息(S20);根据运行状态信息和周边环境信息构建拍摄对象的特征点云,处理特征点云得到拍摄对象的点云模型(S30);将点云模型与预设BIM模型叠加得到叠加模型,根据叠加模型生成到达目的位置的导航路线,并控制无人机按导航路线运行(S40)。实现了无人机室内自动导航,提高了无人机室内导航的精度。

Description

无人机室内导航方法、装置、设备和存储介质
本申请要求于2020年2月12日提交中国专利局、申请号为202010089061.5,发明名称为“无人机室内导航方法、装置、设备和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人工智能技术领域,尤其涉及无人机室内导航方法、装置、设备和存储介质。
背景技术
目前的能实现自主飞行的无人机依赖卫星信号给出位置信息,根据位置信息再使用已有地图规划路径,然而在室内,隧道等有遮蔽物的环境中无人机无法接收到GPS信号,因此,无人机室内定位和导航通常为以下方式:
1.基于蓝牙低功耗或者无线网络的定位,这种技术要想达到较高精度,需要提前在室内部署蓝牙信号源网格,再通过在客户端计算信号强度来定位,使得室内定位成本较高;
2.基于蜂窝网络的定位,这种技术不要求任何前置网络部署,普通能接收手机信号的设备都可以通过对比多个基站的信号来定位。但是难以保证定位精度;
3.基于射频识别的定位技术,这种技术要求提前部署射频识别设备,室内导航定位的精度也不是很理想;
4.视觉定位,这种定位***可以利用摄像头采集周围环境的图像,通过和提前录入的信息对比从而确定位置,这种方式无法确定相同布局的不同分区(比如酒店装修一样的不同房间),而且由于环境的多变,在相对狭小的空间内,无人机可能无法顺利的避障。
发明人意识到上述方式无人机在飞行过程中无法得到自身与周围环境实际状态,在无人机周围出现突发状况时,无人机无法做出及时反应,导致无人机很难在室内环境下进行自主定位和导航。
发明概述
技术问题
问题的解决方案
技术解决方案
本申请的主要目的在于提供一种无人机室内导航方法、装置、设备和存储介质,旨在解决当前无人机室内导航成本较高精度较低的技术问题。
为实现上述目的,本申请提供无人机室内导航方法,所述无人机室内导航方法包括以下步骤:
在接收到无人机导航请求时,获取无人机的初始位置和目的位置;
通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;
根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;
将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
此外,为实现上述目的,本申请还提供一种无人机室内导航装置,所述无人机室内导航装置包括:
请求接收模块,用于在接收到无人机导航请求时,获取无人机的初始位置和目的位置;
信息采集模块,用于通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;
模型构建模块,用于根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;
路线生成模块,用于将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
此外,为实现上述目的,本申请还提供一种无人机室内导航设备;
所述无人机室内导航设备包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其中:所述计算机程序被所述处理器执行时实现如上所述的无人机室内导航方法的步骤,其中,所述无人机室内导航方法至少包括以下步骤:在接收到无人机导航请求时,获取无人机的初始位置和目的位置;通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
此外,为实现上述目的,本申请还提供计算机存储介质;所述计算机存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如上述的无人机室内导航方法的步骤,其中,所述无人机室内导航方法至少包括以下步骤:在接收到无人机导航请求时,获取无人机的初始位置和目的位置;通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
本申请实施例提出的一种无人机室内导航方法、装置、设备和存储介质,终端在接收到无人机导航请求时,获取无人机的初始位置和目的位置;通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制 所述无人机按所述导航路线运行。本实施例的技术方案中不需要额外的硬件费用,通过叠加模型准确地确定建筑构件、门窗等无人机导航障碍物的信息,使得无人机可以更好的避障,减少事故发生率,实现了无人机室内自动导航,提高了无人机室内导航的精度。
发明的有益效果
对附图的简要说明
附图说明
图1是本申请实施例方案涉及的硬件运行环境的装置结构示意图;
图2为本申请无人机室内导航方法第一实施例的流程示意图;
图3为本申请无人机室内导航装置一实施例的功能模块示意图。
本申请目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
发明实施例
本发明的实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
如图1所示,图1是本申请实施例方案涉及的硬件运行环境的终端(又叫无人机室内导航设备,其中,无人机室内导航设备可以是由单独的无人机室内导航装置构成,也可以是由其他装置与无人机室内导航装置组合形成)结构示意图。
本申请实施例终端可以固定终端,也可以是移动终端,如,带联网功能的智能空调、智能电灯、智能电源、智能音箱、自动驾驶汽车、PC(personal computer)个人计算机、智能手机、平板电脑、电子书阅读器、便携计算机等。
如图1所示,该终端可以包括:处理器1001,例如,中央处理器Central Processing Unit,CPU),网络接口1004,用户接口1003,存储器1005,通信总线1002。其中,通信总线1002用于实现这些组件之间的连接通信。用户接口1003可以包括显示屏(Display)、输入单元比如键盘(Keyboard),可选用户接口1003还可以包括标准的有线接口、无线接口。网络接口1004可选的可以包括标准的有线接口、无线接口(如无线保真WIreless-FIdelity,WIFI接口)。存储器1 005可以是高速RAM存储器,也可以是稳定的存储器(non-volatile memory),例如,磁盘存储器。存储器1005可选的还可以是独立于前述处理器1001的存储装置。
可选地,终端还可以包括摄像头、RF(Radio Frequency,射频)电路,传感器、音频电路、WiFi模块;输入单元,比显示屏,触摸屏;网络接口可选除无线接口中除WiFi外,蓝牙、探针等等。其中,传感器比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器;当然,移动终端还可配置陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
本领域技术人员可以理解,图1中示出的终端结构并不构成对终端的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
如图1所示,该计算机软件产品存储在一个存储介质(存储介质:又叫计算机存储介质、计算机介质、可读介质、可读存储介质、计算机可读存储介质或者直接叫介质等,存储介质可以是非易失性可读存储介质,如RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法,作为一种计算机存储介质的存储器1005中可以包括操作***、网络通信模块、用户接口模块以及计算机程序。
在图1所示的终端中,网络接口1004主要用于连接后台服务器,与后台服务器进行数据通信;用户接口1003主要用于连接客户端(用户端),与客户端进行数据通信;而处理器1001可以用于调用存储器1005中存储的计算机程序,并执行本申请以下实施例提供的无人机室内导航方法中的步骤。
基于上述硬件运行环境的提出了本申请无人机室内导航方法的实施例。
在本申请无人机室内导航方法的实施例中,需要室内空间中的两个点,实时生成的点云模型,就可以自动规划出无人机室内导航的路线,使得无人机进行导航,具体地:
参照图2,在本申请一种无人机室内导航方法的第一实施例中,所述无人机室内导航方法包括:
步骤S10,在接收到无人机导航请求时,获取无人机的初始位置和目的位置。
本实施例中无人机室内导航方法应用于终端(又叫无人机室内导航设备),终端与无人机通信连接,终端可以对无人机进行控制,无人机中预设采集装置,采集装置用于采集无人机周边环境信息和无人机运行状态信息(无人机运行状态信息无人机的运行方向和相对位移信息),采集装置包括但不仅限于惯性测量器、感知摄像头和深度摄像头,惯性测量器、感知摄像头和深度摄像头的具体数量不作限定,无人机将采集到的周边环境信息和无人机运行状态信息发送至终端,终端通过处理周边环境信息和无人机运行状态信息,得到无人机室内导航路线,实现无人机准确导航。
具体地,终端接收无人机导航请求,无人机导航请求的触发方式不作具体限定,即,无人机导航请求可以用户主动触发的,例如,用户在终端显示界面上点击无人机导航对应的按键,以触发无人机导航请求;此外,无人机导航请求还可以是终端自动触发的,例如,终端中预设无人机导航请求触发条件:每天凌晨触发无人机导航请求,以采集xxx楼层信息,终端在到达到凌晨时,自动触发无人机导航请求。
终端在接收到无人机导航请求时,终端获取无人机的初始位置和目的位置,无人机的初始位置和目的位置可以是用户设置的;例如,用户终端上输入无人机的初始位置为:xxx建筑3楼层1房间,目的位置为:xxx建筑2楼层3房间。
本实施例中给出了一种确定无人机初始位置和目的位置的具体实现方式,包括以下步骤:
步骤a1,在接收到无人机导航请求时,获取所述无人机当前所处建筑的建筑标识,及所述建筑标识关联的预设BIM模型;
步骤a2,基于所述预设BIM模型构建三维坐标系,将所述无人机当前位置的三维坐标作为所述无人机的初始位置,获取所述无人机导航请求对应的导航目的地,将所述导航目的地的三维作为所述无人机的目的位置。
即,终端在接收到无人机导航请求时,终端获取无人机当前所处建筑的建筑标识(建筑标识是指唯一识别建筑物的标识信息,例如,建筑名称,或者建筑位置信息),终端获取建筑标识关联的预设BIM模型(预设BIM模型是指预先设 置的建筑标识关联的BIM模型(Building Information Modeling模型),BIM模型中包含有建筑物的);总的基于预设BIM模型构建三维坐标系,终端将无人机当前位置的三维坐标作为无人机的初始位置,终端获取无人机导航请求对应的导航目的地,将导航目的地的三维作为无人机的目的位置。
本实施例中基于BIM模型构建三维坐标系,准确地确定无人机的初始位置和目的位置,以实现无人机的准确导航。
步骤S20,通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息。
无人机中预设采集装置,根据采集装置的用途将采集装置划分为第一采集装置和第二采集装置,第一采集装置用于采集无人机的运行状态信息,第一采集装置包括至少一个惯性测量器和至少一个感知摄像头;第二采集装置用于采集无人机周边环境信息,第二采集装置包括至少一个深度摄像头,例如,无人机携带1个惯性测量器、1个深度摄像头和4个环境感知摄像头,其中,惯性测量器负责感知无人机的方向信息,环境感知摄像头负责获得无人机的相对位移,深度摄像头负责感知无人机拍摄对象的深度图像信息,具体地,包括:
步骤b1,在所述初始位置处调整无人机的高度和拍摄角度,通过所述惯性测量器采集所述无人机的方向信息,通过所述感知摄像头采集特征图像,分析所述特征图像得到所述无人机的相对位置信息,将所述方向信息和所述相对位置信息作为所述无人机的运行状态信息;
步骤b2,通过所述深度摄像头发射红外线脉冲至拍摄对象,接收所述拍摄对象反射的红外线脉冲及所述红外线脉冲的反射时间,处理所述反射时间获得拍摄对象的深度图像信息,将所述深度图像信息作为周边环境信息。
本实施例中终端控制无人机在初始位置附近自主调整高度和拍摄角度,进行多角度拍摄得到特征图像,终端提取特征图像的特征点,终端分析特征图像得到无人机的相对位置信息,终端将方向信息和相对位置信息作为无人机的运行状态信息。终端通过深度摄像头发射红外线脉冲,通过计算反射时间来获得深度图像信息,也就是物体表面到摄像头的距离。
步骤S30,根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型。
终端将周边环境信息中的深度图像信息转换成三维特征点云,终端将三维特征点云融合到三维网格中得到点云模型,即,终端将当前的采集装置位置发出射线与上一步的三维特征点云求交集,得到当前帧视角下的点云,同时终端计算其法向量,用来对下一帧的输入深度图像信息配准,不断地循环,获得不同视角下的特征点云,从而重建完成的拍摄对象的场景表面,形成点云模型。具体地,步骤S30包括:
步骤b1,提取所述运行状态信息中的方向信息和相对位置信息,将所述方向信息和所述相对位置信息进行迭代,得到所述无人机中第一采集装置的姿态变化值;
步骤b2,提取所述周边环境信息中的深度图像信息,按所述姿态变化值迭代所述深度图像信息,获得所述无人机拍摄对象的特征点云;
步骤b3,通过预设的SLAM算法处理所述特征点云,获得所述拍摄对象的点云模型。
具体地,终端提取运行状态信息中的方向信息和相对位置信息,终端将方向信息和相对位置信息进行迭代,得到无人机中第一采集装置的姿态变化值;终端提取周边环境信息中的深度图像信息,终端按姿态变化值迭代深度图像信息,获得无人机拍摄对象的特征点云;终端通过预设的SLAM算法(Simultaneous localization and mapping,同步定位与建图算法)处理特征点云,获得拍摄对象的点云模型。
本实施例中终端确定无人机的姿态变化值,用于构建拍摄对象的特征点云,并根据SLAM算法得到拍摄目标大致的点云模型,终端根据点云模型实时地识别非建筑构件对应的障碍物信息。
步骤S40,将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
终端中预设BIM模型,终端根据预设BIM模型确定建筑构件信息,终端根据点 云模型可以确定非建筑构件信息,终端将BIM模型和点云模型进行叠加得到叠加模型,叠加模型中具有建筑构件信息和非建筑构件信息等无人机导航障碍物,终端根据叠加模型生成到达目的位置的导航路线,并控制无人机按导航路线运行。具体地,步骤S30包括:
步骤c1,确定预设BIM模型中所述初始位置对应的基准位置,将所述点云模型中的边缘信息与所述预设BIM模型中所述基准位置处的边缘信息进行比对,得到所述点云模型与所述预设BIM模型的最小距离;
步骤c2,将所述点云模型与预设BIM模型按所述最小距离进行叠加,得到叠加模型;
步骤c3,根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行。
终端确定预设BIM模型中初始位置对应的基准位置,终端将点云模型中拍摄对象图像与预设BIM模型中基准位置处图像进行比对,得到两张图象边缘特征点距离最小距离,终端将点云模型与预设BIM模型按最小距离进行叠加,得到叠加模型。终端根据叠加模型从初始位置处进行路径追溯,得到到达目的位置处的导航路线,并控制无人机按导航路线运行。
本实施例的技术方案中不需要额外的硬件费用,通过叠加模型准确地确定建筑构件、门窗等无人机导航障碍物的信息,使得无人机可以更好的避障,减少事故发生率,实现了无人机室内自动导航,提高了无人机室内导航的精度。
进一步地,在本申请第一实施例的基础上,提出了本申请无人机室内导航方法的第二实施例。
本实施例是第一实施例中步骤S40的细化,本实施例与本申请第一实施例的区别在于:
步骤S41,从所述叠加模型中所述初始位置处沿所述目的位置进行路径追溯,判断追溯路径中是否存在障碍物、追溯路径重复率是否大于预设重复率和/或是否存在至少两条追溯路径;
终端从叠加模型中初始位置处沿目的位置进行路径追溯,即,叠加模型中包含建筑构件信息和非建筑构件信息,终端将建筑构件信息和非建筑构件信息作为 障碍物,终端避开障碍物进行路径追溯,具体地,终端判断追溯路径中是否存在障碍物(障碍物可以是墙体、灯具、装饰品等等),追溯路径重复率是否大于预设重复率(预设重复率是指预先设置全路径与重复路径的比例,例如,预设重复率设置为30%)和/或是否存在至少两条追溯路径。
步骤S42,若追溯路径中存在障碍物,则更换路径追溯方向;若追溯路径重复率大于预设重复率,则放弃追溯路径;和/或若得到至少两条追溯路径,则将距离最短的追溯路径作为无人机导航路线,并控制所述无人机按所述导航路线运行。
若追溯路径中存在障碍物,终端确定该路径到达尽头,终端更换路径追溯方向;若追溯路径重复率大于预设重复率,终端判定该路径为重复路径,则放弃追溯路径;和/或若得到至少两条追溯路径,终端将距离最短的追溯路径作为无人机导航路线,并控制无人机按导航路线运行。本实施例中给出了路径生成方式,有效地保证了无人机导航路线的合理性,使得无人机导航更加准确。
进一步地,在本申请上述实施例的基础上,提出了本申请无人机室内导航方法的第三实施例。
本实施例是第一实施例中步骤S40之后的步骤,本实施例与本申请第一实施例的区别在于:
步骤S50,在监测到所述无人机偏离所述导航路线时,发送路线控制指令至所述无人机,以使所述无人机回归所述导航路线。
终端实时地监测无人机的运行路径信息,运行路径信息包括运行速度,运行路线和运行时间等等,终端根据无人机的运行路径信息,判断无人机是否偏离导航路线,若无人机偏离导航路线,终端发送路线控制指令至无人机,以使无人机根据路线控制指令回归导航路线。
步骤S60,若预设时间段内所述无人机没有回归所述导航路线,则发送信息采集指令至所述无人机,以使所述无人机反馈当前运行参数。
若预设时间段内(预设时间段根据具体场景设置,例如预设时间段设置为2分钟)无人机没有回归导航路线,终端发送信息采集指令至无人机,无人机接收终端发送的采集指令,无人机获取当前运行参数,运行参数包括运行时间、运 行路线,无人机将当前运行参数反馈至移动终端。
步骤S70,接收所述无人机反馈的当前运行参数,若所述当前运行参数异常,则输出提示信息。
终端接收无人机反馈的当前运行参数,终端将当前运行参数与预设的标准运行参数进行比较,判断当前运行参数是否符合标准运行参数,若当前运行参数不符合标准运行参数,终端确定当前运行参数异常,终端判定无人机故障,终端输出提示信息。本实施例中终端对无人机的运行状态进行监测,在无人机故障时,终端可以实时地输出提示信息,进行无人机的及时维修。
此外,参照图3,本申请实施例还提出一种无人机室内导航装置,所述无人机室内导航装置包括:
请求接收模块10,用于在接收到无人机导航请求时,获取无人机的初始位置和目的位置;
信息采集模块20,用于通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;
模型构建模块30,用于根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;
路线生成模块40,用于将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
在一实施例中,所述第一采集装置包括至少一个惯性测量器和至少一个感知摄像头,所述第二采集装置包括至少一个深度摄像头;
所述信息采集模块20,包括:
第一采集模块,用于在所述初始位置处调整无人机的高度和拍摄角度,通过所述惯性测量器采集所述无人机的方向信息,通过所述感知摄像头采集特征图像,分析所述特征图像得到所述无人机的相对位置信息,将所述方向信息和所述相对位置信息作为所述无人机的运行状态信息;
第二采集模块,用于通过所述深度摄像头发射红外线脉冲至拍摄对象,接收所 述拍摄对象反射的红外线脉冲及所述红外线脉冲的反射时间,处理所述反射时间获得拍摄对象的深度图像信息,将所述深度图像信息作为周边环境信息。
在一实施例中,所述模型构建模块30,包括:
姿态计算单元,用于提取所述运行状态信息中的方向信息和相对位置信息,将所述方向信息和所述相对位置信息进行迭代,得到所述无人机中第一采集装置的姿态变化值;
点云确定单元,用于提取所述周边环境信息中的深度图像信息,按所述姿态变化值迭代所述深度图像信息,获得所述无人机拍摄对象的特征点云;
模型生成单元,用于通过预设的SLAM算法处理所述特征点云,获得所述拍摄对象的点云模型。
在一实施例中,所述路线生成模块40,包括:
信息比对子模块,用于确定预设BIM模型中所述初始位置对应的基准位置,将所述点云模型中的边缘信息与所述预设BIM模型中所述基准位置处的边缘信息进行比对,得到所述点云模型与所述预设BIM模型的最小距离;
模型叠加子模块,用于将所述点云模型与预设BIM模型按所述最小距离进行叠加,得到叠加模型;
路线生成子模块,用于根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行。
在一实施例中,所述路线生成子模块,包括:
追溯判断单元,用于从所述叠加模型中所述初始位置处沿所述目的位置进行路径追溯,判断追溯路径中是否存在障碍物、追溯路径重复率是否大于预设重复率和/或是否存在至少两条追溯路径;
控制运行单元,用于若追溯路径中存在障碍物,则更换路径追溯方向;若追溯路径重复率大于预设重复率,则放弃追溯路径;和/或若得到至少两条追溯路径,则将距离最短的追溯路径作为无人机导航路线,并控制所述无人机按所述导航路线运行。
在一实施例中,所述的无人机室内导航装置,包括:
路线监测模块,用于在监测到所述无人机偏离所述导航路线时,发送路线控制 指令至所述无人机,以使所述无人机回归所述导航路线;
指令发送模块,用于若预设时间段内所述无人机没有回归所述导航路线,则发送信息采集指令至所述无人机,以使所述无人机反馈当前运行参数;
提示输出模块,用于接收所述无人机反馈的当前运行参数,若所述当前运行参数异常,则输出提示信息。
其中,无人机室内导航装置的各个功能模块实现的步骤可参照本申请无人机室内导航方法的各个实施例,此处不再赘述。
此外,本申请实施例还提出一种计算机存储介质,所述计算机可读存储介质可以是非易失性,也可以是易失性。所述计算机存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现上述实施例提供的无人机室内导航方法中的操作,其中,所述无人机室内导航方法包括以下步骤:在接收到无人机导航请求时,获取无人机的初始位置和目的位置;通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体/操作/对象与另一个实体/操作/对象区分开来,而不一定要求或者暗示这些实体/操作/对象之间存在任何这种实际的关系或者顺序;术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者***不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者***所固有的要素。在没有更多限制的情况下,由语句“包括一个......”限定的要素,并不排除在包括该要素的过程、方法、物品或者***中还存在另外的相同要素。
对于装置实施例而言,由于其基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。以上所描述的装置实施例仅仅是示 意性的,其中作为分离部件说明的单元可以是或者也可以不是物理上分开的。可以根据实际的需要选择中的部分或者全部模块来实现本申请方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上所述的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。
以上仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (20)

  1. 一种无人机室内导航方法,其中,所述无人机室内导航方法包括以下步骤:
    在接收到无人机导航请求时,获取无人机的初始位置和目的位置;
    通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;
    根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;
    将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
  2. 如权利要求1所述的无人机室内导航方法,其中,所述在接收到无人机导航请求时,获取无人机的初始位置和目的位置的步骤,包括:
    在接收到无人机导航请求时,获取所述无人机当前所处建筑的建筑标识,及所述建筑标识关联的预设BIM模型;
    基于所述预设BIM模型构建三维坐标系,将所述无人机当前位置的三维坐标作为所述无人机的初始位置,获取所述无人机导航请求对应的导航目的地,将所述导航目的地的三维作为所述无人机的目的位置。
  3. 如权利要求1所述的无人机室内导航方法,其中,所述第一采集装置包括至少一个惯性测量器和至少一个感知摄像头,所述第二采集装置包括至少一个深度摄像头;
    所述通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息的步骤,包括:
    在所述初始位置处调整无人机的高度和拍摄角度,通过所述惯性测量器采集所述无人机的方向信息,通过所述感知摄像头采集特征图像,分析所述特征图像得到所述无人机的相对位置信息,将所述方向信息和所述相对位置信息作为所述无人机的运行状态信息;
    通过所述深度摄像头发射红外线脉冲至拍摄对象,接收所述拍摄对象反射的红外线脉冲及所述红外线脉冲的反射时间,处理所述反射时间获得拍摄对象的深度图像信息,将所述深度图像信息作为周边环境信息。
  4. 如权利要求1所述的无人机室内导航方法,其中,所述根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型的步骤,包括:
    提取所述运行状态信息中的方向信息和相对位置信息,将所述方向信息和所述相对位置信息进行迭代,得到所述无人机中第一采集装置的姿态变化值;
    提取所述周边环境信息中的深度图像信息,按所述姿态变化值迭代所述深度图像信息,获得所述无人机拍摄对象的特征点云;
    通过预设的SLAM算法处理所述特征点云,获得所述拍摄对象的点云模型。
  5. 如权利要求1所述的无人机室内导航方法,其中,所述将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行的步骤,包括:
    确定预设BIM模型中所述初始位置对应的基准位置,将所述点云模型中的边缘信息与所述预设BIM模型中所述基准位置处的边缘信息进行比对,得到所述点云模型与所述预设BIM模型的最小距离;
    将所述点云模型与预设BIM模型按所述最小距离进行叠加,得到 叠加模型;
    根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行。
  6. 如权利要求5所述的无人机室内导航方法,其中,所述根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行的步骤,包括:
    从所述叠加模型中所述初始位置处沿所述目的位置进行路径追溯,判断追溯路径中是否存在障碍物、追溯路径重复率是否大于预设重复率和/或是否存在至少两条追溯路径;
    若追溯路径中存在障碍物,则更换路径追溯方向;若追溯路径重复率大于预设重复率,则放弃追溯路径;和/或若得到至少两条追溯路径,则将距离最短的追溯路径作为无人机导航路线,并控制所述无人机按所述导航路线运行。
  7. 如权利要求1至6任意一项所述的无人机室内导航方法,其中,所述将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行的步骤之后,包括:
    在监测到所述无人机偏离所述导航路线时,发送路线控制指令至所述无人机,以使所述无人机回归所述导航路线;
    若预设时间段内所述无人机没有回归所述导航路线,则发送信息采集指令至所述无人机,以使所述无人机反馈当前运行参数;
    接收所述无人机反馈的当前运行参数,若所述当前运行参数异常,则输出提示信息。
  8. 一种无人机室内导航装置,其中,所述无人机室内导航装置包括:
    请求接收模块,用于在接收到无人机导航请求时,获取无人机的 初始位置和目的位置;
    信息采集模块,用于通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;
    模型构建模块,用于根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;
    路线生成模块,用于将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
  9. 一种无人机室内导航设备,其中,所述无人机室内导航设备包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其中:
    所述计算机程序被所述处理器执行时实现一种无人机室内导航方法的步骤,其中,所述无人机室内导航方法包括以下步骤:
    在接收到无人机导航请求时,获取无人机的初始位置和目的位置;
    通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;
    根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;
    将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
  10. 如权利要求9所述的无人机室内导航设备,其中,所述在接收到无人机导航请求时,获取无人机的初始位置和目的位置的步骤,包括:
    在接收到无人机导航请求时,获取所述无人机当前所处建筑的建筑标识,及所述建筑标识关联的预设BIM模型;
    基于所述预设BIM模型构建三维坐标系,将所述无人机当前位置的三维坐标作为所述无人机的初始位置,获取所述无人机导航请求对应的导航目的地,将所述导航目的地的三维作为所述无人机的目的位置。
  11. 如权利要求9所述的无人机室内导航设备,其中,所述第一采集装置包括至少一个惯性测量器和至少一个感知摄像头,所述第二采集装置包括至少一个深度摄像头;
    所述通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息的步骤,包括:
    在所述初始位置处调整无人机的高度和拍摄角度,通过所述惯性测量器采集所述无人机的方向信息,通过所述感知摄像头采集特征图像,分析所述特征图像得到所述无人机的相对位置信息,将所述方向信息和所述相对位置信息作为所述无人机的运行状态信息;
    通过所述深度摄像头发射红外线脉冲至拍摄对象,接收所述拍摄对象反射的红外线脉冲及所述红外线脉冲的反射时间,处理所述反射时间获得拍摄对象的深度图像信息,将所述深度图像信息作为周边环境信息。
  12. 如权利要求9所述的无人机室内导航设备,其中,所述根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型的步骤,包括:
    提取所述运行状态信息中的方向信息和相对位置信息,将所述方向信息和所述相对位置信息进行迭代,得到所述无人机中第一采集装置的姿态变化值;
    提取所述周边环境信息中的深度图像信息,按所述姿态变化值迭 代所述深度图像信息,获得所述无人机拍摄对象的特征点云;
    通过预设的SLAM算法处理所述特征点云,获得所述拍摄对象的点云模型。
  13. 如权利要求9所述的无人机室内导航设备,其中,所述将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行的步骤,包括:
    确定预设BIM模型中所述初始位置对应的基准位置,将所述点云模型中的边缘信息与所述预设BIM模型中所述基准位置处的边缘信息进行比对,得到所述点云模型与所述预设BIM模型的最小距离;
    将所述点云模型与预设BIM模型按所述最小距离进行叠加,得到叠加模型;
    根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行。
  14. 如权利要求13所述的无人机室内导航设备,其中,所述根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行的步骤,包括:
    从所述叠加模型中所述初始位置处沿所述目的位置进行路径追溯,判断追溯路径中是否存在障碍物、追溯路径重复率是否大于预设重复率和/或是否存在至少两条追溯路径;
    若追溯路径中存在障碍物,则更换路径追溯方向;若追溯路径重复率大于预设重复率,则放弃追溯路径;和/或若得到至少两条追溯路径,则将距离最短的追溯路径作为无人机导航路线,并控制所述无人机按所述导航路线运行。
  15. 一种计算机存储介质,其中,所述计算机存储介质上存储有计算 机程序,所述计算机程序被处理器执行时实现一种无人机室内导航方法的步骤,其中,所述无人机室内导航方法包括以下步骤:
    在接收到无人机导航请求时,获取无人机的初始位置和目的位置;
    通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息;
    根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型;
    将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行。
  16. 如权利要求15所述的计算机存储介质,其中,所述在接收到无人机导航请求时,获取无人机的初始位置和目的位置的步骤,包括:
    在接收到无人机导航请求时,获取所述无人机当前所处建筑的建筑标识,及所述建筑标识关联的预设BIM模型;
    基于所述预设BIM模型构建三维坐标系,将所述无人机当前位置的三维坐标作为所述无人机的初始位置,获取所述无人机导航请求对应的导航目的地,将所述导航目的地的三维作为所述无人机的目的位置。
  17. 如权利要求15所述的计算机存储介质,其中,所述第一采集装置包括至少一个惯性测量器和至少一个感知摄像头,所述第二采集装置包括至少一个深度摄像头;
    所述通过所述无人机中的第一采集装置采集所述无人机的运行状态信息,通过所述无人机中的第二采集装置采集所述初始位置处不同高度和不同拍摄角度的周边环境信息的步骤,包括:
    在所述初始位置处调整无人机的高度和拍摄角度,通过所述惯性 测量器采集所述无人机的方向信息,通过所述感知摄像头采集特征图像,分析所述特征图像得到所述无人机的相对位置信息,将所述方向信息和所述相对位置信息作为所述无人机的运行状态信息;
    通过所述深度摄像头发射红外线脉冲至拍摄对象,接收所述拍摄对象反射的红外线脉冲及所述红外线脉冲的反射时间,处理所述反射时间获得拍摄对象的深度图像信息,将所述深度图像信息作为周边环境信息。
  18. 如权利要求15所述的计算机存储介质,其中,所述根据所述运行状态信息和所述周边环境信息构建拍摄对象的特征点云,处理所述特征点云得到所述拍摄对象的点云模型的步骤,包括:
    提取所述运行状态信息中的方向信息和相对位置信息,将所述方向信息和所述相对位置信息进行迭代,得到所述无人机中第一采集装置的姿态变化值;
    提取所述周边环境信息中的深度图像信息,按所述姿态变化值迭代所述深度图像信息,获得所述无人机拍摄对象的特征点云;
    通过预设的SLAM算法处理所述特征点云,获得所述拍摄对象的点云模型。
  19. 如权利要求15所述的计算机存储介质,其中,所述将所述点云模型与预设BIM模型叠加得到叠加模型,根据所述叠加模型生成到达所述目的位置的导航路线,并控制所述无人机按所述导航路线运行的步骤,包括:
    确定预设BIM模型中所述初始位置对应的基准位置,将所述点云模型中的边缘信息与所述预设BIM模型中所述基准位置处的边缘信息进行比对,得到所述点云模型与所述预设BIM模型的最小距离;
    将所述点云模型与预设BIM模型按所述最小距离进行叠加,得到叠加模型;
    根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行。
  20. 如权利要求19所述的计算机存储介质,其中,所述根据所述叠加模型从所述初始位置处进行路径追溯,得到到达所述目的位置处的导航路线,并控制所述无人机按所述导航路线运行的步骤,包括:
    从所述叠加模型中所述初始位置处沿所述目的位置进行路径追溯,判断追溯路径中是否存在障碍物、追溯路径重复率是否大于预设重复率和/或是否存在至少两条追溯路径;
    若追溯路径中存在障碍物,则更换路径追溯方向;若追溯路径重复率大于预设重复率,则放弃追溯路径;和/或若得到至少两条追溯路径,则将距离最短的追溯路径作为无人机导航路线,并控制所述无人机按所述导航路线运行。
PCT/CN2020/085853 2020-02-12 2020-04-21 无人机室内导航方法、装置、设备和存储介质 WO2021159603A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010089061.5 2020-02-12
CN202010089061.5A CN111272172A (zh) 2020-02-12 2020-02-12 无人机室内导航方法、装置、设备和存储介质

Publications (1)

Publication Number Publication Date
WO2021159603A1 true WO2021159603A1 (zh) 2021-08-19

Family

ID=70997022

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085853 WO2021159603A1 (zh) 2020-02-12 2020-04-21 无人机室内导航方法、装置、设备和存储介质

Country Status (2)

Country Link
CN (1) CN111272172A (zh)
WO (1) WO2021159603A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706716A (zh) * 2021-10-21 2021-11-26 湖南省交通科学研究院有限公司 一种利用无人机倾斜摄影的公路bim建模方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111741263B (zh) * 2020-06-18 2021-08-31 广东电网有限责任公司 一种变电站巡检无人机的多目态势感知导航方法
CN111880566A (zh) * 2020-07-28 2020-11-03 中国银行股份有限公司 基于无人机的上门收送款方法、装置、存储介质及设备
CN113485438B (zh) * 2021-07-30 2022-03-18 南京石知韵智能科技有限公司 一种无人机空间监测路径智能规划方法及***
WO2023173409A1 (zh) * 2022-03-18 2023-09-21 深圳市大疆创新科技有限公司 信息的显示方法、模型的对比方法、装置及无人机***
CN117130392B (zh) * 2023-10-26 2024-02-20 深圳森磊弘泰消防科技有限公司 基于bim数据进行室内定位导航的无人机及控制方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236548A (zh) * 2014-09-12 2014-12-24 清华大学 一种微型无人机室内自主导航方法
US20170193830A1 (en) * 2016-01-05 2017-07-06 California Institute Of Technology Controlling unmanned aerial vehicles to avoid obstacle collision
US20170206648A1 (en) * 2016-01-20 2017-07-20 Ez3D, Llc System and method for structural inspection and construction estimation using an unmanned aerial vehicle
WO2018048353A1 (en) * 2016-09-09 2018-03-15 Nanyang Technological University Simultaneous localization and mapping methods and apparatus
CN108303099A (zh) * 2018-06-14 2018-07-20 江苏中科院智能科学技术应用研究院 基于三维视觉slam的无人机室内自主导航方法
CN109410327A (zh) * 2018-10-09 2019-03-01 鼎宸建设管理有限公司 一种基于bim和gis的三维城市建模方法
CN109540142A (zh) * 2018-11-27 2019-03-29 达闼科技(北京)有限公司 一种机器人定位导航的方法、装置、计算设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160034013A (ko) * 2014-09-19 2016-03-29 한국건설기술연구원 무인 항공기를 이용한 건설현장 관리 시스템 및 방법
CN106441286B (zh) * 2016-06-27 2019-11-19 上海大学 基于bim技术的无人机隧道巡检***
CN109410330A (zh) * 2018-11-12 2019-03-01 中国十七冶集团有限公司 一种基于bim技术无人机航拍建模方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236548A (zh) * 2014-09-12 2014-12-24 清华大学 一种微型无人机室内自主导航方法
US20170193830A1 (en) * 2016-01-05 2017-07-06 California Institute Of Technology Controlling unmanned aerial vehicles to avoid obstacle collision
US20170206648A1 (en) * 2016-01-20 2017-07-20 Ez3D, Llc System and method for structural inspection and construction estimation using an unmanned aerial vehicle
WO2018048353A1 (en) * 2016-09-09 2018-03-15 Nanyang Technological University Simultaneous localization and mapping methods and apparatus
CN108303099A (zh) * 2018-06-14 2018-07-20 江苏中科院智能科学技术应用研究院 基于三维视觉slam的无人机室内自主导航方法
CN109410327A (zh) * 2018-10-09 2019-03-01 鼎宸建设管理有限公司 一种基于bim和gis的三维城市建模方法
CN109540142A (zh) * 2018-11-27 2019-03-29 达闼科技(北京)有限公司 一种机器人定位导航的方法、装置、计算设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706716A (zh) * 2021-10-21 2021-11-26 湖南省交通科学研究院有限公司 一种利用无人机倾斜摄影的公路bim建模方法
CN113706716B (zh) * 2021-10-21 2022-01-07 湖南省交通科学研究院有限公司 一种利用无人机倾斜摄影的公路bim建模方法

Also Published As

Publication number Publication date
CN111272172A (zh) 2020-06-12

Similar Documents

Publication Publication Date Title
WO2021159603A1 (zh) 无人机室内导航方法、装置、设备和存储介质
CN109643127B (zh) 构建地图、定位、导航、控制方法及***、移动机器人
CN110446159B (zh) 一种室内无人机精确定位与自主导航的***及方法
AU2019356907B2 (en) Automated control of image acquisition via use of acquisition device sensors
US11632602B2 (en) Automated determination of image acquisition locations in building interiors using multiple data capture devices
CN106168805A (zh) 基于云计算的机器人自主行走的方法
KR20210086072A (ko) 실시간 현장 작업 모니터링 방법 및 시스템
WO2018045538A1 (zh) 无人机及其避障方法和避障***
US10896327B1 (en) Device with a camera for locating hidden object
WO2019051832A1 (zh) 可移动物体控制方法、设备及***
US11455771B2 (en) Venue survey using unmanned aerial vehicle
Cui et al. Search and rescue using multiple drones in post-disaster situation
US20230206491A1 (en) Information processing device, mobile device, information processing system, method, and program
Feng et al. Three-dimensional robot localization using cameras in wireless multimedia sensor networks
Kato A remote navigation system for a simple tele-presence robot with virtual reality
Heiniz et al. Landmark-based navigation in complex buildings
US20220060640A1 (en) Server and method for displaying 3d tour comparison
CN112991440A (zh) 车辆的定位方法和装置、存储介质和电子装置
US20240069203A1 (en) Global optimization methods for mobile coordinate scanners
US20220415193A1 (en) Image processing device, image processing method, and program
Strecker et al. MR Object Identification and Interaction: Fusing Object Situation Information from Heterogeneous Sources
CN112050814A (zh) 一种室内变电站无人机视觉导航***及方法
EP4207100A1 (en) Method and system for providing user interface for map target creation
CN117589153B (zh) 一种地图更新的方法及机器人
WO2022227090A1 (zh) 点云数据处理方法、装置、终端设备、控制***和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20918418

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/12/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20918418

Country of ref document: EP

Kind code of ref document: A1