CN113625764A - Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle Download PDF

Info

Publication number
CN113625764A
CN113625764A CN202111005722.2A CN202111005722A CN113625764A CN 113625764 A CN113625764 A CN 113625764A CN 202111005722 A CN202111005722 A CN 202111005722A CN 113625764 A CN113625764 A CN 113625764A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
point
boundary
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111005722.2A
Other languages
Chinese (zh)
Inventor
唐嘉宁
刘雨晴
周思达
李丁奎
张新磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Minzu University
Original Assignee
Yunnan Minzu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Minzu University filed Critical Yunnan Minzu University
Priority to CN202111005722.2A priority Critical patent/CN113625764A/en
Publication of CN113625764A publication Critical patent/CN113625764A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The invention relates to an unmanned aerial vehicle autonomous exploration method based on boundary driving, a navigation system and an unmanned aerial vehicle. The method comprises the following steps: selecting a local boundary point with the largest information gain and the smallest yaw angle as a first target point according to a preset evaluation function, driving the unmanned aerial vehicle to fly according to the yaw angle determined by the evaluation function until the unmanned aerial vehicle reaches the first target point, and repeating the process until no local boundary point exists in the field of view of the unmanned aerial vehicle; and selecting the global boundary point with the largest information gain and the smallest yaw angle as a second target point according to a preset evaluation function, driving the unmanned aerial vehicle to fly according to the yaw angle determined by the evaluation function until the unmanned aerial vehicle reaches the second target point, repeating the process until the global boundary point is explored, and ending the exploration process. According to the technical scheme provided by the invention, the unmanned aerial vehicle has higher exploration speed and shorter time, and can rarely walk on a repeated path, so that the exploration area is larger in the same time, and the exploration time is shorter.

Description

Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle
Technical Field
The invention relates to the field of unmanned aerial vehicle autonomous exploration, instant mapping and path planning, in particular to a method for searching a boundary driving point.
Background
The unmanned aerial vehicle is used in the military field for the first time, but along with the development of science and technology and the progress of society, the military use of the unmanned aerial vehicle is changed to police and the civil use is inevitable. Unmanned aerial vehicle has advantages such as flying speed is fast, detection range is wide, small, light in weight, so it can accomplish the task that the manpower can not be accomplished or is difficult to accomplish in a plurality of fields. The unmanned aerial vehicle has wide application fields, for example, the unmanned aerial vehicle can be applied to air monitoring, target searching and recording, target tracking, guard monitoring and the like in the public security field; the method can be applied to public security of forests, security patrol of forest zones, forest land area measurement, wild animal and plant protection, forest fire prevention monitoring, aerial reconnaissance of fire accidents, mountain forest search and rescue, emergency command of forest disaster relief and the like; the system can be applied to traffic supervision, traffic road condition monitoring, traffic violation evidence obtaining, road condition inspection, air duty and the like in the field of traffic management, can be used for pesticide spraying, pest monitoring, farm management and the like in the field of agriculture, and can be used for aerial photography mapping, geological survey, urban planning, engineering construction and the like in the field of homeland resources. In short years, unmanned aerial vehicles have flown above large and small cities around the world, and the application of unmanned aerial vehicles has rapidly expanded to the aspects of social life under the large background of the development of the internet of things transmission technology.
The unmanned aerial vehicle exploring unknown map and constructing environment has great significance for realizing autonomous navigation, the exploring and obstacle avoidance problem of the unmanned aerial vehicle is a basic component part in the autonomous exploring problem, and how to explore a larger coverage area in the same time is an important index for determining the quality of an exploring effect, one of the popular exploring methods at present is to rapidly explore a random tree (RRT), the method spends a large amount of time to calculate the optimal branch in order to search the optimal branch in the random tree, and the unmanned aerial vehicle can select hovering before not receiving the optimal branch, so that not only time is consumed, but also no help is provided for reducing map entropy (namely, the unknown or uncertain of the map, the map entropy is higher, the map entropy represents more places without exploration, the unmanned aerial vehicle exploring is a process for reducing the uncertainty of the map), and the other classic method is a boundary driving method, the method enables the unmanned aerial vehicle to fly to a point nearest to the current position, and neglect some boundary points far away from unmanned aerial vehicle, this can lead to unmanned aerial vehicle to seek back and forth between two regions, has increased exploration time and the route has also increased, leads to unmanned aerial vehicle to explore inefficiency, and the amount of consuming time is big.
Disclosure of Invention
In order to overcome the problems in the related art, the invention provides an unmanned aerial vehicle autonomous exploration method based on boundary driving, a navigation system and an unmanned aerial vehicle.
According to a first aspect of the embodiments of the present invention, there is provided a boundary-driven unmanned aerial vehicle autonomous exploration method, wherein a boundary is an edge between a known area and an unknown area in a map, and a boundary point is a point formed by discretizing the edge, and the boundary points are divided into two types: local boundary points and global boundary points, wherein the local boundary points are boundary points within the unmanned aerial vehicle field of view, and the global boundary points are boundary points outside the unmanned aerial vehicle field of view, the method comprising:
responding to the fact that local boundary points exist in the visual field of the unmanned aerial vehicle, selecting the local boundary points with the largest information gain and the smallest yaw angle as first target points according to a preset evaluation function, driving the unmanned aerial vehicle to fly according to the yaw angle determined by the evaluation function until the unmanned aerial vehicle reaches the first target points, and repeating the process until the local boundary points do not exist in the visual field of the unmanned aerial vehicle;
responding to the existence of global boundary points outside the view field of the unmanned aerial vehicle, selecting the global boundary point with the largest information gain and the smallest yaw angle as a second target point according to a preset evaluation function, driving the unmanned aerial vehicle to fly according to the yaw angle determined by the evaluation function until the unmanned aerial vehicle reaches the second target point, repeating the process until the exploration of the global boundary points is completed, and ending the exploration process.
Further, the map adopts an eight-fork map, and the generation process of the eight-fork map comprises the following steps:
generating a point cloud according to data collected by a depth camera;
and converting the point cloud into an octagon map.
Further, after the exploration of all global boundary points in the map is completed and before the exploration process is finished, the method further includes:
and checking whether a new local boundary point is generated or not, driving the unmanned aerial vehicle to explore the local boundary point if the new local boundary point exists, repeating the process until the local boundary point and the global boundary point do not exist in the field of view of the unmanned aerial vehicle, and ending the exploration process if the global boundary point and the global boundary point are all explored.
Further, the formula of the evaluation function is:
Figure BDA0003236973120000021
where V (q) is an evaluation value of a point q, λ is a weighting coefficient, θqIs the yaw angle of the drone, F is the set of boundary points, I (m, x)i) Is a point x in the map miThe information gain of (1) is calculated by the formula:
I(m,xi)=H(m)-H(m|xi)
wherein, H (m) represents the information entropy of the map m, and the calculation formula is as follows:
Figure BDA0003236973120000031
H(m|xi) Representative map m at known point xiThe conditional entropy under the condition of (1) is calculated by the formula:
Figure BDA0003236973120000032
wherein, p (m)i,j) Is any point m in the mapi,jProbability of (c), p (x)i) Is a point xiProbability of (c), p (m)i,j|xi) For maps at known points xiUnder the condition of (1), any point mi,jThe probability of (c).
Further, the method further comprises: and in the process that the unmanned aerial vehicle flies to the first target point or the second target point, generating an instantaneous flight instruction according to the condition of the obstacles in the visual field of the unmanned aerial vehicle, so that the unmanned aerial vehicle flies according to the instantaneous flight instruction until the unmanned aerial vehicle reaches the first target point or the second target point.
Further, according to the condition generation instantaneous flight instruction of barrier in the unmanned aerial vehicle field of vision, specifically include:
if no obstacle exists in the field of view of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to fly linearly.
Further, according to the condition generation instantaneous flight instruction of barrier in the unmanned aerial vehicle field of vision, specifically include:
if a wall surface or continuous obstacles exist on one side in the visual field of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to fly in the direction of the midpoint of the residual visual field by one step length, then the unmanned aerial vehicle is controlled to fly in the direction with the minimum yaw angle change when flying by one step length at each time, and the unmanned aerial vehicle is controlled to fly linearly until no obstacles exist in the visual field of the unmanned aerial vehicle.
Further, according to the condition generation instantaneous flight instruction of barrier in the unmanned aerial vehicle field of vision, specifically include:
if the view of the unmanned aerial vehicle is blocked by discontinuous obstacles, the part of the view blocked by the obstacles is discarded, and the unmanned aerial vehicle is controlled to fly towards the midpoint direction of the fan-shaped view of the residual view.
According to a second aspect of embodiments of the present invention, there is provided a drone navigation system, comprising a processor, and a memory having stored thereon executable code that, when executed by the processor, causes the processor to perform the method as described above.
According to a third aspect of embodiments of the present invention, there is provided a drone comprising a drone navigation system as described above.
According to the technical scheme provided by the embodiment of the invention, the optimal boundary point in all the boundary points is selected according to the evaluation function consisting of the information gain and the deflection angle to guide the unmanned aerial vehicle to fly, the exploration speed is higher, the time is shorter, repeated paths are rarely taken, the exploration area is larger in the same time, and the exploration time is shorter.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
FIG. 1 is a schematic block diagram of the instant mapping and path planning;
FIG. 2 is a schematic diagram of an exploration flow of local boundary points and global boundary points;
FIG. 3 is a schematic view of the unmanned aerial vehicle flying after selecting the next flying target point;
fig. 4 is a schematic diagram of 6 environment types designed according to the relative positions of the drone and the indoor environment;
fig. 5 is a schematic view of the next flight of the drone in mode 2.
Detailed Description
Preferred embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that, although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The technical solutions of the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
The embodiment of the invention provides an unmanned aerial vehicle autonomous exploration method based on boundary driving, which is applied to an unmanned aerial vehicle navigation system, wherein a map is required to be constructed by an unmanned aerial vehicle in an exploration process. The working principle of the system is that an RGBD camera emits a beam of infrared light to a detection target, and the camera calculates the distance between an object and the camera according to a returned structured light pattern. After the depth is measured, the RGBD camera usually completes the pairing between the depth and the color image pixels according to the placement position of each camera, outputs a color image and a depth image corresponding to each other one by one, calculates the 3D camera coordinates of the pixels, and generates a Point Cloud (Point Cloud).
Although point cloud data is obtained from the RGBD camera, due to the disadvantages of large point cloud scale, large memory occupation, slow update speed, etc., a map form with better compression performance is used in this embodiment: OctreeImage (Octomap). The octree map is a flexible, compressed and updated map form, and is different from the point cloud map in that when all the sub-nodes of a certain square block are occupied or not, the node does not need to be expanded. For example, actual objects are often connected together and blank places are often connected together, so most octree nodes do not need to be spread to the leaf level. Therefore, octree saves a lot of storage space compared to point clouds.
Therefore, in the embodiment, the map adopted by the unmanned aerial vehicle is an octree map, the unmanned aerial vehicle can establish the octree map while exploring, the unmanned aerial vehicle firstly adopts the camera data of the depth camera carried by the unmanned aerial vehicle to generate point cloud, and then further converts the point cloud to generate the octree map,
fig. 1 is a schematic frame diagram of the instant mapping and path planning, wherein the left part is the working principle of the depth camera for realizing the instant mapping, wherein the instant environmental depth information collected by the depth camera is output as a depth map by using a visual depth estimation method, meanwhile, a key frame image and IMU information are output as pose information by using a fusion inertial measurement unit IMU and a visual state estimation method carried on the depth camera, and an octree map can be established by using the instant depth map and the pose information.
The right side of fig. 1 is the operation principle of path planning, and mainly detects boundary points in the octree map, and the boundary points are formed by discretizing the boundary between the known area and the unknown area. The method divides the boundary points into two types according to the characteristics of the two types of boundary points: local boundary points and global boundary points. The local boundary points are boundary points in the visual field range of the unmanned aerial vehicle; the global boundary point is a boundary point outside the view range of the drone, i.e. a point relatively far away from the drone, for example, when the drone is in a living room, some unexplored points in the bedroom are outside the view range of the drone. And (3) providing composite boundary points by combining the respective advantages of the local boundary points and the global boundary points, acquiring points capable of minimizing map entropy in all the boundary points as target points through an evaluation function, guiding the unmanned aerial vehicle to fly to the target points by utilizing an unmanned aerial vehicle control instruction, and repeating the process to realize the complete exploration of the unknown map.
FIG. 2 is a composite boundary point movement plan, because the local boundary point is near the current view of the UAV, it is most convenient and time-saving to search for the local boundary point, and it is most efficient to search for the local boundary point when there is a local point, so in the embodiment of the invention, after the UAV detects the boundary point, it first finds the local boundary point, if there is a local boundary point, it controls it to search for the local boundary point, and repeats the process until the local boundary point has been searched, and then finds whether there is a global boundary point, because the global boundary point is anywhere in the whole map, the global boundary point is more dispersed and farther from the UAV, when the local point search is completed, it is an efficient choice to search for the global boundary point, and then it searches for the global boundary point again until all global boundary points have been searched, this time means that the local boundary point and the global boundary point have been searched completely, the map has been globally explored.
When searching for the local boundary point and the global boundary point, because there are many points, the point with the largest information gain and the smallest yaw angle is selected according to the evaluation function to search. In this embodiment, the unmanned aerial vehicle constructs an octree map in the exploration process, the octree map is a grid with variable resolution, as shown in fig. 3, occupied voxels are the place where there is an obstacle, and the obstacle occupies the space, and the place is called occupied voxels in the octree; unknown voxels indicate that this place has not been visited by the drone, those places are unknown, and those places are called unknown voxels in the octree; the free voxels indicate that the place has no obstacles and is a place where the unmanned plane can fly, and the places are called free voxels in the octree; the boundary points are the points between the free voxels and the unknown voxels.
In a specific embodiment, as shown in fig. 3, the drone is in the lower left corner, the drone being currently oriented at (x)curr,ycurr,zcurr) At this time, the local points of the map are searched, but some global points still exist in the upper right corner, and one point (x) is calculatedatt,yatt,zatt) The point obtained by calculation of the evaluation function balances the maximum information gain and the minimum yaw angle, and therefore the point is used as a target point to guide the unmanned aerial vehicle to fly to the target point.
Specifically, the formula of the evaluation function is as follows:
Figure BDA0003236973120000071
where V (q) is an evaluation value of a point q, λ is a weighting coefficient, θqIs the yaw angle of the drone, F is the set of boundary points, I (m, x)i) Is a point x in the map miThe information gain of (1) is calculated by the formula:
I(m,xi)=H(m)-H(m|xi)
wherein, H (m) represents the information entropy of the map m, and the calculation formula is as follows:
Figure BDA0003236973120000072
H(m|xi) Representative map m at known point xiThe conditional entropy under the condition of (1) is calculated by the formula:
Figure BDA0003236973120000073
wherein, p (m)i,j) Is any point m in the mapi,jProbability of (c), p (x)i) Is a point xiProbability of (c), p (m)i,j|xi) For maps at known points xiUnder the condition of (1), any point mi,jThe probability of (c).
In order to avoid collision caused by too fast flying speed of the unmanned aerial vehicle in the exploration process and untimely adjustment of the current flying mode, in the embodiment, according to characteristics of indoor scenes, scenes which the unmanned aerial vehicle may encounter are preset in advance, the conditions of obstacles in the field of view of the unmanned aerial vehicle are divided into six types, as shown in fig. 4, the unmanned aerial vehicle and the environment are relative type diagrams, and corresponding modes are set for all the types, wherein the mode 1 is an obstacle-free mode, the mode 2 is a right side wall mode, the mode 3 is a left side wall mode, the mode 4 is a right obstacle-free mode, the mode 5 is a left obstacle-free mode, and the mode 6 is a multi-obstacle mode.
For example, when the drone is in mode 1, representing that there is no obstacle in the view of the drone at this time, the drone may fly straight; when unmanned aerial vehicle is in mode 2, 3, the right or the left side that represent unmanned aerial vehicle this moment is the wall or like the general continuous barrier of wall, this moment for make unmanned aerial vehicle's the field of vision keep initial size as far as possible, avoid unmanned aerial vehicle because do not have the field of vision and can't explore, select to abandon this field of vision, find the mid point in remaining field of vision, unmanned aerial vehicle has drifted certain angle in other words, treat after its flight a period, its field of vision alright all from being sheltered from regional liberation, unmanned aerial vehicle driftage angle is as follows this moment:
as shown in FIG. 5, let the intersection point of the fan-shaped field of view of the unmanned aerial vehicle and the wall surface be A (x)A,yA,zA) The current coordinate of the unmanned plane is B (x)B,yB,zB) Original flight direction and vector of unmanned aerial vehicle
Figure BDA0003236973120000074
The included angle between the two is phi, and the tangent angle can be obtained:
Figure BDA0003236973120000081
at this time phi satisfies:
Figure BDA0003236973120000082
let the yaw angle between the next flight direction and the current flight direction of the unmanned aerial vehicle be theta, and the size of the fan-shaped view angle of the unmanned aerial vehicle be theta
Figure BDA0003236973120000083
The following can be obtained:
Figure BDA0003236973120000084
controlling the unmanned aerial vehicle to fly in a step length according to the calculated yaw angle theta to the midpoint direction of the residual field of view, and then controlling the unmanned aerial vehicle to fly in the direction with the minimum yaw angle change when flying in a step length at each time until no obstacle exists in the field of view of the unmanned aerial vehicle, wherein the formula for calculating the yaw angle at each time is as follows:
Figure BDA0003236973120000085
Rt+1=Rt+ΔR
θt+1=θtgoal
t∈0,1,....t,...,n-1,n
Rmin≤△R≤Rmax
0≤θt≤2π
here, θcandidatesRepresenting the calculated yaw angle of each point,
Figure BDA0003236973120000086
boundary points, θ, representing the state of a voxel in the current field of view as freegoalIs the amount of yaw of the target point, θtIs the angle of flight from time t, RtDenotes the step size at time t, Δ R is used to update RtOf a predefined increment of RminAnd RmaxRepresenting the maximum step size and the minimum step size of the drone.
Because unmanned aerial vehicle corner size can influence unmanned aerial vehicle's airspeed, in this embodiment, select to fly to the minimum direction of unmanned aerial vehicle yaw angle change, can make unmanned aerial vehicle last fast flight. When the unmanned aerial vehicle deflects to the visual field and does not receive any blockage, representing that no barrier exists in the visual field at the moment, returning to the mode 1, and flying according to the instruction of the mode 1; when unmanned aerial vehicle was in mode 4, 5, 6, represented unmanned aerial vehicle's the field of vision and blockked by discontinuous barrier, at this moment, selected this part field of vision of abandoning by the barrier shelters from, toward remaining fan-shaped visual angle midpoint direction flight, at this moment, the unmanned aerial vehicle field of vision can flee gradually and be sheltered from by the barrier.
In this embodiment, a plurality of environment types have been designed, make unmanned aerial vehicle can respond to unknown environment fast, and the cooperation mode that combines two kinds of boundary points and boundary point evaluation function to mutually support realizes that unmanned aerial vehicle can explore the environment fast under unknown environment and reach the advantage of exploring all regions as far as possible. In addition, the autonomous exploration efficiency of the unmanned aerial vehicle is greatly improved, and the exploration speed can reach 2.5m/s at most. Finally, the unmanned aerial vehicle exploration is visualized, an octree map is established, whether the exploration is complete or not and whether the size of the built map is correct or not are convenient to watch, the flight path of the unmanned aerial vehicle is convenient to track, an operator can visually see the flight path of the unmanned aerial vehicle, and whether the unmanned aerial vehicle repeatedly explores in the same area or not and whether the observation and exploration path is smooth or not is analyzed.
Through simulation experiments on the technical scheme of the embodiment of the invention, the result shows that the method has higher exploration speed and higher coverage rate than the existing method, and has good feasibility and effectiveness.
The embodiment of the invention provides an unmanned aerial vehicle navigation system which comprises a processor and a memory.
The Processor may be a Central Processing Unit (CPU), other general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may include various types of storage units such as system memory, Read Only Memory (ROM), and permanent storage. Wherein the ROM may store static data or instructions that are required by the processor or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. Further, the memory may comprise any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, the memory may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a read-only digital versatile disc (e.g., DVD-ROM, dual layer DVD-ROM), a read-only Blu-ray disc, an ultra-dense optical disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disc, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory has stored thereon executable code which, when processed by the processor, causes the processor to perform some or all of the methods described above.
The embodiment of the invention provides an unmanned aerial vehicle, which comprises the unmanned aerial vehicle navigation system.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out some or all of the steps of the above-described method of the invention.
Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform part or all of the various steps of the above-described method according to the invention.
The aspects of the invention have been described in detail hereinabove with reference to the drawings. In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments. Those skilled in the art should also appreciate that the acts and modules referred to in the specification are not necessarily required by the invention. In addition, it can be understood that the steps in the method according to the embodiment of the present invention may be sequentially adjusted, combined, and deleted according to actual needs, and the modules in the device according to the embodiment of the present invention may be combined, divided, and deleted according to actual needs.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. A boundary-driven unmanned aerial vehicle autonomous exploration method is disclosed, wherein a boundary is an edge between a known area and an unknown area in a map, and a boundary point is a point formed by discretizing the edge, and the boundary points are divided into two types: local boundary points and global boundary points, wherein the local boundary points are boundary points within the unmanned aerial vehicle field of view, and the global boundary points are boundary points outside the unmanned aerial vehicle field of view, the method comprising:
responding to the fact that local boundary points exist in the visual field of the unmanned aerial vehicle, selecting the local boundary points with the largest information gain and the smallest yaw angle as first target points according to a preset evaluation function, driving the unmanned aerial vehicle to fly according to the yaw angle determined by the evaluation function until the unmanned aerial vehicle reaches the first target points, and repeating the process until the local boundary points do not exist in the visual field of the unmanned aerial vehicle;
responding to the existence of global boundary points outside the view field of the unmanned aerial vehicle, selecting the global boundary point with the largest information gain and the smallest yaw angle as a second target point according to a preset evaluation function, driving the unmanned aerial vehicle to fly according to the yaw angle determined by the evaluation function until the unmanned aerial vehicle reaches the second target point, repeating the process until the exploration of the global boundary points is completed, and ending the exploration process.
2. The method of claim 1, wherein the map is an octal map, and wherein the octal map is generated by a process comprising:
generating a point cloud according to data collected by a depth camera;
and converting the point cloud into an octagon map.
3. The method of claim 1, wherein after the exploration of all global boundary points in the map is completed and before the exploration process is finished, the method further comprises:
and checking whether a new local boundary point is generated or not, driving the unmanned aerial vehicle to explore the local boundary point if the new local boundary point exists, repeating the process until the local boundary point and the global boundary point do not exist in the field of view of the unmanned aerial vehicle, and ending the exploration process if the global boundary point and the global boundary point are all explored.
4. The method of claim 3, wherein the evaluation function is formulated as:
Figure FDA0003236973110000011
where V (q) is an evaluation value of a point q, λ is a weighting coefficient, θqIs the yaw angle of the drone, F is the set of boundary points, I (m, x)i) Is a point x in the map miThe information gain of (1) is calculated by the formula:
I(m,xi)=H(m)-H(m|xi)
wherein, H (m) represents the information entropy of the map m, and the calculation formula is as follows:
Figure FDA0003236973110000021
H(m|xi) Representative map m at known point xiThe conditional entropy under the condition of (1) is calculated by the formula:
Figure FDA0003236973110000022
wherein, p (m)i,j) Is groundAny point m in the figurei,jProbability of (c), p (x)i) Is a point xiProbability of (c), p (m)i,j|xi) For maps at known points xiUnder the condition of (1), any point mi,jThe probability of (c).
5. The method according to any one of claims 1-4, further comprising:
and in the process that the unmanned aerial vehicle flies to the first target point or the second target point, generating an instantaneous flight instruction according to the condition of the obstacles in the visual field of the unmanned aerial vehicle, so that the unmanned aerial vehicle flies according to the instantaneous flight instruction until the unmanned aerial vehicle reaches the first target point or the second target point.
6. The method according to claim 5, wherein the generating of the instantaneous flight command according to the condition of the obstacle in the field of view of the unmanned aerial vehicle comprises:
if no obstacle exists in the field of view of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to fly linearly.
7. The method according to claim 5, wherein the generating of the instantaneous flight command according to the condition of the obstacle in the field of view of the unmanned aerial vehicle comprises:
if a wall surface or continuous obstacles exist on one side in the visual field of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to fly in the direction of the midpoint of the residual visual field by one step length, then the unmanned aerial vehicle is controlled to fly in the direction with the minimum yaw angle change when flying by one step length at each time, and the unmanned aerial vehicle is controlled to fly linearly until no obstacles exist in the visual field of the unmanned aerial vehicle.
8. The method according to claim 5, wherein the generating of the instantaneous flight command according to the condition of the obstacle in the field of view of the unmanned aerial vehicle comprises:
if the view of the unmanned aerial vehicle is blocked by discontinuous obstacles, the part of the view blocked by the obstacles is discarded, and the unmanned aerial vehicle is controlled to fly towards the midpoint direction of the fan-shaped view of the residual view.
9. A drone navigation system comprising a processor, and a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any one of claims 1-8.
10. A drone comprising the drone navigation system of claim 9.
CN202111005722.2A 2021-08-30 2021-08-30 Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle Pending CN113625764A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111005722.2A CN113625764A (en) 2021-08-30 2021-08-30 Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111005722.2A CN113625764A (en) 2021-08-30 2021-08-30 Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN113625764A true CN113625764A (en) 2021-11-09

Family

ID=78388401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111005722.2A Pending CN113625764A (en) 2021-08-30 2021-08-30 Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN113625764A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117606489A (en) * 2024-01-22 2024-02-27 华南农业大学 Unmanned aerial vehicle flight survey method, unmanned aerial vehicle flight survey equipment and unmanned aerial vehicle flight survey medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117606489A (en) * 2024-01-22 2024-02-27 华南农业大学 Unmanned aerial vehicle flight survey method, unmanned aerial vehicle flight survey equipment and unmanned aerial vehicle flight survey medium
CN117606489B (en) * 2024-01-22 2024-05-28 华南农业大学 Unmanned aerial vehicle flight survey method, unmanned aerial vehicle flight survey equipment and unmanned aerial vehicle flight survey medium

Similar Documents

Publication Publication Date Title
US11393216B2 (en) Method of computer vision based localisation and navigation and system for performing the same
CN107015559B (en) Probabilistic inference of target tracking using hash weighted integration and summation
Azpúrua et al. Multi-robot coverage path planning using hexagonal segmentation for geophysical surveys
KR20200134313A (en) Relative Atlas and Its Creation for Autonomous Vehicles
KR102286005B1 (en) Cruise control system and cruise control method thereof
EP2818957A1 (en) System and method for UAV landing
US11370115B2 (en) Path planning for an unmanned vehicle
US11237269B2 (en) Localization technique
CN110515390B (en) Autonomous landing method and device of aircraft, electronic equipment and storage medium
CN113924459B (en) Dynamic sensor range detection for vehicle navigation
KR102241584B1 (en) Method and device for detecting emergency vehicles in real time and planning driving routes to cope with situations to be expected to be occurred by the emergency vehicles
US11970185B2 (en) Data structure for storing information relating to an environment of an autonomous vehicle and methods of use thereof
US11645775B1 (en) Methods and apparatus for depth estimation on a non-flat road with stereo-assisted monocular camera in a vehicle
Wallar et al. Foresight: Remote sensing for autonomous vehicles using a small unmanned aerial vehicle
Smith et al. Real-time egocentric navigation using 3d sensing
US11474255B2 (en) System and method for determining optimal lidar placement on autonomous vehicles
Collins et al. Using a DEM to determine geospatial object trajectories
CN113625764A (en) Unmanned aerial vehicle autonomous exploration method based on boundary driving, navigation system and unmanned aerial vehicle
CN114379802A (en) Automatic safe landing place selection for unmanned flight system
Kamat et al. A survey on autonomous navigation techniques
Hong et al. Hierarchical world model for an autonomous scout vehicle
CN116310743A (en) Method, device, mobile device and storage medium for determining expansion strategy
TWI809727B (en) Method for searching a path by using a three-dimensional reconstructed map
Moustafa et al. Towards Autonomous Drone-Based Dynamic and Seismic Response Monitoring of Bridges
Gal et al. Motion Planning in 3D Environments Using Visibility Velocity Obstacles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination