CN109752004B - Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle - Google Patents

Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle Download PDF

Info

Publication number
CN109752004B
CN109752004B CN201811641419.XA CN201811641419A CN109752004B CN 109752004 B CN109752004 B CN 109752004B CN 201811641419 A CN201811641419 A CN 201811641419A CN 109752004 B CN109752004 B CN 109752004B
Authority
CN
China
Prior art keywords
dimensional code
aerial vehicle
unmanned aerial
area information
flying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811641419.XA
Other languages
Chinese (zh)
Other versions
CN109752004A (en
Inventor
朱宁莉
邓海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aisino Corp
Original Assignee
Aisino Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisino Corp filed Critical Aisino Corp
Priority to CN201811641419.XA priority Critical patent/CN109752004B/en
Publication of CN109752004A publication Critical patent/CN109752004A/en
Application granted granted Critical
Publication of CN109752004B publication Critical patent/CN109752004B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The invention provides an indoor unmanned aerial vehicle navigation method, which comprises the following steps: when the area information of the flying target and the area information of the takeoff origin are compared and fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset slow tracking mode, controlling the unmanned aerial vehicle to fly in the slow tracking mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset accurate positioning mode, the unmanned aerial vehicle is controlled to fly in the accurate positioning mode until the flying target is reached, and the flying target is a storage unit in an indoor warehouse. The invention adopts the two-dimensional code to carry out unmanned aerial vehicle navigation, adopts the idea of environment service to the system, and avoids the defects of system complexity, unmanned aerial vehicle dead weight increase and the like caused by complex visual image processing.

Description

Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle
Technical Field
The invention relates to the technical field of unmanned aerial vehicle navigation, in particular to an indoor unmanned aerial vehicle navigation method and device and an indoor unmanned aerial vehicle.
Background
Unmanned aerial vehicle autonomous control all has great restriction to task energy consumption, airborne sensor weight. And when unmanned aerial vehicle independently flies at indoor environment, generally need to utilize vision sensor to acquire indoor environment information in order to realize the location. The currently common vision sensors include laser sensors, monocular cameras, binocular cameras, and RGB-D (color and depth) cameras, etc.
On the other hand, in an industrial environment, such as a large warehouse, various positioning algorithms based on a visual sensor may have a situation in which a feature cannot be found and thus positioning is not possible at all due to the high similarity of local environments.
Disclosure of Invention
The invention provides an indoor unmanned aerial vehicle navigation method and device and an indoor unmanned aerial vehicle, and aims to solve the problems of complexity, insufficient accuracy and the like of a positioning method of a small unmanned aerial vehicle in autonomous flight in an indoor environment.
In a first aspect, the invention provides an indoor unmanned aerial vehicle navigation method, which comprises the following steps:
step S1: when the area information of the flying target and the area information of the takeoff origin are compared and fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode;
step S2: analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a fast cruise mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset slow tracking mode, controlling the unmanned aerial vehicle to fly in the slow tracking mode;
step S3: analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a slow tracking mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset accurate positioning mode, the unmanned aerial vehicle is controlled to fly in the accurate positioning mode until the flying target is reached, and the flying target is a storage unit in an indoor warehouse.
Preferably, the method further comprises the following steps:
according to the multiple static two-dimensional code images, identifying the region information of the takeoff origin where the unmanned aerial vehicle is located, wherein the static two-dimensional code images are generated when the camera device shoots adjacent regions from different angles before flying.
Preferably, the step S1 includes:
extracting the area information of a flying target from a flying target two-dimensional code image provided in a flying instruction, wherein the flying target two-dimensional code image is the same as a two-dimensional code image arranged at the flying target;
comparing the two position marks in the two-dimensional code map according to the position mark in the area information of the takeoff origin and the position mark in the area information of the flying target, when determining that the two position marks do not belong to the pre-divided position areas with the same number,
and when the unmanned aerial vehicle is determined to fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode, wherein the two-dimensional code map is generated in advance according to the position relation of two-dimensional codes arranged on each storage unit in the indoor warehouse.
Preferably, the step S2 includes:
when the unmanned aerial vehicle flies in a fast cruise mode, a camera device shoots two-dimensional code pictures of adjacent areas in real time according to a first updating frequency to generate dynamic two-dimensional code images, and the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the first updating frequency;
if the position identifier in the area information of the flying target and the position identifier in the area information of the current flying area both belong to the pre-divided position areas with the same number in the two-dimensional code map in the continuous n detection periods,
determining to fall into a preset slow tracking mode, and controlling the unmanned aerial vehicle to fly in the slow tracking mode.
Preferably, the step S3 includes:
when the unmanned aerial vehicle flies in a slow tracking mode, the camera device shoots two-dimensional code pictures of adjacent areas in real time according to a second updating frequency to generate dynamic two-dimensional code images, wherein the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the second updating frequency, and the second updating frequency is greater than the first updating frequency;
if the position identifier in the area information of the flying target and the position identifier in the area information of the current flying area both belong to the pre-divided shelf areas with the same number in the two-dimensional code map in the continuous m detection periods,
and when the unmanned aerial vehicle is determined to fall into the preset accurate positioning mode, controlling the unmanned aerial vehicle to fly in the accurate positioning mode until the unmanned aerial vehicle reaches a flying target.
Preferably, when the unmanned aerial vehicle flies in the accurate positioning mode, the camera device shoots the two-dimensional code pictures of the adjacent area in real time according to a third updating frequency to generate a dynamic two-dimensional code image, the time interval of each two-dimensional code image in the dynamic two-dimensional code image is the third updating frequency, and the third updating frequency is greater than the second updating frequency.
Preferably, in the fast cruise mode, the unmanned aerial vehicle flies fast according to two-azimuth path strategies provided in the two-dimensional code map;
in a slow tracking mode, the unmanned aerial vehicle flies slowly according to a three-azimuth path strategy provided in the two-dimensional code map;
in the accurate positioning mode, the unmanned aerial vehicle flies at a low speed according to the five-azimuth path strategy provided in the two-dimensional code map.
In a second aspect, the present invention provides an indoor unmanned aerial vehicle navigation apparatus, including:
a fast cruise control module to:
when the area information of the flying target and the area information of the takeoff origin are compared and fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode;
a slow tracking control module to:
analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a fast cruise mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset slow tracking mode, controlling the unmanned aerial vehicle to fly in the slow tracking mode;
a fine positioning control module for:
analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a slow tracking mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset accurate positioning mode, the unmanned aerial vehicle is controlled to fly in the accurate positioning mode until the flying target is reached, and the flying target is a storage unit in an indoor warehouse.
Preferably, the method further comprises the following steps:
a takeoff origin determination module to:
according to the multiple static two-dimensional code images, identifying the region information of the takeoff origin where the unmanned aerial vehicle is located, wherein the static two-dimensional code images are generated when the camera device shoots adjacent regions from different angles before flying.
In a third aspect, the present invention provides an indoor unmanned aerial vehicle provided with the navigation device explained in the second aspect.
The indoor unmanned aerial vehicle navigation method provided by the invention is based on an indoor map system combined with the two-dimensional code and a two-dimensional code image arranged on an indoor warehouse for navigation, and is simple to realize and accurate in positioning; according to the accurate plane or three-dimensional map information of the indoor environment of the fixed operation of unmanned aerial vehicle that obtains, combine the two-dimensional code to form indoor map system, the robustness is high, can provide unique position identification information to help unmanned aerial vehicle provide accurate location.
Drawings
A more complete understanding of exemplary embodiments of the present invention may be had by reference to the following drawings in which:
fig. 1 is a schematic flow chart of an indoor unmanned aerial vehicle navigation method according to a preferred embodiment of the present invention;
fig. 2 is a schematic composition diagram of an indoor unmanned aerial vehicle navigation device according to a preferred embodiment of the present invention;
fig. 3 is a schematic view of a navigation path between position identifiers in a two-dimensional code map according to a preferred embodiment of the present invention.
Detailed Description
The exemplary embodiments of the present invention will now be described with reference to the accompanying drawings, however, the present invention may be embodied in many different forms and is not limited to the embodiments described herein, which are provided for complete and complete disclosure of the present invention and to fully convey the scope of the present invention to those skilled in the art. The terms used in the exemplary embodiments shown in the drawings are not intended to limit the present invention. In the drawings, the same units/elements are denoted by the same reference numerals.
Unless otherwise defined, terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Further, it will be understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense.
Although the airborne sensor of the unmanned aerial vehicle is developing towards the direction of combining accuracy and precision, light weight and high cost effectiveness, the small unmanned aerial vehicle can fly in an indoor environment independently, and the technical problem still exists.
For example, a scanning dot matrix acquired by a laser sensor is registered and positioned by using an Iterative Closest Point (ICP) algorithm, so that the method has the advantages of good calculation real-time performance, stable output positioning information and the like, and many laboratories at home and abroad can realize positioning and unmanned aerial vehicle autonomous flight in certain specific indoor environments by using the method. The method can only obtain two-dimensional scanning information, so that the method is suitable for the environment with multiple vertical planes; the disadvantage of insufficient perception capability is highlighted in a complex three-dimensional environment.
For a monocular camera, a Motion Structure From Motion (SFM) method is generally used to calculate a fundamental matrix, so as to obtain a Motion direction of the camera, but the method cannot recover a Motion distance, so that the method cannot be used in an unknown complex indoor environment.
The binocular vision system can recover the depth information of a plurality of points in the image, the pixel points in the image are mapped to a three-dimensional space, so that three-dimensional depth information is obtained, and then the motion direction and the distance of the camera system are solved by utilizing the incidence relation between the three-dimensional depth information. The method has strict calibration requirements on the camera, and therefore, the method is expensive.
The environmental information obtained by the RGB-D camera is similar to that of a binocular camera, the three-dimensional position information from a space point to the camera and the common two-dimensional image information can be directly obtained, and the information of 6 degrees of freedom such as the motion direction and the distance of the camera can be obtained by a method similar to that of the binocular camera. Compared with a binocular camera, the RGB-D camera has the advantages of low price; however, the RGB-D camera has problems such as poor data quality, high noise, and inherent data delay.
Due to the development of encoding technology, two-dimensional codes are widely used in various applications. The two-dimensional code records data symbol information by using black and white patterns which are distributed according to a certain rule in a two-dimensional direction, and has the advantages of high information density, large information capacity, strong fault-tolerant capability, reliable decoding and the like. On the other hand, the two-dimensional code image is low in cost for generation, use and copying, easy to manufacture and durable.
The invention adopts the two-dimensional code to carry out unmanned aerial vehicle navigation, adopts the idea of environment service to the system, and avoids the defects of system complexity, unmanned aerial vehicle dead weight increase and the like caused by complex visual image processing.
The indoor unmanned aerial vehicle navigation method provided by the invention combines the two-dimensional codes preset in the warehouse, on the multilayer goods shelves and on each storage unit, and the two-dimensional code map generated based on the two-dimensional codes, so that the on-line navigation of the indoor unmanned aerial vehicle is realized, the cost is low, and the robustness is high.
As shown in fig. 1, the indoor unmanned aerial vehicle navigation method of the embodiment of the present invention includes the following steps:
step S1: when the area information of the flying target and the area information of the takeoff origin are compared and fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode;
step S2: analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a fast cruise mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset slow tracking mode, controlling the unmanned aerial vehicle to fly in the slow tracking mode;
step S3: analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a slow tracking mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset accurate positioning mode, the unmanned aerial vehicle is controlled to fly in the accurate positioning mode until the flying target is reached, and the flying target is a storage unit in an indoor warehouse.
Specifically, the navigation method further includes:
according to the multiple static two-dimensional code images, identifying the region information of the takeoff origin where the unmanned aerial vehicle is located, wherein the static two-dimensional code images are generated when the camera device shoots adjacent regions from different angles before flying.
It should be understood that a static two-dimensional code refers to a picture taken of a two-dimensional code picture when the image pickup apparatus is relatively stationary with respect to a photographic subject.
Specifically, in the navigation method, step S1 specifically includes:
extracting the area information of a flying target from a flying target two-dimensional code image provided in a flying instruction, wherein the flying target two-dimensional code image is the same as a two-dimensional code image arranged at the flying target;
comparing the two position marks in the two-dimensional code map according to the position mark in the area information of the takeoff origin and the position mark in the area information of the flying target, when determining that the two position marks do not belong to the pre-divided position areas with the same number,
and when the unmanned aerial vehicle is determined to fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode, wherein the two-dimensional code map is generated in advance according to the position relation of two-dimensional codes arranged on each storage unit in the indoor warehouse.
Specifically, in the navigation method, the step S2 includes:
when the unmanned aerial vehicle flies in a fast cruise mode, a camera device shoots two-dimensional code pictures of adjacent areas in real time according to a first updating frequency to generate dynamic two-dimensional code images, and the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the first updating frequency;
if the position identifier in the area information of the flying target and the position identifier in the area information of the current flying area both belong to the pre-divided position areas with the same number in the two-dimensional code map in the continuous n detection periods,
determining to fall into a preset slow tracking mode, and controlling the unmanned aerial vehicle to fly in the slow tracking mode.
It should be understood that the dynamic two-dimensional code refers to a picture obtained by taking a picture of a two-dimensional code while the image pickup apparatus is relatively moved with respect to a photographic target.
Specifically, in the navigation method, the step S3 includes:
when the unmanned aerial vehicle flies in a slow tracking mode, the camera device shoots two-dimensional code pictures of adjacent areas in real time according to a second updating frequency to generate dynamic two-dimensional code images, wherein the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the second updating frequency, and the second updating frequency is greater than the first updating frequency;
if the position identifier in the area information of the flying target and the position identifier in the area information of the current flying area both belong to the pre-divided shelf areas with the same number in the two-dimensional code map in the continuous m detection periods,
and when the unmanned aerial vehicle is determined to fall into the preset accurate positioning mode, controlling the unmanned aerial vehicle to fly in the accurate positioning mode until the unmanned aerial vehicle reaches a flying target.
Specifically, in the navigation method, when the unmanned aerial vehicle flies in the accurate positioning mode, the camera device shoots two-dimensional code pictures of adjacent areas in real time according to a third updating frequency to generate dynamic two-dimensional code images, wherein the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the third updating frequency, and the third updating frequency is greater than the second updating frequency.
Specifically, in the navigation method, in a fast cruise mode, an unmanned aerial vehicle flies fast according to two azimuth path strategies provided in a two-dimensional code map;
in a slow tracking mode, the unmanned aerial vehicle flies slowly according to a three-azimuth path strategy provided in the two-dimensional code map;
in the accurate positioning mode, the unmanned aerial vehicle flies at a low speed according to the five-azimuth path strategy provided in the two-dimensional code map.
It should be understood that as the unmanned aerial vehicle gradually approaches the flight target, the update cycle of the two-dimensional code image is gradually shortened, and the update frequency is gradually increased, which is beneficial to providing navigation with higher resolution when the unmanned aerial vehicle approaches the flight target; and is also beneficial to saving the computing resources consumed by picture processing.
As shown in fig. 2, an indoor drone navigation device according to an embodiment of the present invention includes:
a fast cruise control module 10 for:
when the area information of the flying target and the area information of the takeoff origin are compared and fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode;
a slow tracking control module 20 for:
analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a fast cruise mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset slow tracking mode, controlling the unmanned aerial vehicle to fly in the slow tracking mode;
a fine positioning control module 20 for:
analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a slow tracking mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset accurate positioning mode, the unmanned aerial vehicle is controlled to fly in the accurate positioning mode until the flying target is reached, and the flying target is a storage unit in an indoor warehouse.
Specifically, the apparatus further comprises:
a takeoff origin determination module to:
according to the multiple static two-dimensional code images, identifying the region information of the takeoff origin where the unmanned aerial vehicle is located, wherein the static two-dimensional code images are generated when the camera device shoots adjacent regions from different angles before flying.
During specific implementation, set up two at least cameras that are used for reading location navigation two-dimensional code on unmanned aerial vehicle. Preferably, one of the cameras is arranged in front of the unmanned aerial vehicle and used for acquiring image information in front of the unmanned aerial vehicle and shooting two-dimensional code pictures arranged on each storage unit in the warehouse during straight flight; the system comprises an unmanned aerial vehicle, a storage battery, a camera and a camera, wherein the unmanned aerial vehicle is arranged on the side surface of the unmanned aerial vehicle, is used for acquiring image information on two sides of the unmanned aerial vehicle, and is suitable for shooting two-dimensional code pictures arranged in adjacent warehouses or adjacent shelves when the unmanned aerial vehicle flies straight; one is arranged in unmanned aerial vehicle's top for acquire the image information of unmanned aerial vehicle top.
After the unmanned aerial vehicle synthesizes the two-dimensional code image information acquired by the plurality of cameras, the corresponding position of the current geographical position of the unmanned aerial vehicle on the two-dimensional code map is determined.
During specific implementation, when the two-dimensional code for navigation is set, the two-dimensional code can be divided into two categories according to the orientation of the two-dimensional code, namely a two-dimensional code picture for setting on the top layer of the warehouse and a two-dimensional code picture for setting the side face of each direction on the horizontal layer of the warehouse.
Corresponding to the arrangement positions of the two-dimensional codes on the warehouse, the goods shelf and the storage unit, wherein each two-dimensional code picture corresponds to a position mark on a two-dimensional code map; according to the role of the two-dimension code in the navigation path of the cargo compartment, a two-direction path strategy, a three-direction path strategy and a five-direction path strategy shown in figures 3(a), (b) and (c) are respectively bound, and respectively correspond to the first updating frequency, the second updating frequency and the third updating frequency of the dynamic two-dimension code image in sequence;
fig. 3(a), (b) and (c) show navigation paths along position marks, junctions thereof and connecting lines on a two-dimensional code map, wherein solid rectangles correspond to the position marks of one two-dimensional code, and path intersections have three typical path forms of a shape like a Chinese character yi, a shape like a Y and a shape like a Chinese character mi.
In the fast cruise mode, the unmanned aerial vehicle flies fast according to two-direction path strategies provided in the two-dimension code map, and at the moment, the requirement on the updating frequency of the dynamic two-dimension code is not high; correspondingly, after the two-direction route strategy is selected, route updating can be kept for a long time, and only 1 other two-dimensional code picture is needed for providing position identification of 1 other direction for each updating, so that different cargo holds can be crossed, the two-dimensional code pictures are close to a flight target area, and the yaw angle resolution is 180 degrees.
In a slow tracking mode, the unmanned aerial vehicle flies slowly according to a three-azimuth path strategy provided in the two-dimensional code map, and at the moment, the requirement on the updating frequency of the dynamic two-dimensional code is improved; accordingly, the three-azimuth path strategy needs to be updated frequently in a short time interval, and each update needs at least 3 two-dimensional code pictures to provide 3 other-direction position identifications, so as to ensure that the three-azimuth path strategy spans different shelves and is close to a flight target area, and the yaw angle resolution is 120 degrees.
In the accurate positioning mode, the unmanned aerial vehicle flies at a low speed according to the five-azimuth path strategy provided in the two-dimensional code map, and at the moment, the requirement on the updating frequency of the dynamic two-dimensional code is further improved; accordingly, the five-azimuth path policy needs to be updated frequently in a short time interval, and each update needs at least 4 two-dimensional code pictures to provide 4 position identifiers in other directions, so as to ensure that the yaw angle resolution is 90 degrees when the storage unit spans different storage units and is close to a flight target area.
In conclusion, the indoor unmanned aerial vehicle navigation method combines two-dimensional codes preset in the warehouse, on the multilayer goods shelves and on each storage unit, and the two-dimensional code map generated based on the two-dimensional codes realizes the on-line navigation of the indoor unmanned aerial vehicle, and is low in cost and high in robustness.
The invention has been described above by reference to a few embodiments. However, other embodiments of the invention than the one disclosed above are equally possible within the scope of the invention, as would be apparent to a person skilled in the art from the appended patent claims.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the [ device, component, etc ]" are to be interpreted openly as referring to at least one instance of said device, component, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.

Claims (4)

1. An indoor unmanned aerial vehicle navigation method is characterized by comprising the following steps:
step S1: when the area information of the flying target and the area information of the takeoff origin are compared and fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode;
the step S1 includes:
extracting the area information of a flying target from a flying target two-dimensional code image provided in a flying instruction, wherein the flying target two-dimensional code image is the same as a two-dimensional code image arranged at the flying target;
comparing the two position marks in the two-dimensional code map according to the position mark in the area information of the takeoff origin and the position mark in the area information of the flying target, when determining that the two position marks do not belong to the pre-divided position areas with the same number,
when the two-dimensional code is determined to fall into a preset fast cruise mode, controlling the unmanned aerial vehicle to fly in the fast cruise mode, wherein the two-dimensional code map is generated in advance according to the position relation of two-dimensional codes arranged on each storage unit in the indoor warehouse;
step S2: analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a fast cruise mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset slow tracking mode, controlling the unmanned aerial vehicle to fly in the slow tracking mode;
the step S2 includes:
when the unmanned aerial vehicle flies in a fast cruise mode, a camera device shoots two-dimensional code pictures of adjacent areas in real time according to a first updating frequency to generate dynamic two-dimensional code images, and the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the first updating frequency;
if the position identifier in the area information of the flying target and the position identifier in the area information of the current flying area both belong to the pre-divided position areas with the same number in the two-dimensional code map in the continuous n detection periods,
determining to fall into a preset slow tracking mode, and controlling the unmanned aerial vehicle to fly in the slow tracking mode;
step S3: analyzing a dynamic two-dimensional code image of an adjacent area acquired by a camera device in real time when the unmanned aerial vehicle flies in a slow tracking mode; when the area information of the flying target and the area information of the current flying area are compared and fall into a preset accurate positioning mode, controlling the unmanned aerial vehicle to fly in the accurate positioning mode until the flying target is reached, wherein the flying target is a storage unit in an indoor warehouse;
in the fast cruise mode, the unmanned aerial vehicle flies fast according to two azimuth path strategies provided in the two-dimensional code map;
in a slow tracking mode, the unmanned aerial vehicle flies slowly according to a three-azimuth path strategy provided in the two-dimensional code map;
in the accurate positioning mode, the unmanned aerial vehicle flies at a low speed according to the five-azimuth path strategy provided in the two-dimensional code map.
2. The navigation method of claim 1, further comprising:
according to the multiple static two-dimensional code images, identifying the region information of the takeoff origin where the unmanned aerial vehicle is located, wherein the static two-dimensional code images are generated when the camera device shoots adjacent regions from different angles before flying.
3. The navigation method according to claim 1, wherein the step S3 includes:
when the unmanned aerial vehicle flies in a slow tracking mode, the camera device shoots two-dimensional code pictures of adjacent areas in real time according to a second updating frequency to generate dynamic two-dimensional code images, wherein the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the second updating frequency, and the second updating frequency is greater than the first updating frequency;
if the position identifier in the area information of the flying target and the position identifier in the area information of the current flying area both belong to the pre-divided shelf areas with the same number in the two-dimensional code map in the continuous m detection periods,
and when the unmanned aerial vehicle is determined to fall into the preset accurate positioning mode, controlling the unmanned aerial vehicle to fly in the accurate positioning mode until the flying target is reached.
4. The navigation method of claim 3,
when the unmanned aerial vehicle flies in the accurate positioning mode, the camera device shoots two-dimensional code pictures of adjacent areas in real time according to a third updating frequency to generate dynamic two-dimensional code images, the time interval of each two-dimensional code image in the dynamic two-dimensional code images is the third updating frequency, and the third updating frequency is greater than the second updating frequency.
CN201811641419.XA 2018-12-29 2018-12-29 Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle Active CN109752004B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811641419.XA CN109752004B (en) 2018-12-29 2018-12-29 Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811641419.XA CN109752004B (en) 2018-12-29 2018-12-29 Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN109752004A CN109752004A (en) 2019-05-14
CN109752004B true CN109752004B (en) 2022-07-08

Family

ID=66404450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811641419.XA Active CN109752004B (en) 2018-12-29 2018-12-29 Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN109752004B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022040942A1 (en) * 2020-08-25 2022-03-03 深圳市大疆创新科技有限公司 Flight positioning method, unmanned aerial vehicle and storage medium
TWI756844B (en) 2020-09-25 2022-03-01 財團法人工業技術研究院 Automated guided vehicle navigation device and method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834307A (en) * 2015-04-23 2015-08-12 杨珊珊 Control method and control device of unmanned aerial vehicle
CN105841694A (en) * 2016-06-14 2016-08-10 杨珊珊 Beacon navigation device of unmanned vehicle, beacons and navigation method of beacon navigation device of unmanned vehicle
CN107036602A (en) * 2017-06-15 2017-08-11 北京大学 Autonomous navigation system and method in mixing unmanned plane room based on environmental information code
CN108388245A (en) * 2018-01-26 2018-08-10 温州大学瓯江学院 A kind of AGV trolleies indoor positioning navigation system and its control method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406343B (en) * 2016-09-23 2020-07-10 北京小米移动软件有限公司 Control method, device and system of unmanned aerial vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834307A (en) * 2015-04-23 2015-08-12 杨珊珊 Control method and control device of unmanned aerial vehicle
CN105841694A (en) * 2016-06-14 2016-08-10 杨珊珊 Beacon navigation device of unmanned vehicle, beacons and navigation method of beacon navigation device of unmanned vehicle
CN107036602A (en) * 2017-06-15 2017-08-11 北京大学 Autonomous navigation system and method in mixing unmanned plane room based on environmental information code
CN108388245A (en) * 2018-01-26 2018-08-10 温州大学瓯江学院 A kind of AGV trolleies indoor positioning navigation system and its control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种小型无人机自主飞控***设计与实现;梅武军等;《电子科技》;20170715(第07期);正文第106-108页 *

Also Published As

Publication number Publication date
CN109752004A (en) 2019-05-14

Similar Documents

Publication Publication Date Title
CN103886107B (en) Robot localization and map structuring system based on ceiling image information
US11906983B2 (en) System and method for tracking targets
CN117310739A (en) Technique for sharing drawing data between movable objects
CN106444837A (en) Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle
RU2656711C2 (en) Method and system for detecting and tracking of moving objects based on three-dimensional sensor data
CN111670419A (en) Active supplemental exposure settings for autonomous navigation
EP2166375B1 (en) System and method of extracting plane features
US11430199B2 (en) Feature recognition assisted super-resolution method
CN102298070A (en) Method for assessing the horizontal speed of a drone, particularly of a drone capable of hovering on automatic pilot
CN111192318B (en) Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle
CN112577517A (en) Multi-element positioning sensor combined calibration method and system
US11069080B1 (en) Collaborative airborne object tracking systems and methods
WO2019144289A1 (en) Systems and methods for calibrating an optical system of a movable object
CN113095154A (en) Three-dimensional target detection system and method based on millimeter wave radar and monocular camera
US10109074B2 (en) Method and system for inertial measurement having image processing unit for determining at least one parameter associated with at least one feature in consecutive images
CN109752004B (en) Indoor unmanned aerial vehicle navigation method and device and indoor unmanned aerial vehicle
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
Wang et al. Autonomous landing of multi-rotors UAV with monocular gimbaled camera on moving vehicle
CN111273701A (en) Visual control system and control method for holder
Duan et al. Image digital zoom based single target apriltag recognition algorithm in large scale changes on the distance
CN116977806A (en) Airport target detection method and system based on millimeter wave radar, laser radar and high-definition array camera
CN111222586A (en) Inclined image matching method and device based on three-dimensional inclined model visual angle
Kang et al. Development of a peripheral-central vision system for small UAS tracking
Kang et al. Development of a peripheral–central vision system for small unmanned aircraft tracking
Hao et al. Assessment of an effective range of detecting intruder aerial drone using onboard EO-sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant