CN110309883A - A kind of unmanned plane autonomic positioning method of view-based access control model SLAM - Google Patents
A kind of unmanned plane autonomic positioning method of view-based access control model SLAM Download PDFInfo
- Publication number
- CN110309883A CN110309883A CN201910596824.2A CN201910596824A CN110309883A CN 110309883 A CN110309883 A CN 110309883A CN 201910596824 A CN201910596824 A CN 201910596824A CN 110309883 A CN110309883 A CN 110309883A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- location information
- information
- module
- aircraft
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/30—Interpretation of pictures by triangulation
- G01C11/34—Aerial triangulation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/08—Position of single direction-finder fixed by determining direction of a plurality of spaced sources of known location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
A kind of unmanned plane autonomic positioning method of view-based access control model SLAM, belongs to computer vision, technical field of image processing.Acquire the location information of unmanned plane and the relative position information of unmanned plane ambient enviroment;The location information of unmanned plane is positioned according to the method for ultra wideband location techniques;The relative information of the location information of ground station output unmanned plane and ambient enviroment on unmanned plane;In ground station reception and the location information of unmanned plane and the relative position information of unmanned plane ambient enviroment are handled, and sends control instruction to unmanned plane;Unmanned plane receives the control instruction and realizes that independent navigation flight carries out movement and solves part, carries out fusion part, carries out depth part.The present invention is not necessarily to other aiding sensors, and autonomous positioning accuracy rate is high, and independence is strong, can satisfy demand when unmanned plane high-speed flight, provides accurate location information.
Description
Technical field
The present invention relates to the unmanned plane autonomic positioning method of view-based access control model SLAM a kind of, belong to computer vision, at image
Manage technical field.
Background technique
With the increase of social demand, people are more and more for the functional requirement of unmanned plane.Unmanned plane has high
Flexibility and independence can execute task in the case where no human intervention or less intervention, and the mankind is helped to complete tool
The labour of dangerous property or repeatability.In recent years, the fast development of unmanned air vehicle technique obtains it in military and civilian field
It is widely applied, has played important function in terms of information detection, battlefield rescue, earthquake relief work, fire alarm.Due to nothing
The diversity of man-machine application scenarios and the complexity of aerial mission have autonomous using small-sized and miniature drone as Typical Representative
The Intelligent unattended machine of positioning function is increasingly becoming the important trend of current Development of UAV.
The autonomous positioning of unmanned plane is to guarantee premise and the pass of the tasks such as its safe operation, trajectory planning and target following
Where key.The positioning of unmanned plane is broadly divided into two ways at present, one is based on external positioning systems carry out movable body positioning,
Such as: the Global Satellite Navigation System such as GPS, Beidou or indoor locating system, however the precision of GPS is lower, indoor positioning system
System is also required to arrange external collector in advance, has certain limitation;Another kind is the biography self-contained based on unmanned plane
Sensor equipment perceives ambient enviroment, by carrying out processing and environmental modeling to collected sensing data, and then solves
The certainly orientation problem of unmanned plane itself.This mode is currently referred to as autonomous positioning, this method can be lacked in GPS signal or
The orientation problem of unmanned plane is solved in circumstances not known.The autonomous positioning of unmanned plane would generally using laser radar, visual sensor,
The equipment such as Inertial Measurement Unit go to obtain ambient condition information and oneself state, recycle synchronous positioning and build drawing method
(Simultaneous Localization And Mapping, SLAM) realizes environmental modeling and positioning.
The autonomous positioning of unmanned plane is an emphasis direction of academic circles at present research, since the application scenarios of unmanned plane are multiple
Miscellaneous multiplicity, the estimation to its own position are to restrict one of an important factor for it develops, and have only captured this problem, unmanned plane
More preferable more long-range development is just had, and SLAM is to solve the problems, such as that this preferred embodiment and current forward position focus are asked at present
Topic, can preferably solve the problems, such as unmanned plane autonomous positioning in the complex environments such as more obstacles by studying this technology, have
Greatly research significance.
Summary of the invention
To solve the problems in the background art, the present invention provides the unmanned plane autonomous positioning of view-based access control model SLAM a kind of
Method.
Realize that above-mentioned purpose, the present invention take following technical proposals: a kind of unmanned plane autonomous positioning of view-based access control model SLAM
Method, the method are by aircraft module, ultra wide band locating module, earth station's processing module, aircraft digital transmission module, fortune
It is dynamic to solve part, fusion part and depth part composition;Described method includes following steps:
Step 1: the location information of acquisition unmanned plane and the relative position information of unmanned plane ambient enviroment;
Step 2: positioning the location information of unmanned plane according to the method for ultra wideband location techniques;
Step 3: the relative information of the location information of ground station output unmanned plane and ambient enviroment on unmanned plane;
Step 4: in ground station reception and handling the location information of unmanned plane and the relative position letter of unmanned plane ambient enviroment
Breath, and control instruction is sent to unmanned plane;
Step 5: unmanned plane, which receives the control instruction, realizes independent navigation flight
Step 6: carrying out movement solves part, the strategy for combining method of characteristic point and direct method is taken, key is selected
Point feature and line feature are extracted after frame, minimize error then to complete the calculating of relative pose;
Step 7: carrying out fusion part, image and the fusion part of Inertial Measurement Unit imu are taken, by feature
Luminosity after matching minimize the error under the priori translational movement and rotation amount that are obtained after imu pre-integration and present condition translational movement and
Rotation amount minimizes the error to merge to it;
Step 8: carrying out depth part, estimation 3D point depth part is taken, carries out matched basis in the characteristic point of selection
On, carry out the method for trigonometric ratio solve the position 3D of the point to get arrive the point depth value
Compared with prior art, the beneficial effects of the present invention are:
1, the present invention is not necessarily to other aiding sensors, and autonomous positioning accuracy rate is high, and independence is strong;
2, it present invention decreases the complexity of algorithm in the prior art and saves and calculates space, and to a certain extent
It shortens and calculates the time, improve the efficiency of algorithm, can satisfy demand when unmanned plane high-speed flight, accurate position is provided
Information.
Detailed description of the invention
Fig. 1 is the structural schematic block diagram of the unmanned plane freedom positioning system of view-based access control model SLAM of the invention;
Fig. 2 is the structural schematic block diagram of earth station's processing module.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, is clearly and completely retouched to the technical solution in the present invention
It states, it is clear that described embodiment is only a part of the embodiment of invention, instead of all the embodiments, based in the present invention
Embodiment, every other embodiment obtained by those of ordinary skill in the art without making creative efforts,
It shall fall within the protection scope of the present invention.
Specific embodiment 1: the invention discloses the unmanned planes of view-based access control model SLAM a kind of certainly as shown in FIG. 1 to FIG. 2
Master positioning method, the method are by aircraft module 1, ultra wide band locating module 2, earth station's processing module 3, aircraft number
Transmission module 4, movement solve part 5, fusion part 6 and depth part composition 7;Described method includes following steps:
Step 1: the location information of acquisition unmanned plane and the relative position information of unmanned plane ambient enviroment;
Step 2: positioning the location information of unmanned plane according to the method for ultra wideband location techniques;
Step 3: the relative information of the location information of ground station output unmanned plane and ambient enviroment on unmanned plane;
Step 4: in ground station reception and handling the location information of unmanned plane and the relative position letter of unmanned plane ambient enviroment
Breath, and control instruction is sent to unmanned plane;
Step 5: unmanned plane, which receives the control instruction, realizes independent navigation flight
Step 6: carrying out movement solves part 5, the strategy for combining method of characteristic point and direct method is taken, pass is selected
Point feature and line feature are extracted after key frame, minimize error then to complete the calculating of relative pose;
Step 7: carrying out fusion part 6, image and the fusion part of Inertial Measurement Unit imu are taken, by feature
Luminosity after matching minimize the error under the priori translational movement and rotation amount that are obtained after imu pre-integration and present condition translational movement and
Rotation amount minimizes the error to merge to it;
Step 8: carrying out depth part 7, estimation 3D point depth part is taken, carries out matched base in the characteristic point of selection
On plinth, carry out the method for trigonometric ratio solve the position 3D of the point to get arrive the point depth value.
Specific embodiment 2: present embodiment is the further explanation made to specific embodiment one, the basis
Ultra wideband location techniques method positioning unmanned plane location information include:
Step 1: locating base station is arranged in four places indoors;
Step 2: four locating base stations are arranged in rectangle;
Step 3: keeping the height of one of locating base station different from the height of other three locating base stations, and according to fixed
Reference coordinate of the coordinate of position base station as unmanned plane.
Specific embodiment 3: as shown in Fig. 2, present embodiment is made furtherly to specific embodiment one
Bright, earth station's processing module 3 includes number leaflet member 31 and information process unit 32.
Specific embodiment 4: as shown in Fig. 2, present embodiment is made furtherly to specific embodiment three
Bright, the several leaflets member 31 is for receiving the location information of the aircraft module 1 that the ultra wide band locating module 2 transmits and described
Relative position information, and for sending the control instruction to the aircraft module 1;The information process unit 32 is used for root
The control instruction for controlling the aircraft module 1 is generated according to the location information of aircraft module 1 and the relative position information.
Specific embodiment 5: as shown in Fig. 2, present embodiment is made furtherly to specific embodiment four
Bright, the information process unit 32 includes that two-dimensional coordinate point determines subelement 321, flying distance measuring and calculating subelement 322 and three
Dimension coordinate points determine subelement 323.Two-dimensional coordinate point determines subelement 321, for being determined according to the ultra wide band locating module
Two-dimentional reference coordinates point of one aircraft module in two-dimensional space;Flying distance calculates subelement 322, for according to ultra-wide
Data-signal with locating module transmitting generates two independent timestamps, and calculates data letter according to two independent timestamps
Number transmitting time;Three-dimensional coordinate point determines subelement 323, for determining subelement and the ultra wide band according to the coordinate points
Locating module determines the three-dimensional reference coordinates point of aircraft module in three-dimensional space.
Specific embodiment 6: present embodiment is the further explanation made to specific embodiment one, the flight
Device module 1 acquires the location information of unmanned plane and the relative position information of unmanned plane ambient enviroment, surpasses for carrying unmanned plane
Broadband locating module 2 is used for the location information of the method positioning aircraft module according to ultra wideband location techniques, earth station is handled
Module 3 be used for handle the aircraft module location information and the unmanned plane ambient enviroment relative position information, and to
The aircraft module sends control instruction, aircraft digital transmission module 4 and exports the unmanned plane for ground station processing module
The location information of the relative information of ambient enviroment and the aircraft module, and receive the control instruction.
Specific embodiment 7: present embodiment is the further explanation made to specific embodiment one, the key
The Selection Strategy of frame is when the feature number of blocks on new picture frame is less than certain threshold value, then the frame is as new key frame.
UAV flight's camera and TX1 processor, Jetson TX1 are that embedded vision relatively advanced at present calculates
System, and first item modular supercomputer in the world.Jetson TX1 is based on including 256 CUDA cores
NVIDIAMaxwell architecture design is computer vision, deep learning, GPU is calculated and the outstanding exploitation in the fields such as image procossing
Platform.Localization method is run using the processor, camera acquires ambient image.
Part 5 is solved in movement, takes the strategy for combining method of characteristic point and direct method, key frame is selected and mentions later
Point feature and line feature are taken, minimizes error then to complete the calculating of relative pose.It then is fusion part 6, the part is complete
At the fusion of image information and imu information, by the luminosity after characteristic matching minimize the error and imu pre-integration after obtain
Priori translational movement and rotation amount under present condition translational movement and minimizing the error for rotation amount vision and imu are melted
It closes, improves the accuracy and precision of localization method;Final step is depth part composition 7, and it is empty which is then mainly responsible for calculating
Between the depth value put the method for trigonometric ratio is carried out to it to solve the point on the basis of characteristic point of selection carries out matched
The position 3D is to get the depth value for arriving the point.The Selection Strategy of key frame is when the feature number of blocks on new picture frame is less than one
When determining threshold value, then the frame is as new key frame.After processor runs this method, the accurate pose of unmanned plane is finally obtained.
It is obvious to a person skilled in the art that invention is not limited to the details of the above exemplary embodiments, Er Qie
In the case where without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in the form of others dress body.Therefore, no matter
From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and the scope of the present invention is by appended power
Benefit requires rather than above description limits, it is intended that by all in the meaning and scope for the condition of equivalent for falling in claim
Variation is included within the present invention.Any reference signs in the claims should not be construed as limiting the involved claims.
In addition, it should be understood that although this specification is described in terms of embodiments, but not each embodiment is only wrapped
Containing an independent technical solution, this description of the specification is merely for the sake of clarity, and those skilled in the art should
It considers the specification as a whole, the technical solutions in the various embodiments may also be suitably combined, forms those skilled in the art
The other embodiments being understood that.
Claims (7)
1. a kind of unmanned plane autonomic positioning method of view-based access control model SLAM, the method is by aircraft module (1), ultra wide band
Locating module (2), earth station's processing module (3), aircraft digital transmission module (4), movement solve part (5), fusion part (6) with
And depth part composition (7);It is characterized by: described method includes following steps:
Step 1: the location information of acquisition unmanned plane and the relative position information of unmanned plane ambient enviroment;
Step 2: positioning the location information of unmanned plane according to the method for ultra wideband location techniques;
Step 3: the relative information of the location information of ground station output unmanned plane and ambient enviroment on unmanned plane;
Step 4: in ground station reception and handle the location information of unmanned plane and the relative position information of unmanned plane ambient enviroment,
And control instruction is sent to unmanned plane;
Step 5: unmanned plane, which receives the control instruction, realizes independent navigation flight
Step 6: carrying out movement solves part (5), the strategy for combining method of characteristic point and direct method is taken, key is selected
Point feature and line feature are extracted after frame, minimize error then to complete the calculating of relative pose;
Step 7: carrying out fusion part (6), image and the fusion part of Inertial Measurement Unit imu are taken, by characteristic matching
Luminosity afterwards minimize the error under the priori translational movement and rotation amount that are obtained after imu pre-integration and present condition translational movement and rotation
Turn minimizing the error to merge to it for amount;
Step 8: carrying out depth part (7), estimation 3D point depth part is taken, carries out matched basis in the characteristic point of selection
On, carry out the method for trigonometric ratio solve the position 3D of the point to get arrive the point depth value.
2. the unmanned plane autonomic positioning method of view-based access control model SLAM according to claim 1 a kind of, it is characterised in that: described
Include: according to the location information that the method for ultra wideband location techniques positions unmanned plane
Step 1: locating base station is arranged in four places indoors;
Step 2: four locating base stations are arranged in rectangle;
Step 3: keeping the height of one of locating base station different from the height of other three locating base stations, and according to positioning base
Reference coordinate of the coordinate stood as unmanned plane.
3. the unmanned plane autonomic positioning method of view-based access control model SLAM according to claim 1 a kind of, it is characterised in that: described
Earth station's processing module (3) includes number leaflet first (31) and information process unit (32).
4. the unmanned plane autonomic positioning method of view-based access control model SLAM according to claim 3 a kind of, it is characterised in that: described
Number leaflets first (31) are used to receive the location information of the aircraft module (1) of ultra wide band locating module (2) transmission and described
Relative position information, and for sending the control instruction to the aircraft module (1);The information process unit (32) is used
In the control for generating the control aircraft module (1) according to the location information of aircraft module (1) and the relative position information
System instruction.
5. the unmanned plane autonomic positioning method of view-based access control model SLAM according to claim 4 a kind of, it is characterised in that: described
Information process unit (32) includes that two-dimensional coordinate point determines subelement (321), flying distance measuring and calculating subelement (322) and three-dimensional
Coordinate points determine subelement (323).
6. the unmanned plane autonomic positioning method of view-based access control model SLAM according to claim 1 a kind of, it is characterised in that: described
Aircraft module (1) acquires the location information of unmanned plane and the relative position of unmanned plane ambient enviroment for carrying unmanned plane
Information, ultra wide band locating module (2) be used for the location information according to the method positioning aircraft modules of ultra wideband location techniques,
Face station processing module (3) is used to handle the location information of the aircraft module and the relative position of the unmanned plane ambient enviroment
Information, and send control instruction, aircraft digital transmission module (4) to the aircraft module and exported for ground station processing module
The location information of the relative information of the unmanned plane ambient enviroment and the aircraft module, and receive the control instruction.
7. the unmanned plane autonomic positioning method of view-based access control model SLAM according to claim 1 a kind of, it is characterised in that: described
The Selection Strategy of key frame is when the feature number of blocks on new picture frame is less than certain threshold value, then the frame is as new key
Frame.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910586599 | 2019-07-01 | ||
CN2019105865994 | 2019-07-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110309883A true CN110309883A (en) | 2019-10-08 |
Family
ID=68078961
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910596824.2A Pending CN110309883A (en) | 2019-07-01 | 2019-07-02 | A kind of unmanned plane autonomic positioning method of view-based access control model SLAM |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110309883A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110954066A (en) * | 2019-12-19 | 2020-04-03 | 陕西长岭电子科技有限责任公司 | Helicopter hanging swing monitoring system and method based on ultra wide band positioning |
CN112562052A (en) * | 2020-12-03 | 2021-03-26 | 广东工业大学 | Real-time positioning and mapping method for near-shore water area |
CN112985410A (en) * | 2021-03-02 | 2021-06-18 | 哈尔滨理工大学 | Indoor robot self-map-building navigation system based on laser SLAM |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150312774A1 (en) * | 2014-04-25 | 2015-10-29 | The Hong Kong University Of Science And Technology | Autonomous robot-assisted indoor wireless coverage characterization platform |
CN109211241A (en) * | 2018-09-08 | 2019-01-15 | 天津大学 | The unmanned plane autonomic positioning method of view-based access control model SLAM |
CN109270956A (en) * | 2018-11-19 | 2019-01-25 | 深圳大学 | A kind of unmanned vehicle independent Position Fixing Navigation System based on UWB |
CN109520497A (en) * | 2018-10-19 | 2019-03-26 | 天津大学 | The unmanned plane autonomic positioning method of view-based access control model and imu |
-
2019
- 2019-07-02 CN CN201910596824.2A patent/CN110309883A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150312774A1 (en) * | 2014-04-25 | 2015-10-29 | The Hong Kong University Of Science And Technology | Autonomous robot-assisted indoor wireless coverage characterization platform |
CN109211241A (en) * | 2018-09-08 | 2019-01-15 | 天津大学 | The unmanned plane autonomic positioning method of view-based access control model SLAM |
CN109520497A (en) * | 2018-10-19 | 2019-03-26 | 天津大学 | The unmanned plane autonomic positioning method of view-based access control model and imu |
CN109270956A (en) * | 2018-11-19 | 2019-01-25 | 深圳大学 | A kind of unmanned vehicle independent Position Fixing Navigation System based on UWB |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110954066A (en) * | 2019-12-19 | 2020-04-03 | 陕西长岭电子科技有限责任公司 | Helicopter hanging swing monitoring system and method based on ultra wide band positioning |
CN112562052A (en) * | 2020-12-03 | 2021-03-26 | 广东工业大学 | Real-time positioning and mapping method for near-shore water area |
CN112985410A (en) * | 2021-03-02 | 2021-06-18 | 哈尔滨理工大学 | Indoor robot self-map-building navigation system based on laser SLAM |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110446159B (en) | System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle | |
EP3971674B1 (en) | Systems and methods for uav flight control | |
CN104808675B (en) | Body-sensing flight control system and terminal device based on intelligent terminal | |
CN108453738B (en) | Control method for four-rotor aircraft aerial autonomous grabbing operation based on Opencv image processing | |
CN105492985B (en) | A kind of system and method for the control loose impediment in environment | |
CN108351649B (en) | Method and apparatus for controlling a movable object | |
García Carrillo et al. | Combining stereo vision and inertial navigation system for a quad-rotor UAV | |
CN105022401B (en) | Many four rotor wing unmanned aerial vehicles collaboration SLAM methods of view-based access control model | |
Heng et al. | Autonomous obstacle avoidance and maneuvering on a vision-guided mav using on-board processing | |
EP2029970B1 (en) | Beacon-augmented pose estimation | |
CN109211241A (en) | The unmanned plane autonomic positioning method of view-based access control model SLAM | |
CN107450577A (en) | UAV Intelligent sensory perceptual system and method based on multisensor | |
CN109596118A (en) | It is a kind of for obtaining the method and apparatus of the spatial positional information of target object | |
CN102190081B (en) | Vision-based fixed point robust control method for airship | |
CN106017463A (en) | Aircraft positioning method based on positioning and sensing device | |
CN107144281A (en) | Unmanned plane indoor locating system and localization method based on cooperative target and monocular vision | |
Carloni et al. | Robot vision: Obstacle-avoidance techniques for unmanned aerial vehicles | |
CN110309883A (en) | A kind of unmanned plane autonomic positioning method of view-based access control model SLAM | |
CN113625774A (en) | Multi-unmanned aerial vehicle cooperative positioning system and method for local map matching and end-to-end distance measurement | |
Perez et al. | Autonomous collision avoidance system for a multicopter using stereoscopic vision | |
Irfan et al. | Vision-based guidance and navigation for autonomous mav in indoor environment | |
CN112991440A (en) | Vehicle positioning method and device, storage medium and electronic device | |
CN108227749A (en) | Unmanned plane and its tracing system | |
CN207379510U (en) | Unmanned plane indoor locating system based on cooperative target and monocular vision | |
Cai et al. | Multi-source information fusion augmented reality benefited decision-making for unmanned aerial vehicles: A effective way for accurate operation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20191008 |