CN111667531B - Positioning method and device - Google Patents

Positioning method and device Download PDF

Info

Publication number
CN111667531B
CN111667531B CN201910166898.2A CN201910166898A CN111667531B CN 111667531 B CN111667531 B CN 111667531B CN 201910166898 A CN201910166898 A CN 201910166898A CN 111667531 B CN111667531 B CN 111667531B
Authority
CN
China
Prior art keywords
image
flying device
basic image
determining
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910166898.2A
Other languages
Chinese (zh)
Other versions
CN111667531A (en
Inventor
杨东方
赵彦杰
胡若同
刘洋
李永飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Tianmu Tuhang Technology Co.,Ltd.
Original Assignee
Xi'an Yuanzhi Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Yuanzhi Electronic Technology Co ltd filed Critical Xi'an Yuanzhi Electronic Technology Co ltd
Priority to CN201910166898.2A priority Critical patent/CN111667531B/en
Publication of CN111667531A publication Critical patent/CN111667531A/en
Application granted granted Critical
Publication of CN111667531B publication Critical patent/CN111667531B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The disclosure provides a positioning method and a positioning device, relates to the technical field of electronic information, and can solve the problem of inaccurate positioning of a flying device in a flying area with poor electromagnetic wave signals. The specific technical scheme is as follows: acquiring at least one basic image shot by a flying device, wherein the at least one basic image comprises a target basic image, and the target basic image is an image of a current area of the flying device; acquiring a spliced image according to at least one basic image; determining a flight area of the flight device in the reference image according to the spliced image; determining the position of the flying device relative to the target basic image; the position of the flying device in the reference image is determined according to the position of the flying device relative to the target basic image. The present disclosure is for positioning a flying device.

Description

Positioning method and device
Technical Field
The disclosure relates to the technical field of electronic information, and in particular relates to a positioning method and device.
Background
With the development of the technology of the flying device, the technology of the flying device is applied to various fields, such as aerial photography, transportation, monitoring and the like. In the flight process of the flying device, the flying device is usually positioned and navigated, for example, by GPS (English: global Positioning System, GPS) positioning or positioning by other radio signals, however, these positioning methods have obvious defects, for example, the GPS signal needs to be ensured to be capable of receiving/transmitting the electromagnetic wave signal, and the positioning of the flying device cannot be realized in a flying area with poor electromagnetic wave signal.
Disclosure of Invention
The embodiment of the disclosure provides a positioning method and a positioning device, which can solve the problem of inaccurate positioning of a flying device in a flying area with poor electromagnetic wave signals, and the technical scheme is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided a positioning method, the method comprising:
acquiring at least one basic image shot by a flying device, wherein the at least one basic image comprises a target basic image, and the target basic image is an image of a current area of the flying device;
acquiring a spliced image according to at least one basic image;
determining a flight area of the flight device in the reference image according to the spliced image;
determining the position of the flying device relative to the target basic image;
the position of the flying device in the reference image is determined according to the position of the flying device relative to the target basic image.
The position of the flying device in the reference image is determined through the image shot by the flying device, so that the positioning of the flying device is realized, the strength of electromagnetic wave signals is not needed to be relied on, and the flying device can be positioned on the premise of ensuring the positioning accuracy even in the area with poor signals.
In one embodiment, determining the position of the flying device relative to the target base image includes:
and determining the position of the flying device relative to the target basic image by using an imaging inverse transformation algorithm according to the target basic image.
In one embodiment, acquiring a stitched image from at least one base image comprises:
and splicing at least one basic image according to the shooting time sequence to obtain a spliced image.
In one embodiment, determining a flight area of the flying device in the reference image from the stitched image comprises:
comparing the spliced image with each region in a reference image, wherein the reference image comprises at least one region;
the same region in the reference image as the stitched image is determined as the flight region of the flying device.
In one embodiment, acquiring at least one base image captured by a flying device includes:
at least one base image transmitted by the flying device is received.
According to a second aspect of embodiments of the present disclosure, there is provided a positioning device comprising: the device comprises an acquisition module, a splicing module, a region determining module, a first positioning module and a second positioning module;
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring at least one basic image shot by a flight device, the at least one basic image comprises a target basic image, and the target basic image is an image of a current area of the flight device;
the splicing module is used for acquiring spliced images according to at least one basic image;
the area determining module is used for determining the flight area of the flight device in the reference image according to the spliced image;
the first positioning module is used for determining the position of the flying device relative to the target basic image;
and the second positioning module is used for determining the position of the flying device in the reference image according to the position of the flying device relative to the target basic image.
In one embodiment, the first positioning module is further configured to determine a position of the flying device relative to the target base image using an inverse imaging transformation algorithm based on the target base image.
In one embodiment, the stitching module is further configured to stitch at least one basic image according to a time sequence of shooting, so as to obtain a stitched image.
In one embodiment, the region determination module includes: a comparison unit and a judgment unit;
the comparison unit is used for comparing the spliced image with each region in the reference image, and the reference image comprises at least one region;
and the judging unit is used for determining the same area as the spliced image in the reference image as the flight area of the flight device.
In one embodiment, the acquisition module comprises a receiving unit;
and the receiving unit is used for receiving at least one basic image sent by the flying device.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart of a positioning method provided by an embodiment of the present disclosure;
FIG. 2 is a schematic illustration of a flight area of a flying device provided in an embodiment of the present disclosure;
FIG. 3 is a schematic illustration of a position of a flying device provided by an embodiment of the present disclosure;
fig. 4 is an illustrative diagram of an inverse imaging transformation provided by an embodiment of the present disclosure;
FIG. 5 is a block diagram of a positioning device provided by an embodiment of the present disclosure;
FIG. 6 is a block diagram of a positioning device provided by an embodiment of the present disclosure;
fig. 7 is a block diagram of a positioning device according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The embodiment of the disclosure provides a positioning method, which is applied to a positioning device, as shown in fig. 1, and fig. 1 is a flowchart of the positioning method provided by the embodiment of the disclosure, and the positioning method provided by the embodiment of the disclosure includes the following steps:
101. at least one basic image shot by the flying device is acquired.
The at least one base image includes a target base image, the target base image being an image of an area in which the flying device is currently located. The flying device may be an unmanned aerial vehicle.
In one embodiment, acquiring at least one base image captured by a flying device includes: at least one base image transmitted by the flying device is received.
It should be noted that, at least one basic image shot by the flight device may be that the ground is shot from top to bottom in a overlooking angle, and in an application scenario, the flight device continuously shoots the ground in a overlooking angle, so as to obtain continuous basic images.
102. And acquiring a spliced image according to at least one basic image.
In one embodiment, acquiring a stitched image from at least one base image comprises:
and splicing at least one basic image according to the shooting time sequence to obtain a spliced image. In one application scenario, roads in the stitched image may be connected to form a road network, facilitating positioning of mobile devices on the roads.
103. And determining the flight area of the flight device in the reference image according to the spliced image.
The reference image is a pre-stored image of the flight place of the flight device, for example, the flight device executes the flight task in city a, and the reference image may be an image photographed by city a from top to bottom, unlike the map, the reference image is formed by stitching photos photographed by the flight device. The flight device splices the reference image of the city A by shooting the image, then when the flight device executes the flight task in the city A, the flight device shoots the ground image in real time, namely the basic image, after splicing, a spliced image is formed, and the current flight area of the flight device can be determined by comparing the spliced image with the reference image generated in advance.
In one embodiment, determining a flight area of the flying device in the reference image from the stitched image comprises:
comparing the spliced image with each region in a reference image, wherein the reference image comprises at least one region; the same region in the reference image as the stitched image is determined as the flight region of the flying device.
For example, the reference image is divided into 9 areas, the 1 st area to the 9 th area, the spliced image is a part of the 2 nd area or the spliced image contains all the 2 nd area, the position of the spliced image in the reference image can be determined, and then the current flight area of the flight device can be determined according to the position of the latest basic image shot by the flight device according to time sequence.
Taking fig. 2 as an example, fig. 2 is a schematic view of a flight area of a flight device according to an embodiment of the disclosure, in fig. 2, a reference image is divided into 4 areas, a stitched image is a part of a second area, the stitched image includes 3 basic images, and the 3 basic images are respectively a 1 st basic image, a 2 nd basic image and a 3 rd basic image according to a time sequence, and then, according to a newly captured basic image in the stitched image, an area where a current flight area of the flight device is displayed in the 3 rd basic image can be determined.
104. The position of the flying device relative to the target base image is determined.
It should be noted that the target basic image may be a basic image that is newly captured by the flying device, and the position of the flying device relative to the target basic image may be determined by determining the position of the flying device in which the flying device is currently flying. The location of the flying device may be within the target base image or outside the target base image, and there may be various implementations for determining the location of the flying device relative to the target base image, and these are described herein by way of three specific implementations, although this is merely illustrative and not representative of the present disclosure:
in a first implementation, the position of the flying device in the target basic image is fixed, and the position of the flying device in the target basic image can be determined according to the pre-stored positioning coordinates.
As shown in fig. 3, fig. 3 is a schematic view of a position of a flying device according to an embodiment of the present disclosure, where the flying device may be positioned at the center of the target basic image or at the midpoint of the bottom edge of the target basic image. The edge which is intersected and extends to the direction opposite to the flight direction of the flight device from the center of the target basic image is taken as the bottom edge.
In a second implementation, determining a position of the flying device relative to the target base image includes:
and determining the position of the flying device relative to the target basic image by using an imaging inverse transformation algorithm according to the target basic image.
For example: as shown in fig. 4, fig. 4 is a schematic diagram illustrating an inverse imaging transformation provided in an embodiment of the disclosure, and geographic coordinates of each reference point in the target basic image are obtained according to the matching result of the stitched image and the reference image in step 103. Then, the position of the flying device is reversely deduced by utilizing the geometric relation of the monocular vision imaging process according to the pixel coordinates of the reference point (namely the pixel arrangement position of the reference point in the target basic image) and the geographic coordinates thereof.
Assuming that the reference point is P point, the pixel coordinate of the P point in the airborne imaging platform is I, and the coordinate conversion relationship between the P point and the P point can be described by formula (1).
In the above formula, u 0 v 0 Is the principal point offset of the camera, dx and dy are the physical dimensions of each pixel in the x-axis and y-axis directions, f is the focal length of the camera, O c ,O w And O I The coordinate origins of the camera optical center coordinate system, the world coordinate system and the image coordinate system are respectively,the application assumes that in the practical application process, the focal length f and the principal point bias u of the camera 0 v 0 And the pixel sizes dx and dy are known.
In fig. 4, a ray can be uniquely determined according to the positional relationship of the P point and the coordinates of the P point imaged in the camera, and the P point passes through the optical center point C of the camera; also, it is assumed that another reference point P exists 1 Then the point and imaging coordinates I 1 One ray can also be determined, and the intersection point of the two rays is the optical center coordinate position of the camera. From the coordinate locations, geographic coordinates of the air-based aircraft platform may be determined.
105. The position of the flying device in the reference image is determined according to the position of the flying device relative to the target basic image.
The position of the flying device can be determined in the reference image by determining the position of the flying device relative to the target base image and determining the position of the target base image in the reference image (i.e., the flying area of the flying device in the reference image).
In practical application, after a target basic image shot by a flight device is acquired, a road area is extracted by methods such as deep learning (here, the road extraction method is not limited in the disclosure), and skeletonized to obtain a road network vector diagram; then matching the image with an Openstreetmap, a GIS or a global road network vector diagram (i.e. a reference image) obtained by early aerial photography; and determining the position of the flying device in the global road network according to the matching result of the target basic image obtained by the flying device currently and the global road network (namely the road network presented by the global road network vector diagram).
According to the positioning method provided by the embodiment of the disclosure, the position of the flying device in the reference image is determined through the image shot by the flying device, so that the positioning of the flying device is realized, the dependence on the strength of electromagnetic wave signals is avoided, and the flying device can be positioned on the premise of ensuring the positioning accuracy even in the area with poor signals.
Based on the positioning method described in the embodiment corresponding to fig. 1, the embodiment of the disclosure provides a positioning device for performing the positioning method described in the embodiment corresponding to fig. 1, as shown in fig. 5, the positioning device 50 includes: an acquisition module 501, a stitching module 502, a region determination module 503, a first positioning module 504, and a second positioning module 505;
the acquiring module 501 is configured to acquire at least one basic image captured by the flying device, where the at least one basic image includes a target basic image, and the target basic image is an image of an area where the flying device is currently located;
a stitching module 502, configured to obtain a stitched image according to at least one basic image;
a region determining module 503, configured to determine a flight region of the flight device in the reference image according to the stitched image;
a first positioning module 504 for determining a position of the flying device relative to the target base image;
a second positioning module 505 is configured to determine a position of the flying device in the reference image according to a position of the flying device relative to the target base image.
In one embodiment, the first positioning module 504 is further configured to determine a position of the flying device relative to the target base image using an inverse imaging transformation algorithm based on the target base image.
In one embodiment, the stitching module 502 is further configured to stitch at least one basic image according to a time sequence of capturing, so as to obtain a stitched image.
In one embodiment, as shown in fig. 6, the region determination module 503 includes: an comparing unit 5031 and a judging unit 5032;
a comparing unit 5031, configured to compare the stitched image with each region in the reference image, where the reference image includes at least one region;
the judging unit 5032 is configured to determine an area in the reference image, which is the same as the stitched image, as a flight area of the flight device.
In one embodiment, as shown in fig. 6, the acquisition module 501 includes a receiving unit 5011;
a receiving unit 5011 for receiving at least one basic image transmitted by the flying device.
According to the positioning device provided by the embodiment of the disclosure, the position of the flying device in the reference image is determined through the image shot by the flying device, so that the positioning of the flying device is realized, the dependence on the strength of electromagnetic wave signals is avoided, and the positioning of the flying device can be realized even in the area with poor signals on the premise of ensuring the positioning precision.
Based on the positioning method described in the above embodiment corresponding to fig. 1, the present disclosure further provides a computer readable storage medium, for example, a non-transitory computer readable storage medium may be a Read Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. The storage medium stores computer instructions for executing the positioning method described in the embodiment corresponding to fig. 1, which is not described herein.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (8)

1. A method of positioning, the method comprising:
at least one basic image taken by the flying device is obtained,
the at least one basic image comprises a target basic image, wherein the target basic image is an image of an area where the flying device is currently located;
acquiring a spliced image according to the at least one basic image;
determining a flight area of the flight device in a reference image according to the spliced image;
determining a position of the flying device relative to the target base image;
determining a position of the flying device in the reference image according to the position of the flying device relative to the target basic image;
wherein the determining the flight area of the flight device in the reference image according to the spliced image comprises: comparing the spliced image with each region in the reference image, and determining the position of the spliced image in the reference image, wherein the reference image comprises at least one region; and determining the current flight area of the flight device according to the position of the latest basic image shot by the flight device according to the time sequence.
2. The method of claim 1, wherein determining the position of the flying device relative to the target base image comprises:
and determining the position of the flying device relative to the target basic image by using an imaging inverse transformation algorithm according to the target basic image.
3. The method of claim 1, wherein acquiring a stitched image from the at least one base image comprises:
and splicing the at least one basic image according to the shooting time sequence to obtain the spliced image.
4. A method according to any one of claims 1-3, wherein acquiring at least one base image taken by the flying device comprises:
and receiving the at least one basic image sent by the flying device.
5. A positioning device, the positioning device comprising: the device comprises an acquisition module, a splicing module, a region determining module, a first positioning module and a second positioning module;
the acquisition module is used for acquiring at least one basic image shot by the flying device, wherein the at least one basic image comprises a target basic image, and the target basic image is an image of the current area of the flying device;
the splicing module is used for acquiring a spliced image according to the at least one basic image;
the region determining module is used for determining a flight region of the flight device in a reference image according to the spliced image;
the first positioning module is used for determining the position of the flying device relative to the target basic image;
the second positioning module is used for determining the position of the flying device in the reference image according to the position of the flying device relative to the target basic image;
wherein the region determination module comprises: a comparison unit and a judgment unit;
the comparison unit is used for comparing the spliced image with each region in the reference image to determine the position of the spliced image in the reference image, and the reference image comprises at least one region;
the judging unit is used for determining the current flight area of the flight device according to the position of the latest basic image shot by the flight device according to the time sequence.
6. The apparatus of claim 5, wherein the device comprises a plurality of sensors,
the first positioning module is further used for determining the position of the flying device relative to the target basic image by using an imaging inverse transformation algorithm according to the target basic image.
7. The apparatus of claim 5, wherein the device comprises a plurality of sensors,
and the splicing module is also used for splicing the at least one basic image according to the shooting time sequence to obtain the spliced image.
8. The apparatus according to any one of claims 5-7, wherein the acquisition module comprises a receiving unit;
the receiving unit is used for receiving the at least one basic image sent by the flying device.
CN201910166898.2A 2019-03-06 2019-03-06 Positioning method and device Active CN111667531B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910166898.2A CN111667531B (en) 2019-03-06 2019-03-06 Positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910166898.2A CN111667531B (en) 2019-03-06 2019-03-06 Positioning method and device

Publications (2)

Publication Number Publication Date
CN111667531A CN111667531A (en) 2020-09-15
CN111667531B true CN111667531B (en) 2023-11-24

Family

ID=72381661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910166898.2A Active CN111667531B (en) 2019-03-06 2019-03-06 Positioning method and device

Country Status (1)

Country Link
CN (1) CN111667531B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005099423A2 (en) * 2004-04-16 2005-10-27 Aman James A Automatic event videoing, tracking and content generation system
CN102706352A (en) * 2012-05-21 2012-10-03 南京航空航天大学 Vector map matching navigation method for linear target in aviation
CN102855629A (en) * 2012-08-21 2013-01-02 西华大学 Method and device for positioning target object
CN103185880A (en) * 2012-01-03 2013-07-03 国家空间研究中心 Method for calibrating alignment errors of an earth observation system making use of symmetrical exposures
CN104835115A (en) * 2015-05-07 2015-08-12 中国科学院长春光学精密机械与物理研究所 Imaging method for aerial camera, and system thereof
CN104867125A (en) * 2015-06-04 2015-08-26 北京京东尚科信息技术有限公司 Image obtaining method and image obtaining device
CN105159319A (en) * 2015-09-29 2015-12-16 广州极飞电子科技有限公司 Spraying method of unmanned plane and unmanned plane
CN106791294A (en) * 2016-11-25 2017-05-31 益海芯电子技术江苏有限公司 Motion target tracking method
CN107516294A (en) * 2017-09-30 2017-12-26 百度在线网络技术(北京)有限公司 The method and apparatus of stitching image
CN107728633A (en) * 2017-10-23 2018-02-23 广州极飞科技有限公司 Obtain object positional information method and device, mobile device and its control method
TWI632528B (en) * 2017-09-15 2018-08-11 林永淵 System and method for unmanned aircraft image analysis
CN108459597A (en) * 2017-07-26 2018-08-28 炬大科技有限公司 A kind of mobile electronic device and method for handling the task of mission area
CN108535321A (en) * 2018-03-30 2018-09-14 吉林建筑大学 A kind of building thermal technique method for testing performance based on three-dimensional infrared thermal imaging technique

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6044522B2 (en) * 2013-11-19 2016-12-14 横河電機株式会社 Slow change detection system
US20160019421A1 (en) * 2014-07-15 2016-01-21 Qualcomm Incorporated Multispectral eye analysis for identity authentication
WO2017127711A1 (en) * 2016-01-20 2017-07-27 Ez3D, Llc System and method for structural inspection and construction estimation using an unmanned aerial vehicle

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005099423A2 (en) * 2004-04-16 2005-10-27 Aman James A Automatic event videoing, tracking and content generation system
CN103185880A (en) * 2012-01-03 2013-07-03 国家空间研究中心 Method for calibrating alignment errors of an earth observation system making use of symmetrical exposures
CN102706352A (en) * 2012-05-21 2012-10-03 南京航空航天大学 Vector map matching navigation method for linear target in aviation
CN102855629A (en) * 2012-08-21 2013-01-02 西华大学 Method and device for positioning target object
CN104835115A (en) * 2015-05-07 2015-08-12 中国科学院长春光学精密机械与物理研究所 Imaging method for aerial camera, and system thereof
CN104867125A (en) * 2015-06-04 2015-08-26 北京京东尚科信息技术有限公司 Image obtaining method and image obtaining device
CN105159319A (en) * 2015-09-29 2015-12-16 广州极飞电子科技有限公司 Spraying method of unmanned plane and unmanned plane
CN106791294A (en) * 2016-11-25 2017-05-31 益海芯电子技术江苏有限公司 Motion target tracking method
CN108459597A (en) * 2017-07-26 2018-08-28 炬大科技有限公司 A kind of mobile electronic device and method for handling the task of mission area
TWI632528B (en) * 2017-09-15 2018-08-11 林永淵 System and method for unmanned aircraft image analysis
CN107516294A (en) * 2017-09-30 2017-12-26 百度在线网络技术(北京)有限公司 The method and apparatus of stitching image
CN107728633A (en) * 2017-10-23 2018-02-23 广州极飞科技有限公司 Obtain object positional information method and device, mobile device and its control method
CN108535321A (en) * 2018-03-30 2018-09-14 吉林建筑大学 A kind of building thermal technique method for testing performance based on three-dimensional infrared thermal imaging technique

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Abdullah Tariq.Heritage preservation using aerial imagery from light weight low cost Unmanned Aerial Vehicle (UAV).《2017 International Conference on Communication Technologies (ComTech)》.2017,全文. *
黄鸿飞.无人机影像同步拼接与变化发现技术研究.《中国优秀硕士学位论文全文数据库基础科学辑》.2018,全文. *

Also Published As

Publication number Publication date
CN111667531A (en) 2020-09-15

Similar Documents

Publication Publication Date Title
US11704869B2 (en) System and method for determining geo-location(s) in images
EP3469306B1 (en) Geometric matching in visual navigation systems
US10634500B2 (en) Aircraft and obstacle avoidance method and system thereof
KR101105795B1 (en) Automatic processing of aerial images
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
Yahyanejad et al. Incremental mosaicking of images from autonomous, small-scale uavs
US9651384B2 (en) System and method for indoor navigation
US11625851B2 (en) Geographic object detection apparatus and geographic object detection method
CN106408601B (en) A kind of binocular fusion localization method and device based on GPS
CN106444837A (en) Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle
JP4978615B2 (en) Target identification device
JP2009053059A (en) Object specifying device, object specifying method, and object specifying program
CN110703805B (en) Method, device and equipment for planning three-dimensional object surveying and mapping route, unmanned aerial vehicle and medium
CN109883433B (en) Vehicle positioning method in structured environment based on 360-degree panoramic view
US20160169662A1 (en) Location-based facility management system using mobile device
CN108195359B (en) Method and system for acquiring spatial data
KR102195040B1 (en) Method for collecting road signs information using MMS and mono camera
CN111667531B (en) Positioning method and device
CN116228860A (en) Target geographic position prediction method, device, equipment and storage medium
CN111412898B (en) Large-area deformation photogrammetry method based on ground-air coupling
CN113361552B (en) Positioning method and device
CN111581322A (en) Method, device and equipment for displaying interest area in video in map window
CN111666959A (en) Vector image matching method and device
KR100745105B1 (en) Image display method and image display apparatus
CN116051628B (en) Unmanned aerial vehicle positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20231030

Address after: 710075 room 18, 12202, 22 / F, unit 1, building 2, leading Times Plaza (block B), No. 86, Gaoxin Road, high tech Zone, Xi'an, Shaanxi Province

Applicant after: Xi'an Yuanzhi Electronic Technology Co.,Ltd.

Address before: 710121 Xi'an University of Posts and Telecommunications (Chang'an campus), Chang'an District, Xi'an City, Shaanxi Province

Applicant before: XI'AN University OF POSTS & TELECOMMUNICATIONS

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240611

Address after: Room A-115-3, iChuangtu Zhongchuang Park, No. 14 Gaoxin Second Road, High tech Zone, Xi'an City, Shaanxi Province, 710082

Patentee after: Xi'an Tianmu Tuhang Technology Co.,Ltd.

Country or region after: China

Address before: 710075 room 18, 12202, 22 / F, unit 1, building 2, leading Times Plaza (block B), No. 86, Gaoxin Road, high tech Zone, Xi'an, Shaanxi Province

Patentee before: Xi'an Yuanzhi Electronic Technology Co.,Ltd.

Country or region before: China