CN109618134A - A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method - Google Patents

A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method Download PDF

Info

Publication number
CN109618134A
CN109618134A CN201811505245.4A CN201811505245A CN109618134A CN 109618134 A CN109618134 A CN 109618134A CN 201811505245 A CN201811505245 A CN 201811505245A CN 109618134 A CN109618134 A CN 109618134A
Authority
CN
China
Prior art keywords
unmanned plane
data
dynamic video
geographic information
configuration space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811505245.4A
Other languages
Chinese (zh)
Inventor
陈虹旭
刘卫华
刘丽娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhihui Yunzhou Technology Co Ltd
Original Assignee
Beijing Zhihui Yunzhou Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhihui Yunzhou Technology Co Ltd filed Critical Beijing Zhihui Yunzhou Technology Co Ltd
Priority to CN201811505245.4A priority Critical patent/CN109618134A/en
Publication of CN109618134A publication Critical patent/CN109618134A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/52Determining velocity

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Instructional Devices (AREA)

Abstract

The embodiment of the invention discloses a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and methods, this includes: the first acquisition module, for acquire without during split plane unmanned plane position and configuration space data;Second acquisition module, for acquiring the unmanned plane dynamic video data of load holder on unmanned plane;Spatial position settlement module is merged in three-dimensional geographic information scene for obtaining unmanned plane position and configuration space data, and by unmanned plane position and configuration space data by projection;Video dynamic fusion module, for by projection, merging in three-dimensional geographic information scene unmanned plane dynamic video data;Synchronization module, for according to same time shaft standard, proofreading the unmanned plane dynamic video data after merging and unmanned plane position and configuration space data after fusion;Matching module, for matching the location information of the unmanned plane dynamic video data after proofreading and the unmanned plane position after check and correction and configuration space data.

Description

A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method
Technical field
The present invention relates to technical field of virtual reality, and in particular to a kind of unmanned plane dynamic video three-dimensional geographic information is real-time Emerging system and method.
Background technique
Referred to as " unmanned plane ", english abbreviation is " UAV " to UAV, is using radio robot and to provide for oneself The not manned aircraft of presetting apparatus manipulation.With the reduction of the maturation and application difficulty of unmanned air vehicle technique, in various fields Reused rapidly, instead of the mankind complete it is many before be difficult to the work completed.
Unmanned plane is widely used the fields such as scouting, mapping on site.In recent years, China's natural calamity takes place frequently, with The fast development of unmanned air vehicle technique, unmanned aerial vehicle remote sensing be the condition of a disaster acquisition of information, decision commanding, rescue and relief work, restore weight after calamity It the work such as builds and provides new technological means.In addition, unmanned aerial vehicle remote sensing has maneuverability, with strong points, high-efficient, cost It is low, applied widely etc. a little, thus, unmanned aerial vehicle remote sensing is more and more be widely applied to police, agriculture, Traffic monitoring, The various industries such as rescue and relief work, video capture field gradually becomes the important supplement means of Aeronautics and Astronautics remote sensing.Wherein, nobody Machine monitoring function has obtained especially deep industrial application, and plays significance.
Such as in police fields, police unmanned plane in anti-terrorism monitoring, anxious emergency event investigation, anti-riot track down and arrest, assemble a crowd to drive It dissipates, large-scale party monitoring, rescue the application of the fields such as search, traffic monitoring.Police unmanned plane Heterosis exists: rapid reaction, energy Scene is reached rapidly, observes the development of the entire state of affairs in commanding elevation;It records impact development process in time with best angle, is thing Post-processing provides best evidence;In some pernicious violent conflicts, directly enters cluttered area with unmanned plane and is warned, Casualties is avoided as far as possible;During the violent confrontations or military force with ruffian are seized on both sides by the arms, it can understand ruffian's with being free from risk Specific actual conditions;In some encirclement capture processes, unmanned helicopter can monitor situation in the ring of encirclement with high-altitude, for commander people Member provides real-time efficient message.
The isolated video for shooting unmanned plane but in unmanned plane application process and three-dimensional geographic information field Scape is unified, and by the position and video coverage of combining target regional location and unmanned plane, more intuitive more accurately control is moved State picture is at present around one of the problem of unmanned plane application.
Summary of the invention
The embodiment of the present invention is designed to provide a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system And method, to solve problems of the prior art.
To achieve the above object, the embodiment of the present invention provides a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion System, which includes: an acquisition module, the second acquisition module, sky Between position settlement module and video dynamic fusion module, synchronization module and matching module;Wherein, the first acquisition module, for adopting Collect without during split plane unmanned plane position and configuration space data;Second acquisition module is uploaded for acquiring unmanned plane The unmanned plane dynamic video data of lotus holder;Spatial position settlement module, for obtaining unmanned plane position and configuration space data, And unmanned plane position and configuration space data are merged in three-dimensional geographic information scene by projection;Video dynamic fusion mould Block, for by projection, merging in three-dimensional geographic information scene unmanned plane dynamic video data;Synchronization module is used for basis Same time shaft standard proofreads the unmanned plane dynamic video data after merging and unmanned plane position and posture after fusion Spatial data;Matching module, for matching the unmanned plane dynamic video data after proofreading and the unmanned plane position after check and correction With the location information of configuration space data.
Optionally, configuration space data include: Pitch pitch angle, Roll roll angle and Yaw yaw angle.
Optionally, unmanned plane position data includes: GPS/ Beidou location data (longitude, latitude, height, the speed of a ship or plane).
Optionally, GPS/ Beidou location data includes: longitude, latitude, height and the speed of a ship or plane.
Optionally, spatial position settlement module is specifically used for: obtaining Pitch pitch angle, Roll roll angle, Yaw yaw angle With GPS/ Beidou location data, and will acquire Pitch pitch angle, Roll roll angle, Yaw yaw angle and GPS/ Beidou positioning number According to fusion in three-dimensional geographic information scene.
Optionally, video dynamic fusion module, is specifically used for: resolving unmanned plane dynamic video data, and by clearing Orientation, pitching angular dimensions and the lens focus of data and load holder are merged in three-dimensional geographic information scene by projection.
Realize that above-mentioned purpose, the embodiment of the present invention provide a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion side Method, the unmanned plane dynamic video three-dimensional geographic information real time integrating method include: acquisition without the unmanned plane during split plane Position and configuration space data;Acquire the unmanned plane dynamic video data of load holder on unmanned plane;Obtain unmanned plane position and Configuration space data, and unmanned plane position and configuration space data are merged in three-dimensional geographic information scene by projection;It will Unmanned plane dynamic video data is merged in three-dimensional geographic information scene by projection;According to same time shaft standard, check and correction is melted Unmanned plane dynamic video data after conjunction and unmanned plane position and configuration space data after fusion;After matching check and correction The location information of unmanned plane position and configuration space data after unmanned plane dynamic video data and check and correction.
Optionally, unmanned plane dynamic video data is merged in three-dimensional geographic information scene by projection, comprising: resolve Unmanned plane dynamic video data, and the data of clearing and the orientation, pitching angular dimensions and lens focus of load holder are passed through Projection is merged in three-dimensional geographic information scene.
The embodiment of the present invention has the advantages that
The application of unmanned plane dynamic video is browsed in three-dimensional geographic information scene, solving cannot obtain in the prior art With display video in captured target position the problem of, realize dynamic video in real time with three-dimensional geographic information scene space-time datum It is unified to control, real scene environment is restored, synchronous perceive of situation of battlefield for reaching over the horizon is commanded with space correlation, will be divided rapidly Scattered Information Superiority is converted into system decision superiority.Preferably it is used for track of issues and emergency command.
Detailed description of the invention
Fig. 1 is a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system that the embodiment of the present invention 1 provides Structural schematic diagram.
Fig. 2 is a kind of unmanned plane dynamic video three-dimensional geographic information real time integrating method that the embodiment of the present invention 2 provides Structural schematic diagram.
Specific embodiment
Embodiments of the present invention are illustrated by particular specific embodiment below, those skilled in the art can be by this explanation Content disclosed by book is understood other advantages and efficacy of the present invention easily.
It should be clear that this specification structure depicted in this specification institute accompanying drawings, ratio, size etc., only to cooperate specification to be taken off The content shown is not intended to limit the invention enforceable qualifications so that those skilled in the art understands and reads, therefore Do not have technical essential meaning, the modification of any structure, the change of proportionate relationship or the adjustment of size are not influencing the present invention Under the effect of can be generated and the purpose that can reach, it should all still fall in disclosed technology contents and obtain the model that can cover In enclosing.Meanwhile cited such as "upper", "lower", " left side ", the right side in this specification ", the term of " centre ", be merely convenient to chat That states is illustrated, rather than to limit the scope of the invention, relativeness is altered or modified, and is changing skill without essence It is held in art, when being also considered as the enforceable scope of the present invention.
Embodiment 1
The embodiment of the present invention 1 provides a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method. The system finally realizes that the dynamic video picture during unmanned plane during flying is merged with three-dimensional geographic information scene real-time matching.Solution Certainly UAV Video picture isolate, fail to understand location information the problems such as, accomplish three-dimensional geographic space data and dynamic video space-time It is unified, the intuitive control and interpretation efficiency of video pictures are improved, is had in fields such as urban safety, emergency command, disaster monitorings Significant application value.
Fig. 1 is a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system that the embodiment of the present invention 1 provides Structural schematic diagram.As shown in Figure 1, the unmanned plane dynamic video three-dimensional geographic information real time fusion system includes: the first acquisition mould Block 11, the second acquisition module 12, spatial position settlement module 13, video dynamic fusion module 14, synchronization module 15 and matching mould Block 16;
First acquisition module 11, for acquiring unmanned plane position and configuration space data during unmanned plane during flying;The Two acquisition modules 12, for acquiring the unmanned plane dynamic video data of the load holder during unmanned plane during flying;Spatial position Module 13 is resolved, for unmanned plane position and configuration space data to be loaded into three-dimensional geographic information scene by the method projected In, using camera perspective projection positioning calculation model, real-time video is projected to the three-dimensional spatial information where video source On face;Video dynamic fusion module 14 obtains load for realizing to the load holder data calculation during unmanned plane during flying Camera head orientation, pitching angular dimensions, lens focus running parameter etc. are realized geographical with three-dimensional by dynamic video Fusion Module Information scene temporal-spatial fusion;Synchronization module 15, for same time shaft standard, to unmanned plane position and configuration space data, The same check and correction of unmanned plane load camera head video data, realizes same time series, unified time frame, unmanned plane spatial position It is synchronous with dynamic video to be merged with three-dimensional geographic information scene matching;Matching module 16, for realizing unmanned plane space bit confidence Breath, is merged with dynamic video image content with the simultaneously match of three-dimensional geographic information scene.
Optionally, configuration space data include: one of Pitch pitch angle, Roll roll angle and Yaw yaw angle or more Kind.
Optionally, unmanned plane position data includes: GPS/ Beidou location data (longitude, latitude, height, the speed of a ship or plane).
Optionally, video dynamic fusion module 14, specifically for realizing to the load holder number during unmanned plane during flying According to resolving, load camera head orientation, pitching angular dimensions, lens focus running parameter etc. are obtained, mould is merged by dynamic video Block is realized and three-dimensional geographic information scene temporal-spatial fusion.
Optionally, synchronization module 15 are realized based on same time shaft standard, to unmanned plane position and configuration space number According to, the same check and correction of unmanned plane load camera head video data, same time series, unified time frame, unmanned plane space are realized Position is synchronous with dynamic video to merge with three-dimensional geographic information scene matching.
Optionally, unmanned plane spatial positional information is realized in matching module 16, geographical with dynamic video image content and three-dimensional The simultaneously match of information scene merges.
The embodiment of the present invention 1 is mainly used for unified management and calling to disparate modules in whole system.Three are carried out simultaneously The creation of geographic information scene is tieed up, remote sensing image, digital elevation, map vector and threedimensional model are such as loaded, in three-dimensional geographical letter It ceases in scene, realizes and the spatial position of unmanned plane and the dynamic realtime of video pictures covering position are matched, dynamic menu reality When project in three-dimensional geographic information scene, realize dynamic video picture merged with three-dimensional geographic information scene real-time matching.
Video fusion technology can be generally divided into three levels, i.e. pretreatment, information fusion and application layer.Preconditioning technique It is mainly used to carry out video image geometric correction, noise elimination, color;Brightness adjustment and registration etc..Video image registration Refer to the maximal correlation for finding video image and three-dimensional virtual scene, to eliminate image in directions such as space, phase and resolution ratio Information gap, reach merge truer, the more accurate purpose of information.The fusion of information fused layer, that is, video image.Video figure As fusion can be divided into Pixel-level, feature level, decision level fusion etc. by degree of intelligence from low to high.Pixel-level fusion refers to based on image Pixel carries out splicing fusion, is that two or more image co-registrations become an entirety.Feature-based fusion is with the bright of figure Carried out based on aobvious feature, such as lines, building feature the splicing of image with merge.Decision level fusion uses Bayesian Method, D-S The mathematical algorithms such as evidence act carry out Probabilistic Decision-making, carry out video or image co-registration according to this, are more suitable for subjective requirement.
A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system that the embodiment of the present invention 1 provides is dimensionally The application for browsing unmanned plane dynamic video in information scene is managed, solves to obtain and be shown in video in the prior art and be clapped The problem of taking the photograph the position of target realizes dynamic video in real time with the unified control of three-dimensional geographic information scene space-time datum, and reduction is very Real site environment, the synchronous perception of situation of battlefield for reaching over the horizon command with space correlation, rapidly convert dispersed information advantage For system decision superiority.Preferably it is used for track of issues and emergency command.
Embodiment 2
Fig. 2 is a kind of unmanned plane dynamic video three-dimensional geographic information real time integrating method that the embodiment of the present invention 2 provides Structural schematic diagram.As shown in Fig. 2, the unmanned plane dynamic video three-dimensional geographic information real time integrating method includes:
Step S201: acquire without during split plane unmanned plane position and configuration space data;
Step S202: the unmanned plane dynamic video data of load holder on acquisition unmanned plane;
Step S203: unmanned plane position and configuration space data are obtained, and unmanned plane position and configuration space data are led to Projection is crossed, is merged in three-dimensional geographic information scene;
Step S204: it by unmanned plane dynamic video data by projection, merges in three-dimensional geographic information scene;
Step S205: unmanned plane dynamic video data and fusion according to same time shaft standard, after check and correction fusion Unmanned plane position and configuration space data later;
Step S206: the unmanned plane dynamic video data after proofreading and unmanned plane position and posture after check and correction are matched The location information of spatial data.
Optionally, unmanned plane dynamic video data is merged in three-dimensional geographic information scene by projection, comprising: resolve Unmanned plane dynamic video data, and the data of clearing and the orientation, pitching angular dimensions and lens focus of load holder are passed through Projection is merged in three-dimensional geographic information scene.
Although above having used general explanation and specific embodiment, the present invention is described in detail, at this On the basis of invention, it can be made some modifications or improvements, this will be apparent to those skilled in the art.Therefore, These modifications or improvements without departing from theon the basis of the spirit of the present invention are fallen within the scope of the claimed invention.

Claims (8)

1. a kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system, which is characterized in that the system comprises: first Acquisition module, the second acquisition module, spatial position settlement module and video dynamic fusion module, synchronization module and matching module; Wherein,
First acquisition module, for acquire without during split plane unmanned plane position and configuration space data;
Second acquisition module, for acquiring the unmanned plane dynamic video data of load holder on unmanned plane;
The spatial position settlement module, for obtaining the unmanned plane position and configuration space data, and by unmanned plane position With configuration space data by projection, merge in three-dimensional geographic information scene;
The video dynamic fusion module, for the unmanned plane dynamic video data by projection, to be merged three-dimensional geographical letter It ceases in scene;
The synchronization module, for according to same time shaft standard, unmanned plane dynamic video data after check and correction fusion and Unmanned plane position and configuration space data after fusion;
The matching module, for match check and correction after unmanned plane dynamic video data and check and correction after unmanned plane position and The location information of configuration space data.
2. system according to claim 1, which is characterized in that the configuration space data include: Pitch pitch angle, Roll roll angle and Yaw yaw angle.
3. system according to claim 1 or 2, which is characterized in that unmanned plane position data includes: GPS/ Beidou positioning number According to (longitude, latitude, height, the speed of a ship or plane).
4. system according to claim 3, which is characterized in that the GPS/ Beidou location data include: longitude, latitude, Height and the speed of a ship or plane.
5. system according to claim 3, which is characterized in that the spatial position settlement module is specifically used for:
Pitch pitch angle, Roll roll angle, Yaw yaw angle and GPS/ Beidou location data are obtained, and will acquire Pitch and bow The elevation angle, Roll roll angle, Yaw yaw angle and the fusion of GPS/ Beidou location data are in three-dimensional geographic information scene.
6. system according to claim 1, which is characterized in that the video dynamic fusion module is specifically used for:
Resolve the unmanned plane dynamic video data, and by the data of clearing and the orientation of load holder, pitching angular dimensions and Lens focus is merged in three-dimensional geographic information scene by projection.
7. a kind of unmanned plane dynamic video three-dimensional geographic information real time integrating method, which is characterized in that the described method includes:
Acquire without during split plane unmanned plane position and configuration space data;
Acquire the unmanned plane dynamic video data of load holder on unmanned plane;
The unmanned plane position and configuration space data are obtained, and unmanned plane position and configuration space data are melted by projection It closes in three-dimensional geographic information scene;
By the unmanned plane dynamic video data by projection, merge in three-dimensional geographic information scene;
The unmanned seat in the plane after unmanned plane dynamic video data and fusion according to same time shaft standard, after check and correction fusion It sets and configuration space data;
The position of the unmanned plane position and configuration space data after unmanned plane dynamic video data and check and correction after matching check and correction Confidence breath.
8. the method according to the description of claim 7 is characterized in that described pass through throwing for the unmanned plane dynamic video data Shadow merges in three-dimensional geographic information scene, comprising:
Resolve the unmanned plane dynamic video data, and by the data of clearing and the orientation of load holder, pitching angular dimensions and Lens focus is merged in three-dimensional geographic information scene by projection.
CN201811505245.4A 2018-12-10 2018-12-10 A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method Pending CN109618134A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811505245.4A CN109618134A (en) 2018-12-10 2018-12-10 A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811505245.4A CN109618134A (en) 2018-12-10 2018-12-10 A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method

Publications (1)

Publication Number Publication Date
CN109618134A true CN109618134A (en) 2019-04-12

Family

ID=66008546

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811505245.4A Pending CN109618134A (en) 2018-12-10 2018-12-10 A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method

Country Status (1)

Country Link
CN (1) CN109618134A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113569A (en) * 2019-04-22 2019-08-09 苏州天地衡遥感科技有限公司 Unmanned machine head and its video stream processing method
CN111415416A (en) * 2020-03-31 2020-07-14 武汉大学 Method and system for fusing monitoring real-time video and scene three-dimensional model
CN111586360A (en) * 2020-05-14 2020-08-25 佳都新太科技股份有限公司 Unmanned aerial vehicle projection method, device, equipment and storage medium
CN113415433A (en) * 2021-07-30 2021-09-21 成都纵横大鹏无人机科技有限公司 Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle
CN113673360A (en) * 2021-07-28 2021-11-19 浙江大华技术股份有限公司 Human body distribution detection method, aerial photography device, electronic device, and storage medium
CN115297308A (en) * 2022-07-29 2022-11-04 东风汽车集团股份有限公司 Surrounding AR-HUD projection system and method based on unmanned aerial vehicle
CN116821414A (en) * 2023-05-17 2023-09-29 成都纵横大鹏无人机科技有限公司 Method and system for forming view field projection map based on unmanned aerial vehicle video
CN117237438A (en) * 2023-09-18 2023-12-15 共享数据(福建)科技有限公司 Range matching method and terminal for three-dimensional model and unmanned aerial vehicle video data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493699A (en) * 2009-03-04 2009-07-29 北京航空航天大学 Aerial unmanned plane ultra-viewing distance remote control method
KR101183866B1 (en) * 2011-04-20 2012-09-19 서울시립대학교 산학협력단 Apparatus and method for real-time position and attitude determination based on integration of gps, ins and image at
CN103716586A (en) * 2013-12-12 2014-04-09 中国科学院深圳先进技术研究院 Monitoring video fusion system and monitoring video fusion method based on three-dimension space scene
CN105424010A (en) * 2015-11-17 2016-03-23 中国人民解放军信息工程大学 Unmanned aerial vehicle video geographic space information registering method
CN105635616A (en) * 2016-01-27 2016-06-01 中测新图(北京)遥感技术有限责任公司 Method and device for fusing video data and geographic position information
CN105872496A (en) * 2016-07-01 2016-08-17 黄岩 Ultrahigh-definition video fusion method
CN106454209A (en) * 2015-08-06 2017-02-22 航天图景(北京)科技有限公司 Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology
CN108253966A (en) * 2016-12-28 2018-07-06 昊翔电能运动科技(昆山)有限公司 Unmanned plane during flying three-dimensional simulation display methods

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493699A (en) * 2009-03-04 2009-07-29 北京航空航天大学 Aerial unmanned plane ultra-viewing distance remote control method
KR101183866B1 (en) * 2011-04-20 2012-09-19 서울시립대학교 산학협력단 Apparatus and method for real-time position and attitude determination based on integration of gps, ins and image at
CN103716586A (en) * 2013-12-12 2014-04-09 中国科学院深圳先进技术研究院 Monitoring video fusion system and monitoring video fusion method based on three-dimension space scene
CN106454209A (en) * 2015-08-06 2017-02-22 航天图景(北京)科技有限公司 Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology
CN105424010A (en) * 2015-11-17 2016-03-23 中国人民解放军信息工程大学 Unmanned aerial vehicle video geographic space information registering method
CN105635616A (en) * 2016-01-27 2016-06-01 中测新图(北京)遥感技术有限责任公司 Method and device for fusing video data and geographic position information
CN105872496A (en) * 2016-07-01 2016-08-17 黄岩 Ultrahigh-definition video fusion method
CN108253966A (en) * 2016-12-28 2018-07-06 昊翔电能运动科技(昆山)有限公司 Unmanned plane during flying three-dimensional simulation display methods

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李言俊 等: "《景象匹配与目标识别技术》", 31 August 2009 *
杜军平 等: "《多源运动图像的跨尺度融合研究》", 31 July 2018 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113569A (en) * 2019-04-22 2019-08-09 苏州天地衡遥感科技有限公司 Unmanned machine head and its video stream processing method
CN111415416A (en) * 2020-03-31 2020-07-14 武汉大学 Method and system for fusing monitoring real-time video and scene three-dimensional model
CN111415416B (en) * 2020-03-31 2023-12-15 武汉大学 Method and system for fusing monitoring real-time video and scene three-dimensional model
CN111586360A (en) * 2020-05-14 2020-08-25 佳都新太科技股份有限公司 Unmanned aerial vehicle projection method, device, equipment and storage medium
CN111586360B (en) * 2020-05-14 2021-09-10 佳都科技集团股份有限公司 Unmanned aerial vehicle projection method, device, equipment and storage medium
WO2021227359A1 (en) * 2020-05-14 2021-11-18 佳都新太科技股份有限公司 Unmanned aerial vehicle-based projection method and apparatus, device, and storage medium
CN113673360A (en) * 2021-07-28 2021-11-19 浙江大华技术股份有限公司 Human body distribution detection method, aerial photography device, electronic device, and storage medium
CN113415433B (en) * 2021-07-30 2022-11-29 成都纵横大鹏无人机科技有限公司 Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle
CN113415433A (en) * 2021-07-30 2021-09-21 成都纵横大鹏无人机科技有限公司 Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle
CN115297308A (en) * 2022-07-29 2022-11-04 东风汽车集团股份有限公司 Surrounding AR-HUD projection system and method based on unmanned aerial vehicle
CN115297308B (en) * 2022-07-29 2023-05-26 东风汽车集团股份有限公司 Surrounding AR-HUD projection system and method based on unmanned aerial vehicle
CN116821414A (en) * 2023-05-17 2023-09-29 成都纵横大鹏无人机科技有限公司 Method and system for forming view field projection map based on unmanned aerial vehicle video
CN117237438A (en) * 2023-09-18 2023-12-15 共享数据(福建)科技有限公司 Range matching method and terminal for three-dimensional model and unmanned aerial vehicle video data
CN117237438B (en) * 2023-09-18 2024-06-28 共享数据(福建)科技有限公司 Range matching method and terminal for three-dimensional model and unmanned aerial vehicle video data

Similar Documents

Publication Publication Date Title
CN109618134A (en) A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method
US10475209B2 (en) Camera calibration
US10599149B2 (en) Salient feature based vehicle positioning
US20200007746A1 (en) Systems, methods, and devices for setting camera parameters
ES2874506T3 (en) Selective processing of sensor data
EP3446190B1 (en) Systems and methods for coordinating device actions
CN106454209B (en) The fast anti-data link system of unmanned plane emergency and method based on TEMPORAL-SPATIAL INFORMATION FUSION
JP5349055B2 (en) Multi-lens array system and method
US11644839B2 (en) Systems and methods for generating a real-time map using a movable object
Sato et al. Spatio-temporal bird's-eye view images using multiple fish-eye cameras
CN107966136B (en) Slave unmanned aerial vehicle position display method, device and system based on vision of master unmanned aerial vehicle
CN102190081B (en) Vision-based fixed point robust control method for airship
WO2018120350A1 (en) Method and device for positioning unmanned aerial vehicle
CN110333735B (en) System and method for realizing unmanned aerial vehicle water and land secondary positioning
CN110542407A (en) Method for acquiring positioning information of any pixel point of aerial image
CN114240769A (en) Image processing method and device
JP6482855B2 (en) Monitoring system
CN116385504A (en) Inspection and ranging method based on unmanned aerial vehicle acquisition point cloud and image registration
JP2016118996A (en) Monitoring system
JP6473188B2 (en) Method, apparatus and program for generating depth map
CN110267087B (en) Dynamic label adding method, device and system
CN103823470A (en) Panoramic real-time dynamic monitoring system of unmanned aerial vehicle
CN112985398A (en) Target positioning method and system
CN111947623A (en) Method for rapidly obtaining site map according to surf-scan
US11176190B2 (en) Comparative geolocation and guidance system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190412