CN108769569A - A kind of 360 degree of stereoscopic full views observation systems and method for unmanned plane - Google Patents
A kind of 360 degree of stereoscopic full views observation systems and method for unmanned plane Download PDFInfo
- Publication number
- CN108769569A CN108769569A CN201810314065.1A CN201810314065A CN108769569A CN 108769569 A CN108769569 A CN 108769569A CN 201810314065 A CN201810314065 A CN 201810314065A CN 108769569 A CN108769569 A CN 108769569A
- Authority
- CN
- China
- Prior art keywords
- observation
- image
- unmanned plane
- subsystem
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 26
- 238000012545 processing Methods 0.000 claims abstract description 48
- 238000013507 mapping Methods 0.000 claims description 18
- 230000004927 fusion Effects 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 9
- 238000012937 correction Methods 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 9
- 230000015572 biosynthetic process Effects 0.000 claims description 7
- 239000000203 mixture Substances 0.000 claims description 5
- 238000007781 pre-processing Methods 0.000 claims description 4
- 238000003786 synthesis reaction Methods 0.000 claims description 2
- 230000007613 environmental effect Effects 0.000 abstract description 5
- 230000000007 visual effect Effects 0.000 abstract description 4
- 238000001514 detection method Methods 0.000 abstract description 3
- DMBHHRLKUKUOEG-UHFFFAOYSA-N diphenylamine Chemical compound C=1C=CC=CC=1NC1=CC=CC=C1 DMBHHRLKUKUOEG-UHFFFAOYSA-N 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 101000934888 Homo sapiens Succinate dehydrogenase cytochrome b560 subunit, mitochondrial Proteins 0.000 description 1
- 102100025393 Succinate dehydrogenase cytochrome b560 subunit, mitochondrial Human genes 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 239000000155 melt Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
Abstract
A kind of 360 degree of stereoscopic full views observation systems for unmanned plane, including:Observe subsystem and processing subsystem, wherein observation subsystem multiple cameras in 6, space direction residing for observation unmanned plane form, and the processing subsystem is communicated with the observation subsystem.Compared with prior art, can to unmanned plane in height the environmental information of surrounding, the ambient image information of unmanned plane elevation-over be observed, have the ability in the visual field of 360 ° of covering level and vertical 360 ° omni-directional.In addition, present invention adds elevation sensing module, the compatible Big Dipper/GPS satellite information and inertial sensor information so that the present invention is more acurrate to the detection of position and elevation information in unmanned plane observing environment.Using 2D and 3D modeling to image to being observed, elevation information integrated environment image information is more accurate to the description of environment.
Description
Technical field
The present invention relates to unmanned plane fields, are seen more particularly to a kind of 360 degree of stereoscopic full views for unmanned plane
Examining system and method are observed for realizing 360 degree of stereoscopic full views of unmanned plane.
Background technology
As economic rapid development and social are constantly progressive, the environment detection of UAV flight's multiple sensors system
Technology is widely used in many aspects.In terms of environmental monitoring in environmental protection, unmanned plane is because its timeliness is strong, machine
Dynamic property is good, the wide feature of inspection range, can be used in a wide range of verifying ambient conditions.In terms of mapping, unmanned plane conduct is surveyed and drawn
A kind of low cost, high-precision, remote sensing image easy to operate obtain equipment and come into being, and in tradition mapping, digital city builds
If, the monitoring of geographical national conditions, calamity emergency processing etc. achieve good effect.
Existing unmanned plane observation system is mainly shot and is spliced to the scene for overlooking ground, and system can be observed comprehensively
The situation of ground scene can not singly collect the image above the image and unmanned plane of unmanned plane, in environmental monitoring, nothing
The three-dimensional ambient conditions of method reaction.In the case where flying height is not high, it is difficult to observe comprehensive ambient conditions.
Application number:201610969823.4 a kind of unmanned plane panoramic vision tracking of the present invention, unmanned plane and control
Terminal processed, this method include:Obtain the image that multiple cameras are shot at same time point;Splice multiple camera in the same time
The image of point shooting forms panoramic picture;It is whole that the panoramic picture spliced every time is sent to the control being wirelessly connected with unmanned plane
End.
In above-mentioned patent, there are problems that two:1. camera is arranged at the lower section of unmanned plane, it is parallel unmanned plane can not to be perceived
Image information more than height and parallel height.It only has panoramic information in top plan view, does not have in vertical direction
Panoramic information.2. image information is not combined with unmanned plane elevation information, can not detect in detail in actual environment in each elevation
On, the steric information in actual environment.
Invention content
In view of the above-mentioned problems, according to an aspect of the present invention, disclosing a kind of 360 degree of stereoscopic full views for unmanned plane
Observation system, including:Observe subsystem and processing subsystem, wherein the observation subsystem is by observation unmanned plane lower section, observation
Above unmanned plane and multiple cameras composition of observation unmanned plane surrounding, the processing subsystem and the observation subsystem into
Row communication, the processing subsystem are used to handle the image that multiple cameras obtain in the observation subsystem.
Preferably, the observation subsystem further includes:Structure for fixing the multiple camera.
Preferably, the processing subsystem includes:Core processor module, system memory module, video storage modules, figure
As data simultaneous module and position elevation sensing module, wherein the system memory module, the video storage modules, described
Image data synchronization module and the position elevation sensing module are connected with the core processor module respectively.
It is furthermore preferred that the core processor module is DSP, GPU, FPGA or CPU, for controlling scene image data
Reading, image mosaic, image procossing interact with user.
It is furthermore preferred that described image data synchronization processing module is used for the synchronization process of multi-path camera, the position is high
Journey sensing module, for obtaining latitude and longitude information and elevation information where system, the communication module be used for and it is extraneous into
Row communication, the system memory module are respectively used to storage control program and synthesis panoramic picture number with the video storage modules
According to wherein the video storage modules are SD card or TF card.
According to another aspect of the present invention, it discloses a kind of complete using a kind of above-mentioned 360 degree of solids for unmanned plane
The method of landscape examining system, which is characterized in that including:
The observation subsystem acquires image data;
The processing subsystem carries out image preprocessing to described image data;
The processing subsystem carries out image according to the inside and outside parameter of camera according to the pretreated image data
Correction;
The processing subsystem is according to the image data after correction with the camera inside and outside parameter according to progress panoramic projection
Model modeling obtains panoramic projection model;
The processing subsystem carries out panoramic projection texture mapping to the panoramic projection model, obtains panoramic projection texture
Map image;
The processing subsystem carries out image texture fusion to the panoramic projection texture mapping image.
Further, the inside and outside parameter evidence of the camera includes:The focal length of the multiple camera and the multiple
Relative position information between camera.
Further, described to include to panoramic projection model modeling:According to observation needs, according between the multiple camera
Relative position information, 2D or 3D modeling are carried out to described image data, wherein 2D modelings are made of image square formation, square formation
In each grid correspond to a pixel, the 3D modeling forms three-dimensional model by multiple faces, face is each formed in three-dimensional model
A corresponding pixel, wherein the pixel carrys out the image data after self-correcting.
Further, described to include to panoramic projection model progress panoramic projection texture mapping:According to the multiple
The relative position information of camera will form the grid of 2D or 3D models by the pixel or the location information in face be mapped to
In image information after correction, and then re-map in original image information, to complete the mapping of panoramic projection texture, wherein
Each location information corresponds to one or more original image informations.
Further, described to include to panoramic projection texture mapping image progress image texture fusion:By to institute
It states the method that corresponding multiple original images in a picture element position information assign different weights and carries out image texture fusion.
Compared with prior art, can to unmanned plane the environmental information of surrounding, unmanned plane elevation-over in height ring
Border image information is observed, and has the ability in the visual field of 360 ° of covering level and vertical 360 ° omni-directional.In addition, the present invention adds
Elevation sensing module, the compatible Big Dipper/GPS satellite information and inertial sensor information are entered so that the present invention observes unmanned plane
The detection of position and elevation information in environment is more acurrate.Using 2D and 3D modeling to image to being observed, elevation information melts
Ambient image information is closed, it is more accurate to the description of environment.
Description of the drawings
By reading the detailed description of following detailed description, various other advantages and benefit are common for this field
Technical staff will become clear.Attached drawing is only used for showing the purpose of specific implementation mode, and is not considered as to the present invention
Limitation.And throughout the drawings, the same reference numbers will be used to refer to the same parts.In the accompanying drawings:
Fig. 1 is the system structure composition figure of the present invention.
Fig. 2 is the image processing flow figure of the present invention.
Fig. 3 is that the projective textures of the present invention map schematic diagram.
Fig. 4 is a kind of observation subsystem structure figure of alternative embodiment of the present invention.
Fig. 5 is the observation subsystem structure figure of another alternative embodiment of the present invention.
Fig. 6 is that the present invention estimates waist-level viewing region area embodiment schematic diagram using elevation information sensing module.
Specific implementation mode
The illustrative embodiments of the disclosure are more fully described below with reference to accompanying drawings.Although showing this public affairs in attached drawing
The illustrative embodiments opened, it being understood, however, that may be realized in various forms the disclosure without the reality that should be illustrated here
The mode of applying is limited.It is to be able to be best understood from the disclosure on the contrary, providing these embodiments, and can be by this public affairs
The range opened completely is communicated to those skilled in the art.
The invention discloses a kind of 360 degree of stereoscopic full views observation systems for unmanned plane, including:Observe subsystem and place
Subsystem is managed, 6 directions in space residing for unmanned plane are observed by using multiple cameras, by the picture in 6 directions
Face is combined, to realize that 360 degree of three-dimensional panoramas to space residing for unmanned plane are observed, wherein the multiple camera
Constitute camera group, the camera group is as observation subsystem of the invention, and the observation subsystem is by collected nothing
The image information of man-machine residing space all directions be sent to processing subsystem modeled, image co-registration, to finally realize
360 degree of three-dimensional panorama observations in space residing for unmanned plane.
As shown in Figure 1, for the system structure composition figure of the present invention, including:Observe subsystem (camera group) and processing
System, wherein the observation subsystem includes:It observes below unmanned plane, observe above unmanned plane and observe unmanned plane surrounding
Multiple cameras composition, the processing subsystem includes:Core processor module, system memory module, video storage modules,
Image data synchronization module and position elevation sensing module, wherein the system memory module, the video storage modules, institute
It states image data synchronization module and the position elevation sensing module is connected with the core processor module respectively.
Specifically, the core processing module is central processing unit, it is the control centre of whole system, is responsible for control scene
The reading of image data, is spliced and is interacted with user at the processing of image.DSP digital signals can be used in the central processing unit
Processor, GPU image processing units, FPGA programmable logic arrays or CPU.In addition, in order to ensure the normal work of processor,
The invention also includes power module and clock module and communication interface modules, wherein power module is used for transformation, is other
Module provides stable voltage, and for providing timing or tally function, the two is connected with core processing module clock module.Institute
Communication interface modules is stated for connecting the core processor module and observation subsystem (camera group);The system stores mould
Block or the video storage modules are SD card, SDHC (high power capacity SD storage cards) cards or TF card, wherein the system memory module
For storing control program and ephemeral data;The video storage modules are used for the storage of video data, wherein video data
Including:The video data after video data and image texture fusion after original video data, correction;Described image data
Synchronization module for multi-path camera image complex synchronous handle, such as the reading of multi-path camera data, synchronization and coding with
And controlling transmission.In addition, in the case that the image information of camera output is analog signal, image data synchronous processing module
It also needs to the transmission image to multi-path camera and carries out analog-to-digital conversion process;The position elevation sensing module is according to satellite and phase
It closes sensor information and knows latitude and longitude information and elevation information where system.The module is compatible with big-dipper satellite, GPS satellite or adds
The inertial sensors such as speedometer, gyro.Pass through the introducing of position elevation sensing module so that the present invention may be implemented such as to observation
The calculating of region area, or through the invention other than remote control, accurately control the residing height of unmanned plane.The communication mould
Block be used for other than the present invention control or display equipment communicate.The course of work of above-mentioned module is as follows:In clock module
Control under, image data is passed to core processor module by multiple cameras by image data synchronization module, at core
It manages device module and receives the elevation information from position elevation sensing module simultaneously, stored by core processor module calls system
Mould control command in the block to image data handle and be combined with elevation information, and then generates 360 degree of stereo-picture numbers
According to, the fusion image data of image data and generation after raw image data, correction is preserved, while will fusion after
Image data the long-distance monitorng device other than the present invention be sent to by communication module show, and then realize empty residing for unmanned plane
Between 360 degree of full-view stereos observation.The implementation method of the present invention will be illustrated below.
As shown in Fig. 2, for the image processing flow figure of the present invention, method includes:The observation subsystem acquires image
Data;The processing subsystem carries out image preprocessing to described image data;The processing subsystem is according in camera
Outer parameter carries out image rectification according to the pretreated image data;The processing subsystem is according to the picture number after correction
, according to panoramic projection model modeling is carried out, panoramic projection model is obtained according to the camera inside and outside parameter;The processing subsystem
Panoramic projection texture mapping is carried out to the panoramic projection model, obtains panoramic projection texture mapping image;The processing subsystem
System carries out image texture fusion to the panoramic projection texture mapping image.
Specifically, the present invention is carried out while being observed to 6 directions in space residing for unmanned plane using multiple cameras, to
The image data in 6 directions in space residing for unmanned plane is obtained, processing subsystem passes through 6 directions of communication interface modules acquisition
Image data, and be handled as follows:Image preprocessing, the part mainly to image into line definition enhancing, image denoising sound
And video image de interlacing processing.It includes that gaussian filtering method carries out dry processing, principle to image to be used in the present invention
It is to be weighted averagely to entire image, so that image becomes to increase linear smoothing;Field can be used in the processing of image de interlacing
Between median filtering algorithm handled, process is as follows:For six rows (row1-row6) in image, by taking its first three rows as an example,
If the first row row1 odd field image pixel values are A, B, C, the second row row2 idols field, pixel is D, E, F, and the third line row3 odd fields are
G,H,I.Then after de interlacing processing, the first row image pixel is filled with A, B, C.Second row image completion is median image.In then
Value is calculated as:
Value1=median (first data of the first row, the second number data of row first, first data of the third line)
Value2=median (second data of the first row, the second number data of row second, first data of the third line)
Value3=median (the first row third data, the second number data of row second, second data of the third line)
To three numerical value in median by from small to large arrange after take median, shown in table specific as follows:
Image rectification, since most of image of camera shooting has distortion situation, just in general camera regards
Rink corner degree is bigger, and pattern distortion is more serious.Therefore, by the observation to pattern distortion characteristic in the present invention, using based on calibration
The method of object to establish the mathematical model of pattern distortion, and then obtains pattern distortion rule;Then known camera is utilized
Distortion Law restores image to the image to have distorted along the inverse process of its another pattern distortion.Camera in the above process
Inner parameter also can to the distortion of figure according to the relative position such as multiple cameras according to the external parameter such as focal length and camera
It has an impact, therefore the present invention also needs to use the inside and outside parameter evidence of camera in carrying out image correction process.
Panoramic projection models, and is observed in order to facilitate panorama, according to observation requirements in the present invention, can carry out two to panoramic picture
Kind processing carries out 2D modelings processing or progress 3D modeling processing.2D models in the present invention are made of multiple lattice-array, and with one
A lattice-array represents a pixel, and 3D models are modeled using virtual 3D world coordinate systems, so that observable viewpoint is more
Abundant, 3D models are presented in a manner of 3 dimension point clouds, and 3D projection models can be used such as sphere or other three-dimensional models, the present invention
In the surface of the 3D models is split, to obtain multiple subsection faces, one is corresponded to each subsection face in 3D models
A pixel.In the present invention, plane 2D models are suitable for overlook or look up angular observation some plane panoramic scene.It is elected
It, will be to the figure for shooting space above residing for unmanned plane in the present invention when looking up observation such as selection when selecting 2D modeling observations
As and for shoot space surrounding residing for unmanned plane image carry out image mosaic, formed a whole picture look up scene graph, with this
It is similar, it, will be to for shooting the image below space residing for unmanned plane and being used in the present invention when such as selecting waist-level viewing
The image for shooting space surrounding residing for unmanned plane carries out image mosaic, forms a whole picture and overlooks scene graph.
Panoramic projection texture mapping, the step assign each pixel on panoramic projection model to specific texture value.Such as
Upper described, each lattice-array or subsection face in the present invention on panoramic projection model represent a pixel, this pixel (side
Lattice battle array or subsection face) position can be first mapped in the image after distortion correction by the inside and outside parameter of camera, into
And (image without working process of camera shooting) is re-mapped in original image, it is finally obtained in original camera
In corresponding location of pixels, which is also known as texture coordinate.Due to being carried out at the same time observation using multiple cameras in the present invention,
And in order to ensure that the integrality of picture, the observation area of the multiple camera will produce overlapping, therefore the panorama model
In a location of pixels may have corresponding texture coordinate in original image, the process of texture mapping is as shown in Figure 3.
Therefore, the case where will corresponding to multiple texture coordinates to said one model pixel point in the present invention, carries out at image texture fusion
Reason, details are provided below.
Image texture merges, by analyzing the pixel with multiple texture coordinates in model in the present invention, point
Not Que Ding in the pixel each texture weight, to be merged to multiple textures according to weight, with its mathematical way table
Up to for:
F (x, y)=w1f1(x,y)+w2f2(x,y)+……wnfn(x,y)
Wherein, F (x, y) indicates the texture after fusion, fi(x, y) indicates i-th of texture having in a pixel, wi
Indicate i-th of texture weight shared in the pixel.
Specific embodiment
As shown in figure 4, for a kind of structure chart for observing subsystem of alternative embodiment of the present invention, by 6 in the present embodiment
Individual camera is fixed in a 6 face body structures, forms camera group, and the panorama to image is realized by the camera group
Solid acquisition.Specifically, each one camera of installation in the top surface of the 6 face body structure, bottom surface and each side, 4 sides
Camera covers horizontal 360-degree visual field, top surface or bottom surface and the formation of surface camera covers vertical direction scene
2D visual fields, a 3D stereo scene is collectively formed in 6 cameras.Wherein, the camera can be digital camera or mould
Quasi- camera.
As shown in figure 5, for the observation subsystem structure figure of another alternative embodiment of the present invention, wherein observing subsystem
(camera group) uses pentahedron structure, and wide-angle camera of the horizontal 360-degree using three observation angles more than 120 degree carries out
Observation.
As shown in fig. 6, estimating waist-level viewing region area embodiment using position elevation information sensing module for the present invention
Schematic diagram, wherein α indicate that the observation angle of camera, h indicate the residing height of unmanned plane, and r indicates observation area radius, due to nothing
Man-machine residing height is far longer than cam lens diameter, so α and β approximately equals, therefore observation area radius r is approximately equal to
R=h*arctan (α), so the area approximation of observation area is equal to π * [h*arctan (α)]2。
More than, illustrative specific implementation mode only of the invention, but scope of protection of the present invention is not limited thereto, appoints
What those familiar with the art in the technical scope disclosed by the present invention, the change or replacement that can be readily occurred in, all
It is covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be subject to the protection scope in claims.
Claims (10)
1. a kind of 360 degree of stereoscopic full views observation systems for unmanned plane, which is characterized in that including:Observe subsystem and processing
Subsystem, wherein the observation subsystem is by observation unmanned plane lower section, observation unmanned plane top and observation unmanned plane surrounding
Multiple camera compositions, the processing subsystem are communicated with the observation subsystem, and the processing subsystem is used for institute
The image that multiple cameras obtain in observation subsystem is stated to be handled.
2. observation system according to claim 1, which is characterized in that the observation subsystem further includes:For fixing
State the structure of multiple cameras.
3. observation system according to claim 1, which is characterized in that the processing subsystem includes:Core processor mould
Block, system memory module, video storage modules, image data synchronization module and position elevation sensing module, wherein the system
System memory module, the video storage modules, described image data simultaneous module and position elevation sensing module difference
It is connected with the core processor module.
4. observation system according to claim 3, which is characterized in that the core processor module is, DSP, GPU,
FPGA or CPU, for controlling the reading of scene image data, image mosaic, image procossing are interacted with user.
5. observation system according to claim 3, which is characterized in that described image data synchronization processing module is used for multichannel
The synchronization process of camera, the position elevation sensing module, for obtaining latitude and longitude information and elevation letter where system
Breath, the communication module is used for and the external world is communicated, and the system memory module is respectively used to the video storage modules
Storage control program and synthesis panoramic image data, wherein the video storage modules are SD card or TF card.
6. a kind of a kind of 360 degree of stereoscopic full views for unmanned plane using described in the claims 1-5 any one are observed
The method of system, which is characterized in that including:
The observation subsystem acquires image data;
The processing subsystem carries out image preprocessing to described image data;
The processing subsystem carries out image rectification according to the inside and outside parameter of camera according to the pretreated image data;
The processing subsystem is according to the image data after correction with the camera inside and outside parameter according to progress panoramic projection model
Modeling obtains panoramic projection model;
The processing subsystem carries out panoramic projection texture mapping to the panoramic projection model, obtains panoramic projection texture mapping
Image;
The processing subsystem carries out image texture fusion to the panoramic projection texture mapping image.
7. observation procedure according to claim 6, which is characterized in that the inside and outside parameter of the camera is according to including:It is described
Relative position information between the focal length of multiple cameras and the multiple camera.
8. observation procedure according to claim 6, which is characterized in that described to include to panoramic projection model modeling:According to
Observation needs, and according to the relative position information between the multiple camera, 2D or 3D modeling are carried out to described image data, wherein
The 2D modelings are made of image square formation, and each grid corresponds to a pixel in square formation, and the 3D modeling is made of multiple faces
Three-dimensional model each forms face in three-dimensional model and corresponds to a pixel, wherein the pixel carrys out the picture number after self-correcting
According to.
9. observation procedure according to claim 6, which is characterized in that described to carry out panorama throwing to the panoramic projection model
Shadow texture mapping includes:
According to the relative position information of the multiple camera, grid or the face of 2D or 3D models will be made up of the pixel
Location information be mapped in the image information after correction, and then re-map in original image information, thrown to complete panorama
The mapping of shadow texture, wherein each picture element position information corresponds to one or more original image informations.
10. observation procedure according to claim 6, which is characterized in that described to the panoramic projection texture mapping image
Carrying out image texture fusion includes:By assigning different power to corresponding multiple original images in one picture element position information
The method of weight carries out image texture fusion.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810314065.1A CN108769569B (en) | 2018-04-10 | 2018-04-10 | 360-degree three-dimensional panoramic observation system and method for unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810314065.1A CN108769569B (en) | 2018-04-10 | 2018-04-10 | 360-degree three-dimensional panoramic observation system and method for unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108769569A true CN108769569A (en) | 2018-11-06 |
CN108769569B CN108769569B (en) | 2021-04-13 |
Family
ID=63981565
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810314065.1A Active CN108769569B (en) | 2018-04-10 | 2018-04-10 | 360-degree three-dimensional panoramic observation system and method for unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108769569B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110113571A (en) * | 2019-05-07 | 2019-08-09 | 合肥芃明科技有限公司 | A kind of approaches to IM based on virtual reality and video fusion |
CN111064947A (en) * | 2019-12-04 | 2020-04-24 | 广东康云科技有限公司 | Panoramic-based video fusion method, system, device and storage medium |
CN112567308A (en) * | 2020-01-21 | 2021-03-26 | 深圳市大疆创新科技有限公司 | Airspace detection method, movable platform, equipment and storage medium |
CN112712462A (en) * | 2019-10-24 | 2021-04-27 | 上海宗保科技有限公司 | Unmanned aerial vehicle image acquisition system based on image splicing |
WO2022140970A1 (en) * | 2020-12-28 | 2022-07-07 | 深圳市大疆创新科技有限公司 | Panoramic image generation method and apparatus, movable platform and storage medium |
CN115861070A (en) * | 2022-12-14 | 2023-03-28 | 湖南凝服信息科技有限公司 | Three-dimensional video fusion splicing method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010151311A1 (en) * | 2009-06-26 | 2010-12-29 | Flight Landata, Inc. | Dual-swath imaging system |
CN104834784A (en) * | 2015-05-13 | 2015-08-12 | 西南交通大学 | Railway emergency auxiliary rescue three-dimensional virtual electronic sand table system |
CN105139350A (en) * | 2015-08-12 | 2015-12-09 | 北京航空航天大学 | Ground real-time reconstruction processing system for unmanned aerial vehicle reconnaissance images |
CN105627991A (en) * | 2015-12-21 | 2016-06-01 | 武汉大学 | Real-time panoramic stitching method and system for unmanned aerial vehicle images |
CN105775151A (en) * | 2016-01-29 | 2016-07-20 | 上海云舞网络科技有限公司 | 360 degree panoramic aerial photographing and video recording unmanned aerial vehicle and rack frame |
CN206251247U (en) * | 2016-11-10 | 2017-06-13 | 广西师范大学 | Three-dimensional panoramic video long distance control system based on unmanned plane |
CN107240065A (en) * | 2017-04-19 | 2017-10-10 | 中科院微电子研究所昆山分所 | A kind of 3D full view image generating systems and method |
-
2018
- 2018-04-10 CN CN201810314065.1A patent/CN108769569B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010151311A1 (en) * | 2009-06-26 | 2010-12-29 | Flight Landata, Inc. | Dual-swath imaging system |
CN104834784A (en) * | 2015-05-13 | 2015-08-12 | 西南交通大学 | Railway emergency auxiliary rescue three-dimensional virtual electronic sand table system |
CN105139350A (en) * | 2015-08-12 | 2015-12-09 | 北京航空航天大学 | Ground real-time reconstruction processing system for unmanned aerial vehicle reconnaissance images |
CN105627991A (en) * | 2015-12-21 | 2016-06-01 | 武汉大学 | Real-time panoramic stitching method and system for unmanned aerial vehicle images |
CN105775151A (en) * | 2016-01-29 | 2016-07-20 | 上海云舞网络科技有限公司 | 360 degree panoramic aerial photographing and video recording unmanned aerial vehicle and rack frame |
CN206251247U (en) * | 2016-11-10 | 2017-06-13 | 广西师范大学 | Three-dimensional panoramic video long distance control system based on unmanned plane |
CN107240065A (en) * | 2017-04-19 | 2017-10-10 | 中科院微电子研究所昆山分所 | A kind of 3D full view image generating systems and method |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110113571A (en) * | 2019-05-07 | 2019-08-09 | 合肥芃明科技有限公司 | A kind of approaches to IM based on virtual reality and video fusion |
CN112712462A (en) * | 2019-10-24 | 2021-04-27 | 上海宗保科技有限公司 | Unmanned aerial vehicle image acquisition system based on image splicing |
CN111064947A (en) * | 2019-12-04 | 2020-04-24 | 广东康云科技有限公司 | Panoramic-based video fusion method, system, device and storage medium |
CN112567308A (en) * | 2020-01-21 | 2021-03-26 | 深圳市大疆创新科技有限公司 | Airspace detection method, movable platform, equipment and storage medium |
WO2022140970A1 (en) * | 2020-12-28 | 2022-07-07 | 深圳市大疆创新科技有限公司 | Panoramic image generation method and apparatus, movable platform and storage medium |
CN115861070A (en) * | 2022-12-14 | 2023-03-28 | 湖南凝服信息科技有限公司 | Three-dimensional video fusion splicing method |
Also Published As
Publication number | Publication date |
---|---|
CN108769569B (en) | 2021-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108769569A (en) | A kind of 360 degree of stereoscopic full views observation systems and method for unmanned plane | |
AU2011312140B2 (en) | Rapid 3D modeling | |
US7944547B2 (en) | Method and system of generating 3D images with airborne oblique/vertical imagery, GPS/IMU data, and LIDAR elevation data | |
EP3132231B1 (en) | A method and system for estimating information related to a vehicle pitch and/or roll angle | |
CN104748728B (en) | Intelligent machine attitude matrix calculation method and its applied to photogrammetric method | |
WO2015096806A1 (en) | Attitude determination, panoramic image generation and target recognition methods for intelligent machine | |
KR102295809B1 (en) | Apparatus for acquisition distance for all directions of vehicle | |
CN110930508B (en) | Two-dimensional photoelectric video and three-dimensional scene fusion method | |
CN112686877B (en) | Binocular camera-based three-dimensional house damage model construction and measurement method and system | |
KR101150510B1 (en) | Method for Generating 3-D High Resolution NDVI Urban Model | |
US11057566B2 (en) | Image synthesis system | |
CN110706273B (en) | Real-time collapse area measurement method based on unmanned aerial vehicle | |
CN112184786B (en) | Target positioning method based on synthetic vision | |
CN112288637A (en) | Unmanned aerial vehicle aerial image rapid splicing device and rapid splicing method | |
JP2013518339A (en) | Three-dimensional model method based on the combination of ground-based images and images taken from above | |
CN108269234A (en) | A kind of lens of panoramic camera Attitude estimation method and panorama camera | |
WO2018052100A1 (en) | Image processing device, image processing method, and image processing program | |
CN117576343B (en) | Three-dimensional MESH model manufacturing method based on high-resolution satellite stereoscopic image | |
Wang et al. | Automated mosaicking of UAV images based on SFM method | |
CN107784666A (en) | The detection of terrain and its features three dimensional change and update method based on stereopsis | |
CN113961068A (en) | Close-distance real object eye movement interaction method based on augmented reality helmet | |
CN112365591A (en) | Space and ground collaborative comprehensive situation generation method based on synthetic vision | |
WO2020247399A1 (en) | Spherical image based registration and self-localization for onsite and offsite viewing | |
Mirisola et al. | 3D Map Registration using Vision/Laser and Inertial Sensing. | |
CN113066154B (en) | Method and system for real-time superposition of earth surface image and underground space image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: Room 707, complex building, Kunshan Industrial Technology Research Institute, No. 1699, Zuchongzhi South Road, Kunshan, Suzhou, Jiangsu, 215399 Applicant after: Kunshan Microelectronics Technology Research Institute Address before: 215347 7th floor, IIR complex, 1699 Weicheng South Road, Kunshan City, Suzhou City, Jiangsu Province Applicant before: KUNSHAN BRANCH, INSTITUTE OF MICROELECTRONICS OF CHINESE ACADEMY OF SCIENCES |
|
CB02 | Change of applicant information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |