CN108734685B - Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images - Google Patents

Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images Download PDF

Info

Publication number
CN108734685B
CN108734685B CN201810444565.7A CN201810444565A CN108734685B CN 108734685 B CN108734685 B CN 108734685B CN 201810444565 A CN201810444565 A CN 201810444565A CN 108734685 B CN108734685 B CN 108734685B
Authority
CN
China
Prior art keywords
image
hyperspectral
images
remote sensing
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810444565.7A
Other languages
Chinese (zh)
Other versions
CN108734685A (en
Inventor
易俐娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology Beijing CUMTB
Original Assignee
China University of Mining and Technology Beijing CUMTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology Beijing CUMTB filed Critical China University of Mining and Technology Beijing CUMTB
Priority to CN201810444565.7A priority Critical patent/CN108734685B/en
Publication of CN108734685A publication Critical patent/CN108734685A/en
Application granted granted Critical
Publication of CN108734685B publication Critical patent/CN108734685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a splicing method of an unmanned aerial vehicle-mounted hyperspectral line array remote sensing image, which comprises the steps of firstly collecting a hyperspectral remote sensing image to be spliced, and selecting and matching characteristic points of the hyperspectral remote sensing image; selecting an image transformation model based on the matched feature points to carry out image registration to obtain a registered image; feathering and color balancing processing are carried out on the overlapped area based on the registered image, and then mosaic based on pixels is carried out to obtain a spliced hyperspectral image; and carrying out geographic registration on the spliced hyperspectral images based on the area array orthographic images with geographic coordinates to obtain the hyperspectral images with real geographic coordinates. The method can solve the problem that the coverage area of a single image of the unmanned aerial vehicle is small, the visual effect is guaranteed, meanwhile, the spectral distortion of the image before and after splicing is small, and the final splicing result has real geographic coordinates.

Description

Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images
Technical Field
The invention relates to the technical field of remote sensing image processing, in particular to a splicing method of an unmanned aerial vehicle-mounted hyperspectral line array remote sensing image.
Background
The hyperspectral technology utilizes a large number of very narrow electromagnetic wave bands to obtain related data from an interested object, and contains rich space, radiation and spectrum triple information, and the most important characteristics and signs of the hyperspectral technology are that a spectrum is combined with an image into a whole, which is the frontier field of the current international remote sensing technology. The hyperspectral remote sensing image has very high spectral resolution and can provide richer earth surface information, so the hyperspectral remote sensing image is greatly concerned and widely applied by scholars at home and abroad, and the application field of the hyperspectral remote sensing image covers all aspects of earth science, becomes an effective technical means in the fields of geological mapping, vegetation investigation, ocean remote sensing, agricultural remote sensing, atmospheric research, environmental monitoring and the like, and plays an increasingly important role. In recent years, imaging spectrometer hardware technology is continuously developed, the size is smaller and smaller, the weight is gradually reduced, and the cost is also gradually reduced, so that the imaging spectrometer is more convenient and quicker to acquire hyperspectral images, and along with the development of an unmanned aerial vehicle aerial survey remote sensing system with strong maneuverability, low cost and high precision, the imaging spectrometer and the unmanned aerial vehicle are integrated to acquire hyperspectral data, and the development field of emerging research is provided. Compared with satellite remote sensing images and traditional aerial remote sensing images, the low-altitude unmanned aerial vehicle data acquisition mode is flexible, the image precision is high, the image is clear, the ground features are rich, the unmanned aerial vehicle remote sensing image is characterized in that the acquisition of large-area ground feature information is completed by continuous shooting of a low-altitude airplane in the same flight zone and back-and-forth shooting of different flight zones, people establish a panoramic image of the same flight zone for establishing a wide-view-angle panoramic image, but the ground feature information covered by the single flight zone is still limited, and a plurality of flight zones need to be spliced to effectively cover a research area.
The essence of establishing the multi-flight-zone panoramic image in the field of unmanned aerial vehicle remote sensing is to synthesize a plurality of single images with overlapped areas in a plurality of adjacent flight zones into a seamless high-resolution image with wide coverage area, and complete a drawing task with an accurate and real ground feature information new view through a series of aerial images of unmanned aerial vehicles so as to facilitate later-stage application, wherein the basic part of the image is splicing. Image stitching refers to superimposing two or more sequential images according to a common part of the sequential images to obtain a large seamless image with a wide viewing angle. The spliced image not only facilitates people to visually observe the global effect of the observed area, but also retains the detail information in the original image. The image splicing mainly comprises two key technologies of image registration and image fusion, and the purpose of the registration is to register the images into the same coordinate system according to a geometric motion model; the fusion is to combine the registered images into a large spliced image, and the image fusion techniques commonly used in the prior art are generally pixel-based fusion methods, such as a direct average fusion method, a weighted average fusion method, and a multi-resolution pyramid fusion method, which are commonly used, but the image fusion techniques in the prior art have certain limitations.
Disclosure of Invention
The invention aims to provide a method for splicing an unmanned aerial vehicle-mounted hyperspectral line array remote sensing image, which can solve the problem that the coverage area of a single image of an unmanned aerial vehicle is small, ensure the visual effect and small spectral distortion of the image before and after splicing, and ensure that the final splicing result has real geographic coordinates.
The purpose of the invention is realized by the following technical scheme:
a splicing method for remote sensing images of an unmanned aerial vehicle-mounted hyperspectral line array comprises the following steps:
step 1, collecting hyperspectral remote sensing images to be spliced, and selecting and matching characteristic points of the hyperspectral remote sensing images;
step 2, selecting an image transformation model based on the matched feature points to carry out image registration to obtain a registered image;
step 3, performing feathering and color equalization processing on the overlapped area based on the registered image, and then performing pixel-based mosaic to obtain a spliced hyperspectral image;
and 4, carrying out geographic registration on the spliced hyperspectral images based on the area array orthographic images with geographic coordinates to obtain the hyperspectral images with real geographic coordinates.
The process of the step 1 specifically comprises the following steps:
Firstly, collecting hyperspectral images to be spliced, and cutting and rotating a navigation band;
feature synonym points are selected based on the processed flight band.
The process of the step 2 specifically comprises the following steps:
based on the matched characteristic points, one flight strip is taken as a benchmark, and the other flight strip is subjected to image transformation by adopting a curved surface spline function method to obtain an image with a correct relative position, namely the registered image.
The formula of the surface spline function method is expressed as follows:
Figure BDA0001656818830000021
wherein, a0,a1,a2,Fi(i ═ 1, 2.. times, n) is a undetermined coefficient;ri 2=(x-xi)2+(y-yi)2(ii) a Epsilon is an empirical parameter for adjusting the curvature of the curved surface;
the specific process of image transformation by adopting the curved surface spline function method comprises the following steps:
firstly, obtaining characteristic point coordinates of a first image as (x1, y1), (x2, y2) … (xn, yn) and (n is more than or equal to 3), sequentially corresponding to characteristic point coordinates of a second image as (w1, v1), (w2, v2) … (wn, vn), and correcting the first image by taking the second image as a reference;
then, based on two groups of data of (xi, yi, wi) and (xi, yi, vi) (i ═ 1,2, … n), a matrix equation is constructed, and fitted surface coefficients W of the geographic abscissa and the geographic ordinate are respectively calculatedw(x, y) and Wv(x,y);
And then carrying out coordinate transformation on the first image to be corrected, and resampling pixels by using a bilinear interpolation method.
In step 3, the color equalization process includes:
firstly, calculating a gray level histogram of an original image;
then, obtaining an accumulated probability distribution function of each gray level of the original image, and constructing a gray level conversion function;
and mapping all pixel gray values of the original image to an output image according to the gray conversion function.
In step 3, the process of feathering is as follows:
setting I (I, j) as a new image gray value after feathering; i is1、I2The image gray values of the spliced images include:
I(i,j)=eI1(i,j)+(1-e)I2(i,j)(i,j)∈I1∩I2,0≤e≤1
wherein e is a weighting coefficient, and the maximum value and the minimum value of the overlapping area of the two images in the X-axis direction are respectively: x is the number ofmax、xminThen the weighting coefficient e is expressed as:
Figure BDA0001656818830000031
the process of the step 4 specifically comprises the following steps:
and taking the area array orthoimage with the geographic coordinates as a reference image, and performing geographic registration on the spliced line array hyperspectral image by adopting an affine transformation model to obtain a hyperspectral image with real geographic coordinates.
The formula of the affine transformation model is expressed as:
Figure BDA0001656818830000032
wherein m is0、m1、m3、m4Representing the image change scale and the rotation angle; m is2、m5Representing horizontal and vertical displacement;
the specific process of geographic registration by adopting the affine transformation model comprises the following steps:
firstly, obtaining characteristic point coordinates of a hyperspectral image to be registered as (x1, y1), (x2, y2) … (xn, yn) and (n is more than or equal to 3), wherein the characteristic point coordinates of an orthoimage serving as a reference image sequentially correspond to (w1, v1), (w2, v2) … (wn, vn);
Then substituting the coordinate point pairs of the hyperspectral images to be registered into the affine transformation model, and solving unknown parameters of the model by adopting a least square method;
and transforming the hyperspectral image to be registered to the coordinate system of the orthoimage according to the obtained transformation model, and resampling the pixel value by adopting a bilinear interpolation method.
According to the technical scheme provided by the invention, the problem that the coverage area of a single image of the unmanned aerial vehicle is small can be solved, the visual effect is ensured, the spectral distortion of the image before and after splicing is also ensured to be small, and the final splicing result has real geographic coordinates.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a splicing method of an unmanned aerial vehicle-mounted hyperspectral line array remote sensing image provided by an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The following will describe an embodiment of the present invention in further detail with reference to the accompanying drawings, and as shown in fig. 1, a schematic flow chart of a method for stitching an unmanned aerial vehicle-mounted hyperspectral line array remote sensing image provided by the embodiment of the present invention is shown, where the method includes:
step 1, collecting hyperspectral remote sensing images to be spliced, and selecting and matching characteristic points of the hyperspectral remote sensing images;
in the step, firstly, collecting hyperspectral images to be spliced, and cutting and rotating a navigation band;
feature synonym points are selected based on the processed flight band.
Step 2, selecting an image transformation model based on the matched feature points to carry out image registration to obtain a registered image;
in this step, based on the matched feature points, one flight band may be used as a reference, and a curved surface spline function method is used to perform image transformation on the other flight band, so as to obtain an image with a correct relative position, that is, the registered image.
In a specific implementation, the formula of the curved surface spline function method is represented as:
Figure BDA0001656818830000041
wherein, a0,a1,a2,Fi(i ═ 1,2,. n) is the undetermined coefficient; r isi 2=(x-xi)2+(y-yi)2(ii) a The epsilon is an empirical parameter for adjusting the curvature of the curved surface, and is properly selected according to actual conditions, generally, the curved surface needs to reduce distortion and smooth, and the epsilon is 1-10-2Even taking epsilon as 10 for curved surface with large distortion-5~10-6
The above undetermined coefficient a0,a1,a2,Fi(i ═ 1,2.., n) can be determined by the following system of equations:
Figure BDA0001656818830000051
wherein, CjIs a parameter relating to the coefficient of elasticity, according to CjThe surface fitting can be divided into three application modes: a. flexible fitting (such as generating DEM, establishing a relation between geographic coordinates and map coordinates, and the like); b. elastic fit (e.g., smooth surface); c. rigid fitting (e.g. correction of digitized map), in this example Cj=0。
Further, the specific process of image transformation by using the curved surface spline function method is as follows:
firstly, obtaining characteristic point coordinates of a first image as (x1, y1), (x2, y2) … (xn, yn) and (n is more than or equal to 3), sequentially corresponding to characteristic point coordinates of a second image as (w1, v1), (w2, v2) … (wn, vn), and correcting the first image by taking the second image as a reference;
then, based on two groups of data of (xi, yi, wi) and (xi, yi, vi) (i ═ 1,2, … n), a matrix equation is constructed, and fitted surface coefficients W of the geographic abscissa and the geographic ordinate are respectively calculated w(x, y) and Wv(x,y);
And then carrying out coordinate transformation on the first image to be corrected, and resampling pixels by using a bilinear interpolation method.
Step 3, performing feathering and color equalization processing on the overlapped area based on the registered image, and then performing pixel-based mosaic to obtain a spliced hyperspectral image;
in this step, the process of performing color equalization processing is:
firstly, calculating a gray level histogram of an original image;
then, obtaining an accumulated probability distribution function of each gray level of the original image, and constructing a gray level conversion function;
and mapping all pixel gray values of the original image to an output image according to the gray conversion function.
The process of performing the feathering process is:
setting I (I, j) as a new image gray value after feathering; I.C. A1、I2The image gray values of the spliced images include:
I(i,j)=eI1(i,j)+(1-e)I2(i,j)(i,j)∈I1∩I2,0≤e≤1
wherein e is a weighting coefficient, and the maximum value and the minimum value of the overlapping area of the two images in the X-axis direction are respectively: x is the number ofmax、xminThen the weighting coefficient e is expressed as:
Figure BDA0001656818830000061
and 4, carrying out geographic registration on the spliced hyperspectral images based on the area array orthographic images with geographic coordinates to obtain the hyperspectral images with real geographic coordinates.
In this step, an area array orthoimage with geographic coordinates may be specifically used as a reference image, and an affine transformation model is used to perform geographic registration on the stitched linear array hyperspectral image, so as to obtain a hyperspectral image with real geographic coordinates.
In a specific implementation, the above formula of the affine transformation model is expressed as:
Figure BDA0001656818830000062
wherein m is0、m1、m3、m4Representing the image change scale and the rotation angle; m is2、m5Representing horizontal and vertical displacement;
further, the specific process of performing geographic registration by using the affine transformation model comprises the following steps:
firstly, obtaining characteristic point coordinates of a hyperspectral image to be registered as (x1, y1), (x2, y2) … (xn, yn) and (n is more than or equal to 3), wherein the characteristic point coordinates of an orthoimage serving as a reference image sequentially correspond to (w1, v1), (w2, v2) … (wn, vn);
then substituting the coordinate point pairs of the hyperspectral images to be registered into the affine transformation model, and solving unknown parameters of the model by adopting a least square method;
and transforming the hyperspectral image to be registered to the coordinate system of the orthoimage according to the obtained transformation model, and resampling the pixel value by adopting a bilinear interpolation method.
It is noted that those skilled in the art will recognize that embodiments of the present invention are not described in detail herein.
In summary, the method for splicing the remote sensing images of the unmanned aerial vehicle-mounted hyperspectral line array can effectively remove obvious splicing lines caused by exposure difference, obtain high regional registration accuracy, enable the final splicing result to have real geographic coordinates, and facilitate application of hyperspectral data.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (6)

1. A splicing method for remote sensing images of an unmanned aerial vehicle-mounted hyperspectral line array is characterized by comprising the following steps:
step 1, collecting hyperspectral remote sensing images to be spliced, and selecting and matching characteristic points of the hyperspectral remote sensing images;
step 2, selecting an image transformation model based on the matched feature points to carry out image registration to obtain a registered image, wherein the specific process is as follows:
based on the matched characteristic points, one flight band is taken as a reference, and the other flight band is subjected to image transformation by adopting a curved surface spline function method to obtain an image with a correct relative position, namely the registered image;
wherein the formula of the surface spline function method is expressed as:
Figure FDA0003563968660000011
wherein, a0,a1,a2,Fi(i ═ 1,2,. n) is the undetermined coefficient; r isi 2=(x-xi)2+(y-yi)2(ii) a Epsilon is an empirical parameter for adjusting the curvature of the curved surface;
The undetermined coefficient a0,a1,a2,Fi(i ═ 1,2.., n) was determined by the following system of equations:
Figure FDA0003563968660000012
wherein, CjIs a parameter relating to the elastic coefficient;
the specific process of image transformation by adopting the curved surface spline function method comprises the following steps:
firstly, obtaining characteristic point coordinates of a first image as (x1, y1), (x2, y2) … (xn, yn) and (n is more than or equal to 3), sequentially corresponding to characteristic point coordinates of a second image as (w1, v1), (w2, v2) … (wn, vn), and correcting the first image by taking the second image as a reference;
then using (xi, yi, wi) and (xi, yi, vi) (xi, yi, vi)i-1, 2, … n) as a basis, constructing a matrix equation, and respectively calculating the fitted surface coefficients W of the geographic abscissa and the geographic ordinatew(x, y) and Wv(x,y);
Then, carrying out coordinate transformation on the first image to be corrected, and resampling pixels by using a bilinear interpolation method;
step 3, performing feathering and color equalization processing on the overlapped area based on the registered image, and then performing pixel-based mosaic to obtain a spliced hyperspectral image;
and 4, carrying out geographic registration on the spliced hyperspectral images based on the area array orthographic images with geographic coordinates to obtain the hyperspectral images with real geographic coordinates.
2. The method for splicing the unmanned aerial vehicle-mounted hyperspectral line array remote sensing images according to claim 1, wherein the process of the step 1 specifically comprises the following steps:
firstly, collecting hyperspectral images to be spliced, and cutting and rotating a navigation band;
feature synonym points are selected based on the processed flight tapes.
3. The method for splicing the remote sensing images of the unmanned aerial vehicle-mounted hyperspectral line array according to claim 1, wherein in the step 3, the color equalization processing process comprises the following steps:
firstly, calculating a gray level histogram of an original image;
then, obtaining an accumulated probability distribution function of each gray level of the original image, and constructing a gray level conversion function;
and mapping all pixel gray values of the original image to an output image according to the gray conversion function.
4. The method for splicing the remote sensing images of the unmanned aerial vehicle-mounted hyperspectral line array according to claim 1, wherein in the step 3, the process of the feathering is as follows:
setting I (I, j) as a new image gray value after feathering; i is1、I2The gray values of the images with the spliced images are as follows:
I(i,j)=eI1(i,j)+(1-e)I2(i,j)(i,j)∈I1∩I2,0≤e≤1
wherein e is a weighting coefficient, and the maximum value and the minimum value of the overlapping area of the two images in the X-axis direction are respectively: x is the number of max、xminThen the weighting coefficient e is expressed as:
Figure FDA0003563968660000021
5. the method for splicing the unmanned aerial vehicle-mounted hyperspectral line array remote sensing image according to claim 1, wherein the process of the step 4 specifically comprises the following steps:
and taking the area array orthoimage with the geographic coordinates as a reference image, and carrying out geographic registration on the spliced line array hyperspectral image by adopting an affine transformation model to obtain a hyperspectral image with real geographic coordinates.
6. The method for splicing the unmanned aerial vehicle-mounted hyperspectral line array remote sensing image according to claim 5, wherein the specific process of carrying out geographic registration by adopting an affine transformation model comprises the following steps:
firstly, obtaining the characteristic point coordinates of a hyperspectral image to be registered, and sequentially corresponding to the characteristic point coordinates of an orthoscopic image serving as a reference image;
then substituting the coordinate point pairs of the hyperspectral images to be registered into an affine transformation model, and solving unknown parameters of the model by adopting a least square method;
and transforming the hyperspectral image to be registered to the coordinate system of the orthoimage according to the obtained transformation model, and resampling the pixel value by adopting a bilinear interpolation method.
CN201810444565.7A 2018-05-10 2018-05-10 Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images Active CN108734685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810444565.7A CN108734685B (en) 2018-05-10 2018-05-10 Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810444565.7A CN108734685B (en) 2018-05-10 2018-05-10 Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images

Publications (2)

Publication Number Publication Date
CN108734685A CN108734685A (en) 2018-11-02
CN108734685B true CN108734685B (en) 2022-06-03

Family

ID=63937216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810444565.7A Active CN108734685B (en) 2018-05-10 2018-05-10 Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images

Country Status (1)

Country Link
CN (1) CN108734685B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111047628B (en) * 2019-12-16 2020-10-02 中国水利水电科学研究院 Night light satellite image registration method and device
CN111429356B (en) * 2020-03-31 2023-06-27 北京建筑大学 Geometric registration and clipping method for ground hyperspectral image
CN111709901B (en) * 2020-05-22 2023-04-28 哈尔滨工业大学 FCM cluster matching and Wallis filtering-based no-weight multi/hyperspectral remote sensing image color homogenizing method
CN112991186B (en) * 2021-04-27 2021-07-27 湖南大学 Unmanned aerial vehicle large-field-of-view hyperspectral image generation method and system
CN114022385B (en) * 2021-11-07 2024-03-26 桂林电子科技大学 Image restoration method based on local surface fitting
CN117788281B (en) * 2024-01-08 2024-05-28 安徽大学 Low-POS-precision airborne linear array hyperspectral image region image stitching method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009065003A1 (en) * 2007-11-14 2009-05-22 Intergraph Software Technologies Company Method and apparatus of taking aerial surveys
CN105844587A (en) * 2016-03-17 2016-08-10 河南理工大学 Low-altitude unmanned aerial vehicle-borne hyperspectral remote-sensing-image automatic splicing method
CN107274380A (en) * 2017-07-07 2017-10-20 北京大学 A kind of quick joining method of unmanned plane multispectral image
CN107563964A (en) * 2017-08-22 2018-01-09 长光卫星技术有限公司 The quick joining method of large area array sub-meter grade night scene remote sensing image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009065003A1 (en) * 2007-11-14 2009-05-22 Intergraph Software Technologies Company Method and apparatus of taking aerial surveys
CN105844587A (en) * 2016-03-17 2016-08-10 河南理工大学 Low-altitude unmanned aerial vehicle-borne hyperspectral remote-sensing-image automatic splicing method
CN107274380A (en) * 2017-07-07 2017-10-20 北京大学 A kind of quick joining method of unmanned plane multispectral image
CN107563964A (en) * 2017-08-22 2018-01-09 长光卫星技术有限公司 The quick joining method of large area array sub-meter grade night scene remote sensing image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Automated mosaicking of UAV images based on SFM method;Hui Wang等;《2014 IEEE Geoscience and Remote Sensing Symposium》;20140718;全文 *
无人机航摄技术在测绘领域的应用;易俐娜;《教育教学论坛》;20171231;75-76 *

Also Published As

Publication number Publication date
CN108734685A (en) 2018-11-02

Similar Documents

Publication Publication Date Title
CN108734685B (en) Splicing method for unmanned aerial vehicle-mounted hyperspectral line array remote sensing images
CN107505644B (en) Three-dimensional high-precision map generation system and method based on vehicle-mounted multi-sensor fusion
CN106127697B (en) EO-1 hyperion geometric correction method is imaged in unmanned aerial vehicle onboard
CN107103584B (en) A method of the production high-spatial and temporal resolution NDVI based on space-time weighting
CN103914808B (en) Method for splicing ZY3 satellite three-line-scanner image and multispectral image
CN106895851B (en) A kind of sensor calibration method that the more CCD polyphasers of Optical remote satellite are uniformly processed
US10565789B2 (en) Method and system for geometric referencing of multi-spectral data
CN102609918A (en) Image characteristic registration based geometrical fine correction method for aviation multispectral remote sensing image
CN112884672B (en) Multi-frame unmanned aerial vehicle image relative radiation correction method based on contemporaneous satellite images
CN107798668B (en) Unmanned aerial vehicle imaging hyperspectral geometric correction method and system based on RGB images
CN104361563B (en) GPS-based (global positioning system based) geometric precision correction method of hyperspectral remote sensing images
CN115060208A (en) Power transmission and transformation line geological disaster monitoring method and system based on multi-source satellite fusion
CN113744249B (en) Marine ecological environment damage investigation method
CN111538051B (en) Precise processing method for swing-scanning large-width optical satellite
CN115183669A (en) Target positioning method based on satellite image
CN114265427A (en) Inspection unmanned aerial vehicle auxiliary navigation system and method based on infrared image matching
Hakim et al. Development of systematic image preprocessing of LAPAN-A3/IPB multispectral images
CN108109118B (en) Aerial image geometric correction method without control points
JPH09153131A (en) Method and device for processing picture information and picture information integrating system
Cariou et al. Automatic georeferencing of airborne pushbroom scanner images with missing ancillary data using mutual information
CN105046667B (en) The geometric image correction method of 45 ° of rotary scanning mode space cameras
CN103925919A (en) Fisheye camera based planetary rover detection point positioning method
Wang et al. A method for generating true digital orthophoto map of UAV platform push-broom hyperspectral scanners assisted by lidar
CN102509275A (en) Resample method for remote sensing image composited based on image element imaging areas
CN113029332A (en) Satellite cloud picture prediction method, device and processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant