CN116579960B - Geospatial data fusion method - Google Patents
Geospatial data fusion method Download PDFInfo
- Publication number
- CN116579960B CN116579960B CN202310501962.4A CN202310501962A CN116579960B CN 116579960 B CN116579960 B CN 116579960B CN 202310501962 A CN202310501962 A CN 202310501962A CN 116579960 B CN116579960 B CN 116579960B
- Authority
- CN
- China
- Prior art keywords
- value
- remote sensing
- rem
- sensing images
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 28
- 230000004927 fusion Effects 0.000 claims abstract description 66
- 238000000034 method Methods 0.000 claims abstract description 34
- 238000004590 computer program Methods 0.000 claims description 12
- 101150077194 CAP1 gene Proteins 0.000 claims description 9
- 101150014715 CAP2 gene Proteins 0.000 claims description 9
- 101100135641 Caenorhabditis elegans par-3 gene Proteins 0.000 claims description 9
- 101100438378 Neurospora crassa (strain ATCC 24698 / 74-OR23-1A / CBS 708.71 / DSM 1257 / FGSC 987) fac-1 gene Proteins 0.000 claims description 9
- 101100326803 Neurospora crassa (strain ATCC 24698 / 74-OR23-1A / CBS 708.71 / DSM 1257 / FGSC 987) fac-2 gene Proteins 0.000 claims description 9
- 102100040853 PRKC apoptosis WT1 regulator protein Human genes 0.000 claims description 9
- 101710162991 PRKC apoptosis WT1 regulator protein Proteins 0.000 claims description 9
- 101100361282 Schizosaccharomyces pombe (strain 972 / ATCC 24843) rpm1 gene Proteins 0.000 claims description 9
- 101150071218 cap3 gene Proteins 0.000 claims description 9
- 101150009194 cap4 gene Proteins 0.000 claims description 9
- 101150002095 capB gene Proteins 0.000 claims description 9
- 101100406879 Neurospora crassa (strain ATCC 24698 / 74-OR23-1A / CBS 708.71 / DSM 1257 / FGSC 987) par-2 gene Proteins 0.000 claims description 6
- 101100246985 Pseudomonas aeruginosa (strain ATCC 15692 / DSM 22644 / CIP 104116 / JCM 14847 / LMG 12228 / 1C / PRS 101 / PAO1) exaA gene Proteins 0.000 claims description 6
- 101100061872 Pseudomonas aeruginosa (strain ATCC 15692 / DSM 22644 / CIP 104116 / JCM 14847 / LMG 12228 / 1C / PRS 101 / PAO1) exaB gene Proteins 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 6
- 238000009499 grossing Methods 0.000 claims description 6
- 101000869664 Pseudomonas aeruginosa (strain ATCC 15692 / DSM 22644 / CIP 104116 / JCM 14847 / LMG 12228 / 1C / PRS 101 / PAO1) FAD-dependent catabolic D-arginine dehydrogenase DauA Proteins 0.000 claims description 3
- 238000009825 accumulation Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 abstract description 11
- 238000007499 fusion processing Methods 0.000 abstract description 4
- 230000000694 effects Effects 0.000 description 6
- 230000009286 beneficial effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000010354 integration Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The application relates to the field of data processing, and provides a geospatial data fusion method and system. The method can be used for efficiently fusing the geospatial data, does not need to manually adjust part of the data, greatly improves the data fusion speed, can more accurately reflect the geospatial information and the ground feature change in the region, can also improve the quality and the precision of the geospatial data, and greatly reduces the data deviation or error caused by a plurality of different remote sensing images in the fusion process.
Description
Technical Field
The application relates to the field of data processing, in particular to a geospatial data fusion method.
Background
Geospatial data refers to various data related to geospatial location including geographic location, geographic area, geographic phenomenon, topography, and the like. Geospatial data generally has dimensions of space, time, attributes and the like, and the collection and acquisition modes mainly include remote sensing technology, global positioning system, ground measurement and the like. The geospatial data can be used for describing and analyzing geographic phenomena, geographic laws and geographic processes, and has wide application in the fields of urban planning, natural resource management, disaster early warning and the like.
Geospatial data fusion refers to integrating, integrating and processing geospatial data of different sources and different types to obtain more accurate and complete geographic information data. With the wide application of geographic information systems, more and more geographic space data are collected and generated, including remote sensing images, geographic position data, satellite images and the like, meanwhile, the geographic space data fusion method tends to be diversified and complicated, and the traditional data fusion method mainly comprises model fusion, feature fusion, decision fusion and the like, but due to the differences of data formats, data precision and the like, the problems of low data processing efficiency, poor fusion effect and the like in the process of fusion and integration of the geographic space data are often difficult to solve, and the main stream method of the current data fusion such as multi-source data fusion, multi-scale data fusion, deep learning and the like has the characteristics of high processing efficiency, high fusion speed and the like, but the processing effect in the aspects of information loss, huge complexity and the like is still poor, so the geographic space data fusion method and application are provided, and are the key for improving the geographic space data fusion precision and reliability in the current geographic information technology research field.
Disclosure of Invention
The application aims to provide a geospatial data fusion method and application thereof, which are used for solving one or more technical problems in the prior art and at least providing a beneficial selection or creation condition.
The application provides a geospatial data fusion method, which comprises the steps of obtaining a plurality of remote sensing images, extracting geospatial data in the plurality of remote sensing images, integrating the geospatial data in the plurality of remote sensing images to obtain a data source, calculating the geometric degree of overlap of the data source, and carrying out data fusion on the plurality of remote sensing images according to the geometric degree of overlap of the data source. The method can be used for efficiently fusing the geospatial data, does not need to manually adjust part of the data, greatly improves the data fusion speed, can more accurately reflect the geospatial information and the ground feature change in the region, can also improve the quality and the precision of the geospatial data, and greatly reduces the data deviation or error caused by a plurality of different remote sensing images in the fusion process.
To achieve the above object, according to an aspect of the present application, there is provided a geospatial data fusion method and application, the method comprising the steps of:
s100, acquiring a plurality of remote sensing images, and extracting geographic space data in the plurality of remote sensing images;
s200, integrating the geospatial data in the plurality of remote sensing images to obtain a data source;
s300, calculating the geometric degree of overlap of the data source;
s400, carrying out data fusion on the plurality of remote sensing images according to the geometric degree of the data source.
Further, in step S100, the remote sensing image is a remote sensing digital image, the remote sensing digital image is stored in a digital form, and a basic unit of the remote sensing digital image is a pixel, and the pixel has a corresponding brightness value.
Further, in step S100, the steps of acquiring a plurality of remote sensing images and extracting geospatial data in the plurality of remote sensing images specifically include: and acquiring a plurality of remote sensing images through remote sensing monitoring, taking the brightness value of the pixel point in the remote sensing images and the space coordinate of the pixel point in the remote sensing images as the geographic space data in the remote sensing images, and sequentially extracting and storing the geographic space data of each remote sensing image in the plurality of remote sensing images.
Further, in step S200, the step of integrating the geospatial data in the plurality of remote sensing images to obtain the data source specifically includes:
s201, representing an ith remote sensing image in a plurality of remote sensing images by rem (i), recording the number of the plurality of remote sensing images as N (namely, the specific number of the plurality of remote sensing images is N), initializing an integer variable j1, wherein the initial value of the variable j1 is 1, the value range of the variable j1 is [1, N ], traversing j1 from j1 = 1, creating a blank set lan { } and turning to S202;
s202, recording the number of all pixel points in the current rem (j 1) as M j1 Let alr (j) denote the luminance value of the j-th pixel point in the current rem (j 1), j=1, 2, …, M j1 Representing the average value of the brightness values of all pixel points in the current rem (j 1) by using tha (j 1), adding the value of the current tha (j 1) into a set lan { }, and turning to S203;
s203, if the value of the current j1 is smaller than N, the value of the current j1 is increased by 1, and the process goes to S202; if the value of current j1 is equal to or greater than N, go to S204;
s204, representing the ith element in the set lan { } by lan (i), i=1, 2, …, N, recording the largest median element in the set lan { } as lan (M1), recording the smallest median element in the set lan { } as lan (M2), creating a blank set mis { }, adding all the elements remained after removing the elements lan (M1) and lan (M2) from the set lan { } to the set mis { }, recording tow=mis_a/(lan (M1) -lan (M2)), where mis_a represents the sum of all the elements in the set mis { }; resetting the value of the variable j1 to 1, creating a blank set und { }, and turning to S205;
s205, if the value of the current lan (j 1) is larger than the value of the round dup (tow), adding the value of the current variable j1 into the set und; if the value of the current lan (j 1) is less than or equal to the value of the round dup (tow), then go to S206; wherein, the round dup (top) is a value obtained by rounding up the top value;
s206, if the value of the current j1 is smaller than N, the value of the current j1 is increased by 1, and the process goes to S205; if the value of current j1 is equal to or greater than N, go to S207;
s207, the number of all elements in the set und { } is denoted by N1, and the i1 st element in the set und { } is denoted by und (i 1), i1=1, 2, …, N1, and rem (und (1)), rem (und (2)), …, rem (und (N1)) are sequentially stored as a data source.
The beneficial effects of this step are: because the spectrum information and the space information exist in the remote sensing image, the brightness values of the pixel points in the remote sensing image can reflect the geographical space information of the target area most, meanwhile, for a plurality of remote sensing images in the same area, when the capturing angles of the images are similar, the average brightness values of all the pixel points in the images are close, and when the capturing angles of the images are large in difference, the brightness values of all the pixel points in the images show large fluctuation, so that key samples (namely rem (1)) in the plurality of remote sensing images are selected through screening, rem (und (2)) … and rem (N1)) are stored as data sources, the brightness value change of the pixel points in the key samples can reflect the key information of the target area, the key samples can be screened out to provide a high-quality data fusion effect, and the smooth processing speed of the pixel points of the subsequent part can be accelerated.
Further, in step S300, the step of calculating the geometric degree of overlap of the data source specifically includes:
s301, initializing an integer variable k1, wherein the initial value of the variable k1 is 1, the value range of the variable k1 is [1, N1], N1 is the number of all elements in a set und { }, and turning to S302;
s302, selecting rem (un (k 1)) in a data source, namely rem (un (k 1)) as an un (k 1) Zhang Yaogan image in a plurality of remote sensing images, marking a pixel point at the upper left corner in rem (un (k 1)) as par1, marking a pixel point at the upper right corner in rem (un (k 1)) as par3, marking a pixel point at the lower left corner in rem (un (k 1)) as par4, connecting the pixel points par1 and par2 to obtain a straight line cap1, connecting the pixel points par2 and par3 to obtain a straight line cap2, connecting the pixel points par3 and par4 to obtain a straight line cap3, connecting the pixel points par4 and par1 to obtain a straight line cap4, and turning to S303;
s303, selecting a pixel point with the minimum brightness value from the current rem (und (k 1)) and marking as a soc, selecting a line with the shortest distance to the pixel point soc from the lines cap1, cap2, cap3 and cap4 and marking as a capA, selecting two lines with a perpendicular relation to the lines capA from the lines cap1, cap2, cap3 and cap4 and marking as capC1 and capC2 respectively, selecting a line with the shortest distance to the pixel point soc from the lines capC1 and capC2 and marking as a capB, and turning to S304;
s304, making a vertical line on a straight line capA through a pixel point soc to obtain a drop foot exaA, making a vertical line on a straight line capB through the pixel point soc to obtain a drop foot exaB, recording the intersection point of the straight line capA and the straight line capB as dau, sequentially connecting soc, exaA, dau, exaB to obtain a square region gro, recording all pixel points in the square region gro in the current rem (un (k 1)) as geometric pixel points, creating a blank set fut { }, sequentially adding brightness values corresponding to all geometric pixel points into the set fut { } (each pixel point corresponds to a brightness value), recording M2 as the number of all elements in the set fut { }, and recording the k2 element in the set fut { } as fut (k 2), wherein k2=1, 2, … and M2; removing all geometric pixel points in the current rem (un (k 1)), and marking the rest pixel points as first pixel points; the geometric degree of overlap geo_re (rem (un (k 1))) of the current rem (un (k 1)) is calculated by:
wherein fut _a is the element with the smallest median value in the set fut { }, soc_b is the brightness value of the pixel with the smallest brightness value in all the first pixel points, k3 is an accumulation variable, fut (k 3) is the k3 element in the set fut { }, hav is the average value of the brightness values of all the first pixel points, min { } represents the minimum value of the numbers in { }, max { } represents the maximum value of the numbers in { }, and the process goes to S305;
s305, if the value of the current variable k1 is smaller than N1, increasing the value of k1 by 1, and turning to S302; if the value of the current variable k1 is equal to or greater than N1, go to S306;
s306, creating a blank set Geo { }, adding Geo_Re (rem (und (1))), geo_Re (rem (und (2))), …, geo_Re (rem (und (N1))) to the set Geo { }, and recording the average value of all elements in the set Geo { } as GeoA.
The beneficial effects of this step are: in image registration, the use of key coupons in the data source is an effective method. These coupons are often representative samples that can be used to calculate the degree of geometric matching between different images. The degree of matching on the geometric layers can indicate the degree of matching of different spatial positions between different remote sensing images and can be used for determining a fusion position with high degree of matching. And carrying out local processing on the fusion positions with high matching degree so as to eliminate possible splicing errors, thereby realizing seamless fusion of images. Meanwhile, as the quality difference or the shooting angle of the remote sensing images are different when the remote sensing images are imaged, the single remote sensing image can only reflect the local characteristics of the target section, the method of the step utilizes the key sample in the data source to calculate the geometric degree of the key sample, the geometric degree of the geometric sample can indicate whether the fusion degrees of different spatial positions in different remote sensing images are sufficiently matched, the fusion positions with high matching degree are locally processed, the reflection integrity of the integral characteristics of the target section can be improved, the fused image has better geographic information quantity, and the fusion precision and reliability of the remote sensing images are improved.
Further, in step S400, the method for performing data fusion on the plurality of remote sensing images according to the geometric degree of overlap of the data sources specifically includes:
s401, initializing an integer variable j2, wherein the initial value of the variable j2 is 1, the value range of the variable j2 is [1, N ], N is the number of a plurality of remote sensing images, traversing the variable j2 from j2 = 1, and turning to S402;
s402, recording the pixel point with the maximum brightness value in the current rem (j 2) as pag (j 2), marking the critical pixel point with the brightness value larger than cla in rem (j 2) as a second pixel point, and turning to S403; wherein cla=geoa×pag (j 2), and critical pixel points in rem (j 2) are defined as: a pixel having a distance less than T from the edge of rem (j 2) (i.e., a critical pixel is a pixel having a distance less than T from the edge of rem (j 2)); t is the distance between [3,50] pixel points;
s403, if the value of the current variable j2 is smaller than N, the value of the variable j2 is increased by 1 and the process goes to S402.
The beneficial effects of this step are: in the process of data fusion, pixel points positioned at the edge of a remote sensing image are particularly critical, the pixel points have a decisive effect on the quality of the remote sensing image generated after fusion, the second pixel points in the critical pixel points are screened out by using the geometric overlap degree, the geometric form information in a target zone is restored to a higher degree by peripheral pixels of the second pixel points, and local pixel-level processing is carried out on the second pixel points, so that the detail degree and the local quality of the remote sensing image can be improved, the key information of the target zone can be extracted more accurately, and a more reliable data basis is provided for subsequent geographic information analysis and application.
Further, in step S400, data fusion is performed on the plurality of remote sensing images according to the geometric degree of overlap of the data sources, and the method further includes: filtering and smoothing the second pixel point in the plurality of remote sensing images by using a neighborhood mean value method, and carrying out data fusion on all the remote sensing images subjected to filtering and smoothing pretreatment; the data fusion method is any one of pixel-level image fusion, feature-level image fusion and decision-level image fusion.
The application also provides a geospatial data fusion system comprising: a processor, a memory, and a computer program stored in the memory and executable on the processor, wherein the processor implements steps in a geospatial data fusion method when the computer program is executed, the geospatial data fusion system may be executed in a computing device such as a desktop computer, a notebook computer, a mobile phone, a portable phone, a tablet computer, a palm top computer, a cloud data center, and the like, and the executable system may include, but is not limited to, a processor, a memory, a server cluster, and the processor executes the computer program to be executed in units of:
the image acquisition unit is used for acquiring a plurality of remote sensing images and extracting geographic space data in the plurality of remote sensing images;
the data integration unit is used for integrating the geospatial data in the remote sensing images to obtain a data source;
the parameter calculation unit is used for calculating the geometric degree of the data source;
and the data fusion unit is used for carrying out data fusion on the plurality of remote sensing images according to the geometric degree of the data source.
The beneficial effects of the application are as follows: the method can be used for efficiently fusing the geospatial data, does not need to manually adjust part of the data, greatly improves the data fusion speed, can more accurately reflect the geospatial information and the ground feature change in the region, can also improve the quality and the precision of the geospatial data, and greatly reduces the data deviation or error caused by a plurality of different remote sensing images in the fusion process.
Drawings
The above and other features of the present application will become more apparent from the detailed description of the embodiments thereof given in conjunction with the accompanying drawings, in which like reference characters designate like or similar elements, and it is apparent that the drawings in the following description are merely some examples of the present application, and other drawings may be obtained from these drawings without inventive effort to those of ordinary skill in the art, in which:
FIG. 1 is a flow chart of a geospatial data fusion method;
FIG. 2 is a system architecture diagram of a geospatial data fusion system.
Detailed Description
The conception, specific structure, and technical effects produced by the present application will be clearly and completely described below with reference to the embodiments and the drawings to fully understand the objects, aspects, and effects of the present application. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
In the description of the present application, a number means one or more, a number means two or more, and greater than, less than, exceeding, etc. are understood to not include the present number, and above, below, within, etc. are understood to include the present number. The description of the first and second is for the purpose of distinguishing between technical features only and should not be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
Referring now to FIG. 1, a flowchart of a geospatial data fusion method in accordance with the present application is shown, and a geospatial data fusion method in accordance with an embodiment of the present application is described below in conjunction with FIG. 1.
The application provides a geospatial data fusion method, which comprises the following steps:
s100, acquiring a plurality of remote sensing images, and extracting geographic space data in the plurality of remote sensing images;
s200, integrating the geospatial data in the plurality of remote sensing images to obtain a data source;
s300, calculating the geometric degree of overlap of the data source;
s400, carrying out data fusion on the plurality of remote sensing images according to the geometric degree of the data source.
Further, in step S100, the remote sensing image is a remote sensing digital image, the remote sensing digital image is stored in a digital form, and a basic unit of the remote sensing digital image is a pixel, and the pixel has a corresponding brightness value.
Further, in step S100, the steps of acquiring a plurality of remote sensing images and extracting geospatial data in the plurality of remote sensing images specifically include: and acquiring a plurality of remote sensing images through remote sensing monitoring, taking the brightness value of the pixel point in the remote sensing images and the space coordinate of the pixel point in the remote sensing images as the geographic space data in the remote sensing images, and sequentially extracting and storing the geographic space data of each remote sensing image in the plurality of remote sensing images.
Further, in step S200, the step of integrating the geospatial data in the plurality of remote sensing images to obtain the data source specifically includes:
s201, representing an ith remote sensing image in a plurality of remote sensing images by rem (i), recording the number of the plurality of remote sensing images as N (namely, the specific number of the plurality of remote sensing images is N), initializing an integer variable j1, wherein the initial value of the variable j1 is 1, the value range of the variable j1 is [1, N ], traversing j1 from j1 = 1, creating a blank set lan { } and turning to S202;
s202, recording the number of all pixel points in the current rem (j 1) as M j1 Let alr (j) denote the luminance value of the j-th pixel point in the current rem (j 1), j=1, 2, …, M j1 Let tha (j 1) represent the average value of the brightness values of all the pixels in the current rem (j 1), and let tha (j 1) beThe value is added to the set lan { }, proceeding to S203;
s203, if the value of the current j1 is smaller than N, the value of the current j1 is increased by 1, and the process goes to S202; if the value of current j1 is equal to or greater than N, go to S204;
s204, representing the ith element in the set lan { } by lan (i), i=1, 2, …, N, recording the largest median element in the set lan { } as lan (M1), recording the smallest median element in the set lan { } as lan (M2), creating a blank set mis { }, adding all the elements remained after removing the elements lan (M1) and lan (M2) from the set lan { } to the set mis { }, recording tow=mis_a/(lan (M1) -lan (M2)), where mis_a represents the sum of all the elements in the set mis { }; resetting the value of the variable j1 to 1, creating a blank set und { }, and turning to S205;
s205, if the value of the current lan (j 1) is larger than the value of the round dup (tow), adding the value of the current variable j1 into the set und; if the value of the current lan (j 1) is less than or equal to the value of the round dup (tow), then go to S206; wherein, the round dup (top) is a value obtained by rounding up the top value;
s206, if the value of the current j1 is smaller than N, the value of the current j1 is increased by 1, and the process goes to S205; if the value of current j1 is equal to or greater than N, go to S207;
s207, the number of all elements in the set und { } is denoted by N1, and the i1 st element in the set und { } is denoted by und (i 1), i1=1, 2, …, N1, and rem (und (1)), rem (und (2)), …, rem (und (N1)) are sequentially stored as a data source.
Further, in step S300, the step of calculating the geometric degree of overlap of the data source specifically includes:
s301, initializing an integer variable k1, wherein the initial value of the variable k1 is 1, the value range of the variable k1 is [1, N1], N1 is the number of all elements in a set und { }, and turning to S302;
s302, selecting rem (un (k 1)) in a data source, namely rem (un (k 1)) as an un (k 1) Zhang Yaogan image in a plurality of remote sensing images, marking a pixel point at the upper left corner in rem (un (k 1)) as par1, marking a pixel point at the upper right corner in rem (un (k 1)) as par3, marking a pixel point at the lower left corner in rem (un (k 1)) as par4, connecting the pixel points par1 and par2 to obtain a straight line cap1, connecting the pixel points par2 and par3 to obtain a straight line cap2, connecting the pixel points par3 and par4 to obtain a straight line cap3, connecting the pixel points par4 and par1 to obtain a straight line cap4, and turning to S303;
s303, selecting a pixel point with the minimum brightness value from the current rem (und (k 1)) and marking as a soc, selecting a line with the shortest distance to the pixel point soc from the lines cap1, cap2, cap3 and cap4 and marking as a capA, selecting two lines with a perpendicular relation to the lines capA from the lines cap1, cap2, cap3 and cap4 and marking as capC1 and capC2 respectively, selecting a line with the shortest distance to the pixel point soc from the lines capC1 and capC2 and marking as a capB, and turning to S304;
s304, making a vertical line on a straight line capA through a pixel point soc to obtain a drop foot exaA, making a vertical line on a straight line capB through the pixel point soc to obtain a drop foot exaB, recording the intersection point of the straight line capA and the straight line capB as dau, sequentially connecting soc, exaA, dau, exaB to obtain a square region gro, recording all pixel points in the square region gro in the current rem (un (k 1)) as geometric pixel points, creating a blank set fut { }, sequentially adding brightness values corresponding to all geometric pixel points into the set fut { } (each pixel point corresponds to a brightness value), recording M2 as the number of all elements in the set fut { }, and recording the k2 element in the set fut { } as fut (k 2), wherein k2=1, 2, … and M2; removing all geometric pixel points in the current rem (un (k 1)), and marking the rest pixel points as first pixel points; the geometric degree of overlap geo_re (rem (un (k 1))) of the current rem (un (k 1)) is calculated by:
wherein fut _a is the element with the smallest median value in the set fut { }, soc_b is the brightness value of the pixel with the smallest brightness value in all the first pixel points, k3 is an accumulation variable, fut (k 3) is the k3 element in the set fut { }, hav is the average value of the brightness values of all the first pixel points, min { } represents the minimum value of the numbers in { }, max { } represents the maximum value of the numbers in { }, and the process goes to S305;
s305, if the value of the current variable k1 is smaller than N1, increasing the value of k1 by 1, and turning to S302; if the value of the current variable k1 is equal to or greater than N1, go to S306;
s306, creating a blank set Geo { }, adding Geo_Re (rem (und (1))), geo_Re (rem (und (2))), …, geo_Re (rem (und (N1))) to the set Geo { }, and recording the average value of all elements in the set Geo { } as GeoA.
Further, in step S400, the method for performing data fusion on the plurality of remote sensing images according to the geometric degree of overlap of the data sources specifically includes:
s401, initializing an integer variable j2, wherein the initial value of the variable j2 is 1, the value range of the variable j2 is [1, N ], N is the number of a plurality of remote sensing images, traversing the variable j2 from j2 = 1, and turning to S402;
s402, recording the pixel point with the maximum brightness value in the current rem (j 2) as pag (j 2), marking the critical pixel point with the brightness value larger than cla in rem (j 2) as a second pixel point, and turning to S403; wherein cla=geoa×pag (j 2), and critical pixel points in rem (j 2) are defined as: a pixel having a distance less than T from the edge of rem (j 2) (i.e., a critical pixel is a pixel having a distance less than T from the edge of rem (j 2)); t is the distance between [3,50] pixel points;
s403, if the value of the current variable j2 is smaller than N, the value of the variable j2 is increased by 1 and the process goes to S402.
Further, in step S400, data fusion is performed on the plurality of remote sensing images according to the geometric degree of overlap of the data sources, and the method further includes: filtering and smoothing the second pixel point in the plurality of remote sensing images by using a neighborhood mean value method, and carrying out data fusion on all the remote sensing images subjected to filtering and smoothing pretreatment; the data fusion method is any one of pixel-level image fusion, feature-level image fusion and decision-level image fusion.
The geospatial data fusion system includes: the steps in the embodiments of the geospatial data fusion method described above are implemented by a processor, a memory, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program, and the geospatial data fusion system may be executed in a computing device such as a desktop computer, a notebook computer, a mobile phone, a tablet computer, a palm computer, a cloud data center, and the executable system may include, but is not limited to, a processor, a memory, and a server cluster.
As shown in fig. 2, a geospatial data fusion system according to an embodiment of the present application includes: a processor, a memory, and a computer program stored in the memory and executable on the processor, the processor implementing the steps in one geospatial data fusion method embodiment described above when the computer program is executed, the processor executing the computer program to run in the units of the following system:
the image acquisition unit is used for acquiring a plurality of remote sensing images and extracting geographic space data in the plurality of remote sensing images;
the data integration unit is used for integrating the geospatial data in the remote sensing images to obtain a data source;
the parameter calculation unit is used for calculating the geometric degree of the data source;
and the data fusion unit is used for carrying out data fusion on the plurality of remote sensing images according to the geometric degree of the data source.
The geospatial data fusion system can be operated in computing devices such as desktop computers, notebook computers, palm computers, cloud data centers and the like. The geospatial data fusion system includes, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that the examples are merely examples of a geospatial data fusion method and system, and are not meant to be limiting, and that more or fewer components than examples may be included, or that certain components may be combined, or that different components may be included, for example, the geospatial data fusion system may also include input and output devices, network access devices, buses, etc.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application SpecificIntegrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete component gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like that is a control center of the one geospatial data fusion system and that utilizes various interfaces and lines to connect the various sub-areas of the entire one geospatial data fusion system.
The memory may be used to store the computer program and/or module, and the processor may implement the various functions of the geospatial data fusion method and system by running or executing the computer program and/or module stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
The application provides a geospatial data fusion method, which comprises the steps of obtaining a plurality of remote sensing images, extracting geospatial data in the plurality of remote sensing images, integrating the geospatial data in the plurality of remote sensing images to obtain a data source, calculating the geometric degree of overlap of the data source, and carrying out data fusion on the plurality of remote sensing images according to the geometric degree of overlap of the data source. The method can be used for efficiently fusing the geospatial data, does not need to manually adjust part of the data, greatly improves the data fusion speed, can more accurately reflect the geospatial information and the ground feature change in the region, can also improve the quality and the precision of the geospatial data, and greatly reduces the data deviation or error caused by a plurality of different remote sensing images in the fusion process. Although the present application has been described in considerable detail and with particularity with respect to several described embodiments, it is not intended to be limited to any such detail or embodiment or any particular embodiment so as to effectively cover the intended scope of the application. Furthermore, the foregoing description of the application has been presented in its embodiments contemplated by the inventors for the purpose of providing a useful description, and for the purposes of providing a non-essential modification of the application that may not be presently contemplated, may represent an equivalent modification of the application.
Claims (7)
1. A method of geospatial data fusion, the method comprising the steps of:
s100, acquiring a plurality of remote sensing images, and extracting geographic space data in the plurality of remote sensing images;
s200, integrating the geospatial data in the plurality of remote sensing images to obtain a data source;
s300, calculating the geometric degree of overlap of the data source;
s400, carrying out data fusion on a plurality of remote sensing images according to the geometric degree of the data source;
in step S300, the step of calculating the geometric degree of overlap of the data source specifically includes:
s301, initializing an integer variable k1, wherein the initial value of the variable k1 is 1, the value range of the variable k1 is [1, N1], N1 is the number of all elements in a set und { }, and turning to S302;
s302, selecting rem (un (k 1)) in a data source, namely rem (un (k 1)) as an un (k 1) Zhang Yaogan image in a plurality of remote sensing images, marking a pixel point at the upper left corner in rem (un (k 1)) as par1, marking a pixel point at the upper right corner in rem (un (k 1)) as par3, marking a pixel point at the lower left corner in rem (un (k 1)) as par4, connecting the pixel points par1 and par2 to obtain a straight line cap1, connecting the pixel points par2 and par3 to obtain a straight line cap2, connecting the pixel points par3 and par4 to obtain a straight line cap3, connecting the pixel points par4 and par1 to obtain a straight line cap4, and turning to S303;
s303, selecting a pixel point with the minimum brightness value from the current rem (und (k 1)) and marking as a soc, selecting a line with the shortest distance to the pixel point soc from the lines cap1, cap2, cap3 and cap4 and marking as a capA, selecting two lines with a perpendicular relation to the lines capA from the lines cap1, cap2, cap3 and cap4 and marking as capC1 and capC2 respectively, selecting a line with the shortest distance to the pixel point soc from the lines capC1 and capC2 and marking as a capB, and turning to S304;
s304, making a vertical line on a straight line capA through a pixel point soc to obtain a drop foot exaA, making a vertical line on a straight line capB through the pixel point soc to obtain a drop foot exaB, recording the intersection point of the straight line capA and the straight line capB as dau, sequentially connecting soc, exaA, dau, exaB to obtain a square region gro, recording all pixel points in the square region gro in the current rem (un (k 1)) as geometric pixel points, creating a blank set fut { }, sequentially adding brightness values corresponding to all geometric pixel points into the set fut { } (each pixel point corresponds to a brightness value), recording M2 as the number of all elements in the set fut { }, and recording the k2 element in the set fut { } as fut (k 2), wherein k2=1, 2, … and M2; removing all geometric pixel points in the current rem (un (k 1)), and marking the rest pixel points as first pixel points; the geometric degree of overlap geo_re (rem (un (k 1))) of the current rem (un (k 1)) is calculated by:
wherein fut _a is the element with the smallest median value in the set fut { }, soc_b is the brightness value of the pixel with the smallest brightness value in all the first pixel points, k3 is an accumulation variable, fut (k 3) is the k3 element in the set fut { }, hav is the average value of the brightness values of all the first pixel points, min { } represents the minimum value of the numbers in { }, max { } represents the maximum value of the numbers in { }, and the process goes to S305;
s305, if the value of the current variable k1 is smaller than N1, increasing the value of k1 by 1, and turning to S302; if the value of the current variable k1 is equal to or greater than N1, go to S306;
s306, creating a blank set Geo { }, adding Geo_Re (rem (und (1))), geo_Re (rem (und (2))), …, geo_Re (rem (und (N1))) to the set Geo { }, and recording the average value of all elements in the set Geo { } as GeoA.
2. The geospatial data fusion method of claim 1 wherein in step S100, the remote sensing image is a remote sensing digital image, the remote sensing digital image is stored in digital form, the basic unit of the remote sensing digital image is a pixel, and the pixel has a corresponding brightness value.
3. The method of claim 1, wherein in step S100, the step of obtaining a plurality of remote sensing images and extracting geospatial data in the plurality of remote sensing images is specifically: and acquiring a plurality of remote sensing images through remote sensing monitoring, taking the brightness value of the pixel point in the remote sensing images and the space coordinate of the pixel point in the remote sensing images as the geographic space data in the remote sensing images, and sequentially extracting and storing the geographic space data of each remote sensing image in the plurality of remote sensing images.
4. The geospatial data fusion method according to claim 1 wherein in step S200, the step of integrating geospatial data in a plurality of remote sensing images to obtain a data source is specifically:
s201, representing an ith remote sensing image in a plurality of remote sensing images by rem (i), recording the number of the plurality of remote sensing images as N (namely, the specific number of the plurality of remote sensing images is N), initializing an integer variable j1, wherein the initial value of the variable j1 is 1, the value range of the variable j1 is [1, N ], traversing j1 from j1 = 1, creating a blank set lan { } and turning to S202;
s202, recording the number of all pixel points in the current rem (j 1) as M j1 Expressed as alr (j)Luminance value of j-th pixel in current rem (j 1), j=1, 2, …, M j1 Representing the average value of the brightness values of all pixel points in the current rem (j 1) by using tha (j 1), adding the value of the current tha (j 1) into a set lan { }, and turning to S203;
s203, if the value of the current j1 is smaller than N, the value of the current j1 is increased by 1, and the process goes to S202; if the value of current j1 is equal to or greater than N, go to S204;
s204, representing the ith element in the set lan { } by lan (i), i=1, 2, …, N, recording the largest median element in the set lan { } as lan (M1), recording the smallest median element in the set lan { } as lan (M2), creating a blank set mis { }, adding all the elements remained after removing the elements lan (M1) and lan (M2) from the set lan { } to the set mis { }, recording tow=mis_a/(lan (M1) -lan (M2)), where mis_a represents the sum of all the elements in the set mis { }; resetting the value of the variable j1 to 1, creating a blank set und { }, and turning to S205;
s205, if the value of the current lan (j 1) is larger than the value of the round dup (tow), adding the value of the current variable j1 into the set und; if the value of the current lan (j 1) is less than or equal to the value of the round dup (tow), then go to S206; wherein, the round dup (top) is a value obtained by rounding up the top value;
s206, if the value of the current j1 is smaller than N, the value of the current j1 is increased by 1, and the process goes to S205; if the value of current j1 is equal to or greater than N, go to S207;
s207, the number of all elements in the set und { } is denoted by N1, and the i1 st element in the set und { } is denoted by und (i 1), i1=1, 2, …, N1, and rem (und (1)), rem (und (2)), …, rem (und (N1)) are sequentially stored as a data source.
5. The geospatial data fusion method according to claim 1 wherein in step S400, the method for data fusion of a plurality of remote sensing images according to the geometric degree of overlap of the data sources is specifically as follows:
s401, initializing an integer variable j2, wherein the initial value of the variable j2 is 1, the value range of the variable j2 is [1, N ], N is the number of a plurality of remote sensing images, traversing the variable j2 from j2 = 1, and turning to S402;
s402, recording the pixel point with the maximum brightness value in the current rem (j 2) as pag (j 2), marking the critical pixel point with the brightness value larger than cla in rem (j 2) as a second pixel point, and turning to S403; wherein cla=geoa×pag (j 2), and critical pixel points in rem (j 2) are defined as: a pixel having a distance less than T from the edge of rem (j 2) (i.e., a critical pixel is a pixel having a distance less than T from the edge of rem (j 2)); t is the distance between [3,50] pixel points;
s403, if the value of the current variable j2 is smaller than N, the value of the variable j2 is increased by 1 and the process goes to S402.
6. The geospatial data fusion method of claim 1 wherein in step S400, data fusion is performed on a plurality of remote sensing images according to geometric degree of overlap of data sources, further comprising: filtering and smoothing the second pixel point in the plurality of remote sensing images by using a neighborhood mean value method, and carrying out data fusion on all the remote sensing images subjected to filtering and smoothing pretreatment; the data fusion method is any one of pixel-level image fusion, feature-level image fusion and decision-level image fusion.
7. A geospatial data fusion system, the geospatial data fusion system comprising: a processor, a memory and a computer program stored in the memory and running on the processor, the processor implementing the steps of a geospatial data fusion method according to any of claims 1-6 when the computer program is executed, the geospatial data fusion system running in a computing device of a desktop, notebook, palm or cloud data center.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310501962.4A CN116579960B (en) | 2023-05-06 | 2023-05-06 | Geospatial data fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310501962.4A CN116579960B (en) | 2023-05-06 | 2023-05-06 | Geospatial data fusion method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116579960A CN116579960A (en) | 2023-08-11 |
CN116579960B true CN116579960B (en) | 2023-12-08 |
Family
ID=87538972
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310501962.4A Active CN116579960B (en) | 2023-05-06 | 2023-05-06 | Geospatial data fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116579960B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103646027A (en) * | 2013-10-30 | 2014-03-19 | 广东省数字广东研究院 | Geographic spatial database updating method and system |
CN110186820A (en) * | 2018-12-19 | 2019-08-30 | 河北中科遥感信息技术有限公司 | Multisource data fusion and environomental pollution source and pollutant distribution analysis method |
CN111581308A (en) * | 2020-03-26 | 2020-08-25 | 中国农业科学院农业信息研究所 | Agricultural remote sensing monitoring data multidimensional fusion method and system |
CN112258464A (en) * | 2020-10-14 | 2021-01-22 | 宁波大学 | Full-reference remote sensing image fusion quality evaluation method |
WO2021056538A1 (en) * | 2019-09-29 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Image processing method and device |
WO2021077847A1 (en) * | 2019-10-23 | 2021-04-29 | 北京建筑大学 | Seawater-polluted area identification method based on high-resolution remote-sensing image, and device |
WO2022048196A1 (en) * | 2020-09-03 | 2022-03-10 | 深圳前海微众银行股份有限公司 | Method and device for monitoring industrial production index |
WO2022160895A1 (en) * | 2021-01-28 | 2022-08-04 | 北京迈格威科技有限公司 | Image processing method, image processing apparatus, electronic system and readable storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8145677B2 (en) * | 2007-03-27 | 2012-03-27 | Faleh Jassem Al-Shameri | Automated generation of metadata for mining image and text data |
-
2023
- 2023-05-06 CN CN202310501962.4A patent/CN116579960B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103646027A (en) * | 2013-10-30 | 2014-03-19 | 广东省数字广东研究院 | Geographic spatial database updating method and system |
CN110186820A (en) * | 2018-12-19 | 2019-08-30 | 河北中科遥感信息技术有限公司 | Multisource data fusion and environomental pollution source and pollutant distribution analysis method |
WO2021056538A1 (en) * | 2019-09-29 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Image processing method and device |
WO2021077847A1 (en) * | 2019-10-23 | 2021-04-29 | 北京建筑大学 | Seawater-polluted area identification method based on high-resolution remote-sensing image, and device |
CN111581308A (en) * | 2020-03-26 | 2020-08-25 | 中国农业科学院农业信息研究所 | Agricultural remote sensing monitoring data multidimensional fusion method and system |
WO2022048196A1 (en) * | 2020-09-03 | 2022-03-10 | 深圳前海微众银行股份有限公司 | Method and device for monitoring industrial production index |
CN112258464A (en) * | 2020-10-14 | 2021-01-22 | 宁波大学 | Full-reference remote sensing image fusion quality evaluation method |
WO2022160895A1 (en) * | 2021-01-28 | 2022-08-04 | 北京迈格威科技有限公司 | Image processing method, image processing apparatus, electronic system and readable storage medium |
Non-Patent Citations (5)
Title |
---|
城市轨道交通接运公交最优长度与线路布设研究;郭本峰;张杰林;李铁柱;;交通运输工程与信息学报(04);全文 * |
基于高斯混合模型的遥感影像云检测技术;杨帆;赵增鹏;张磊;;南京林业大学学报(自然科学版)(04);全文 * |
多源遥感影像数据融合的理论与技术;韩玲, 吴汉宁;西北大学学报(自然科学版)(04);全文 * |
影像融合技术在地学信息处理中的应用;潘存玲;李伟风;;测绘技术装备(04);全文 * |
遥感影像融合关键技术探讨;杨玉静;;北京测绘(02);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN116579960A (en) | 2023-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021017998A1 (en) | Method and system for positioning text position, and method and system for training model | |
JP2013025799A (en) | Image search method, system, and program | |
TWI581207B (en) | Computing method for ridesharing path, computing apparatus and recording medium using the same | |
CN110399847B (en) | Key frame extraction method and device and electronic equipment | |
CN111160288A (en) | Gesture key point detection method and device, computer equipment and storage medium | |
CN111192239A (en) | Method and device for detecting change area of remote sensing image, storage medium and electronic equipment | |
JP2008020225A (en) | Self position estimation program, self position estimation method and self position estimation apparatus | |
CN113793370A (en) | Three-dimensional point cloud registration method and device, electronic equipment and readable medium | |
CN116415020A (en) | Image retrieval method, device, electronic equipment and storage medium | |
CN114359383A (en) | Image positioning method, device, equipment and storage medium | |
US10354409B2 (en) | Image processing device, image processing method, and non-transitory computer-readable recording medium | |
CN113537026A (en) | Primitive detection method, device, equipment and medium in building plan | |
CN116579960B (en) | Geospatial data fusion method | |
CN113158773A (en) | Training method and training device for living body detection model | |
JP6365117B2 (en) | Information processing apparatus, image determination method, and program | |
US20230401670A1 (en) | Multi-scale autoencoder generation method, electronic device and readable storage medium | |
CN111199188A (en) | Pixel processing method and device for remote sensing image difference map, storage medium and equipment | |
CN111870954B (en) | Altitude map generation method, device, equipment and storage medium | |
CN111968030B (en) | Information generation method, apparatus, electronic device and computer readable medium | |
WO2021139178A1 (en) | Image synthesis method and related device | |
JP2016045538A (en) | Information processing apparatus, image determination method, and program | |
CN110766996B (en) | Click-to-read content positioning method and device, electronic equipment and storage medium | |
CN114626483A (en) | Landmark image generation method and device | |
CN114756634A (en) | Method and device for discovering interest point change, electronic equipment and storage medium | |
CN116523884B (en) | Remote sensing image data intelligent interpretation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
PE01 | Entry into force of the registration of the contract for pledge of patent right |
Denomination of invention: A Geospatial Data Fusion Method Granted publication date: 20231208 Pledgee: Huangpu sub branch of Guangzhou Rural Commercial Bank Co.,Ltd. Pledgor: Guangzhou Nano Technology Co.,Ltd. Registration number: Y2024980024820 |