CN109271928B - Road network updating method based on vector road network fusion and remote sensing image verification - Google Patents

Road network updating method based on vector road network fusion and remote sensing image verification Download PDF

Info

Publication number
CN109271928B
CN109271928B CN201811074936.3A CN201811074936A CN109271928B CN 109271928 B CN109271928 B CN 109271928B CN 201811074936 A CN201811074936 A CN 201811074936A CN 109271928 B CN109271928 B CN 109271928B
Authority
CN
China
Prior art keywords
road
vector
road network
edge
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811074936.3A
Other languages
Chinese (zh)
Other versions
CN109271928A (en
Inventor
眭海刚
孙向东
周明婷
程效猛
徐川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201811074936.3A priority Critical patent/CN109271928B/en
Publication of CN109271928A publication Critical patent/CN109271928A/en
Application granted granted Critical
Publication of CN109271928B publication Critical patent/CN109271928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/182Network patterns, e.g. roads or rivers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a road network automatic updating method based on vector road network fusion and high-resolution remote sensing image verification, which comprises the steps of firstly registering a historical road vector, a new-period navigation road network vector and a remote sensing image; then, fusing the historical road vector and the new-period navigation road network vector by adopting a coarse matching method and a fine matching method in sequence, and finding out an unchanged road and a suspected changed road; then extracting edge features, spectral features and vegetation features from the high-resolution remote sensing image to construct a multi-feature verification model, verifying whether the road suspected to change is a road or not, and finding out road changes; and finally, performing road network fusion on the unchanged roads and the changed roads according to the geometric characteristics to obtain an updated road network. The invention excavates position, geometric, topological and semantic information from the vector road network, extracts the road network by combining scene characteristics of roads in the high-resolution remote sensing image, and has stronger practicability and higher accuracy.

Description

Road network updating method based on vector road network fusion and remote sensing image verification
Technical Field
The invention relates to the technical field of remote sensing image application, in particular to an automatic road network updating method based on vector road network fusion and high-resolution remote sensing image verification.
Background
Road networks are ties connecting regions. With the development of the urbanization process of China, the demand of people on road traffic and transportation is continuously increased, so that the rapid development of road construction of China is promoted. The high-precision extraction and update of the road network information have important functions on traffic management, city planning and automatic navigation. The rapid acquisition and updating of road element data become an important task for the construction of basic geographic information in China. With the development of remote sensing image processing technology, the automatic extraction of road information from high-resolution remote sensing images gradually becomes a research hotspot of scholars at home and abroad, but at present, a set of road automatic extraction algorithm with strong robustness does not exist, manual interpretation based on the high-resolution remote sensing images is still a main means for updating road elements, the defect of low productivity exists, and the requirement of rapid updating of a road network cannot be met.
On the high-resolution remote sensing image, the road is represented as a long and narrow strip target with spectral homogeneity, the road boundary and the pavement marker are clearly visible, and based on the characteristics, a plurality of road extraction algorithms are derived. These road extraction methods can be roughly classified into 4 types: classification-based methods, knowledge rule-based methods, mathematical morphology, and active contour methods. The classification-based method is to utilize the geometric features, radiation features, textural features and other structural feature spaces of roads to classify pixels or homogeneous objects into roads or non-roads, a support vector machine, a Markov random field and a neural network are the most common road classification methods, and road and non-road target confusion is the most important problem in the classification method. The deep learning method developed vigorously in recent years is also applied to road extraction, and the essence of the method is to classify pixels to obtain a road target, but because a road has a narrow width attribute, information is gradually lost in deep neural network downsampling, and the method has a poor effect on roads, particularly rural minor roads. The knowledge rule-based method is a method of constructing a road extraction rule based on expert knowledge and combining the rules to extract roads, and road tracking is one of the most typical knowledge rule-based road extraction methods. The mathematical morphology method is to use various morphological operators to perform morphological processing on the remote sensing image, filter out interferents and noise while enhancing the long and narrow characteristics of the road, and the road extraction method based on mathematical morphology operation is usually combined with other methods such as image segmentation, edge detection and the like to extract a road skeleton. The main principle of the active contour model is that an energy functional is constructed, a contour curve gradually approaches to the edge of an object to be detected under the drive of the minimum value of the energy function, and finally a target is segmented, and a snake model and a level set model are active contour models commonly used in road extraction and are widely applied to semi-automatic road extraction. The road extraction algorithms greatly promote the development of the automatic updating technology of the road network, but no method can perfectly adapt to changeable road scenes under the road network extraction task, and firstly, the method is influenced by the imaging factors: imaging factors such as sensor difference, spectrum and spatial resolution difference and illumination difference cause the change of road radiation characteristics, so that difficulty is brought to automatic road extraction; secondly, the road self-factors are as follows: on a high-resolution remote sensing image with rich details, roads are represented as an aggregate of various ground features, such as vehicles, road signs, lane lines, lane trees and the like, so that the interior of road elements has great heterogeneity, and meanwhile, road objects and adjacent ground features have great feature correlation, so that the automatic road extraction method is difficult to accurately identify the road objects; thirdly, environmental factors: due to the occlusion of shadows and other ground objects, the task of automatic road extraction based on remote sensing images becomes more difficult, and it is a necessary trend to seek assistance from other data to extract roads.
Based on the requirement of rapid updating of a basic geographic information road network and in consideration of the difficulty existing in road extraction in a high-resolution remote sensing image, the invention introduces a navigation electronic map and historical period road vectors in geographic national situation general survey to assist the extraction of the road network, fully utilizes road network prior information with the presence and the reliability provided by the navigation electronic map and the historical road vectors, constructs a set of road network automatic updating technical system based on vector road network fusion and high-resolution remote sensing image verification, and progressively researches a road network automatic updating method from two levels of existing road extraction, new road updating and verification. The method comprises four steps of vector data and remote sensing image registration, vector fusion of a navigation road network and a historical road, variable road verification based on multi-feature evidence and road network connection, and has high practicability and high accuracy.
Disclosure of Invention
Aiming at the problems in the background art, the invention provides a road network automatic updating method based on vector road network fusion and high-resolution remote sensing image verification. The technical scheme of the invention is as follows:
a road network automatic updating method based on vector road network fusion and high-resolution remote sensing image verification comprises the following steps:
step 1, registering vector data and a remote sensing image, wherein the vector data is obtained by cutting a navigation road network and a historical road vector according to an image range;
step 2, fusing the registered navigation road network and the historical road vector, and obtaining a navigation road network section matched with the historical road vector as an unchanged road; the method comprises the following steps that a road section which cannot be matched with a historical road vector in a navigation road network is regarded as a road which is likely to change;
step 3, verifying the road which is possibly changed in the step 2 by adopting an edge characteristic, spectral characteristic and vegetation characteristic multi-characteristic evidence model, and further separating the changed road and the non-road;
and 4, connecting the changed road in the step 3 with the unchanged road in the step 2 according to the geometric characteristics of the road sections to form a road network.
Further, in the step 1, vector data and the remote sensing image are registered by adopting a human-computer interaction mode, and the specific process is as follows: and cutting the navigation road network and the historical road vector according to the remote sensing image range to obtain vector data, manually selecting a plurality of homonymous points from the vector data and the remote sensing image respectively, and carrying out integral affine transformation on the vector data of the road network.
Further, the specific implementation manner of the step 2 is as follows,
step 2.1, defining a vector section between two nodes as a road section by taking the historical road vector as a reference, and generating a buffer area by taking the road width as the size of the road section by section;
step 2.2, in the buffer area, firstly, roughly matching the navigation road network vector with the historical road vector, wherein the matching basis is to judge whether the navigation road network vector passes through the buffer area, if so, the rough matching is finished, otherwise, the matching fails;
step 2.3, for the navigation road network vector passing through the rough matching, adopting the length similarity SlenAnd direction similarity SangTwo constraint conditions are finely matched, SlenAnd SangAre respectively expressed as formulas (1) to (2):
Figure BDA0001800536270000031
Sang=cosθ (2)
wherein L isOSMThe method comprises the steps of integrating the lengths of all vector segments of a navigation road network in a buffer area, wherein L is the length of a vector segment in a current historical road vector, and theta is an included angle between a navigation road network vector and a historical road vector;
the final similarity index adopts length similarity SlenAnd direction similarity SangAs shown in equation (3),
S=λ·Sang+(1-λ)·Slen (3)
the smaller the S value is, the closer the navigation road network and the historical road vector is;
step 2.4, setting TmatchIf S is less than T, the threshold is a fine match thresholdmatchIf the road sections passing the fine matching in the navigation road network data are regarded as unchanged roads, the navigation road network data are not processed; and regarding the road sections which cannot be matched with the historical road vector in the navigation road network as the roads which are likely to change.
Further, the step 3 is realized in the following way,
step 3.1, constructing buffer areas for road sections which do not pass the fine matching in the navigation road network data section by section, and acquiring image slices according to the range of the buffer areas;
step 3.2, judging whether the road is the road or not by adopting a multi-feature evidence model for each road section, wherein:
firstly, an edge feature evidence model: carrying out canny edge detection on the high-resolution remote sensing image in the buffer area; calculating the direction of each edge, counting the total length of the edge corresponding to each direction, and considering the direction corresponding to the longest length as the main direction of the road section; calculating edge density in the main direction as edge feature evidence RedgeThe calculation formula is shown in (4):
Redge=Lang/L (4)
wherein ang is the main direction of the road section, LangThe total length of the edge corresponding to the main direction, and L is the length of the road section; let TedgeIs a valid edge feature threshold, if Redge<TedgeThe probability of road existence is positively linearly related to the edge characteristic value, if R isedge≥TedgeIf so, the road is considered to exist; the road existence probability of the edge feature evidence model is as follows (5):
Figure BDA0001800536270000041
wherein alpha isedgeThe probability weight corresponding to the edge feature is used, and epsilon is a predefined small probability value;
second, spectral feature evidence model: the gray value standard deviation of the pixels in the range of the buffer area at each wave band is std (DN), TspecIs the effective spectral feature threshold, then the spectral feature Rspec=std(DN)/Tspec(ii) a If R isspecIf < 1, the probability of road existence and R are consideredspecIs in negative linear correlation; if R isspecIf the road is more than or equal to 1, the road is considered to exist; the road existence probability of the spectrum evidence model is as follows (6):
Figure BDA0001800536270000042
αspecprobability weight corresponding to the spectral characteristics;
and thirdly, vegetation characteristic evidence model: firstly, calculating a normalized vegetation index NDVI of an image in a buffer area, and roughly extracting vegetation in the buffer area by adopting a threshold segmentation method; let DplantSet of direction angles corresponding to the implanted objects in the buffer zone for the navigation section DroadFor the main direction corresponding angle, T, of the current road sectionplantAs a direction angle difference threshold, vegetation characteristic RplantSee formula (7):
Rplant=std(Dplant-Droad)/Tplant (7)
when R isplantWhen < 1, the probability of road existence and RplantNegative correlation; when R isplantWhen the vegetation direction is not related to the road trend, the probability of being a non-road object is higher; the road existence probability of the vegetation evidence model is as follows (8):
Figure BDA0001800536270000051
and fourthly, after the road existence probabilities corresponding to the edge evidence model, the spectrum evidence model and the vegetation evidence model are respectively calculated, obtaining the road existence probability based on the multi-evidence model by solving the maximum values of the three probability densities, wherein the formula is as follows (7):
P=max{Pedge,Pspec,Pplant} (9)
and after the road existence probability based on the multi-evidence model is obtained through calculation, if P is more than or equal to 0.5, the road is considered as a changed road, and otherwise, the road is considered as a non-road.
Further, the size of the buffer in step 3.1 is set to twice the road width.
The invention adopts a high-resolution remote sensing image, a historical road vector and a new-period navigation road network vector as input data sources, comprehensively utilizes position, geometry, topology and semantic information in the navigation road network and the historical road vector data and scene characteristics in the high-resolution image, and combines the priori knowledge of the real road structure to complete the task of automatically extracting the road network elements. The invention has stronger practicability and higher accuracy, and is characterized in that:
(1) the navigation road network vector and the historical road vector are adopted to assist in extracting the road network, the advantage that the navigation road network has timeliness and the advantage that the historical road vector has reliability are fully utilized, roads which are not changed are directly extracted, only road areas which are possibly increased/reduced are processed, and the road network extraction efficiency is greatly improved;
(2) for the road section with change, three characteristics of edge characteristics, spectral characteristics and vegetation characteristics are adopted as evidences to verify whether the road section is a road or not, so that the method is effective for the road with obvious characteristics and has a good effect on the shielded or interfered road section.
Drawings
FIG. 1 is a flow chart of an automatic road network updating method based on vector road network fusion and high-resolution remote sensing image verification.
FIG. 2 is a schematic diagram of a rough matching between a navigation road network and historical road vectors.
FIG. 3 is a schematic diagram of road network connections.
Detailed Description
The present invention provides a method for automatically updating a road network based on vector road network fusion and high-resolution remote sensing image verification, which will be described in further detail below with reference to the accompanying drawings for facilitating those skilled in the art to understand and implement the present invention.
The technical scheme of the invention is shown in a flow chart in figure 1, and specifically comprises the following steps:
the method comprises the following steps: and registering the vector data with the remote sensing image. If the vector and the image have large position deviation, the influence on road extraction can be generated, and the scheme adopts a man-machine interaction mode to register the vector data and the remote sensing image so as to reduce the deviation of the vector data and the remote sensing image. Firstly, a navigation road network and a historical road vector are cut according to an image range to obtain vector data, a plurality of homonymous points are manually selected from the vector and the image respectively, and the road network vector is subjected to overall affine transformation. To ensure proper performance of the subsequent road extraction process, the offset distance triggering the affine transformation is determined by the buffer radius of the road extraction, which is equal to the sum of the road half-width and the acceptable registration error. And after the navigation road network is superposed with the image, if the position deviation distance of the homonymous road is greater than the buffer radius, manual correction is required.
Step two: fusing the registered navigation road network and the historical road vector, and obtaining a navigation road network section matched with the historical road vector as an unchanged road; and the road sections which cannot be matched with the historical road vector in the navigation road network are regarded as the roads which are likely to change. The method comprises the following specific steps:
1) defining a vector section between two nodes as a road section by taking a historical road vector as a reference, and generating a buffer area by taking the road width as the size of the road section by section; the specific method is that the maximum and minimum values of the horizontal and vertical coordinates of all the nodes of the historical road vector of the road section are firstly calculated to be used as the boundary of the circumscribed rectangle, and then a buffer area with the width is added to the rectangle, as shown in fig. 2.
2) In the buffer area, firstly, roughly matching the navigation road network and the historical road vector, wherein the matching basis is to judge whether the navigation road network passes through the buffer area, if so, the rough matching is finished, otherwise, the matching fails; as shown in fig. 2, in the navigation road network, the black road segments meet the rough matching condition, and the gray road segments do not meet the rough matching condition.
3) For the navigation road network through rough matching, adopting the length similarity SlenAnd direction similarity SangTwo constraints are fine-matched, where the length similarity SlenUsed for representing the closeness degree of the lengths of the navigation road network and the historical road vector road section and the direction similarity SangThe method is used for filtering the navigation network with larger vector direction difference with the local historical road, and the calculation formulas are respectively as the following formulas (1) to (2):
Figure BDA0001800536270000061
Sang=|cosθ| (2)
wherein L isOSMThe length of all vector segments of the navigation road network in the buffer area is integrated, L is the length of the vector segment in the current historical road vector, and theta is the included angle between the navigation road network and the historical road vector. It should be noted that if the buffer contains a plurality of navigation network vector segments, the similarity is calculated vector segment by vector segment. The final similarity index adopts length similarity SlenAnd direction similarity SangIs expressed as equation (3):
S=λ·Sang+(1-λ)·Slen (3)
in the formula (3), λ is a balance factor for adjusting the weight of length similarity and direction similarity, and in general, λ is 0.5, which can be fine-tuned according to actual conditions. The smaller the S value, the closer the navigation road network is to the historical road vector.
4) Let TmatchIf S is less than T, the threshold is a fine match thresholdmatchThen the navigation road network passes the fine matching. And regarding the road sections which pass through the fine matching in the navigation road network data as not changed, and then not processing any more, regarding the road sections which fail to be matched with the historical road vector in the navigation road network as possibly changed roads, and processing only the road sections which do not pass through the fine matching in the third step. T ismatchThe value is an empirical value and needs to be determined through multiple experiments. It is generally recommended to take the initial value at [0.5-0.8 ]]If there is a mismatch, T is increasedmatchTaking values; if there is a virtual match, then T is decreasedmatchAnd (4) taking values.
Step three: and verifying the changed road based on the multi-feature evidence model of the edge feature, the spectral feature and the vegetation feature. The method comprises the following specific steps:
1) for road sections which do not pass through the fine matching in the navigation road network data, a buffer area is constructed section by section (the size of the buffer area is set to be twice the road width), and image slices are obtained according to the range of the buffer area. It should be noted that if the length of the road segment is greater than a certain length (it is recommended to set the length to be 1000 pixels), the road segment needs to be split into multiple segments for processing, so as to avoid the excessive spectral difference of the roads in the road segment.
2) Adopting a multi-feature evidence model to judge whether the road is the road or not, wherein:
firstly, an edge feature evidence model: without occlusion, roads typically have significant edge-isotropic features, and thus, edge features can serve as strong evidence for road verification. The specific calculation method comprises the following steps: carrying out canny edge detection on the high-resolution remote sensing image in the buffer area; calculating the direction (the value range is [ 0-pi ]) of each edge, counting the total length of the edge corresponding to each direction, and considering the direction corresponding to the longest length as the main direction of the road section; calculating edge density in the main direction as edge feature evidence RedgeThe calculation formula is shown in(4):
Redge=Lang/L (4)
Wherein ang is the angle corresponding to the main direction of the road section, LangIs the total length of the edge corresponding to the main direction, and L is the length of the road section. Let TedgeIs a valid edge feature threshold, if Redge<TedgeThe probability of road existence is positively linearly related to the edge characteristic value, if R isedge≥TedgeThe road is considered to exist. The road existence probability of the edge evidence model is as follows (5):
Figure BDA0001800536270000081
wherein alpha isedgeThe probability weight corresponding to the edge feature is generally 0.33, and can be adjusted according to the actual situation: if the road section edge is clear, alpha can be increasededgeAnd (4) taking values. Epsilon is a predefined small probability value, typically 0.001.
Second, spectral feature evidence model: the ideal road surface spectral feature has small variation and large gray scale difference with the adjacent non-road background. However, the situation that the road is shielded by other background ground objects or the road has large spectral difference in the road section due to self factors generally exists, and therefore the spectral characteristics cannot be used as the evidence of road disappearance. However, when the difference of the spectral features in the road sections corresponding to the navigation road network data is small, the spectral features can be used as strong evidence of the existence of the road.
The gray value standard deviation of the pixels in the range of the buffer area at each wave band is std (DN), TspecIs the effective spectral feature threshold, then the spectral feature Rspec=std(DN)/Tspec. If R isspecIf < 1, the probability of road existence and R are consideredspecIs in negative linear correlation; if R isspecAnd if the road is more than or equal to 1, the road is considered to exist. The road existence probability of the spectrum evidence model is as follows (6):
Figure BDA0001800536270000082
αspecthe probability weight corresponding to the spectral feature is generally 0.33, and can be adjusted according to actual conditions: if the spectral characteristics of the road surface are uniform, alpha can be increasedspecAnd (4) taking values. Epsilon is a predefined small probability value, typically 0.001.
And thirdly, vegetation characteristic evidence model: road vegetation is the feature of land that is common in road scenes, and is composed of street trees, greenbelts and greenbelts in reality; from the geometrical aspect, the road vegetation is usually close to the road, wherein the green zones of the road are usually distributed continuously along the road and have the direction characteristics consistent with the trend of the road, and the green land is surrounded by the road to form a regular geometrical aspect; on the image, the road vegetation reflects a special spectral curve. Through the analysis, the direction characteristics of the vegetation object in the road scene can be used as evidence for checking whether the road exists or not. The specific calculation method comprises the following steps: firstly, calculating a normalized vegetation index NDVI of an image in a buffer area, and roughly extracting vegetation in the buffer area by adopting a threshold segmentation method; let DplantSet of direction angles corresponding to the implanted objects in the buffer zone for the navigation section DroadFor the main direction corresponding angle, T, of the current road sectionplantAs a direction angle difference threshold, vegetation characteristic RplantSee formula (7):
Rplant=std(Dplant-Droad)/Tplant (7)
when R isplantWhen < 1, the probability of road existence and RplantNegative correlation; when R isplantWhen the vegetation direction is not less than 1, the vegetation direction is considered to be irrelevant to the road trend, and the probability of being a non-road object is higher. The road existence probability of the vegetation evidence model is as follows (8):
Figure BDA0001800536270000091
αplantfor the probability weight that vegetation characteristic corresponds, generally take the value to be 0.33, can adjust according to actual conditions: if there are road trees on both sides of the road section, alpha can be increasedplantAnd (4) taking values. Epsilon is predefinedThe small probability value of (2) is generally 0.001.
And fourthly, after the road existence probabilities corresponding to the edge evidence model, the spectrum evidence model and the vegetation evidence model are respectively calculated, obtaining the road existence probability based on the multi-evidence model by solving the maximum values of the three probability densities, wherein the formula is as follows (7):
P=max{Pedge,Pspec,Pplant} (9)
and after the road existence probability based on the multi-evidence model is obtained through calculation, if P is more than or equal to 0.5, the road is considered as a changed road, and otherwise, the road is considered as a non-road.
Step four: and connecting road networks. And connecting the changed road and the unchanged road according to the geometric characteristics of the road sections to form a road network. The geometrical characteristics of the constraint road section connection mainly comprise: the end point distance, the direction of the connecting section and the direction of the existing road section are different. A schematic diagram of a road segment connection based on geometric features is shown in fig. 3. Branch road section E in the figure11E12And E21E22For broken road sections, it is necessary to connect with the main section C1C2The connection is made. According to the distance constraint, with end point E12Detecting the center and the radius to obtain a candidate connecting node C1And C2(R is determined by road width and is typically set to twice road width); center point E12Connecting with candidate node C1And C2Respectively connected and calculating the included angle between the connection section and the horizontal line
Figure BDA0001800536270000092
And
Figure BDA0001800536270000093
and taking the corresponding node at the minimum included angle as a connecting node and connecting. Solid black line E in FIG. 312C2Representing the final selected link connection result, and a black dotted line E12C1It indicates a candidate connection segment that is not selected.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (4)

1. A road network automatic updating method based on vector road network fusion and high-resolution remote sensing image verification is characterized by comprising the following steps:
step 1, registering vector data and a remote sensing image, wherein the vector data is obtained by cutting a navigation road network and a historical road vector according to an image range;
step 2, fusing the registered navigation road network and the historical road vector, and obtaining a navigation road network section matched with the historical road vector as an unchanged road; the method comprises the following steps that a road section which cannot be matched with a historical road vector in a navigation road network is regarded as a road which is likely to change;
the specific implementation of step 2 is as follows,
step 2.1, defining a vector section between two nodes as a road section by taking the historical road vector as a reference, and generating a buffer area by taking the road width as the size of the road section by section;
step 2.2, in the buffer area, firstly, roughly matching the navigation road network vector with the historical road vector, wherein the matching basis is to judge whether the navigation road network vector passes through the buffer area, if so, the rough matching is finished, otherwise, the matching fails;
step 2.3, for the navigation road network vector passing through the rough matching, adopting the length similarity SlenAnd direction similarity SangTwo constraint conditions are finely matched, SlenAnd SangAre respectively expressed as formulas (1) to (2):
Figure FDA0002933008240000011
Sang=cosθ (2)
wherein L isOSMNavigating a road network in a bufferThe length of the vector section is integrated, L is the length of the vector section in the current historical road vector, and theta is the included angle between the navigation road network vector and the historical road vector;
the final similarity index adopts length similarity SlenAnd direction similarity SangAs shown in equation (3),
S=λ·Sang+(1-λ)·Slen (3)
the smaller the S value is, the closer the navigation road network and the historical road vector is;
step 2.4, setting TmatchFor a fine match threshold, if S<TmatchIf the road sections passing the fine matching in the navigation road network data are regarded as unchanged roads, the navigation road network data are not processed; regarding the road sections which cannot be matched with the historical road vectors in the navigation road network as the roads which are likely to change;
step 3, verifying the road which is possibly changed in the step 2 by adopting an edge characteristic, spectral characteristic and vegetation characteristic multi-characteristic evidence model, and further separating the changed road and the non-road;
and 4, connecting the changed road in the step 3 with the unchanged road in the step 2 according to the geometric characteristics of the road sections to form a road network.
2. The method for automatically updating the road network based on the vector road network fusion and the high-resolution remote sensing image verification as claimed in claim 1, wherein the method comprises the following steps: in the step 1, vector data and a remote sensing image are registered by adopting a human-computer interaction mode, and the specific process is as follows: and cutting the navigation road network and the historical road vector according to the remote sensing image range to obtain vector data, manually selecting a plurality of homonymous points from the vector data and the remote sensing image respectively, and carrying out integral affine transformation on the vector data of the road network.
3. The method for automatically updating the road network based on the vector road network fusion and the high-resolution remote sensing image verification as claimed in claim 1, wherein the method comprises the following steps: the specific implementation manner of the step 3 is as follows,
step 3.1, constructing buffer areas for road sections which do not pass the fine matching in the navigation road network data section by section, and acquiring image slices according to the range of the buffer areas;
step 3.2, judging whether the road is the road or not by adopting a multi-feature evidence model for each road section, wherein:
firstly, an edge feature evidence model: carrying out canny edge detection on the high-resolution remote sensing image in the buffer area; calculating the direction of each edge, counting the total length of the edge corresponding to each direction, and considering the direction corresponding to the longest length as the main direction of the road section; calculating edge density in the main direction as edge feature evidence RedgeThe calculation formula is shown in (4):
Redge=Lang/L (4)
wherein ang is the main direction of the road section, LangThe total length of the edge corresponding to the main direction, and L is the length of the road section; let TedgeIs a valid edge feature threshold, if Redge<TedgeThe probability of road existence is positively linearly related to the edge characteristic value, if R isedge≥TedgeIf so, the road is considered to exist; the road existence probability of the edge feature evidence model is as follows (5):
Figure FDA0002933008240000021
wherein alpha isedgeThe probability weight corresponding to the edge feature is used, and epsilon is a predefined small probability value;
second, spectral feature evidence model: the gray value standard deviation of the pixels in the range of the buffer area at each wave band is std (DN), TspecIs the effective spectral feature threshold, then the spectral feature Rspec=std(DN)/Tspec(ii) a If R isspec<1, the probability of the road existence and R are consideredspecIs in negative linear correlation; if R isspecIf the road is more than or equal to 1, the road is considered to exist; the road existence probability of the spectrum evidence model is as follows (6):
Figure FDA0002933008240000031
αspecprobability weight corresponding to the spectral characteristics;
and thirdly, vegetation characteristic evidence model: firstly, calculating a normalized vegetation index NDVI of an image in a buffer area, and roughly extracting vegetation in the buffer area by adopting a threshold segmentation method; let DplantSet of direction angles corresponding to the implanted objects in the buffer zone for the navigation section DroadFor the main direction corresponding angle, T, of the current road sectionplantAs a direction angle difference threshold, vegetation characteristic RplantSee formula (7):
Rplant=std(Dplant-Droad)/Tplant (7)
when R isplant<1, probability of road existence and RplantNegative correlation; when R isplantWhen the vegetation direction is not related to the road trend, the probability of being a non-road object is higher; the road existence probability of the vegetation evidence model is as follows (8):
Figure FDA0002933008240000032
αplantprobability weights corresponding to the vegetation characteristics;
and fourthly, after the road existence probabilities corresponding to the edge evidence model, the spectrum evidence model and the vegetation evidence model are respectively calculated, obtaining the road existence probability based on the multi-evidence model by solving the maximum values of the three probability densities, wherein the formula is as follows (9):
P=max{Pedge,Pspec,Pplant} (9)
and after the road existence probability based on the multi-evidence model is obtained through calculation, if P is more than or equal to 0.5, the road is considered as a changed road, and otherwise, the road is considered as a non-road.
4. The method for automatically updating the road network based on the vector road network fusion and the high-resolution remote sensing image verification as claimed in claim 3, wherein the method comprises the following steps: the size of the buffer in step 3.1 is set to twice the road width.
CN201811074936.3A 2018-09-14 2018-09-14 Road network updating method based on vector road network fusion and remote sensing image verification Active CN109271928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811074936.3A CN109271928B (en) 2018-09-14 2018-09-14 Road network updating method based on vector road network fusion and remote sensing image verification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811074936.3A CN109271928B (en) 2018-09-14 2018-09-14 Road network updating method based on vector road network fusion and remote sensing image verification

Publications (2)

Publication Number Publication Date
CN109271928A CN109271928A (en) 2019-01-25
CN109271928B true CN109271928B (en) 2021-04-02

Family

ID=65189059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811074936.3A Active CN109271928B (en) 2018-09-14 2018-09-14 Road network updating method based on vector road network fusion and remote sensing image verification

Country Status (1)

Country Link
CN (1) CN109271928B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110084107A (en) * 2019-03-19 2019-08-02 安阳师范学院 A kind of high-resolution remote sensing image method for extracting roads and device based on improvement MRF
CN110543885B (en) * 2019-08-13 2022-03-04 武汉大学 Method for interactively extracting high-resolution remote sensing image road and generating road network
CN111145157B (en) * 2019-12-27 2023-08-04 国交空间信息技术(北京)有限公司 Road network data automatic quality inspection method based on high-resolution remote sensing image
US11380093B2 (en) * 2020-07-30 2022-07-05 GM Global Technology Operations LLC Detecting road edges by fusing aerial image and telemetry evidences
CN112115817B (en) * 2020-09-01 2024-06-07 国交空间信息技术(北京)有限公司 Remote sensing image road track correctness checking method and device based on deep learning
CN112509453B (en) * 2020-12-14 2022-08-16 广西壮族自治区自然资源遥感院 Electronic navigation method and system for scenic spot live-action navigation map based on mobile equipment
CN113065594B (en) * 2021-04-01 2023-05-05 中科星图空间技术有限公司 Road network extraction method and device based on Beidou data and remote sensing image fusion
CN113076387B (en) * 2021-04-08 2022-09-20 北京星天地信息科技有限公司 Road network matching method and device based on multi-element map matching
CN113177456B (en) * 2021-04-23 2023-04-07 西安电子科技大学 Remote sensing target detection method based on single-stage full convolution network and multi-feature fusion
CN113408457B (en) * 2021-06-29 2022-10-21 西南交通大学 Road information intelligent extraction method combining high-resolution image and video image
CN114064835B (en) * 2021-11-18 2023-05-26 中国公路工程咨询集团有限公司 Multi-source vector road network updating method based on change point detection and electronic equipment
CN116860906B (en) * 2023-09-05 2023-11-28 高德软件有限公司 Track generation method, track generation device, track generation equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101364259A (en) * 2008-04-09 2009-02-11 武汉大学 Method for extracting road various information of multi-level knowledge driven panchromatic remote sensing image
KR101476172B1 (en) * 2013-07-15 2014-12-24 (주)시터스 Method for Automatic Change Detection Based on Areal Feature Matching in Different Network Datasets
CN106778605A (en) * 2016-12-14 2017-05-31 武汉大学 Remote sensing image road net extraction method under navigation data auxiliary

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101364259A (en) * 2008-04-09 2009-02-11 武汉大学 Method for extracting road various information of multi-level knowledge driven panchromatic remote sensing image
KR101476172B1 (en) * 2013-07-15 2014-12-24 (주)시터스 Method for Automatic Change Detection Based on Areal Feature Matching in Different Network Datasets
CN106778605A (en) * 2016-12-14 2017-05-31 武汉大学 Remote sensing image road net extraction method under navigation data auxiliary

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Automatic change detection of geo-spatial data from imagery;Deren L 等;《Geo-Spatial Information Science》;20030930;第6卷(第3期);第1-7页 *
基于道路网矢量数据的遥感影像道路损毁检测;徐阳 等;《测绘通报》;20110425(第4期);第14-16页 *

Also Published As

Publication number Publication date
CN109271928A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
CN109271928B (en) Road network updating method based on vector road network fusion and remote sensing image verification
Zai et al. 3-D road boundary extraction from mobile laser scanning data via supervoxels and graph cuts
CN111028277B (en) SAR and optical remote sensing image registration method based on pseudo-twin convolution neural network
CN106778605B (en) Automatic remote sensing image road network extraction method under assistance of navigation data
Jin et al. An integrated system for automatic road mapping from high-resolution multi-spectral satellite imagery by information fusion
Qin Change detection on LOD 2 building models with very high resolution spaceborne stereo imagery
Maurya et al. Road extraction using k-means clustering and morphological operations
CN109919944B (en) Combined superpixel graph-cut optimization method for complex scene building change detection
Matkan et al. Road extraction from lidar data using support vector machine classification
Karantzalos et al. A region-based level set segmentation for automatic detection of man-made objects from aerial and satellite images
CN105956542B (en) High-resolution remote sensing image road extraction method based on statistical matching of structural wire harnesses
CN112396612B (en) Vector information assisted remote sensing image road information automatic extraction method
CN115717894A (en) Vehicle high-precision positioning method based on GPS and common navigation map
Izadi et al. A new neuro-fuzzy approach for post-earthquake road damage assessment using GA and SVM classification from QuickBird satellite images
Yao et al. Automatic extraction of road markings from mobile laser-point cloud using intensity data
CN109543498A (en) A kind of method for detecting lane lines based on multitask network
Zhang et al. Knowledge-based image analysis for 3D edge extraction and road reconstruction
Senthilnath et al. Automatic road extraction using high resolution satellite image based on texture progressive analysis and normalized cut method
WO2018042208A1 (en) Street asset mapping
Peng et al. Building change detection by combining lidar data and ortho image
Doxani et al. Object-based building change detection from a single multispectral image and pre-existing geospatial information
Liu et al. Color component–based road feature extraction from airborne lidar and imaging data sets
Kasemsuppakorn et al. Pedestrian network extraction from fused aerial imagery (orthoimages) and laser imagery (lidar)
Thiede et al. Uncertainty quantification for the extraction of informal roads from remote sensing images of South Africa
Shackelford et al. Urban road network extraction from high-resolution multispectral data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant