CN113066154B - Method and system for real-time superposition of earth surface image and underground space image - Google Patents
Method and system for real-time superposition of earth surface image and underground space image Download PDFInfo
- Publication number
- CN113066154B CN113066154B CN202110262776.0A CN202110262776A CN113066154B CN 113066154 B CN113066154 B CN 113066154B CN 202110262776 A CN202110262776 A CN 202110262776A CN 113066154 B CN113066154 B CN 113066154B
- Authority
- CN
- China
- Prior art keywords
- camera
- space image
- underground space
- image
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The application relates to a method and a system for superposing a surface image and an underground space image in real time, wherein the method comprises the following steps: the method comprises the steps of obtaining the geographic coordinate of a first position and the geographic coordinate of a second position through high-precision positioning equipment, arranging a camera at the second position to collect an earth surface image, obtaining a regional underground space image in a preset region where the second position is located, intercepting the regional underground space image to obtain the preset underground space image in a preset range, and overlapping the preset underground space image onto the earth surface image to obtain an overlapped space image.
Description
Technical Field
The application relates to the field of geographic information processing, in particular to a method and a system for superposing a surface image and an underground space image in real time.
Background
Geographic information is used to describe the spatial location and distribution of various targets in the real world, and is one of the most important and basic information resources of human beings, and the geographic information obtained by mapping can be roughly divided into surface information and underground spatial information.
At present, no effective solution is provided aiming at the problems that in the related technology, earth surface information and corresponding underground space information are not organically combined, and low efficiency and inaccuracy exist in the observation and research of geographic information.
Disclosure of Invention
The embodiment of the application provides a method and a system for superposing a surface image and an underground space image in real time, which at least solve the problem that the surface image and the corresponding underground space image in the related technology cannot be integrated and unified, so that the mapping image is inaccurate.
In a first aspect, an embodiment of the present application provides a method for superimposing an earth surface image and a subsurface space image in real time, where the method includes:
acquiring the geographical coordinates of the first position through high-precision positioning equipment, and acquiring the geographical coordinates of the second position through high-precision positioning equipment;
a camera is arranged at the second position to collect a ground surface image;
acquiring an area underground space image in a preset area where the second position is located;
intercepting a preset underground space image in a preset range from the regional underground space image;
and overlapping the preset underground space image to the earth surface image to obtain an overlapped space image.
In some of these embodiments, in addition to positioning the camera at the second location to capture the earth's surface image, the method further comprises: and calibrating the camera to obtain the internal parameters of the camera, wherein the internal parameters comprise a focal length, a principal point coordinate and a pixel size.
In some embodiments, after obtaining the geographical coordinates of the first location by the high-precision positioning device and obtaining the geographical coordinates of the second location by the high-precision positioning device, the method further comprises: and obtaining external parameters of the camera according to the geographic coordinates of the first position and the second position, wherein the external parameters comprise rotation parameters and translation parameters.
In some embodiments, the step of capturing the preset underground space image within the preset range from the regional underground space image includes: and according to the geographic coordinates of the second position where the camera is positioned, intercepting an underground space image within the range of 300 x 100 meters in front of the camera from the regional underground space image to obtain a preset underground space image, wherein the preset underground space image comprises a subway corridor, a pipeline corridor, a working well, a single pipeline or a plurality of pipelines and auxiliary facilities thereof, and vector diagrams or/and grid diagrams of geological cavities and underground structures.
In some embodiments, obtaining the superimposed spatial image by superimposing the preset subsurface spatial image on the surface image comprises: according to the external parameters and the internal parameters of the camera, the preset underground space image is superimposed on the earth surface image after perspective transformation is carried out on the preset underground space image through a preset formula, and a superimposed space image is obtained, wherein the preset formula is
In a second aspect, the present application provides a system for real-time superposition of a surface image and a subsurface space image, the system comprising a camera, a high-precision positioning device, a surveying instrument and a processor;
the high-precision positioning equipment acquires the geographic coordinate of a first position, and the high-precision positioning equipment acquires the geographic coordinate of a second position;
the camera collects earth surface images at the second position;
the processor acquires an underground space image of the area in the preset area where the second position is located;
the processor intercepts the regional underground space image to obtain a preset underground space image within a preset range;
and the processor superimposes the preset underground space image on the earth surface image to obtain a superimposed space image.
In some of these embodiments, in addition to the camera acquiring the earth's surface image at the second location, the system further comprises: and calibrating the camera to obtain the internal parameters of the camera, wherein the internal parameters comprise a focal length, a principal point coordinate and a pixel size.
In some of these embodiments, after the high-precision positioning device obtains the geographic coordinates of the first location and the high-precision positioning device obtains the geographic coordinates of the second location, the system further comprises: and obtaining external parameters of the camera according to the geographic coordinates of the first position and the second position, wherein the external parameters comprise rotation parameters and translation parameters.
In some embodiments, the processor truncates the preset underground space image within a preset range from the regional underground space image includes: and the processor intercepts an underground space image within the range of 300 x 100 m in front of the camera from the regional underground space image according to the geographic coordinates of the second position where the camera is positioned, so as to obtain a preset underground space image, wherein the preset underground space image comprises a vector diagram or/and a grid diagram of a subway corridor, a pipeline corridor, a working well, a single pipeline or a plurality of pipelines and auxiliary facilities thereof, a geological cavity and an underground structure.
In some embodiments, the processor superimposes the preset subsurface spatial image onto the surface image, and obtaining the superimposed spatial image includes: the processor performs perspective transformation on the preset underground space image according to the external parameters and the internal parameters of the camera through a preset formula and then superimposes the preset underground space image on the earth surface image to obtain a superimposed space image, wherein the preset formula is
Compared with the prior art, the method and the system for superposing the earth surface image and the underground space image in real time provided by the embodiment of the application, acquiring the geographical coordinates of the first position and the geographical coordinates of the second position by a high-precision positioning device, a camera is arranged at the second position to collect the earth surface image and obtain the regional underground space image in the preset region where the second position is located, the preset underground space image in the preset range is obtained by intercepting the underground space image in the region, the preset underground space image is superposed on the ground surface image to obtain a superposed space image, the problem that the ground surface information is not organically combined with the corresponding underground space information is solved, the problems of low efficiency and inaccuracy exist in the observation and research of the geographic information, the real-time superposition of the earth surface image and the corresponding underground space image is realized, and the user can observe the invisible underground space conveniently by combining the earth surface image.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a block diagram of a system for real-time superposition of earth surface images and subsurface space images according to an embodiment of the present application;
FIG. 2 is a diagram illustrating relationships of four coordinate systems according to an embodiment of the present application;
FIG. 3 is a flow chart of a method for real-time superposition of a surface image and a subsurface image according to an embodiment of the application;
fig. 4 is a flow chart of a method for real-time superposition of a surface image and a subsurface image according to a specific embodiment of the present application.
Description of the drawings: 11. a camera; 12. high-precision positioning equipment; 13. a processor.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application.
It is obvious that the drawings in the following description are only examples or embodiments of the present application, and that it is also possible for a person skilled in the art to apply the present application to other similar contexts on the basis of these drawings without inventive effort. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as referred to herein means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The embodiment of the application provides a system for superposing a surface image and an underground space image in real time, and fig. 1 is a structural block diagram of the system for superposing the surface image and the underground space image in real time according to the embodiment of the application, and as shown in fig. 1, the system comprises a camera 11, a high-precision positioning device 12, a processor 13 and a processor 13;
the high-precision positioning device 12 obtains the geographical coordinates of a first position, and the high-precision positioning device 12 obtains the geographical coordinates of a second position;
the camera 11 collects an image of the earth's surface at the second location;
the processor 13 acquires the underground space data within a first preset range with the second position as the center, and generates a regional underground space image according to the underground space data;
the processor 13 intercepts the underground space image of the region to obtain a preset underground space image within a preset range;
the processor 13 superimposes the preset underground space image on the surface image in real time to obtain a superimposed space image.
According to the embodiment of the application, the high-precision positioning equipment 12 obtains the geographic coordinate of the first position and the geographic coordinate of the second position, the camera 11 is arranged at the second position to collect the earth surface image, the processor obtains the regional underground space image in the preset region where the second position is located, the preset underground space image in the preset range is obtained by intercepting in the regional underground space image, the preset underground space image is superposed on the earth surface image in real time, the superposed space image is obtained, the problems that the earth surface information is not organically combined with the corresponding underground space information, low efficiency and inaccuracy exist in observation and research of the geographic information are solved, the real-time superposition of the earth surface image and the corresponding underground space image is realized, and the user can observe the invisible underground space by combining the earth surface image conveniently.
In some of these embodiments, in addition to the camera 11 acquiring the earth's surface image at the second location, the system further comprises: and calibrating the camera 11 to obtain internal parameters of the camera 11, wherein the internal parameters include a focal length, a principal point coordinate and a pixel size.
In some of these embodiments, after the high precision positioning device 12 obtains the geographical coordinates of the first location and the high precision positioning device 12 obtains the geographical coordinates of the second location, the system further comprises: from the geographical coordinates of the first and second positions, the extrinsic parameters of the camera 11 are obtained, which include rotation parameters and translation parameters.
In some embodiments, the processor 13 truncates the preset underground space image within the preset range from the regional underground space image, including: the processor 13 captures an underground space image within a range of 300 × 100 m in front of the camera from the regional underground space image according to the geographic coordinates of the second position where the camera 11 is located, so as to obtain a preset underground space image.
The present embodiment provides a method and system for real-time superposition of earth surface images and subsurface space images, fig. 2 is a schematic diagram of the relationship of four coordinate systems according to the present embodiment, as shown in fig. 2,
world coordinate system (world coordinate system), user-defined coordinate system of a three-dimensional world, owIs the origin of the world coordinate system, xw、ywAnd zwIs three axes of a world coordinate system and aims to describe the position of a target object in the real world, for example, the coordinate of a point P in the world coordinate system is (x)w,yw,zw) In meters (m);
camera coordinate system (camera coordinate system), a coordinate system established with the camera 11 as the origin, ocIs the origin of the camera coordinate system, xc、ycAnd zcIs the three axes of the world coordinate system, which is used to describe the position of the target object from the perspective of the camera 11, and is the middle loop that communicates the world coordinate system with the image or pixel coordinate system, e.g., the coordinate of point P in the camera coordinate system is (x)c,yc,zc) In meters (m);
an image coordinate system (image coordinate system), where o is an origin of the image coordinate system, and x and y are two axes of the image coordinate system, in order to describe a projection transmission relationship of an object from a camera coordinate system to the image coordinate system during imaging, you can further obtain coordinates in a pixel coordinate system through the image coordinate system, for example, coordinates of a point P in the image coordinate system are (x, y) and a unit is meter (m);
a pixel coordinate system (pixel coordinate system), u and v are two axes of the image coordinate system, and the purpose is to describe coordinates of a pixel point after the imaging of an object on a digital image, and the coordinate system is the coordinate system where information is really read from the camera 11, for example, the coordinate of a point P in the pixel coordinate system is (u, v), and the unit is the number of pixels (pixels);
the intrinsic parameters of the camera 11 include the focal length, principal point coordinates and pixel size, where the focal length f is equal to the origin o of the camera coordinate systemcDistance to the origin o of the image coordinate system, principal point coordinates are obtained from the pixel coordinate system (u, v), and pixel size is (d)x、dy) (ii) a The external parameters of the camera 11 comprise a rotation parameter and a translation parameter, wherein the rotation parameter R is equal to the azimuth angle of the lens orientation (lens axis) of the camera 11 located at the second position calculated from the geographic coordinates of the first position and the second position; the translation parameter T is equal to the geographic coordinates of the second position where the camera 11 is located;
acquiring geographical coordinates of a first position and a second position through high-precision positioning equipment 12, acquiring a ground surface image at the second position by using a camera 11, acquiring a regional underground space image in a preset region where the second position is located, and intercepting the underground space image within a range of 300 x 100 meters in front of a camera from the regional underground space image to obtain a preset underground space image;
according to a preset formula:
namely, it isWill presetSubjecting the underground space image to perspective transformation, and superimposing the perspective transformation on the surface image in real time to obtain a superimposed space image, whereinIs a reference for the camera to be used,is an external reference of the camera.
By the embodiment of the present application, the high-precision positioning device 12 obtains the geographic coordinates of the first location and the geographic coordinates of the second location, a camera 11 is arranged at the second position to collect the earth surface image, the regional underground space image in the preset region where the second position is located is obtained, intercepting the underground space image in the region to obtain a preset underground space image in a preset range, carrying out perspective transformation on the preset underground space image through a preset formula, and finally superposing the preset underground space image on the ground surface image in real time to obtain a superposed space image, so that the problem that the ground surface information is not organically combined with the corresponding underground space information is solved, the problems of low efficiency and inaccuracy exist in the observation and research of the geographic information, the real-time superposition of the earth surface image and the corresponding underground space image is realized, and the user can observe the invisible underground space conveniently by combining the earth surface image.
The embodiment of the present application provides a method for superimposing a surface image and an underground space image in real time, fig. 3 is a flowchart of a method for superimposing a surface image and an underground space image in real time according to an embodiment of the present application, and as shown in fig. 3, the method includes the following steps:
s302, acquiring the geographical coordinates of a first position through the high-precision positioning equipment 12, and acquiring the geographical coordinates of a second position through the high-precision positioning equipment 12;
s304, arranging a camera 11 at the second position to collect a ground surface image;
s306, acquiring an underground space image of the area in the preset area where the second position is located;
s308, intercepting the underground space image in the region to obtain a preset underground space image in a preset range;
and S310, overlaying the preset underground space image on the surface image in real time to obtain an overlaid space image.
Through steps S302 to S310 in the embodiment of the present application, the high-precision positioning device 12 obtains the geographic coordinate of the first position and the geographic coordinate of the second position, the camera 11 is disposed at the second position to collect the surface image, the regional underground space image in the preset region where the second position is located is obtained, the preset underground space image in the preset range is obtained by capturing in the regional underground space image, and the preset underground space image is superimposed on the surface image in real time to obtain the superimposed space image.
In some of these embodiments, in addition to the second position at which the camera 11 captures the earth's surface images, the method further comprises: and calibrating the camera 11 to obtain internal parameters of the camera 11, wherein the internal parameters include a focal length, a principal point coordinate and a pixel size.
In some embodiments, after obtaining the geographical coordinates of the first location by the high precision positioning device 12 and obtaining the geographical coordinates of the second location by the high precision positioning device 12, the method further comprises: from the geographical coordinates of the first and second positions, extrinsic parameters of the camera 11 are obtained, which include rotation parameters and translation parameters.
In some embodiments, the step of capturing the preset underground space image within the preset range from the underground space image of the region includes: and according to the geographic coordinates of the second position where the camera 11 is located, intercepting an underground space image within the range of 300 × 100 meters in front of the camera from the underground space image of the region to obtain a preset underground space image, wherein the preset underground space image comprises a vector diagram or/and a grid diagram of a subway corridor, a pipeline corridor, a working well, a single pipeline or a plurality of pipelines and auxiliary facilities thereof, a geological cavity and an underground structure.
In some embodiments, obtaining the overlaid space image by overlaying the preset subsurface space image onto the surface image in real time includes: according to the external parameters and the internal parameters of the camera 11, the preset underground space image is subjected to perspective transformation through a preset formula and then is superimposed on the earth surface image in real time to obtain a superimposed space image, wherein the preset formula is
The embodiment of the present application provides a method for superimposing a surface image and an underground space image in real time, fig. 4 is a flowchart of a method for superimposing a surface image and an underground space image in real time according to the embodiment of the present application, and as shown in fig. 4, the method includes the following steps:
s402, acquiring the geographical coordinates of a first position through the high-precision positioning equipment 12, and acquiring the geographical coordinates of a second position through the high-precision positioning equipment 12;
s404, arranging a camera 11 at the second position to collect a ground surface image;
s406, acquiring an underground space image of the area in the preset area where the second position is located;
s408, intercepting the underground space image in the region to obtain a preset underground space image in a preset range;
s410, passing a preset formulaCarrying out perspective transformation on a preset underground space image;
and S412, superposing the preset underground space image subjected to perspective transformation on the earth surface image in real time to obtain a superposed space image.
Through steps S402 to S412 in the embodiment of the present application, the high-precision positioning device 12 obtains the geographic coordinates of the first location and the geographic coordinates of the second location, a camera 11 is arranged at the second position to collect the earth surface image, the regional underground space image in the preset region where the second position is located is obtained, intercepting the underground space image in the region to obtain a preset underground space image in a preset range, carrying out perspective transformation on the preset underground space image through a preset formula, and finally superposing the preset underground space image on the ground surface image in real time to obtain a superposed space image, so that the problem that the ground surface information is not organically combined with the corresponding underground space information is solved, the problems of low efficiency and inaccuracy exist in the observation and research of the geographic information, the real-time superposition of the earth surface image and the corresponding underground space image is realized, and the user can observe the invisible underground space conveniently by combining the earth surface image.
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (8)
1. A method for superposing a surface image and a subsurface space image in real time is characterized by comprising the following steps:
acquiring the geographical coordinates of the first position through high-precision positioning equipment, and acquiring the geographical coordinates of the second position through high-precision positioning equipment; a camera is arranged at the second position to collect a ground surface image;
acquiring an area underground space image in a preset area where the second position is located;
intercepting a preset underground space image in a preset range from the regional underground space image;
according to the external parameters and the internal parameters of the camera, through a preset formulaSuperimposing the preset underground space image onto the earth surface image after perspective transformation to obtain a superimposed space image, wherein,the focal length f in the intrinsic parameters is the distance from the origin of the camera coordinate system to the origin of the image coordinate system, u0And v0For the particular principal point coordinates obtained from the pixel coordinate system, dxAnd dyIn order to be the size of the pixel,the external parameters of the camera are rotation parameters R in the external parameters, the azimuth angle of the orientation of the camera lens is calculated according to the geographic coordinates of the first position and the second position, translation parameters T are the geographic coordinates of the second position where the camera is located, u and v are pixel coordinates of a point, and x is the pixel coordinate of the pointw,ywAnd zwIs the world coordinate of a point, zcIs the z-axis coordinate of the camera coordinate system.
2. The method of claim 1, wherein in addition to positioning a camera at the second location to capture an image of the earth's surface, the method further comprises: and calibrating the camera to obtain the internal parameters of the camera, wherein the internal parameters comprise a focal length, a principal point coordinate and a pixel size.
3. The method of claim 1, wherein after acquiring the geographical coordinates of the first location with the high-precision positioning device and acquiring the geographical coordinates of the second location with the high-precision positioning device, the method further comprises: and obtaining external parameters of the camera according to the geographic coordinates of the first position and the second position, wherein the external parameters comprise rotation parameters and translation parameters.
4. The method of claim 1, wherein the step of intercepting the preset underground spatial image within a preset range from the regional underground spatial image comprises: and according to the geographic coordinates of the second position where the camera is positioned, intercepting an underground space image within the range of 300 x 100 meters in front of the camera from the regional underground space image to obtain a preset underground space image, wherein the preset underground space image comprises a subway corridor, a pipeline corridor, a working well, a single pipeline or a plurality of pipelines and auxiliary facilities thereof, and vector diagrams or/and grid diagrams of geological cavities and underground structures.
5. A system for superposing a surface image and an underground space image in real time is characterized by comprising a camera, high-precision positioning equipment and a processor;
the high-precision positioning equipment acquires the geographic coordinate of a first position, and the high-precision positioning equipment acquires the geographic coordinate of a second position;
the camera collects earth surface images at the second position;
the processor acquires an underground space image of the area in the preset area where the second position is located;
the processor intercepts the regional underground space image to obtain a preset underground space image within a preset range;
the processor is used for processing the external parameters and the internal parameters of the camera through a preset formulaSuperimposing the preset underground space image onto the earth surface image after perspective transformation to obtain a superimposed space image, wherein,as an internal parameter of the camera, cThe focal length f in the intrinsic parameters is the distance from the origin of the camera coordinate system to the origin of the image coordinate system, u0And v0For the particular principal point coordinates obtained from the pixel coordinate system, dxAnd dyIn order to be the size of the pixel,the external parameters of the camera are rotation parameters R in the external parameters, the azimuth angle of the orientation of the camera lens is calculated according to the geographic coordinates of the first position and the second position, translation parameters T are the geographic coordinates of the second position where the camera is located, u and v are pixel coordinates of a point, and x is the pixel coordinate of the pointw,ywAnd zwIs the world coordinate of a point, zcIs the z-axis coordinate of the camera coordinate system.
6. The system of claim 5, further comprising, in addition to the camera acquiring the earth's surface image at the second location: and calibrating the camera to obtain the internal parameters of the camera, wherein the internal parameters comprise a focal length, a principal point coordinate and a pixel size.
7. The system of claim 5, wherein the high-precision positioning device obtains the geographical coordinates of the first location, and wherein after the high-precision positioning device obtains the geographical coordinates of the second location, the system further comprises: and obtaining external parameters of the camera according to the geographic coordinates of the first position and the second position, wherein the external parameters comprise rotation parameters and translation parameters.
8. The system of claim 5, wherein the processor truncates the regional underground space image to obtain a preset underground space image within a preset range comprises: and the processor intercepts an underground space image within the range of 300 x 100 m in front of the camera from the regional underground space image according to the geographic coordinates of the second position where the camera is positioned, so as to obtain a preset underground space image, wherein the preset underground space image comprises a vector diagram or/and a grid diagram of a subway corridor, a pipeline corridor, a working well, a single pipeline or a plurality of pipelines and auxiliary facilities thereof, a geological cavity and an underground structure.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110262776.0A CN113066154B (en) | 2021-03-10 | 2021-03-10 | Method and system for real-time superposition of earth surface image and underground space image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110262776.0A CN113066154B (en) | 2021-03-10 | 2021-03-10 | Method and system for real-time superposition of earth surface image and underground space image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113066154A CN113066154A (en) | 2021-07-02 |
CN113066154B true CN113066154B (en) | 2021-11-30 |
Family
ID=76560475
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110262776.0A Active CN113066154B (en) | 2021-03-10 | 2021-03-10 | Method and system for real-time superposition of earth surface image and underground space image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113066154B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5214045B1 (en) * | 2012-04-12 | 2013-06-19 | 株式会社フリーソフトネット | Map data input method and system |
CN103970919A (en) * | 2013-02-04 | 2014-08-06 | 上海市城市建设设计研究总院 | Automatic building information modeling data processing method |
CN104793909A (en) * | 2014-01-17 | 2015-07-22 | 浙江图维电力科技有限公司 | Two-channel space picture synchronous display and control method and implementing system thereof |
CN111562791A (en) * | 2019-03-22 | 2020-08-21 | 沈阳上博智像科技有限公司 | System and method for identifying visual auxiliary landing of unmanned aerial vehicle cooperative target |
CN112037130A (en) * | 2020-08-27 | 2020-12-04 | 江苏提米智能科技有限公司 | Adaptive image splicing and fusing method and device, electronic equipment and storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107705364A (en) * | 2016-08-08 | 2018-02-16 | 国网新疆电力公司 | A kind of immersion virtual display system based on three-dimensional geographic information |
-
2021
- 2021-03-10 CN CN202110262776.0A patent/CN113066154B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5214045B1 (en) * | 2012-04-12 | 2013-06-19 | 株式会社フリーソフトネット | Map data input method and system |
CN103970919A (en) * | 2013-02-04 | 2014-08-06 | 上海市城市建设设计研究总院 | Automatic building information modeling data processing method |
CN104793909A (en) * | 2014-01-17 | 2015-07-22 | 浙江图维电力科技有限公司 | Two-channel space picture synchronous display and control method and implementing system thereof |
CN111562791A (en) * | 2019-03-22 | 2020-08-21 | 沈阳上博智像科技有限公司 | System and method for identifying visual auxiliary landing of unmanned aerial vehicle cooperative target |
CN112037130A (en) * | 2020-08-27 | 2020-12-04 | 江苏提米智能科技有限公司 | Adaptive image splicing and fusing method and device, electronic equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
An Overall Deformation Monitoring Method of Structure Based on Tracking Deformation Contour;Xi Chu等;《Innovative Methods and Materials in Structural Health Monitoring of Civil Infrastructures》;20191025;第9卷(第21期);1-20 * |
基于数码相机的三维物体空间几何位置的摄影测量;管业鹏等;《电子学报》;20020625(第6期);849-852 * |
Also Published As
Publication number | Publication date |
---|---|
CN113066154A (en) | 2021-07-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Teller et al. | Calibrated, registered images of an extended urban area | |
CN104748728B (en) | Intelligent machine attitude matrix calculation method and its applied to photogrammetric method | |
AU2011312140C1 (en) | Rapid 3D modeling | |
US8098958B2 (en) | Processing architecture for automatic image registration | |
CN100557634C (en) | A kind of camera marking method based on double 1-dimension drone | |
CN102692214B (en) | Narrow space binocular vision measuring and positioning device and method | |
CN112629431B (en) | Civil structure deformation monitoring method and related equipment | |
CA2568617A1 (en) | Digital 3d/360 degree camera system | |
CN103226838A (en) | Real-time spatial positioning method for mobile monitoring target in geographical scene | |
CN112470092A (en) | Surveying and mapping system, surveying and mapping method, device, equipment and medium | |
KR20090000186A (en) | Point of interest displaying apparatus and method for using augmented reality | |
CA2705809A1 (en) | Method and apparatus of taking aerial surveys | |
CN107563959B (en) | Panorama generation method and device | |
CN108769569A (en) | A kind of 360 degree of stereoscopic full views observation systems and method for unmanned plane | |
CN104159036A (en) | Display method and shooting equipment of image direction information | |
CN105374067A (en) | Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof | |
CN108957507A (en) | Fuel gas pipeline leakage method of disposal based on augmented reality | |
CN115841487A (en) | Hidden danger positioning method and terminal along power transmission line | |
CN111527375B (en) | Planning method and device for surveying and mapping sampling point, control terminal and storage medium | |
JP5862865B2 (en) | Composite image display device and composite image display program | |
CN113066154B (en) | Method and system for real-time superposition of earth surface image and underground space image | |
JP2013196388A (en) | Image processor, image processing method and image processing program | |
CN108955723A (en) | The calibration method of augmented reality city planting ductwork | |
CN109461116B (en) | 720 panorama unfolding monitoring method based on opengl | |
CN108954016A (en) | Fuel gas pipeline leakage disposal system based on augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |