CN109561295A - A kind of mixed reality Three-Dimensional Dynamic space-time visible system and method based on AR+3DGIS - Google Patents
A kind of mixed reality Three-Dimensional Dynamic space-time visible system and method based on AR+3DGIS Download PDFInfo
- Publication number
- CN109561295A CN109561295A CN201811505248.8A CN201811505248A CN109561295A CN 109561295 A CN109561295 A CN 109561295A CN 201811505248 A CN201811505248 A CN 201811505248A CN 109561295 A CN109561295 A CN 109561295A
- Authority
- CN
- China
- Prior art keywords
- data
- gateway
- fusion
- dimensional
- video stream
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 230000004927 fusion Effects 0.000 claims abstract description 86
- 230000003190 augmentative effect Effects 0.000 claims abstract description 29
- 238000002156 mixing Methods 0.000 claims abstract description 9
- 238000000547 structure data Methods 0.000 claims abstract description 8
- 230000000007 visual effect Effects 0.000 claims description 7
- 238000012937 correction Methods 0.000 claims description 6
- 230000008859 change Effects 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 abstract description 27
- 230000008447 perception Effects 0.000 abstract description 18
- 238000010586 diagram Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000001953 sensory effect Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000007405 data analysis Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 206010017577 Gait disturbance Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the invention discloses a kind of mixed reality Three-Dimensional Dynamic space-time visible system and method based on AR+3DGIS, should mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS include: structure data entry equipment, unstructured data acquisition equipment, the first gateway, the second gateway and image co-registration equipment;Wherein, image co-registration equipment includes: augmented reality Fusion Module and three-dimensional geographic information Fusion Module;Structure data entry equipment is sent to the first gateway for acquiring video stream data, and by video stream data;Unstructured data acquires equipment, for acquiring the second data, and the second data is sent to the second gateway;Augmented reality Fusion Module is merged, and the result of fusion is sent to three-dimensional geographic information Fusion Module for receiving the second data of video stream data and the second gateway forwards Jing Guo the first gateway forwards by preset blending algorithm;Three-dimensional geographic information Fusion Module, for realizing to monitor video picture, intellectual analysis data, the fusion application of Internet of Things perception data in three-dimensional geographic information scape.
Description
Technical field
The present invention relates to mixed reality technical fields, and in particular to a kind of mixed reality Three-Dimensional Dynamic based on AR+3DGIS
Space-time visible system and method.
Background technique
Video fusion technology is a branch of virtual reality technology, or perhaps a developing stage of virtual reality.
3 D video integration technology, which refers to, to be added one or more by camera review sequence video and associated three-dimensional virtual scene
To match and merge, the new dynamic virtual scene or model about this scene is generated, realizes virtual scene and in real time view
The fusion of frequency, it may be assumed that actual situation combines.
3 D video integration technology, can rely on individual d engine, realize the three-dimensional scenic of small range or part with
The fusion application of the resources such as video.But in actual application, some problems can still be faced:
Outdoor scene will appear torsional deformation problem in scene fusion.Due to scene be it is three-dimensional, real scene video be it is two-dimensional,
It in fusion process, needs to be corrected processing to outdoor scene video camera, could preferably embody the geography information value of video.
This is just opposite with the live-action map problem of augmented reality, and live-action map can preferably restore real scene, but
Geographical information value cannot accurately be reacted.3 D video fusion can accurately react three-dimensional geographic information value, but regard
Frequency itself, is limited to video installation site problem, video pictures integrally reduce.
Therefore, how by augmented reality fusion application in three-dimensional geographic information scene fusion application, be one urgently to be resolved
The problem of.
Summary of the invention
The embodiment of the present invention to be designed to provide a kind of mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS visual
System and method, to solve problems of the prior art.
To achieve the above object, the embodiment of the present invention provides a kind of mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS
Visible system, should mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS include: structure data entry equipment,
Unstructured data acquires equipment, the first gateway, the second gateway and image co-registration equipment;Wherein, described image fusion device packet
It includes: augmented reality Fusion Module and three-dimensional geographic information Fusion Module;The structure data entry equipment, for acquiring video
Flow data, and the video stream data is sent to first gateway;The unstructured data acquires equipment, for acquiring
Second data, and second data are sent to second gateway;The augmented reality Fusion Module passes through for receiving
Second data of the video stream data of first gateway forwards and second gateway forwards, are melted by preset blending algorithm
It closes, and the result of fusion is sent to the three-dimensional geographic information Fusion Module;The three-dimensional geographic information Fusion Module is used
In in three-dimensional geographic information scape, the fusion application of the video stream data and the second data is realized.
Optionally, the video stream data is forwarded to described by first gateway by 28181 agreements or SDK mode
Augmented reality Fusion Module.
Optionally, second gateway is merged second data forwarding to the augmented reality by SDK mode
Module.
Optionally, the augmented reality Fusion Module, is specifically used for: to first panoramic picture carry out geometric correction,
Noise is eliminated, chroma-luminance adjusts and registration;The maximal correlation of first panoramic picture Yu the three-dimensional map scene is found out,
Generate the transformation model of the three-dimensional map scene, the coordinate change of unified first panoramic picture and the three-dimensional map scene
It changes, project in the three-dimensional map scene by first panoramic picture by the method that texture projects, realize video
The fusion of image.
To achieve the above object, the embodiment of the present invention provides a kind of mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS
Visual method, being somebody's turn to do the mixed reality Three-Dimensional Dynamic space-time visual method based on AR+3DGIS includes: acquisition video stream data, concurrently
Send the video stream data;The second data are acquired, and send second data;Receive the video stream data and described second
Data are merged by preset blending algorithm, obtain fusion results, and send the fusion results;In three-dimensional geographic information
Jing Zhong realizes the fusion application of the video stream data and the second data.
Optionally, the video stream data is forwarded to described by first gateway by 28181 agreements or SDK mode
Augmented reality Fusion Module.
Optionally, second gateway is merged second data forwarding to the augmented reality by SDK mode
Module.
Optionally, the video stream data and second data are received, is merged, is melted by preset blending algorithm
It closes as a result, being specifically used for: geometric correction being carried out to first panoramic picture, noise is eliminated, chroma-luminance adjusts and registration;It looks for
The maximal correlation of first panoramic picture and the three-dimensional map scene out generates the transformation mould of the three-dimensional map scene
Type, the coordinate transform of unified first panoramic picture and the three-dimensional map scene will be described by the method that texture projects
First panoramic picture project in the three-dimensional map scene, realizes the fusion of video image.
The embodiment of the present invention has the advantages that
Detailed description of the invention
Fig. 1 is that a kind of mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS that the embodiment of the present invention 1 provides visually is
The structural schematic diagram of system.
Fig. 2 is that another mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS that the embodiment of the present invention 2 provides is visual
The structural schematic diagram of system.
Fig. 3 is that a kind of mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS that the embodiment of the present invention 3 provides is visually square
The structural schematic diagram of method.
Specific embodiment
Embodiments of the present invention are illustrated by particular specific embodiment below, those skilled in the art can be by this explanation
Content disclosed by book is understood other advantages and efficacy of the present invention easily.
It should be clear that this specification structure depicted in this specification institute accompanying drawings, ratio, size etc., only to cooperate specification to be taken off
The content shown is not intended to limit the invention enforceable qualifications so that those skilled in the art understands and reads, therefore
Do not have technical essential meaning, the modification of any structure, the change of proportionate relationship or the adjustment of size are not influencing the present invention
Under the effect of can be generated and the purpose that can reach, it should all still fall in disclosed technology contents and obtain the model that can cover
In enclosing.Meanwhile cited such as "upper", "lower", " left side ", the right side in this specification ", the term of " centre ", be merely convenient to chat
That states is illustrated, rather than to limit the scope of the invention, relativeness is altered or modified, and is changing skill without essence
It is held in art, when being also considered as the enforceable scope of the present invention.
Embodiment 1
The embodiment of the present invention 1 provides a kind of mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS, should
Mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS is in augmented reality (AR) level, by structural data, non-
Structural data and the fusion of true full-view video image;In three-dimensional geographic information system (3DGIS) level, by structural data and
Unstructured data is merged with three-dimensional geographic information scene;In mixed reality (MR) level, by augmented reality scene and three
It ties up the unified matching of geographic information scene to call, while the application of respective advantage, realizes outdoor scene and three-dimensional geographic information scene
Uniformity fusion application.
Fig. 1 is that a kind of mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS that the embodiment of the present invention 1 provides visually is
The structural schematic diagram of system.As shown in Figure 1, being somebody's turn to do the mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS includes: knot
Structure data acquisition equipment 11, unstructured data acquisition equipment 12, the first gateway 13, the second gateway 14 and image co-registration equipment
15。
Structure data entry equipment 11 is sent to the first gateway for acquiring video stream data, and by video stream data
13;Unstructured data acquires equipment 12, for acquiring the second data, and the second data is sent to the second gateway 14;Wherein,
Image co-registration equipment includes 15: augmented reality Fusion Module and three-dimensional geographic information Fusion Module;Augmented reality Fusion Module is used
In receiving the video stream data by the forwarding of the first gateway 13 and the second data of the second gateway 14 forwarding, pass through preset fusion
Algorithm fusion, and the result of fusion is sent to three-dimensional geographic information Fusion Module;Three-dimensional geographic information Fusion Module, is used for
In three-dimensional geographic information scape, realize to monitor video picture, intellectual analysis data, the fusion application of Internet of Things perception data.
The embodiment of the present invention 1 is by mixed reality technology, to the application of augmented reality real scene video and three-dimensional geographic information system
The system efficient unified integration of fusion application, realizes security protection video monitoring system, big data analysis system, Internet of Things sensory perceptual system and three
The centralized and unified calling for tieing up GIS-Geographic Information System, solves the problems such as isolating, disperse of each system mass data, realizes each system number
According to merging association with three-dimensional geographic information scene space-time uniformity, whole, intuitive, orderly perception and data are applied.
Embodiment 2
Fig. 2 is that another mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS that the embodiment of the present invention 1 provides is visual
The structural schematic diagram of system.As shown in Fig. 2, the mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS includes:
Video capture device 21, intellectual analysis equipment 22, Internet of Things awareness apparatus 23, Video Applications gateway 24, intellectual analysis data gateway
25, Internet of Things perception data gateway 26 and video fusion equipment 27.
Video capture device 21 includes high point video camera, gunlock and ball machine;High point video camera is for acquiring video acquisition point
Video pictures, gunlock and ball machine are then for acquiring video data in real time;Intellectual analysis equipment 22, for video acquisition point
Analysis data be acquired;Internet of Things awareness apparatus 23, the data such as temperature for acquiring video acquisition point;Video fusion equipment
Augmented reality Fusion Module in 27, for realizing to video data, intellectual analysis number in high point real scene video pictured scene
According to the fusion application of, Internet of Things perception data;Three-dimensional geographic information Fusion Module in video fusion equipment 27, for three-dimensional geographical
In information scape, realize to monitor video picture, intellectual analysis data, the fusion application of Internet of Things perception data.
The mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS that the embodiment of the present invention 2 provides, in three-dimensional
GIS-Geographic Information System (3DGIS) level, by video data, intellectual analysis data, Internet of Things perception data and three-dimensional geographic information field
The fusion of scape;In mixed reality (MR) level, augmented reality scene matches calling with three-dimensional geographic information scene unification, respective
While advantage is applied, the uniformity fusion application of outdoor scene and three-dimensional geographic information scene is realized.
Optionally, three-dimensional geographic information Fusion Module is specifically used for carrying out geometric correction and registration to monitor video picture
Processing realizes that monitor video picture, intellectual analysis data, Internet of Things perception data and three-dimensional geographic information scene are based on longitude, latitude
The spatial position actual situation of the accurate geographical information coordinate such as degree, height above sea level combines.
Specifically, three-dimensional geographic information Fusion Module, for passing through Video Applications gateway, intellectual analysis data gateway and object
Join perception data gateway, realize in three-dimensional geographic information scene, video pictures, intellectual analysis application data and Internet of Things perceive number
It is merged according to according to geographical position coordinates such as accurately longitude, latitude, height above sea level, realizes three-dimensional geographic information scene and real world number
According to virtual fusion application.
Optionally, Video Applications gateway 24, by 28181 agreements or SDK mode, realize video monitoring equipment access and
Streaming Media forwarding.
Specifically, Video Applications gateway 24, for the unified management and scheduling to monitor video resource.Pass through 28181 associations
View or SDK mode realize that the access of video monitoring equipment and Streaming Media forwarding, access refer to video flowing from video capture device
11 accesses, forwarding, which refers to, is forwarded to video fusion equipment for video stream data.
Optionally, intellectual analysis data gateway 25 realizes various video monitoring device by 28181 agreements or SDK mode
Or the access and analysis data forwarding of the intellectual analysis application of third-party platform.
Specifically, intellectual analysis data gateway 25 is used to carry out intelligence to the monitor video resource of setting video analysis function
The unified management for analyzing data, realizes the calling of intellectual analysis function and the display and forwarding of intellectual analysis alarm data.Pass through
28181 agreements or SDK mode realize the access and analysis of the intellectual analysis application of various video monitoring device or third-party platform
Data forwarding, including recognition of face, vehicle identification, line analysis of stumbling, analysis of crossing the border, article loss leave analysis etc., unified to converge
It is linked into fusion application system, is the transfer module that intellectual analysis data are used for fusion application.
Optionally, Internet of Things perception data gateway 26 realizes the perception data of multiple sensors equipment by SDK mode
Access and dynamic data forwarding.
Specifically, Internet of Things perception data gateway 26 is used to carry out the Internet of Things perception data that Multiple Source Sensor obtains unified pipe
Reason realizes the configuration and management of Internet of Things sensory perceptual system.By SDK mode, connecing for the perception data of multiple sensors equipment is realized
Enter and forwarded with dynamic data, comprising: Temperature Humidity Sensor, air quality sensor, smog alarm sensor, laser radar sensing
Device face etc., unified convergence are linked into fusion application system, are the transfer modules that Internet of Things perception data is used for fusion application.
Augmented reality Fusion Module in the embodiment of the present invention 2, by Video Applications gateway, intellectual analysis data gateway and
Internet of Things perception data gateway is realized in high point real scene video picture, overlay video label, intellectual analysis application data and Internet of Things
Perception data information realizes the prevention and control application of live-action map solid, actual services data fusion application in real picture.
The embodiment of the present invention 2 is by mixed reality technology, to the application of augmented reality real scene video and three-dimensional geographic information system
The system efficient unified integration of fusion application, realizes security protection video monitoring system, big data analysis system, Internet of Things sensory perceptual system and three
The centralized and unified calling for tieing up GIS-Geographic Information System, solves the problems such as isolating, disperse of each system mass data, realizes each system number
According to merging association with three-dimensional geographic information scene space-time uniformity, whole, intuitive, orderly perception and data are applied.
Embodiment 3
Fig. 3 is that a kind of mixed reality Three-Dimensional Dynamic space-time based on AR+3DGIS that the embodiment of the present invention 3 provides is visually square
Method, being somebody's turn to do the mixed reality Three-Dimensional Dynamic space-time visual method based on AR+3DGIS includes:
Step S301: acquisition video stream data, and send the video stream data;
Step S302: the second data of acquisition, and send second data;
Step S303: the video stream data and second data are received, is merged, is obtained by preset blending algorithm
Fusion results, and send the fusion results;
Step S304: in three-dimensional geographic information scape, the fusion application to video stream data and the second data is realized.
Optionally, the video stream data is forwarded to described by first gateway by 28181 agreements or SDK mode
Augmented reality Fusion Module.
Optionally, second gateway is merged second data forwarding to the augmented reality by SDK mode
Module.
Optionally, the video stream data and second data are received, is merged, is melted by preset blending algorithm
It closes as a result, being specifically used for: geometric correction being carried out to first panoramic picture, noise is eliminated, chroma-luminance adjusts and registration;It looks for
The maximal correlation of first panoramic picture and the three-dimensional map scene out generates the transformation mould of the three-dimensional map scene
Type, the coordinate transform of unified first panoramic picture and the three-dimensional map scene will be described by the method that texture projects
First panoramic picture project in the three-dimensional map scene, realizes the fusion of video image.
Although above having used general explanation and specific embodiment, the present invention is described in detail, at this
On the basis of invention, it can be made some modifications or improvements, this will be apparent to those skilled in the art.Therefore,
These modifications or improvements without departing from theon the basis of the spirit of the present invention are fallen within the scope of the claimed invention.
Claims (8)
1. a kind of mixed reality Three-Dimensional Dynamic space-time visible system based on AR+3DGIS, which is characterized in that the system comprises:
Structure data entry equipment, unstructured data acquisition equipment, the first gateway, the second gateway and image co-registration equipment;
Wherein, described image fusion device includes: augmented reality Fusion Module and three-dimensional geographic information Fusion Module;
The structure data entry equipment is sent to described for acquiring video stream data, and by the video stream data
One gateway;
The unstructured data acquires equipment, for acquiring the second data, and second data is sent to described second
Gateway;
The augmented reality Fusion Module, for receiving video stream data and second net Jing Guo first gateway forwards
The second data for closing forwarding, are merged by preset blending algorithm, and the result of fusion is sent to the three-dimensional geographical letter
Cease Fusion Module;
The three-dimensional geographic information Fusion Module, for realizing the video stream data and second in three-dimensional geographic information scape
The fusion application of data.
2. system according to claim 1, which is characterized in that first gateway, by 28181 agreements or SDK mode,
The video stream data is forwarded to the augmented reality Fusion Module.
3. system according to claim 1, which is characterized in that second gateway, by SDK mode, by described second
Data forwarding is to the augmented reality Fusion Module.
4. system according to claim 2 or 3, which is characterized in that the augmented reality Fusion Module is specifically used for:
Geometric correction is carried out to first panoramic picture, noise is eliminated, chroma-luminance adjusts and registration;
The maximal correlation for finding out first panoramic picture Yu the three-dimensional map scene generates the change of the three-dimensional map scene
Mold changing type, the coordinate transform of unified first panoramic picture and the three-dimensional map scene will by the method that texture projects
First panoramic picture project in the three-dimensional map scene, realizes the fusion of video image.
5. a kind of mixed reality Three-Dimensional Dynamic space-time visual method based on AR+3DGIS, which is characterized in that the described method includes:
Video stream data is acquired, and sends the video stream data;
The second data are acquired, and send second data;
The video stream data and second data are received, is merged by preset blending algorithm, fusion results are obtained, and
Send the fusion results;
In three-dimensional geographic information scape, the fusion application of the video stream data and the second data is realized.
6. according to the method described in claim 5, it is characterized in that, first gateway, by 28181 agreements or SDK mode,
The video stream data is forwarded to the augmented reality Fusion Module.
7. according to the method described in claim 5, it is characterized in that, second gateway, by SDK mode, by described second
Data forwarding is to the augmented reality Fusion Module.
8. method according to claim 6 or 7, which is characterized in that the video stream data and second data are received,
It is merged by preset blending algorithm, obtains fusion results, be specifically used for:
Geometric correction is carried out to first panoramic picture, noise is eliminated, chroma-luminance adjusts and registration;
The maximal correlation for finding out first panoramic picture Yu the three-dimensional map scene generates the change of the three-dimensional map scene
Mold changing type, the coordinate transform of unified first panoramic picture and the three-dimensional map scene will by the method that texture projects
First panoramic picture project in the three-dimensional map scene, realizes the fusion of video image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811505248.8A CN109561295A (en) | 2018-12-10 | 2018-12-10 | A kind of mixed reality Three-Dimensional Dynamic space-time visible system and method based on AR+3DGIS |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811505248.8A CN109561295A (en) | 2018-12-10 | 2018-12-10 | A kind of mixed reality Three-Dimensional Dynamic space-time visible system and method based on AR+3DGIS |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109561295A true CN109561295A (en) | 2019-04-02 |
Family
ID=65869760
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811505248.8A Pending CN109561295A (en) | 2018-12-10 | 2018-12-10 | A kind of mixed reality Three-Dimensional Dynamic space-time visible system and method based on AR+3DGIS |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109561295A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110879964A (en) * | 2019-10-08 | 2020-03-13 | 北京智汇云舟科技有限公司 | Large scene density analysis system and method based on three-dimensional geographic information |
CN111562547A (en) * | 2020-07-13 | 2020-08-21 | 中铁第一勘察设计院集团有限公司 | 3D visualization method and system for monitoring element |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103595974A (en) * | 2013-12-01 | 2014-02-19 | 北京航空航天大学深圳研究院 | Video geographic information system and method for urban areas |
CN103716586A (en) * | 2013-12-12 | 2014-04-09 | 中国科学院深圳先进技术研究院 | Monitoring video fusion system and monitoring video fusion method based on three-dimension space scene |
CN103795976A (en) * | 2013-12-30 | 2014-05-14 | 北京正安融翰技术有限公司 | Full space-time three-dimensional visualization method |
CN106373148A (en) * | 2016-08-31 | 2017-02-01 | 中国科学院遥感与数字地球研究所 | Equipment and method for realizing registration and fusion of multipath video images to three-dimensional digital earth system |
CN106791613A (en) * | 2016-11-30 | 2017-05-31 | 江苏省邮电规划设计院有限责任公司 | A kind of intelligent monitor system being combined based on 3DGIS and video |
-
2018
- 2018-12-10 CN CN201811505248.8A patent/CN109561295A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103595974A (en) * | 2013-12-01 | 2014-02-19 | 北京航空航天大学深圳研究院 | Video geographic information system and method for urban areas |
CN103716586A (en) * | 2013-12-12 | 2014-04-09 | 中国科学院深圳先进技术研究院 | Monitoring video fusion system and monitoring video fusion method based on three-dimension space scene |
CN103795976A (en) * | 2013-12-30 | 2014-05-14 | 北京正安融翰技术有限公司 | Full space-time three-dimensional visualization method |
CN106373148A (en) * | 2016-08-31 | 2017-02-01 | 中国科学院遥感与数字地球研究所 | Equipment and method for realizing registration and fusion of multipath video images to three-dimensional digital earth system |
CN106791613A (en) * | 2016-11-30 | 2017-05-31 | 江苏省邮电规划设计院有限责任公司 | A kind of intelligent monitor system being combined based on 3DGIS and video |
Non-Patent Citations (1)
Title |
---|
马原野等: "3DGIS与多视频融合***的设计与实现", 《计算机应用与软件》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110879964A (en) * | 2019-10-08 | 2020-03-13 | 北京智汇云舟科技有限公司 | Large scene density analysis system and method based on three-dimensional geographic information |
CN111562547A (en) * | 2020-07-13 | 2020-08-21 | 中铁第一勘察设计院集团有限公司 | 3D visualization method and system for monitoring element |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103607568B (en) | Stereo street scene video projection method and system | |
CN106294474B (en) | Show processing method, the apparatus and system of data | |
CN103795976B (en) | A kind of full-time empty 3 d visualization method | |
CN109068103A (en) | Dynamic video space-time virtual reality fusion method and system based on three-dimensional geographic information | |
WO2018011944A1 (en) | Crowd monitoring device and crowd monitoring system | |
CN106204595A (en) | A kind of airdrome scene three-dimensional panorama based on binocular camera monitors method | |
CN114399606A (en) | Interactive display system, method and equipment based on stereoscopic visualization | |
CN110060230B (en) | Three-dimensional scene analysis method, device, medium and equipment | |
CN109561295A (en) | A kind of mixed reality Three-Dimensional Dynamic space-time visible system and method based on AR+3DGIS | |
CN107438152A (en) | A kind of motion cameras is to panorama target fast positioning method for catching and system | |
Aykut et al. | Realtime 3D 360-degree telepresence with deep-learning-based head-motion prediction | |
CN112015170A (en) | Moving object detection and intelligent driving control method, device, medium and equipment | |
CN114442805A (en) | Monitoring scene display method and system, electronic equipment and storage medium | |
CN114358112A (en) | Video fusion method, computer program product, client and storage medium | |
US20140098138A1 (en) | Method and system for augmented reality based smart classroom environment | |
KR101383997B1 (en) | Real-time video merging method and system, visual surveillance system and virtual visual tour system using the real-time video merging | |
CN107371011B (en) | The method that wide angle picture is converted into map projection's image and perspective projection image | |
Chamberlain et al. | A distributed robotic vision service | |
CN105224570B (en) | A kind of display methods and system of point of interest | |
CN113507599B (en) | Education cloud service platform based on big data analysis | |
Psarras et al. | Visual saliency in navigation: Modelling navigational behaviour using saliency and depth analysis | |
CN107426561A (en) | The virtual reality live broadcasting method and device of a kind of 3D360 degree | |
CN110267087B (en) | Dynamic label adding method, device and system | |
Song et al. | Systems, control models, and codec for collaborative observation of remote environments with an autonomous networked robotic camera | |
CN113179496A (en) | Video analysis framework based on MEC and indoor positioning system under framework |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190402 |
|
RJ01 | Rejection of invention patent application after publication |