CN110990506A - Multi-dimensional synthesis command system integrating AR technology - Google Patents

Multi-dimensional synthesis command system integrating AR technology Download PDF

Info

Publication number
CN110990506A
CN110990506A CN201911162537.7A CN201911162537A CN110990506A CN 110990506 A CN110990506 A CN 110990506A CN 201911162537 A CN201911162537 A CN 201911162537A CN 110990506 A CN110990506 A CN 110990506A
Authority
CN
China
Prior art keywords
module
display
dimensional
mixed reality
technology
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201911162537.7A
Other languages
Chinese (zh)
Inventor
苏东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Wisdom Daoshu Shanghai Technology Co Ltd
Original Assignee
New Wisdom Daoshu Shanghai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Wisdom Daoshu Shanghai Technology Co Ltd filed Critical New Wisdom Daoshu Shanghai Technology Co Ltd
Priority to CN201911162537.7A priority Critical patent/CN110990506A/en
Publication of CN110990506A publication Critical patent/CN110990506A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Tourism & Hospitality (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Marketing (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a multi-dimensional synthesis command system fusing AR technology, which at least comprises: the display module displays a two-dimensional map synthesized in the command system in a two-dimensional form, and performs point scattering display on the map by combining with command elements; the mixed reality module forms a virtual scene corresponding to a real scene of a real environment through a tracking registration technology of an AR technology, and forms a synthetic image for display through a virtual reality fusion display technology; the operation system controls the operation of the display module and the mixed reality module; and the communication module is respectively communicated between the operating system and the display module, between the display module and the mixed reality module and between the operating system and the mixed reality module. The technology utilizes AR augmented reality technology to form a multidimensional synthesis command system, utilizes a screen to combine with free space three-dimensional information linkage to form a command system closer to actual combat, and restores the real multidimensional synthesis command system with actual combat significance.

Description

Multi-dimensional synthesis command system integrating AR technology
Technical Field
The invention relates to the technical field of information, in particular to a multidimensional synthesis command system fused with AR technology.
Background
At present, the synthetic command centers of the industries such as public security, traffic police, road administration and the like all over the country carry out a new round of information system transformation and upgrading, and the aim is to create a novel synthetic command center integrating situation and instruction, namely integration of information, command and action scheduling. Specifically, the 'emotion and behavior integration' service reform mode reflects the five characteristics of 'online, rapid, flat, accurate and normative'. The 'emotion-finger linkage' center is 'brain', and the synthetic combat action team is undoubtedly 'trunk' responsible for capture and action; the brain is responsible for clues to check out and land to efficiently and quickly command; the trunk also needs to be caught as quickly and efficiently as the arms and fingers. The combined command system is an informatization system for providing service for 'situation and behavior integration'.
Although the current synthesis command system integrates information such as time and space GIS maps and the like as basic data support, the synthesis command system is still displayed by a screen, commands are synthesized on a plane on a large screen, 2D/2.5D display is carried out through a screen display carrier, the presence sense is lacked, the real effect just like the presence of commands on the site is not existed, and the current synthesis command system is lacked in the exercise capacity of simulating the real environment.
The AR (Augmented Reality) technology is an Augmented Reality technology, and can superimpose a three-dimensional virtual object and a scene in a real space to form an application scene which is integrated with the virtual and real. The novel multidimensional synthesis command system realized by fusing the AR technology can not only complete the functions of the traditional synthesis command system, but also bring the presence for command decision and drilling, so that the synthesis command is closer to the reality, and the system has actual combat characteristics.
Accordingly, those skilled in the art have been made to develop a multidimensional synthesis command system that fuses AR technology to solve the above-mentioned problems.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a multi-dimensional synthesis command system fused with AR technology, which enhances the telepresence of the command and the reality of the synthesis command so as to solve the problems in the background technology.
In order to solve the above problems, the present invention provides a multidimensional synthesis command system with an AR technology fused, which at least includes:
the display module displays a two-dimensional map synthesized in the command system in a two-dimensional form, and performs point scattering display on the map by combining with command elements;
the mixed reality module forms a virtual scene corresponding to a real scene of a real environment through a tracking registration technology of an AR technology, and forms a synthetic image for display through a virtual reality fusion display technology;
the operation system is used for controlling the running of the display module and the mixed reality module by a general control center of the synthesis command system;
the communication module is used for realizing communication between the operating system and the display module, between the display module and the mixed reality module and between the operating system and the mixed reality module respectively through the communication module.
Furthermore, the two-dimensional map on the display module at least has a zooming function, a moving function and a click highlighting function.
Further, the mixed reality module is provided with a data acquisition module, a data processing module and an imaging display module, wherein the data acquisition module acquires spatial position data of a real scene of a real environment, the data processing module analyzes the data acquired by the data acquisition module by using a tracking registration technology to obtain the relative position of the virtual scene and the real scene, reconstructs a three-dimensional virtual scene and performs fusion calculation of the virtual scene and the real scene; and the imaging display module carries out three-dimensional display on the virtual scene formed by the analysis and reconstruction of the data processing module.
Furthermore, the data processing module in the mixed reality module at least comprises an identification interaction module, the identification interaction module performs data updating and man-machine interaction, and the identification interaction module acquires the action of a user to perform the transformation operation of the corresponding virtual scene.
Further, when the data processing module reconstructs the virtual scene, the data processing module constructs a coordinate system by using the placing earth surface of the ground three-dimensional virtual scene of the command center.
Furthermore, the interaction between the operating system and the display module is completed through a mouse, and the operating system interface is clicked through the mouse to control and select the corresponding two-dimensional map position on the display module.
Furthermore, the operating system, the display module and the mixed reality module are linked, when the operating system selects the display module through a mouse, a highlight is displayed in the selected area, the display module is communicated with the mixed reality module through the communication module, and the mixed reality module acquires a real scene of the highlight display area and constructs a corresponding virtual scene.
Further, the command elements of the highlight display area on the display module are correspondingly displayed in the mixed reality module.
By implementing the multi-dimensional synthesis command system fusing the AR technology, provided by the invention, the following technical effects are achieved:
according to the technical scheme, the multi-dimensional synthesis command system is formed by utilizing the AR augmented reality technology, the defect that commands are synthesized on a plane only depending on a screen is overcome, the screen is combined with free space three-dimensional information linkage, people, three-dimensional virtual scenes and a large screen display system are organically integrated, the multi-dimensional synthesis command system which is closer to the actual combat is formed, and the real multi-dimensional synthesis command system with the actual combat significance is restored.
Drawings
The conception, the specific structure and the technical effects of the present invention will be further described with reference to the accompanying drawings to fully understand the objects, the features and the effects of the present invention.
FIG. 1 is a block diagram of a pointing system module in accordance with an embodiment of the present invention;
fig. 2 is a schematic diagram of a practical application of the pointing system in the embodiment of the present invention.
In the figure:
1. synthesizing a command center; 10. a command center ground; 11. a virtual imaging zone;
2. an operating system;
3. a display screen/display module; 30. a two-dimensional map; 31. a highlight region;
4. a mixed reality module; 40. a virtual scene; 41. a data acquisition module; 42. a data processing module; 43. an imaging display module; 44. identifying an interaction module;
5. a communication module;
6. and (5) commanding personnel.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The technical solution of the present invention will be described in detail with specific embodiments.
Fig. 1 is a block diagram of a multidimensional synthesis command system incorporating AR technology, and the command system at least includes:
the display module 3 displays the two-dimensional map 30 synthesized in the command system in a two-dimensional form, and performs point scattering display on the map by combining with command elements; the two-dimensional map 30 on the display module 3 has at least a zoom function, a move function, and a click highlight function.
The mixed reality module 4 at least comprises a data acquisition module 41, a data processing module 42 and an imaging display module 43, wherein the data acquisition module 41 acquires spatial position data of a real scene of a real environment, the data processing module 42 analyzes the data acquired by the data acquisition module 41 by using a tracking registration technology to obtain the relative position of the virtual scene 41 and the real scene, reconstructs the three-dimensional virtual scene 40 and performs fusion calculation of the virtual scene 40 and the real scene; the imaging display module 43 displays the virtual scene 40 analyzed and reconstructed by the data processing module 42 in three dimensions; when the virtual scene 40 is reconstructed, the data processing module 42 constructs a coordinate system based on the placement surface of the command center ground three-dimensional virtual scene 40.
The data processing module 42 in the mixed reality module 4 at least includes an identification interaction module 44, the identification interaction module 44 performs data updating and human-computer interaction, and obtains the user's action through the identification interaction module 44 to perform the transformation operation corresponding to the virtual scene 40.
The operation system, the general control center of the synthesis command system, controls the operation of the display module 3 and the mixed reality module 4; interaction between the operating system and the display module 3 is completed through a mouse, and the mouse clicks an interface of the operating system to control and select the position of the corresponding two-dimensional map 30 on the display module 3.
The communication module 5 is respectively communicated between the operating system and the display module 3 and between the display module 3 and the mixed reality module 4 and between the operating system and the mixed reality module 4 through the communication module 5.
Linkage is provided among the operating system, the display module 3 and the mixed reality module 4, when the operating system selects the display module 3 through a mouse, the selected area displays highlight, the display module 3 is communicated with the mixed reality module 4 through the communication module 5, and the mixed reality module 4 acquires a real scene of the highlighted display area and constructs a corresponding virtual scene 40.
In practical application, the command system is mainly applied to a composite command center 1 of public security, traffic police and road administration and the like, generally, the composite command center 1 has a three-dimensional space as shown in fig. 2, the three-dimensional space at least has a command center ground 10 and at least one complete wall, a three-dimensional virtual scene 40 is formed on the command center ground 10, and the command center ground 10 is used as a construction ground surface of the three-dimensional virtual scene 40 to construct the three-dimensional virtual scene 40; the free space above the ground 10 of the command center is a virtual imaging area 11, and a three-dimensional virtual scene 40 is displayed; the display module 3 is typically a large screen, i.e. a display screen 3, and the display screen 3 is mounted on the complete wall of the command center.
The display screen 3 is communicated with the operating system through the communication module 5, the display screen 3 is controlled to be opened through the operating system, and the operating interface of the operating system is displayed in an enlarged mode in the same proportion. As a command center of public security, traffic police road administration and the like, a two-dimensional map 30 is displayed on a display screen 3, and information such as spot display and the like is carried out on the two-dimensional map 30 by combining with command elements (such as police, police strength, bayonets and video point positions); the two-dimensional map 30 has functions of zooming, moving, clicking a key block, highlighting, and the like. The commander 6 can perform operations such as zooming and moving on the two-dimensional map 30 by operating a marker such as a mouse, and selects an area to be highlighted on the two-dimensional map 30, and this area is displayed as a highlight area 31 as a highlight region.
The three-dimensional virtual scene 40 displayed on the command center floor 10 corresponds to the selected highlight region 31.
In actual operation, the data acquisition module 41 in the mixed reality module 4 generally acquires information of a certain place on time and free space by using devices such as a camera and a sensor, for example, image information, video information and the like of a current shooting area are recorded by the camera, and temperature and humidity information and the like of the current acquisition area are acquired by the sensor; the data acquisition module 41 has various device types, and can be used in combination according to actual requirements, the types of information formed by the acquisition devices are different, and the information acquisition time is different, so that the data acquired by the data acquisition module 41 needs to be processed and integrated by the data processing module 42 to form data content required for realizing the construction of the virtual scene 40.
When analyzing and constructing the time and free space information acquired by each device of the data acquisition module 41, it is necessary to use a tracking registration technique, i.e. three-dimensional registration, to construct a three-dimensional virtual scene 40 according to a display scene, and to change the three-dimensional virtual scene 40 according to real-time updated data; three-dimensional registration is the core of AR technology, that is, two-dimensional or three-dimensional objects in a real scene are used as markers, and virtual information and real scene information are aligned and matched, that is, the position, size, motion path, and the like of a virtual object must be perfectly matched with a real environment, so that a virtual reality and a real reality are generated.
The three-dimensional registered three-dimensional virtual scene 40 displays the synthesized image in the three-dimensional space on the ground of the synthesis command center 1 through an imaging display module 43, such as an AR head display, a camera on an intelligent mobile device, a gyroscope, a sensor, and other accessories, and presents the synthesized image to the commander 6.
The commander 6 can acquire the control signal through a device of the recognition interaction module 44, for example, an interaction accessory on an AR head display or an intelligent mobile device, such as a microphone, an eye tracker, an infrared sensor, a camera, a sensor, and the like, for example, perform operations such as switching, amplifying, and the like through gestures, perform corresponding human-computer interaction and information update, and implement augmented reality interaction operations.
Therefore, when the commander 6 interacts with the display screen 3 through the mouse and lights up the highlight area 31 on the display screen 3, the devices and modules of the mixed reality module 4 load and display the three-dimensional virtual scene of the area in the three-dimensional space above the command center ground 10, and the command elements of the highlight area 31 on the two-dimensional map 30 are simultaneously displayed in the free space around the three-dimensional virtual scene; when the highlight area 31 displayed on the display screen 3 changes, the three-dimensional virtual scene is also linked, and if the important area is switched and command element details in the area, such as the alarm details, are checked, the alarm details are displayed on the two-dimensional map 30 on the display screen 3 and the three-dimensional virtual scene in the important area at the same time; when the commander 6 standing on the command center ground 10 interacts with the three-dimensional virtual scene through gestures, if the commander swings to switch scenes, the two-dimensional map 30 on the display screen 3 is linked with the three-dimensional virtual scene.
It is to be understood that unless otherwise defined, technical or scientific terms used herein have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any uses or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the present invention is not limited to the structures that have been described above and shown in the drawings, and that various modifications and changes can be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (8)

1. A multi-dimensional synthesis command system fused with AR technology is characterized by at least comprising:
the display module displays a two-dimensional map synthesized in the command system in a two-dimensional form, and performs point scattering display on the map by combining with command elements;
the mixed reality module forms a virtual scene corresponding to a real scene of a real environment through a tracking registration technology of an AR technology, and forms a synthetic image for display through a virtual reality fusion display technology;
the operation system is used for controlling the running of the display module and the mixed reality module by a general control center of the synthesis command system;
the communication module is used for realizing communication between the operating system and the display module, between the display module and the mixed reality module and between the operating system and the mixed reality module respectively through the communication module.
2. The application system of claim 1, wherein the two-dimensional map on the display module has at least a zoom function, a move function, and a click highlight function.
3. The application system of claim 1, wherein the mixed reality module has a data acquisition module, a data processing module and an imaging display module, wherein the data acquisition module acquires spatial position data of a real scene of a real environment, the data processing module analyzes the data acquired by the data acquisition module by using a tracking registration technology to obtain a relative position of the virtual scene and the real scene, reconstruct a three-dimensional virtual scene, and perform fusion calculation of the virtual scene and the real scene; and the imaging display module carries out three-dimensional display on the virtual scene formed by the analysis and reconstruction of the data processing module.
4. The application system of claim 3, wherein the data processing module in the mixed reality module further comprises at least an identification interaction module, the identification interaction module performs data updating and human-computer interaction, and the identification interaction module obtains the user's actions to perform the transformation operation of the corresponding virtual scene.
5. The application system of claim 3, wherein the data processing module performs coordinate system construction with a placement surface of a command center ground three-dimensional virtual scene during virtual scene reconstruction.
6. The application system of claim 1, wherein the interaction between the operating system and the display module is accomplished by a mouse, and the corresponding two-dimensional map position on the display module is controlled and selected by clicking the operating system interface through the mouse.
7. The application system of claim 2, wherein the operating system, the display module, and the mixed reality module are linked, the selection area displays a highlight when the operating system selects the display module through a mouse, the display module is communicated with the mixed reality module through the communication module, and the mixed reality module acquires a real scene of the highlight area to construct a corresponding virtual scene.
8. The application system of claim 7, wherein the command elements for highlighting the display area on the display module are correspondingly displayed in the mixed reality module.
CN201911162537.7A 2019-11-25 2019-11-25 Multi-dimensional synthesis command system integrating AR technology Withdrawn CN110990506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911162537.7A CN110990506A (en) 2019-11-25 2019-11-25 Multi-dimensional synthesis command system integrating AR technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911162537.7A CN110990506A (en) 2019-11-25 2019-11-25 Multi-dimensional synthesis command system integrating AR technology

Publications (1)

Publication Number Publication Date
CN110990506A true CN110990506A (en) 2020-04-10

Family

ID=70086319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911162537.7A Withdrawn CN110990506A (en) 2019-11-25 2019-11-25 Multi-dimensional synthesis command system integrating AR technology

Country Status (1)

Country Link
CN (1) CN110990506A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140306996A1 (en) * 2013-04-15 2014-10-16 Tencent Technology (Shenzhen) Company Limited Method, device and storage medium for implementing augmented reality
CN106909215A (en) * 2016-12-29 2017-06-30 深圳市皓华网络通讯股份有限公司 Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality
CN108230440A (en) * 2017-12-29 2018-06-29 杭州百子尖科技有限公司 Chemical industry whole process operating system and method based on virtual augmented reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140306996A1 (en) * 2013-04-15 2014-10-16 Tencent Technology (Shenzhen) Company Limited Method, device and storage medium for implementing augmented reality
CN106909215A (en) * 2016-12-29 2017-06-30 深圳市皓华网络通讯股份有限公司 Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality
CN108230440A (en) * 2017-12-29 2018-06-29 杭州百子尖科技有限公司 Chemical industry whole process operating system and method based on virtual augmented reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张磊;姜晓东;: "基于增强现实的作战演示***设计" *

Similar Documents

Publication Publication Date Title
US11120628B2 (en) Systems and methods for augmented reality representations of networks
CN105760106B (en) A kind of smart home device exchange method and device
CN107045844B (en) A kind of landscape guide method based on augmented reality
King et al. ARVino-outdoor augmented reality visualisation of viticulture GIS data
US11272160B2 (en) Tracking a point of interest in a panoramic video
CN111696216B (en) Three-dimensional augmented reality panorama fusion method and system
CN109828658B (en) Man-machine co-fusion remote situation intelligent sensing system
CN106683197A (en) VR (virtual reality) and AR (augmented reality) technology fused building exhibition system and VR and AR technology fused building exhibition method
CN109561282B (en) Method and equipment for presenting ground action auxiliary information
CN103270540A (en) Tracking moving objects using a camera network
JP2012084146A (en) User device and method providing augmented reality (ar)
CN107479705A (en) A kind of command post's work compound goods electronic sand map system based on HoloLens
US20200143600A1 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
CN108594999A (en) Control method and device for panoramic picture display systems
Baldauf et al. KIBITZER: a wearable system for eye-gaze-based mobile urban exploration
CN112711458A (en) Method and device for displaying prop resources in virtual scene
CN111222190A (en) Ancient building management system
CN109099902A (en) A kind of virtual reality panoramic navigation system based on Unity 3D
CN109656319B (en) Method and equipment for presenting ground action auxiliary information
CN113115015A (en) Multi-source information fusion visualization method and system
JPH07271546A (en) Image display control method
CN113253842A (en) Scene editing method and related device and equipment
DE102018106108A1 (en) INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS AND PROGRAM
CN112182286B (en) Intelligent video management and control method based on three-dimensional live-action map
CN110990506A (en) Multi-dimensional synthesis command system integrating AR technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200410

WW01 Invention patent application withdrawn after publication